Updates to GitHub Copilot interaction data usage policy

(github.blog)

390 pontos | por prefork 19 dias atrás

60 comentários

  • stefankuehnel
    19 dias atrás
    If you scroll down to "Allow GitHub to use my data for AI model training" in GitHub settings, you can enable or disable it. However, what really gets me is how they pitch it like it’s some kind of user-facing feature:

    Enabled = You will have access to the feature

    Disabled = You won't have access to the feature

    As if handing over your data for free is a perk. Kinda hilarious.

    • data-ottawa
      19 dias atrás
      It’s not so bad, there’s no double negative and it’s not a confusing “switch” that is always ambiguous as to whether it’s enabled or not.

      In contrast when you create a a GCS bucket it uses a checkmark for enabling “public access prevention”. Who designed that modal? It takes me a solid minute to figure out if I’m publishing private data or not.

    • a1o
      19 dias atrás
      I went to check on this and I have everything copilot related disabled and in the two bars that measure usage my Copilot Chat usage was somehow in 2%, how is this possible?

      Before anyone comes to me to sell me on AI, this is on my personal account, I have and use it in my business account (but it is a completely different user account), I just make it a point to not use it in my personal time so I can keep my skills sharp.

      • hakunin
        19 dias atrás
        Does Github count it as copilot chat usage when you use AI search form on their website, I wonder?
        • a1o
          19 dias atrás
          I wonder if that’s it! I occasionally do some code search on GitHub and then remember it doesn’t work well and go back to searching in the IDE. I usually need to look into not the main branch because I do a lot of projects that have a develop branch where things actually happen. But that would explain so I guess this is it.
      • saratogacx
        19 dias atrás
        If you're taking about the quota bar. That is only measuring your premium request usage (models with a #.#x multiplier next to the name). If you only use the free models and code completion you won't actually consume any "usage". If you use AI code review that consumes a single request (now). Same with the Github Copilot web chat, if you use a free model, it doesn't count, if you use a premium model you get charged the usage cost.
    • martin-t
      19 dias atrás
      A few days ago, I unchecked it, only to see it checked again when I reloaded the page.

      It could be incompetence but it shouldn't matter. This level of incompetence should be punished equally to malice.

    • petcat
      19 dias atrás
      I guess the "perk" is that maybe their models get retrained on your data making them slightly more useful to you (and everyone else) in the future? idk
    • mirekrusin
      19 dias atrás
      The feature is that your coding style will be in next models!
      • rzmmm
        19 dias atrás
        I wish my GPL license would transit along with my code.
        • mirekrusin
          19 dias atrás
          I said it few years back that code license doesn't exist anymore, some people just haven't realized it yet.
          • Imustaskforhelp
            19 dias atrás
            Previously, big tech used to still somehow find loopholes for GPL and licenses still had some value.

            Nowadays, It genuinely feels a lot less because there are now services who will re-write the code to prevent the license.

            Previously, I used to still think that somewhat non propreitory licenses like the SSPL license etc. might be interesting approaches but I feel like they aren't that much prone to this either now anymore.

            So now I am not exactly sure.

        • UqWBcuFx6NV4r
          19 dias atrás
          If you are wholly confident that model training is a violation of the GPL then go sue.
          • tglman
            19 dias atrás
            I guess freedom of study and use may include also training AI, but would be cool if all the derivate work, as AI models and generated code from AI models should be licensed as GPL, layers needed here
    • 7bit
      19 dias atrás
      It's worded that way to create FOMO in the hopes people keep it enabled.

      Dark pattern and dick move.

    • Rapzid
      19 dias atrás
      Is that not some stock feature-flag verbiage?
      • bigiain
        19 dias atrás
        Stock dark pattern verbiage...

        I'm a little surprised the options aren't "Enable" and "Ask me later".

      • NewJazz
        19 dias atrás
        But it isn't a feature, so using a feature flag is a bit weird.
        • Rapzid
          18 dias atrás
          How is it not a feature from a development standpoint? Colloquially any bit of intended functionality qualifies as a "feature" and certainly any functionality you conditionally enable/disable would be controlled by a "feature flag" regardless.
          • NewJazz
            18 dias atrás
            Because the user sees no difference in experience.
        • UqWBcuFx6NV4r
          19 dias atrás
          [flagged]
          • NewJazz
            19 dias atrás
            "Please think like a developer" lmao if I said this to someone at my dayjob I'd be gone.
    • Imustaskforhelp
      19 dias atrás
      Thanks to your comment, I have disabled it now :-)

      I agree that it feels like a dart pattern for the most part, makes me want to use codeberg/self hosted git

    • TJ_FLEET
      19 dias atrás
      [flagged]
  • mentalgear
    19 dias atrás
    > On April 24 we'll start using GitHub Copilot interaction data for AI model training unless you opt out. Review this update and manage your preferences in your GitHub account settings.

    Now "Allow GitHub to use my data for AI model training" is enabled by default.

    Turn it off here: https://github.com/settings/copilot/features

    Do they have this set on business accounts also by default? If so, this is really shady.

    • lenova
      19 dias atrás
      Ugh, can't believe they made this opt-in by default, and didn't even post the direct URLs to disable in their blog post.

      To add on to your (already helpful!) instructions:

      - Go to https://github.com/settings/copilot/features - Go to the "Privacy" section - Find: "Allow GitHub to use my data for AI model training" - Set to disabled

      • thrdbndndn
        19 dias atrás
        I always thought "opt-in" (not "opt in") meant something you have to actively choose to enable; otherwise, it stays off. So calling something "opt-in by default" sounds like a misnomer to me.

        But English is not my first language so please correct me if I'm wrong.

      • inetknght
        19 dias atrás
        > can't believe they made this opt-in by default

        You can't believe Microslop is force-feeding people Copilot in yet another way?

        > and didn't even post the direct URLs to disable in their blog post

        You can't believe Microshaft didn't tell you how to not get shafted?

        • miohtama
          19 dias atrás
          He must be new here
    • g947o
      19 dias atrás
      https://github.com/orgs/community/discussions/188488

      > Why are you only using data from individuals while excluding businesses and enterprises?

      > Our agreements with Business and Enterprise customers prohibit using their Copilot interaction data for model training, and we honor those commitments. Individual users on Free, Pro, and Pro+ plans have control over their data and can opt out at any time.

      • dormento
        19 dias atrás
        Aka "they have lawyers and you usually don't, so we think we can get away with it."
        • gentleman11
          19 dias atrás
          only big companies have access to the legal system. nobody else can afford it
          • apublicfrog
            17 dias atrás
            Most of the world aren't American and we can afford our legal systems ;).
            • kyleee
              17 dias atrás
              Thankfully shariah court is free. Inshallah
      • themafia
        19 dias atrás
        > and we honor those commitments.

        Ah, so when the inevitable "bug" appears, and we all learn that you've completely failed to honor anything, what will be your "commitment" then? An apology and a few free months?

        Time to start pushing for a self hosted git service again.

    • parkersweb
      19 dias atrás
      Yes - not impressed at all that this is opt-in default for business users. We have a policy in place with clients that code we write for them won’t be used in AI training - so expecting us to opt out isn’t an acceptable approach for a business relationship where the expectation is security and privacy.
      • aksss
        19 dias atrás
        It is not opt-in by default for business users. The feature flag doesn't show in org policies and github states that it's not scoped to business users.
        • parkersweb
          19 dias atrás
          Gah - you’re right - but given that I don’t use personal copilot - but I do manage an organisation that gives copilot to some of our developers AND I was sent an email this evening making no mention at all of business copilot being excluded it could definitely have been communicated better…
          • Palmik
            19 dias atrás
            My email does mention it clearly:

            > Again, your organization's Copilot interaction data is not included in model training under this new policy, but we are excited for you to enjoy the product improvements it will unlock.

    • gentleman11
      19 dias atrás
      What did everyone expect? I can't understand this community's trust of microsoft or startups. It's the typical land grab: start off decent, win people over, build a moat, then start shaking everybody down in the most egregious way possible.

      It's just unusual how quickly they're going for the shakedown this time

    • martinwoodward
      19 dias atrás
      Just confirming, we do not use Copilot interaction data for model training of Copilot Business or Enterprise customers.
      • verdverm
        19 dias atrás
        You shouldn't do it for public by opt-in, it should be opt-out. But that is the Microslop effect on GitHub, users are an afterthought.
    • whynotmaybe
      19 dias atrás
      Per their blog post

      > Business and Copilot Enterprise users are not affected by this update.

    • archb
      19 dias atrás
      Interestingly, it is disabled by default for me.
      • crashingintoyou
        19 dias atrás
        Reading the github blog post "If you previously opted out of the setting allowing GitHub to collect this data for product improvements, your preference has been retained—your choice is preserved, and your data will not be used for training unless you opt in."
        • verdverm
          19 dias atrás
          Is this the new name for the setting? I cannot find one that sounds like the previous one you mention

          Notable that they have no "privacy" section in account settings

      • gpm
        19 dias atrás
        Me too, which is making me wonder if they're planning on silently flipping this setting on April 24th (making it impossible to opt out in advance).
        • martinwoodward
          19 dias atrás
          We are not. The reason we wanted to announce early was so that folks had plenty of time to opt-out now. We've also added the opt-out setting even if you don't use Copilot so that you can opt-out now before you forget and then if you decide to use Copilot in the future it will remember your preference.
          • pred_
            19 dias atrás
            Would you be able to comment on https://news.ycombinator.com/item?id=47522876, i.e. explain the legal basis for this change for EU based users? If there is none, you may have to expect that people will exercise their right to lodge a complaint with a supervisory authority.
            • ThrowawayB7
              19 dias atrás
              Why would you expect an engineer to be able to comment on legal affairs? Presumably it was cleared with Microsoft's legal department or whatever GitHub's divisional equivalent is.
              • 1718627440
                18 dias atrás
                That's precisely what the term 'engineer' signifies. (I know it gets used incorrectly for software developers.) Workers in general need to decide whether something is legal independently of their company, because the company lawyers have the interest of the company in mind, which might conflict with the workers interest to not do illegal things.

                Big Tech is known for clearing illegal things by their legal departments all the time.

        • spiderfarmer
          19 dias atrás
          Is it because I'm in the EU?
          • paularmstrong
            19 dias atrás
            I'm in the US and it's off for me. I believe I've previously opted out of everything copilot related in the past if there was anything.
          • gpm
            19 dias atrás
            I'm in Canada, so not only the EU at least.
      • xgdgsc
        19 dias atrás
        I guess we have to check out again on April 24 ?
    • pjmlp
      19 dias atrás
      We have a business account, and because of issues like this, access to anything CoPilot is blocked.
    • DavidSJ
      19 dias atrás
      > Do they have this set on business accounts also by default? If so, this is really shady.

      Looks like not, but would it actually have been shadier, or are we just used to individual users being fucked over?

  • QuadrupleA
    19 dias atrás
    Fun fact: Copilot gives you no way to ignore sensitive files with API keys, passwords, DB credentials, etc.: https://github.com/orgs/community/discussions/11254#discussi...

    So by default you send all this to Microsoft by opening your IDE.

    • 0xbadcafebee
      19 dias atrás
      Separate fun fact: Gemini CLI blocks env vars with strings like 'AUTH' in the name. They have two separate configuration options that both let you allow specific env vars. Neither work (bad vibe coding). Tried opening an issue and a PR, and two separate vibe-coding bots picked up my issue and wrote PRs, but nobody has looked at them. Bug's still there, so can't do git code signing via ssh agent socket. Only choice is to do the less-secure, not-signed git commits.

      On top of that, Gemini 3 refuses to refactor open source code, even if you fork it, if Gemini thinks your changes would violate the spirit of the intent of the original developers in a safety/security context. Even if you think you're actually making it more secure, but Gemini doesn't, it won't write your code.

      • WatchDog
        19 dias atrás
        Gemini also won't help you with C++ if you are under 18, since it would be unsafe.

        https://news.ycombinator.com/item?id=39632959

        • verdverm
          19 dias atrás
          Is it still true? That's two years old
          • WatchDog
            19 dias atrás
            It's improved significantly in that time, but relative to the other frontier models, it is still the one that is the most condescending and coddling.
      • verdverm
        19 dias atrás
        I use Gemini 3 to edit multiple forks. Your statement is false based on stuff I actually do.
        • 0xbadcafebee
          19 dias atrás
          Well it's true based on my running into the issue 8 hours ago
          • verdverm
            18 dias atrás
            Maybe it's your prompts? I've never had Gemini refuse to write any code in any context. I use it with Claude prompts, edited down, in particular to remove guardrails.

            You shouldn't use Google Ai products, they are inferior. Their models are quite good. It's confusing when people use the model name when referring to a product. What's your setup?

    • sceptic123
      19 dias atrás
      Fun fact: you shouldn't have sensitive files with API keys, passwords, DB credentials, etc. in your repo
      • wzdd
        19 dias atrás
        “In your repo” and “in the directory you are running copilot” are two separate things.
      • nunez
        17 dias atrás
        It’s fine to have them in your repo if they’re encrypted and the private key isn’t in there as well!
    • nulld3v
      19 dias atrás
      Sadly, this issue is systemic: https://github.com/openai/codex/issues/2847
      • stavros
        19 dias atrás
        OpenCode has a plugin that lets you add an .ignore file (though I think .agentignore would be better). The problem is that, even though the plugin makes it so the agent can't directly read the file, there's no guarantee the agent will try to be helpful and do something like "well I can't read .envrc using my read tool, so let me cat .envrc and read it that way".
        • solaire_oa
          17 dias atrás
          This points out that agentic security flaws are worse than "systemic", they're the feature. Agents are literal backdoors.

          It's so bizarre to be discussing minor security concerns of backdoors, like trying to block env vars. Of course the maintainers don't care about blocking env vars. It's security theater.

    • malnourish
      19 dias atrás
      I swear I just set up enterprise and org level ignore paths.
      • veverkap
        19 dias atrás
        Yeah, it's a Copilot Business/Enterprise feature
  • section_me
    19 dias atrás
    If I'm paying, which I am, I want to have to opt-in, not opt-out, Mario Rodriguez / @mariorod needs to give his head a wobble.

    What on earth are they thinking...

    • sph
      19 dias atrás
      > What on earth are they thinking...

      @mariorod's public README says one of his focuses is "shaping narratives and changing \"How we Work\"", so there you go.

      • fmjrey
        19 dias atrás
        Translation: more alignment with Microsoft practices
      • section_me
        19 dias atrás
        "shaping narratives", sounds like they follow the methodologies of a current president
        • okanat
          19 dias atrás
          It looks like the literal translation of "manipulation" to Linkedin-speak.
        • efilife
          18 dias atrás
          which one?
    • wenldev
      19 dias atrás
      [dead]
  • pred_
    19 dias atrás
    What is the legal basis of this in the EU? Ignoring the fact they could end up stealing IP, it seems like the collected information could easily contain PII, and consent would have to be

    > freely given, specific, informed and unambiguous. In order to obtain freely given consent, it must be given on a voluntary basis.

    • rennokki
      19 dias atrás
      It breaks GDPR easily: GDPR enforces you to comply with opt-out by default, no workaround by prefilling before hitting submit.

      While some think this applies only to personal data, then yes. But it takes only one line of code to use my phone number for testing while I test locally a register form in the application I'm developing.

      Once it gets sent to Copilot I can threaten with legal action if they are not taking it down.

    • LadyCailin
      19 dias atrás
      I actually don’t seem to have this option on my GitHub settings page, which leads me to wonder if this only applies to Americans.
      • LauraMedia
        19 dias atrás
        I actually did have to manually disable this from Germany, so it might be a different reason you don't have it?
        • LadyCailin
          17 dias atrás
          Dunno! I would have expected Germany and Norway to be the same.
      • spartanatreyu
        19 dias atrás
        I have the setting in Australia.

        I'd be curious to see which countries are affected

  • sph
    19 dias atrás
    Thanks to Github and the AI apocalypse, all my software is now stored on a private git repository on my server.

    Why would I even spend time choosing a copyleft license if any bot will use my code as training data to be used in commercial applications? I'm not planning on creating any more opensource code, and what projects of mine still have users will be left on GH for posterity.

    If you're still serious about opensource, time to move to Codeberg.

    • heavyset_go
      19 dias atrás
      Made the same choice, my open source projects with users are in maintenance mode or archived. New projects are released via SaaS, compiled artifacts or not at all.

      I scratch my open source itch by contributing to existing language and OS projects where incremental change means eventually having to retrain models to get accurate inference :)

    • thesmart
      19 dias atrás
      Yeah, I'm guessing that probably because in their TOS you grant them some license work-around for running the service, which can mean anything.
    • midasz
      19 dias atrás
      I'm in my happy space selfhosting forgejo and having a runner on my own hardware
  • diath
    19 dias atrás
    > This approach aligns with established industry practices

    "others are doing it too so it's ok"

    • theshrike79
      19 dias atrás
      Ackshually Anthropic is opt-in AND they give you discounts if you enable it
      • stingraycharles
        19 dias atrás
        It’s opt-out, not opt-in, at least for Claude Desktop and Claude Code, unless you use the API.
      • nodar86
        19 dias atrás
        What kind of discounts? I have never heard of this
      • cma
        19 dias atrás
        Anthropic puts up random prompts defaulting to enabled to trick you into accidentally enabling.
  • Deukhoofd
    19 dias atrás
    So basically they want to retain everyone's full codebases?

    > The data used in this program may be shared with GitHub affiliates, which are companies in our corporate family including Microsoft

    So every Microsoft owned company will have access to all data Copilot wants to store?

  • hoten
    19 dias atrás
    Why is there no cancel copilot subscription option here?. Docs say there should be...

    Mobile

    https://github.com/settings/billing/licensing

    EDIT:

    https://docs.github.com/en/copilot/how-tos/manage-your-accou...

    > If you have been granted a free access to Copilot as a verified student, teacher, or maintainer of a popular open source project, you won’t be able to cancel your plan.

    Oh. jeez.

  • hmate9
    19 dias atrás
    For what it's worth they're not trying to hide this change at all and are very upfront about it and made it quite simple to opt out.
    • matltc
      19 dias atrás
      They didn't even link the setting in their email. They didn't even name it specifically, just vaguely gestured toward it. Dark patterns, but that's Microslop for ya
      • hmate9
        19 dias atrás
        going to github i was greeted with a banner and a link directly to the settings for changing it
      • w4yai
        19 dias atrás
        I've seen worse dark pattern to be honest... I don't think they're being malicious here.
    • ncr100
      19 dias atrás
      They do not make it very simple to opt out. That is false.

      On Android for instance I invite you to use the GitHub app and modify your opt-in or opt outside settings... You will find that nothing works on the settings page once you actually find the settings page after digging through a couple of layers and scrolling about 2 ft.

  • badthingfactory
    19 dias atrás
    I appreciated the notification at the top of the screen because it prompted me to disable every single copilot feature I possibly could from my account. I also appreciated Microsoft for making Windows 11 horrible so I could fall back in love with Linux again.
  • _pdp_
    19 dias atrás
    Microsoft doing dumb things once again.

    Who in their right mind will opt into sharing their code for training? Absolutely nobody. This is just a dark pattern.

    Btw, even if disabled, I have zero confidence they are not already training on our data.

    I would also recommend to sprinkle copyright noticed all over the place and change the license of every file, just in case they have some sanity checks before your data gets consumed - just to be sure.

  • TZubiri
    19 dias atrás
    If this doesn't sound bad enough, it's possible that Copilot is already enabled. As we know this kind of features are pushed to users instead of being asked for.

    Maybe it's already active in our accounts and we don't realize it, so our code will be used to train the AI.

    Now we can't be sure if this will happen or not, but a company like GitHub should be staying miles away from this kind of policy. I personally wouldn't use GitHub for private corporate repositories. Only as a public web interface for public repos.

  • TZubiri
    19 dias atrás
    Two issues with this:

    1- Vulnerabilities, Secrets can be leaked to other users. 2- Intellectual Property, can also be leaked to other users.

    Most smart clients won't opt-out, they will just cut usage entirely.

    • matltc
      19 dias atrás
      That's me. Frankly, looking at just uninstalling VSCode because Copilot straight-up gets in the way of so much, and they stopped even bothering with features that are not related to it (with one exception of native browser in v112, which, admittedly, is great)
  • stefanos82
    19 dias atrás
    Serious question: let's say I host my code on this platform which is proprietary and is for my various clients. Who can guarantee me that AI won't replicate it to competitors who decide to create something similar to my product?
    • halfcat
      19 dias atrás
      If the code is ever visible to anyone else ever, you have no guarantee. If it’s actually valuable, you have to protect it the same way you’d protect a pile of gold bars.

      What does “my code...for my clients” mean (is it yours or theirs)? If it’s theirs let them house it and delegate access to you. If they want to risk it being, ahem...borrowed, that’s their business decision to make.

      If it’s yours, you can host it yourself and maintain privacy, but the long tail risk of maintaining it is not as trivial as it seems on the surface. You need to have backups, encrypted, at different locations, geographically distant, so either you need physical security, or you’re using the cloud and need monitoring and alerting, and then need something to monitor the monitor.

      It’s like life. Freedom means freedom from tyranny, not freedom from obligation. Choosing a community or living solo in the wilderness both come with different obligations. You can pay taxes (and hope you’re not getting screwed, too much), or you can fight off bears yourself, etc.

  • OtherShrezzing
    19 dias atrás
    It’s not clear to me how GitHub would enforce the “we don’t use enterprise repos” stuff alongside “we will use free tier copilot for training”.

    A user can be a contributor to a private repository, but not have that repository owner organisation’s license to use copilot. They can still use their personal free tier copilot on that repository.

    How can enterprises be confident that their IP isn’t being absorbed into the GH models in that scenario?

    • martinwoodward
      19 dias atrás
      We do not train on the contents from any paid organization’s repos, regardless of whether a user is working in that repo with a Copilot Free, Pro, or Pro+ subscription. If a user’s GitHub account is a member of or outside collaborator with a paid organization, we exclude their interaction data from model training.
      • 8cvor6j844qw_d6
        19 dias atrás
        For private repositories under a personal account, if the repo owner has opted out of model training but a collaborator has not, would the collaborator's Copilot interactions with that repo still be used for training?
      • lmc
        19 dias atrás
        Thank you for clarifying this.
    • danelski
      19 dias atrás
      Quite simply, that's just a matter of the corporate internal policy and its (lack of) enforcement. This problem is just a subset of the wider IP breach with some people happily feeding their work documents into the free tier of ChatGPT.
  • pizzafeelsright
    19 dias atrás
    I am not certain this is that big of a deal outside of "making AI better".

    At this point, is there any magic in software development?

    If you have super-secret-content is a third party the best location?

    • danelski
      19 dias atrás
      They've had ample access to the final output - our code, but they still hope with enough data on HOW we work they can close the agentic gap and finally get those stinky, lazy humans that demand salary out of the loop.
    • thesmart
      19 dias atrás
      How about "no." You may be okay giving away your individual rights, including to copyright, but I am not.
  • rectang
    19 dias atrás
    I just checked my Github settings, and found that sharing my data was "enabled".

    This setting does not represent my wishes and I definitely would not have set it that way on purpose. It was either defaulted that way, or when the option was presented to me I configured it the opposite of how I intended.

    Fortunately, none of the work I do these days with Copilot enabled is sensitive (if it was I would have been much more paranoid).

    I'm in the USA and pay for Copilot as an individual.

    Shit like this is why I pay for duck.ai where the main selling point is that the product is private by default.

  • david_allison
    19 dias atrás
    I have GitHub Copilot Pro. I don't believe I signed up for it. I neither use it nor want it.

    1. A lot of settings are 'Enabled' with no option to opt out. What can I do?

    2. How do I opt out of data collection? I see the message informing me to opt out, but 'Allow GitHub to use my data for AI model training' is already disabled for my account.

    • martinwoodward
      19 dias atrás
      Hey David - if you want to send me (martinwoodward at github.com) details of your GitHub account I can take a look. At a guess I suspect you are one of the many folks who qualified for GitHub Copilot Pro for free as a maintainer of a popular open source project.

      Sounds like you are already opted out because you'd previously opted out of the setting allowing GitHub to collect this data for product improvements. But I can check that.

      Note, it's only _usage_ data when using Copilot that is being trained on. Therefore if you are not using Copilot there is no usage data. We do not train on private data at rest in your repos etc.

  • OtherShrezzing
    19 dias atrás
    So, how does this work with source-available code, that’s still licensed as proprietary - or released under a license which requires attribution?

    If someone takes that code and pokes around on it with a free tier copilot account, GitHub will just absorb it into their model - even if it’s explicitly against that code’s license to do so?

    • danelski
      19 dias atrás
      Most of the new culture and website contents is under full copyright. How much of an obstacle was that to these companies?
  • liquid_thyme
    19 dias atrás
    They use data from the poor student tier, but arguably, large corporates and businesses hiring talented devs are going to create higher quality training data. Just looking at it logically, not that I like any of this...
  • cebert
    19 dias atrás
    I wish GitHub would focus on making their service reliable instead of Copilot and opting folks into their data being stolen for training.
  • kevcampb
    16 dias atrás
    This is terrifying. Github was the one provider I did not expect to make such an action. We're now playing whack-a-mole with vendors to try and ensure that our company IP doesn't end up being used to train a model.
    • AlexeyBelov
      16 dias atrás
      > Github was the one provider I did not expect to make such an action

      There is no Github anymore in any meaningful way. It's Microsoft. Github doesn't have a CEO anymore.

  • etothet
    19 dias atrás
    The fact that this is on by default, especially for paid accounts and even more especially for organizations, where certain types of privacy is sometimes mandated by the industry your business is in, is ridiculous.

    There should also be a much easier one-click to opt out without having to scroll way down on the settings page.

  • thesmart
    19 dias atrás
    I'm ready to abandon Github. Enschitification of the world's source infrastructure is just a matter of time.
  • robeym
    17 dias atrás
    There are several settings in my account relating to Copilot that are locked/enabled with a shield and key icon next to it. Any idea how to disable these settings? It's on the same settings/copilot/features page.
  • jmhammond
    19 dias atrás
    Mine was defaulted to disabled. I’m on the Education pro plan (academic), so maybe that’s different than personal?
  • ncr100
    19 dias atrás
    On my Android phone I was able to change the setting using Firefox by logging into GitHub and not allowing it to launch the GitHub app.

    I was unable to change the setting when I used the GitHub app to open up the web page in a container.. button clicks weren't working. Quite frustrating.

  • greatgib
    18 dias atrás
    And something important, that is leaking from the phrasing of their blog post, is that it is not really "Github" that wants to suck all your data "prompts, code, context, documents", ... but it is "Microsoft"!
  • phendrenad2
    19 dias atrás
    So I do all the work of thinking about how to do something, and as soon as I tell Copilot about it, not it's in the training data and anyone can ask the LLM and it'll tell them the solution I came up with? Great. I'm going to cancel.
  • sbinnee
    19 dias atrás
    Bold move. Who uses Copilot these days? Unless they have free credit I mean.
  • rvz
    19 dias atrás
    > From April 24 onward, interaction data—specifically inputs, outputs, code snippets, and associated context—from Copilot Free, Pro, and Pro+ users will be used to train and improve our AI models unless they opt out.

    Now is the time to run off of GitHub and consider Codeberg or self hosting like I said before. [0]

    [0] https://news.ycombinator.com/item?id=22867803

    • 0x3f
      19 dias atrás
      Codeberg doesn't support non OSS and I'd rather just have one 'git' thing I have to know for both OSS and private work. So it's not a great option, IMO. Self-hosting also for other reasons.

      I'm not sure there are any good GitHub alternatives. I don't trust Gitlab either. Their landing page title currently starts with "Finally, AI". Eek.

      • eipi10_hn
        19 dias atrás
        Maybe sourcehut? https://sourcehut.org
        • 0x3f
          19 dias atrás
          It's an option but I can't really take the platform seriously when the owner removes content based on his personal whims. He currently removes crypto projects because of their 'social ills'. I don't work on crypto, but he might start deleting AI projects for the same reason, say.
  • Heliodex
    19 dias atrás
    Finally. The option for me to enable Copilot data sharing has been locked as disabled for some time, so until now I couldn't even enable it if I wanted to.
  • indigodaddy
    19 dias atrás
    Checked and mine was already on disabled. Don't remember if I previously toggled it or not..
    • martinwoodward
      19 dias atrás
      If you previously opted out of the setting allowing GitHub to collect data for product improvements, your preference has been retained here. We figured if you didn't want that then you definitely wouldn't want this..
  • dartf
    17 dias atrás
    I don't see an option to opt-out? Is it US only thing?
  • djmashko2
    19 dias atrás
    > Content from your issues, discussions, or private repositories at rest. We use the phrase “at rest” deliberately because Copilot does process code from private repositories when you are actively using Copilot. This interaction data is required to run the service and could be used for model training unless you opt out.

    Sounds like it's even likely to train on content from private repositories. This feels like a bit of an overstep to me.

  • mt42or
    19 dias atrás
    Is it legal ? Surely not in any EU countries.
    • okanat
      19 dias atrás
      Does it even matter? They trained AI on obviously copyrighted and even pirated content. If this change is legally significant and a legal breach, the existence of all models and all AI businesses also is illegal.
      • 0x3f
        19 dias atrás
        It might or might not be legal, but it seems materially worse to screw over your direct customers than to violate the social-contracty nature of copyright law. But hey ho if you're not paying then you're the product, as ever was.
    • mentalgear
      19 dias atrás
      At least one instance where it was enabled in EU countries as well.
  • marak830
    19 dias atrás
    As it's enabled by default, does that mean everything has already been siphoned off and now I'm just closing the gate behind the animals escaping?

    Shit like this shouldn't be allowed.

  • explodes
    19 dias atrás
    We all knew Microsoft was going to destroy GitHub eventually when it was first bought.

    How much longer do you want to tolerate the enshittification? How much longer CAN you tolerate it?

  • tuananh
    19 dias atrás
    making this option opt-in by default is a very shady choice, GitHub.
  • semiinfinitely
    19 dias atrás
    ill be moving off github now
  • nhkjhh
    13 dias atrás
    nig
  • baobabKoodaa
    19 dias atrás
    (oops)
  • abdelmon
    17 dias atrás
    [flagged]
  • pugchat
    19 dias atrás
    [dead]
  • iam_circuit
    19 dias atrás
    [dead]
  • ComputeLeap
    19 dias atrás
    [dead]
  • ryguz
    18 dias atrás
    [dead]
  • Mooshux
    19 dias atrás
    [dead]
  • SilentEditor
    19 dias atrás
    [dead]
  • hikaru_ai
    19 dias atrás
    [dead]
  • manudaro
    19 dias atrás
    [dead]
  • patrickRyu
    19 dias atrás
    [flagged]
  • aimemobe
    11 dias atrás
    [flagged]
  • bustah
    19 dias atrás
    [dead]
  • xtremehdtv
    16 dias atrás
    [dead]
  • latand6
    19 dias atrás
    Why won't people like to make the models better? Aren't we all getting the benefit after all?
    • danelski
      19 dias atrás
      That's akin to being grateful for your local shop owner that they allowed you to sweep the floor for other customers.
      • latand6
        19 dias atrás
        Please don’t strawman me, I asked completely different question.

        It’s not about being grateful or something, but that many people (devs) are too concerned about their code being stolen as if they’ve come up with something unique and the LLMs are some kind of database (which it isn’t).

        At the end of the day we’re going to be using AI to write all the code, many of us already doing that. And if some GitHub copilot model would be better - we’re getting more quality code that is generally available for next pretraining runs (for your and other models). Some would even switch to copilot if it’s good.

        What do you think about it?

        • sayamqazi
          18 dias atrás
          If something is mine by right, no matter how little or lot worth it has noone shoule be allowed to force/trick me to donate it. It should just be my choice.
          • latand6
            13 dias atrás
            Yeah but I’m just really curious why so many people are so against improving the models. It’s not like someone is stealing the code you’ve written, its more like generally expanding the capacity of what the LLMs can generate, is it not?
          • cindyllm
            18 dias atrás
            [dead]
        • adi_kurian
          18 dias atrás
          People would have a different response if they did not, in my view accurately, perceive that wool is being pulled over their eyes.