Microsoft plans to change how GitHub Copilot uses your data, and this update directly affects how the tool learns from real users. The company will start training its AI models using interaction data collected from Copilot users by default, unless you opt out before April 24. This move shifts Copilot from relying mainly on public code to learning from how developers actually use it in real projects.
GitHub Copilot already helps with code generation, pull request summaries, and code reviews, but Microsoft now wants to improve accuracy using real-world usage patterns. Interaction data includes prompts, generated code, file structures, comments, and even how you navigate within your project. Microsoft believes this type of data leads to better and more reliable AI outputs.
Microsoft will collect interaction data from Copilot Free, Pro, and Pro+ users. However, Copilot Business and Enterprise users, along with enterprise-owned repositories, remain excluded from this data collection. The company also says it will not use data stored at rest, which reduces concerns around stored code exposure.
At the same time, Microsoft confirms it will share this interaction data with GitHub affiliates, but not with third-party AI providers. This detail matters for developers who want to understand where their data goes after collection.
Opt-out deadline you should not ignore
You still have control over this setting through your privacy options. However, if you take no action before April 24, Microsoft will automatically enable data sharing for training. This default opt-in approach puts the responsibility on users to review and update their settings.
This update shows how Microsoft plans to improve Copilot, but it also raises clear questions about data control and transparency for everyday developers.