GitHub has announced a policy change for GitHub Copilot: starting April 24, interaction data from Copilot Free, Pro, and Pro+ users will be used to train and improve GitHub’s AI models unless those users opt out.

GitHub describes “interaction data” as inputs and outputs, code snippets, and related context. It also lists items such as accepted/modified outputs, code context around the cursor, repository structure/navigation patterns, and feedback signals (e.g., thumbs up/down).

The company states that Copilot Business and Copilot Enterprise users are not affected by this update. GitHub also notes that if a user previously opted out of collection for product improvements, that preference remains.

Why developers should pay attention:

- Anything you send to Copilot (including code context shown while you’re actively using it) may contribute to training unless you opt out.

- The change may affect how individuals and small teams handle sensitive code, secrets, or proprietary logic during AI-assisted coding.

Recommended next steps:

- If you use Copilot Free/Pro/Pro+, review Copilot privacy settings and decide whether to opt out.

- For teams, update internal AI usage guidelines (what code can/can’t be pasted into assistants).

- Ensure secret scanning is enabled and avoid including credentials in prompts or surrounding code.