CF
ClearFeed
Trust Analysis
85Trust
Verified
Model Assessment
Matthias OttonMastodon17h ago
Quick reminder, especially if you’re a freelancer or developer using Free/Pro/Pro+ plans for client work: Opt out of GitHub using your data for AI model training before April 24 (seriously, wtf that this isn’t opt-in!). https://github.com/settings/copilot/features#copilot-telemetry-policy
Trust Metrics
92
Accuracy
88
Sources
75
Framing
80
Context
Claim Accuracy92%
Source Quality88%
Framing & Tone75%
Context80%
Analysis Summary
This is real — GitHub does allow AI training on repository code, and users can opt out through Copilot settings. The specific April 24 deadline isn't independently verified here, but the core claim is solid. The post's main point is valid: it's frustrating that opting out requires active user steps rather than being the default. Developers using GitHub for client work should definitely check this if they care about data privacy.
Claims Analysis (4)
GitHub is using user data for AI model training
GitHub's Copilot uses training data from repositories. This is documented policy, not speculation.
Verified
Users can opt out of GitHub using their data for AI model training
GitHub provides opt-out mechanism in Copilot settings. Direct link to setting confirms this exists.
Verified
The opt-out deadline is April 24
Cannot independently verify this specific date from the linked settings page alone. Post is from March 30, 2026, making April 24 plausible but unconfirmed.
? Unverifiable
Data opt-in should be default rather than opt-out
Implicit normative claim about privacy design philosophy. Reasonable perspective but not a verifiable fact.
💬 Opinion
Was this analysis helpful?
Try ClearFeed free
clearfeed.app — Trust scores for your social feed