Vercel's ToS Update Opts Free-Tier Developers Into AI Training — Deadline to Opt Out Is Today (March 31, 2026)
Vercel quietly updated its Terms of Service to allow AI model training on Hobby plan users' code by default, sharing data with third-party AI providers. The opt-out deadline is March 31, 2026 at 11:59 PM PST — today. Paid Pro users are opted out by default; free users must manually opt out or their code enters the training set permanently.
Vercel has updated its Terms of Service to enable AI model training on source code, agent chats, and build telemetry from users on its Hobby (free) and Trial Pro plans — and the deadline to opt out is . After this deadline, any data already ingested into Vercel's training pipelines cannot be retroactively removed, even if a user opts out afterward.
What Happened
Vercel announced the updates to its Terms of Service and Privacy Policy in , framing the changes as enabling "agentic" platform features — proactive incident investigation, performance analysis, and automated optimization suggestions. Under the new terms, Vercel may use content from Hobby and Trial Pro accounts to train AI and machine learning models, and may share that content with unnamed third-party AI providers for model development purposes.
The data collected includes anonymized code and Vercel agent chats, build and deployment telemetry data, aggregate traffic statistics, and build error patterns. Vercel states that personal information, account details, environment variables, and API keys are removed before any data sharing occurs. However, critics note that "anonymized" code still exposes architectural patterns, proprietary algorithms, and internal business logic to whatever third-party AI providers Vercel contracts with.
Key Details
- Opt-out deadline: March 31, 2026 at 11:59:59 PM PST — once passed, previously ingested data cannot be removed from training datasets
- Hobby and Trial Pro users: Opted in by default. Must manually navigate to Team Settings → Data Preferences to opt out
- Paid Pro users: Opted out by default. Can opt in voluntarily via Team Settings → Data Preferences
- Enterprise users: Always opted out — data is never used for AI training regardless of settings
- Data shared: Anonymized code, agent chats, build telemetry, traffic statistics, and error logs — shared with unspecified third-party AI providers
- Not shared: PII, account details, environment variables, and API keys (Vercel claims these are redacted)
What Developers and Users Are Saying
Developer reaction ranged from concern to alarm. Analysis on DEV Community pointed out that even without secrets, architectural patterns, proprietary algorithms, and internal logic are being routed into AI training pipelines — creating IP exposure risks that Vercel's redaction claims don't fully address. A widely shared post stated: "The underlying change to data handling is something every Principal Engineer and CTO needs to audit immediately."
Compliance professionals noted specific regulatory exposure: for companies operating under SOC 2, HIPAA, or GDPR data processing agreements, opting in — even by default — to share code with unnamed third-party AI trainers may directly violate existing DPAs. The concern is not just about what Vercel does with the data, but about what obligations companies have to their own customers regarding third-party data sharing.
The Vercel community forum thread on the update generated limited direct discussion, with the most visible complaint being a technical bug in the terms acceptance popup that temporarily prevented some users from accessing their dashboards at all.
What This Means for Developers
If you are a Vercel Hobby or Trial Pro user, you must take action before to prevent your code from entering Vercel's AI training pipeline. To opt out: navigate to Team Settings → Data Preferences and toggle from opt-in to opt-out. The change takes effect immediately for future data.
If the opt-out deadline has already passed when you read this: future data is protected once you opt out, but data already ingested before the deadline cannot be removed. Vercel does not offer a data deletion mechanism for training data.
For teams with enterprise data processing agreements or compliance obligations under HIPAA, SOC 2, or GDPR, legal review of Vercel's updated AI Product Terms is advisable regardless of opt-out status, as the new terms introduce third-party data sharing that may require DPA amendments.
What's Next
Vercel has not announced which third-party AI providers it contracts with for model training. Developers who require a full list of data sub-processors for compliance purposes can contact Vercel's legal team directly. The company's Enterprise tier remains fully excluded from AI training data collection — teams with compliance requirements may consider upgrading, or migrating to alternatives like Netlify, Render, or Fly.io that have not announced similar data practices.
The full updated Terms of Service, AI Product Terms, and opt-out instructions are available at vercel.com/changelog.
Sources
- Vercel Changelog — Updates to Terms of Service (March 2026) — Official announcement
- Vercel AI Product Terms — Full legal text covering AI data use
- Vercel Community Forum — ToS Update Thread — Developer discussion
- DEV Community — Vercel's Agentic Shift — Technical and compliance analysis
- AI Navigate — Vercel Will Train Models on Your Code — Independent coverage
Stay up to date with Doolpa
Subscribe to Newsletter →