AI TECH

The clause that broke the internet, for a second

WeTransfer has confirmed it doesn’t use any files shared on its platform to train AI.

The company came under fire after updating its terms of service, which many users took to mean their content could be used for AI purposes.

WeTransfer says the clause was only meant to cover moderation tools that might use AI to flag harmful content, not to train models or sell user data.

After the backlash, it updated the wording to make things clearer.

The new terms, kicking in on 8 August, say users give permission to use content for running and improving the service, in line with the privacy policy.

Several creatives said they were thinking of switching platforms, worried about how their work might be used.

Three things to know:

  • WeTransfer says it does not train AI models on your files or sell your data.

  • It updated the wording in its terms to clear up any confusion.

  • Legal experts warn vague policies can put users in a tight spot.

Same clause, new spin

A similar situation happened with Dropbox in late 2023, when users voiced concerns about AI, prompting the company to issue a clarification.

Legal experts say these kinds of terms can be risky, especially as tech firms push to use more data for AI.

For users, sudden changes like this can feel like they’re being left with little say.

Dropbox walked so WeTransfer could…walk back.

Keep Reading

No posts found