The Problem
Major tech companies have trained their AI models on billions of images, videos, and audio files - often without the knowledge or consent of the creators who made them. Artists, photographers, musicians, and voice actors are finding their work replicated by AI systems they never agreed to participate in.
This isn't just about compensation. It's about respect, consent, and the fundamental right of creators to control how their work is used.
Watch: The Fight for Creator Rights
Our Principles for Responsible AI
Explicit Consent
Creators must actively agree before their work is used for AI training. No hidden clauses. No automatic opt-ins. Clear, informed consent.
Opt-Out Rights
Creators should always have the right to remove their work from AI training datasets. This right should be easy to exercise and respected.
Fair Compensation
When creators' work contributes to AI systems, they deserve to share in the value created. AI companies should pay for what they use.
How Diversity Photos is Different
- 01.Every photographer in our collection has explicitly consented to how their work is used
- 02.Our AI features (Adaptive Originals) extend existing photos - they don't generate from scratch using scraped data
- 03.Creator attribution is maintained even on AI-extended images
- 04.Photographers receive compensation when their work is licensed
Our Story: How We Learned the Hard Way
As creators ourselves, we experienced firsthand how corporations use legal language to exploit creative work. Here's what we discovered about the contracts we signed in good faith - before AI changed everything.
The Friendly Title
“License We Need to Promote Your Work”
This sounds great, right? They need a license to promote our work. We put content on their platform, they sell it, we share the profits. A mutual benefit. This is what we signed up for.

The Reasonable Example
“...modify (so as to better showcase your Work, for example)”
They can modify our work “to better showcase it.” Makes sense - maybe they need to crop for a thumbnail or adjust for different display sizes. This is standard practice to help sell our content to customers.

The Hidden Dagger
“...developing new features and services”
Here's where everything changed. You would think this means new features to help promote your work. That's the context of the entire clause.
But corporations said: “Nope. You agreed we can create new features and services. It doesn't say they have to promote your work. AI training is a new service. Thanks for the free training data.”

This is Bad Faith Business
Contracts written before AI existed are now being interpreted to justify using creator content for AI training - without additional consent, compensation, or even basic transparency. The spirit of these agreements was mutual benefit. The letter is being weaponized against creators.
Tips for Creators: Protect Yourself
Read “New Features” Clauses
Any clause mentioning “new features,” “new services,” or “product development” could be used to justify AI training. Ask for explicit exclusions or limitations.
Look for “Perpetual” Language
“Perpetual, worldwide, royalty-free” licenses survive contract termination. Once granted, you may never get those rights back. Negotiate time limits or termination clauses.
Demand AI-Specific Terms
New contracts should explicitly address AI. Request language that requires separate consent for machine learning, model training, or generative AI uses.
Document Everything
Keep copies of all versions of terms of service. Companies change terms without notice. Having historical records can be crucial for legal disputes.
Remember: If a platform won't clarify how they'll use your work for AI, that silence is your answer. Choose platforms that are transparent about their AI practices and respect creator rights.