Your face is yours.
Your voice is yours.
Your script is yours.
AI can now generate you without asking. DIAP exists to make sure it has to.
The situation
Generation quality crossed the convincing threshold. A GPU can render your face, clone your voice, mimic your expressions, and synthesize your motion — and nobody asked you first. Scripts get fed into models that produce summaries, rewrites, and derivatives without a single notification to the writer.
The industry has no shared infrastructure to request consent, enforce scope limits, prove authorization, or pay talent. Contracts exist, but they aren't machine-readable. They can't stop a render. The result: legal risk piling up, reputational harm spreading, fragmented deals that benefit platforms while creators opt out entirely.
Even being searchable inside an AI tool — before any project starts — can violate talent intent if platform access was never granted. Visibility itself is an act of exposure.
Why now
Studios and brands need provable compliance to reduce liability. Talent demands control, visibility, and compensation for digital use. Platforms need scalable verification to combat deepfakes. Regulators are moving on biometrics, consent, and AI disclosures. AI-assisted writing workflows are accelerating, and script leakage risk is growing faster than anyone wants to admit.
There is no "wait and see." The window to set the standard is now — before the defaults become "everything is fair game."
What DIAP is
DIAP — the Digital Identity Authorization Protocol — is the infrastructure layer that makes consent enforceable at the machine level. Not a marketplace. Not a talent agency. Not another rights management dashboard. It's the protocol underneath all of them.
One protocol that lets actors, creators, writers, and studios control exactly how AI uses their identity — voice, face, expression, motion, and now scripts — across every app and pipeline. Built on consent. Revocable by default. Every output auditable.
Human-rooted authority
The identity owner — or their estate, or their representative — is the source of truth. Not the platform. Not the model. Not the studio. The person.
Least exposure
Identity assets and scripts stay in the vault. They don't get shipped to untrusted apps. Access is scoped, time-limited, and revocable.
Training is a separate right
Using someone's face to render a poster is not the same as using it to train a model. DIAP enforces that distinction at the protocol level. Always.
Two layers of consent
First: can this app even see you? Second: can this project use you? Visibility and usage are separate decisions. Both require explicit authorization.
Every frame is auditable
Receipts, watermarks, and provenance chains. If it was generated with your identity, there's a cryptographic trail back to the authorization that allowed it.
Kill switch by default
Short-lived tokens. Push-based revocation. If something goes wrong, you can stop it — immediately, not after a legal review.
How it works, briefly
Actors and writers register their identity modules — voice, face, expression, motion, scripts — in a secure vault. They set visibility policies per app: who can even see them, who can't. That's Layer 1.
When an app wants to use someone's identity for a project, it sends a license request. The request specifies exactly what: which rights, which campaign, which territory, how many renders, for how long. The identity owner (or their rep) approves or denies. That's Layer 2.
On approval, a short-lived, cryptographically bound token is issued. The app can only do what the token allows. Every output gets a signed receipt and a watermark. Platforms downstream can verify: "Was this authorized?" The answer is cryptographic, not contractual.
Scripts work the same way. A writer registers their screenplay. They control which apps can see it, which can generate summaries or breakdowns from it, and training is always a separate, explicit permission that is never implied.
Who this is for
Actors and creators who want to participate in AI-powered production without giving up control. DIAP gives you granular, revocable authority over every use of your digital identity.
Writers whose scripts are being ingested by AI systems with no consent trail. ScriptModule treats your work as a protected asset — readable, derivable, or trainable only with your explicit permission.
Studios that need speed and flexibility without increasing legal exposure. DIAP turns identity and script usage into controlled, auditable workflows that mirror existing contract structures. Marketing, localization, dubbing, previs — all covered.
AI developers who want to build tools that studios and talent will actually trust. Integrate once via the SDK. Get DIAP-Certified. Access a growing registry of authorized identities and scripts.
Platforms that need to answer: "Was this content authorized?" DIAP's verification API and watermark scanning give you a machine-checkable answer at scale.
Where we're heading
We start centralized and focused. One Central Authority. One Trust Registry. Publish the spec, schemas, and conformance tests publicly. Launch the DIAP-Certified program. Pilot with an anchor talent and a studio marketing workflow — real posters, real approvals, real receipts.
Then we expand. Multiple certified issuers — studios, unions, agencies — listed in the registry. Localization and dubbing workflows. Writing pipelines with ScriptModule. Distribution verification partners who can check authorization at the platform level.
Eventually, federation. Multi-party governance. Studios, unions, and platforms steering the standard together. Transparent audits. Standardized key ceremonies. DIAP becomes the consent layer that every AI application checks before it renders a human.
The endgame is not a product. It's infrastructure. The way HTTPS made "is this connection secure?" a solved question, DIAP makes "was this person's identity used with their consent?" a solved question.
What we don't do
DIAP is not a replacement for copyright registration, chain-of-title, or legal guild processes. It's the technical enforcement and audit layer that makes those agreements machine-enforceable.
We don't take a percentage of talent pay. Pricing is subscription-based — annual fees and usage-based billing for studios and platforms. Talent access to the vault and visibility controls is free. We want maximum participation on the supply side, not a toll booth.
We don't generate content. We don't represent talent. We don't replace agents or managers. We build the protocol that everyone else plugs into.
AI is not going to slow down. The question is whether the people it depicts get a say in how it happens. DIAP is the infrastructure that guarantees they do.