Microsoft forms C2PA coalition with Intel, Adobe, more to address deepfakes

Devesh Beri

Looking for more info on AI, Bing Chat, Chat GPT, or Microsoft's Copilots? Check out our AI / Copilot page for the latest builds from all the channels, information on the program, links, and more!

Several tech firms, including Adobe, Microsoft, and Intel, have come together to form a coalition called the Coalition for Content Provenance and Authenticity (C2PA), reports CryptoSlate. The goal of this coalition is to tackle the growing challenge of deepfakes by developing an open standard that can verify the origins and authenticity of online content.

The C2PA aims to develop tools that can attach critical information to digital content, allowing users and content platforms to identify whether AI was used to create or alter the content. This is crucial as AI techniques become increasingly advanced, enabling the creation of hyperrealistic fake images and videos.

Earlier this year, OpenAI gave up on researching AI bots that can detect if a human or AI wrote content due to limitations in its performance.

The C2PA has released open-source tools like ContentCredentials that any organization can adopt, making it more accessible for a wide range of entities, including media outlets, academics, nonprofits, and tech firms, to implement these standards.

While the C2PA’s efforts do not amount to official regulation, they are viewed as a crucial first step in addressing the challenge of deepfakes and synthetic media. By promoting transparency and trust online, the coalition hopes to mitigate the spread of falsified content and misinformation.

Developing robust authentication methods remains challenging, as no perfect AI detection system currently exists. Voluntary initiatives like the C2PA are seen as valuable in the absence of comprehensive regulation.

AI technology is on the rise across industries. To combat synthetic media and deepfakes, the C2PA brings together tech companies and stakeholders to maintain trust in online content.