Balaji S. Srinivasan, a renowned American entrepreneur and investor, has brought attention to the growing challenge of distinguishing authentic videos from AI-generated fakes.
As the co-founder of Counsyl, the former CTO of Coinbase, and a former general partner at Andreessen Horowitz, Srinivasan’s insights hold weight in the tech and crypto communities. He recently proposed the concept of “verifiable video” as a solution to this pressing issue.
Srinivasan emphasizes the need for verifiable video to ensure that footage hasn’t been manipulated with AI. He elaborates:
“We can get there with cryptocameras. Suppose that when you take a video, you can optionally put its hash onchain for a small fee. This is like a digital notary public. It establishes that (a) the video existed at that timestamp and (b) you are the user who wrote that video file to the blockchain.”
The proposed system would allow users to hash their video recordings on a blockchain, providing a timestamp and authentication for the footage. However, Srinivasan acknowledges the limitations:
“Of course, it would still be possible to take a video, manipulate it with AI, and then put its hash onchain while claiming it’s real. But it could be made quite difficult, on par with spoofing Apple’s GPS or harder.”
Srinivasan suggests that major social networks could integrate verifiable video as a feature within their platforms. Any major social network could build verifiable video into their software right now. You just take the camera app and add ‘verifiability’ as another mode. It’d be similar to slow-mo or time-lapse, but require a small fee to write a verifiable video onchain.
Additionally, he points out that phone manufacturers could embed this feature at the hardware level: And any phone vendor could also put verifiable video into hardware if sufficient demand existed. They could probably make it very hard to fake by streaming the video hash live to the blockchain as it was recorded.
The potential applications of verifiable video extend far beyond combating misinformation in media. Srinivasan envisions a future where the concept evolves into “cryptoinstruments” for broader use cases:
In fact, citizen journalists of the future might have to post verifiable videos, with an onchain checkmark next to them, or else people would consider them more likely to be fake.
PS: The use cases go way beyond media as well. If you generalize the concept of cryptocameras to cryptoinstruments, you could get a verifiable chain of custody for every important piece of scientific data, like DNA sequencing data or temperature measurements. That could go a long way towards reducing academic fakery and dealing with the replication crisis.
In a world where AI continues to blur the lines between real and fake, Srinivasan’s proposition offers a beacon of hope. By leveraging blockchain technology, verifiable video and cryptoinstruments could restore trust in media, science, and other critical domains. As Srinivasan aptly notes:
“AI makes everything easy to fake, but crypto makes it hard again.”
Also read: Major Coin Listing Date Nears With Gas-Free $Major Token Airdrop