by nick
Share
by nick

This is why we need blockchain-stamped videos.
If you haven’t seen the latest developments in AI video generation, it will blow your mind. It might seem like all fun and games, but the footage we rely on for security and authenticity is now more vulnerable than ever.
Here’s a generated video of Sam Altman shoplifting: https://www.youtube.com/shorts/PBrh1OaSl9M
The tells are still there — he’s stealing from the “Gratics Cards” section at Target, but I imagine fixing this is completely possible with current tech.
Using Sora 2, I could pretty easily insert anybody into the crowd at a crime scene; I could probably even make them appear to be the criminal. Historical footage isn’t safe either if we don’t start taking into account when that footage originated. File dates can be easily changed; the only truly secure solution is a timestamp that can’t be altered by anyone: the blockchain.
By using our blockchain-stamping API, we can easily cryptographically sign any video or image created by a camera and etch that video’s hash onto the blockchain, significantly reducing the likelihood that it was modified later.
Of course, if I had access to the camera itself, I might be able to introduce some fake images by hacking the physical device. But let me ask you, what is more likely? Somebody halfway around the world creates a deepfake and tries to pass it off as real, or somebody climbs up a pole to physically tamper with a security camera?
These are the tradeoffs we’re going to need to think about in this new world of AI slop footage.
Chainletter can help. We’re looking for people working on new solutions in security, wearables, archival footage, really anything where the sensor needs to produce trust.
Check out this AI For Humans video if you want a much deeper dive into Sora 2: https://www.youtube.com/watch?v=mYd8VgGtw5A