Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Let's say, for the sake of argument, AI could generate absolutely perfect invented videos of arbitrary people doing literally anything. The consequence will be that video will no longer be taken seriously as evidence for crimes. People will also quickly not trust video calls without an extreme level of verification (e.g. asking about recent irl interactions, etc.)

Yes some people will be scammed as they always have been, such as the recent Hong Kong financial deepfake. But no, millions of people will not keep falling for this. Just like the classic 419 advanced free fraud, it will hit a very small percentage of people.



OK, but I did like living in a universe where I could watch video news of something happening in another country and treat it as reasonably strong evidence of what is happening in the world. Now I basically have to rely on only my own eyes, which are limited to my immediate surroundings, and eyewitness accounts from people I trust who live in those places. In that sense, I feel like my ability to be informed about the world has regressed to pre-20th-century levels.


I predict that we will have blockchain integration of media crating devices such that any picture / film that is taken will be assigned a blockchain transaction ID that moment it is generated. We will only trust media with verifiable blockchain encryption that allows us to screen against any tampering from the source.

Invest in web 3.0 now.


Pray it doesn't regress any further


Video alone has never been considered evidence of a crime in a court of law (At least in the United States). A person needs to authenticate the evidence.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: