Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This seems to have a glaring flaw in that a fake AI generated person could have a real-world doppelganger, which a tyrannical government could mistakenly arrest, or worse.


A real person can also have a real-world doppelganger. In fact, isn't this just as likely as an AI generated person having one, maybe even more? So doing this swap didn't really change anything when it comes to "innocent" people getting targeted?


It's a novel approach, but like you've said I think the potential negatives far outweighs the positives. Perhaps they could use obviously fake faces with weird proportions and features?

Idk, this seems like a case of using cool flashy tech because it's cool and flashy and not because it's better than the alternative (using an actor to portray the subject, blurring or blacking out the subject's face).


With face recognition software and automated edits they could run any protest footage from autocratic regimes like Iran or China through a filter to obscure everyone's face and replace it with an AI generated one. All it would cost them is a few extra minutes of editing time, I'm not sure if that would be critical in the news business where there's a lot of time pressure.

I think that would be a really cool feature that could help keep protesters safe. Dissidents could send footage to trustworthy outlets who would only make public the edited footage. Because of course people still want to share the event with the world, but they might not want to be identified in a place where if caught they get tortured, raped or even killed.

The individual face isn't really important most of the time, it's not like it makes a difference to the viewer. They could still give the unedited footage too police in places with legitimate rule of law if the government has a court order and can prove a crime was committed by an individual they have on tape.


I think that would be a really helpful feature to convince Regular Joe what there are protests today in checks notes Eastasia.


Seems like a publicity stunt.

A face covering would cost nothing and achieve the end goal of hiding their identity.


This solves multiple goals: hide identity, show anon-source exists, and show facial emotions to viewers.

The latter is a huge win for making media people want to watch. Same reason all those cable stations segment the screen into four with a face in each corner, or why streamers overlay their gaming with a their face on a webcam. We like faces.


> The latter is a huge win for making media people want to watch.

I get that people like faces, but the news should be real. Maybe I'm expecting too much of the BBC, but it seems pretty short-sighted to start carrying fake news, even if they are only starting with a fake face.


I guess you're right, but if it's on the news, shouldn't we expect factual data rather than "a pageant" as De Niro repeated in "Wag the Dog"?


But since it's publicly known that the faces are fake, the government isn't going to try to use them to identify anyone.


A "glaring flaw" because of some extreme and unlikely scenario?

Why would governments be using AI modified content to look for people to arrest? Is this really a serious risk to dissuade using it? Seems pretty unlikely that it simultaneously a) exclusively be used as the source + b) actually matches someone IRL.


The USA itself put out a nationwide manhunt, distributing video to news outlets, asking the public to assist in identifying protesters whose only known alleged crime was trespassing.


If you look at the J6 convictions it is for far more than trespassing. Interrupting the electoral college count turns out to be a pretty big deal.


China, North Korea, and Iran also have excuses for how they treat their dissidents.


This is more than just dissent, though. I don’t think there’s a country in the world which wouldn’t criminalize active interference with its political process.


Not many people heard about the sitch in Phoenix after the Dobbs decision, but security forces found it necessary to barricade the State Capitol with concrete walls as pro-abortion protesters began to plot and lay siege to the Legislature inside.

That battle hasn't ended, either; it's been taken to the Crisis Pregnancy Centers, but I simply remind people: if you're up-in-arms about J6, insurrection is a two-way street and knows no Right nor Left.


> Why would governments be using this content to look for people to arrest?

Why wouldn't oppressive regimes that are arresting dissidents review footage of dissidents that makes them look bad? Harassing reporters and their sources is a very common way to suppress information.

Scientology does it, why wouldn't a government? https://www.forbes.com/sites/richardbehar/2020/08/05/sciento...


> Why wouldn't oppressive regimes that are arresting dissidents review footage of dissidents that makes them look bad?

Because the faces aren’t real? They want to arrest dissidents not random people.


> Because the faces aren’t real? They want to arrest dissidents not random people.

How can you tell? The entire point of the GP's comment is that they could be mistaken.

https://news.ycombinator.com/item?id=33735239


This ignores how information spreads. Given how unlikely a randomly generated face will match a relevant person in a local area related to the protest or gov, that means the AI technology will also be widespread (otherwise it’s statically a non-problem because it will be significantly even more unlikely to happen IRL).

If it is widespread then the government workers doing facial recognition should be keenly aware of its existence and adapt by seeking photos from non-activist/protected sources… like the thousands of photos posted on social media after every protest.

The bad government doesn’t want to be arresting random people either they want the real ones.


FWIW I agree. We are dangerously close to living in a world where you cannot trust any videos, pictures, or audio.

I just think that this approach is perhaps more 'cool' than it is useful, it makes me think of Mr. Swirl[0] who got caught because of a fancy effect. Is face-swapping an actor's face really better than just using the actor outright?

[0] https://en.m.wikipedia.org/wiki/Christopher_Paul_Neil


> How can you tell?

They literally say so in the documentary.


That doesn't stop clips from this or future videos that use this technique from being taken out of context.


Are you serious?

This is standard practice by security forces of any dictatorship.


A standard practice to use... potentially AI modified surveillance photography? Where will they get these photos from exactly in this future scenario? Activists and journalists when these photo apps become widespread (unbeknownst to the government)?

It's an interesting hypothetical for sure but it's stretch to call it a glaring flaw. Not really any worse than people being misidentified in normal photos.


Are we sure this has a greater chance of happening than the alternative (you present the voice a transcript or robotic voice and the government arrests a random person they thought it may be linked to anywas)


Especially since these are generated based on training data from real humans…


It's not even AI generated faces!

They claim that they face swapped real interviews of participants with actors instead of just having the actors perform the piece....


What if you used the faces of that tyrannical government's leadership?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: