My guess: Sam wanted to imitate the voice from Her and became aware of Midler v. Ford cases so reached out to SJ. He probably didn't expect her decline. Anyway, this prior case tells that you cannot mimic other's voice without their permission and the overall timeline indicates OpenAI's "intention" of imitation. It does not matter if they used SJ's voice in the training set or not. Their intention matters.
Please don't take this as me defending OpenAI's clearly sketchy process. I'm writing this to help myself think through it.
If it weren't for their attempt to link the voice to SJ (i.e. with the "her" tweet), would that be OK?
- It's fine to hire a voice actor.
- It's fine to train a system to sound like that voice actor.
- It's fine to hire a voice actor who sounds like someone else.
- It's probably fine to go out of your way to hire a voice actor who sounds like someone else.
- It's probably not fine to hire a voice actor and tell them to imitate someone else.
- It's very likely not fine to market your AI as "sounds like Jane Doe, who sounds like SJ".
- It's definitely not fine to market your AI as "sounds like SJ".
Say I wanted to make my AI voice sound like Patrick Stewart. Surely it's OK to hire an English actor who sounds a lot like him, so long as I don't use Sir Patrick's name in the advertising. If so, would it have been OK for OpenAI to do all this as long as they didn't mention SJ? Or is SJ so clearly identifiable with her role in "Her" that it's never OK to try to make a product like "Her" that sounds like SJ?
Most of those "rights of publicity" don't apply in the US though. If they did, half the actors in movies couldn't get work, because quite a few of them get work as the off-brand version of X in the first place.
Off the top of my head I can't think of a single example of any actors that come across as an "off-brand" version of someone else. What are some of the examples you have in mind?
I don't think this has anything to do with this case, but there certainly are some "types" in Hollywood with doppelganger actors.
Isla Fischer / Amy Adams
Margot Robbie / Samara Weaving / Jamie Pressley
Justin Long / Ezra Miller
Not to mention all of the token/stereotype characters where it hardly matters who the actor is at all. Need to fill the funny fat lady trope? If Rebel Wilson wasn't available, maybe they can get Melissa McCarthy.
The voice from Her isn't even the first voice I'd think of for a female computer voice. That trope has been around for decades. I'm sure OpenAI just wanted SJ specifically because she's currently one of the most popular celebrities in the world.
Another one you didn't mention: until relatively recently, I thought actors Elias Koteas (Exotica) and Christopher Meloni (CSI: Special Victims Unit) were the same person!
This is where your post breaks down. Many people say they don't think the voice sounds like SJ. Others do. But it appears you've made up your mind that they deliberately emulated her voice?
There's no clear line for this. To get the definite conclusion, you will need to bring this to the court with lots of investigation. I know this kind of ambiguity is frustrating, but the context and intention matter a lot here and unfortunately we don't have a better way than a legal battle to figure it out.
Thanks to Sam, this OpenAI case is clearer than others since he made a number of clear evidence against him.
Sure. And I'm not necessarily concerned about the outcome of the seemingly inevitable lawsuit. I'm more interested in calibrating my own moral compass here. If I were asked to participate in this, where would my comfort zone be?
I'll defer to a judge and jury about the legalities. As you noted, Sam gave them a lot of help.
My own thought is - every artist in history has taken inspiration from previous artists. The British voice actor in the above example surely studied greats in his field like Sir Patrick. We don't typically mind that. Where I think the line is between standing on the shoulders of giants and devaluing someone else's art is how well digested, for lack of a better term, the inspiration has been. Is the later artist breaking the components out, examining each, and assembling them with the insights they gained? Or are they seeking to resemble the end result as much as possible? I think that separates the cheap hack from the person who 'steals like an artist'
When it comes to whether something is "wrong", in general intent matters a lot and what they did was communicate an obvious intent. There are certainly ways they could have avoided doing so, and I'm not sure I understand the value of trying to dissect it into a dozen tiny pieces and debate which particular detail pushes it over the line from ambiguous to hard-to-deny? Maybe I don't understand what kind of clarity you're trying to achieve here.
This particular area of law or even just type of "fairness" is by necessity very muddy, there isn't a set of well-defined rules you can follow that will guarantee an outcome where everyone is happy no matter what, sometimes you have to step back and evaluate how people feel about things at various steps along the way.
I'd speculate that OAI's attempts to reach out to SJ are probably the result of those evaluations - "this seems like it could make her people upset, so maybe we should pay her to not be mad?"
What if they hired 10 different voice actors with no intent to have someone like SJ, but one voice actor actually did sound like from Her so they liked it the most and decided to go with it. And if only after the fact they realized that it is quite similar to SJ in general and decided to reach out to her and also went along with the Her idea due to obvious similarities?
Such a situation strains credulity. Voice acting is a profession, and professionals by their nature are aware of the wider industry, including the competition. SJ was the world’s highest paid actress in 2018-2019. The film Her was nominated for 5 Academy Awards, including Best Picture, and won Best Original Screenplay.
Even if this did go down the way you suppose, once they realized the obvious similarities, the ethical thing to do was to not use the voice. It doesn’t matter if the intention was pure. It doesn’t matter if it was an accident.
>If it weren't for their attempt to link the voice to SJ (i.e. with the "her" tweet), would that be OK?
Taking the recent screwup out of account... It's tough. A commercial product shouldn't try to assossiate with another brand. But if we're being realistic: "Her" is nearly uncopyrightable.
Since Her isn't a tech branch it would be hard to get in trouble based on that association alone for some future company. Kind of like how Skynet in theory could have been taken by a legitimate tech company and how the Terminator IP owner would struggle to seek reparations (to satisfy curiosity, Skynet is a US govt. Program. So that's already taken care of).
As long as you don't leave a trail, you can probably get away with copying Stewart. But if you start making Star trek references (even if you never contacted Stewart), you're stepping in hit water.
This is specifically covered in cases like Midler v. Ford, and legally it matters what the use is for. If it's for parody/etc it's completely different from attempting to impersonate and convince customers it is someone else.
Midler v. Ford is a bit different from CahtGPT in that it was about a direct copy of a Midler song for a car ad, not just a voice sounding similar saying different stuff.
Sure, you can't sell light sabers. Can't you use a Darth Vader voice impersonator to sell vacuums? What about a voice that sounds like generic background actor 12?
> It's fine to hire a voice actor who sounds like someone else.
Not necessarily, when you're hiring them because they like someone else—especially someone else who has said that they don't want to work with you. OpenAI took enough steps to show they wanted someone who sounded like SJ.
> Surely it's OK to hire an English actor who sounds a lot like him, so long as I don't use Sir Patrick's name in the advertising.
> If it weren't for their attempt to link the voice to SJ (i.e. with the "her" tweet), would that be OK?
No.
The fact that it sounds very much like her and it is for a virtual assistant that clearly draws a parallel to the virtual assistant voiced by SJ in the movie (and it was not a protected use like parody) makes it not OK and not legal.
> Surely it's OK to hire an English actor who sounds a lot like him, so long as I don't use Sir Patrick's name in the advertising.
Nope.
If you made it sound identical to Patrick Stewart that would also likely not be OK or legal since his voice and mannerisms are very distinctive.
If you made it sound kind of like Patrick Stewart that is where things get really grey and it is probably allowed (but if you're doing other things to draw parallels to Patrick Stewart / Star Trek / Picard then that'd make your case worse).
And the law deals with grey areas all the time. You can drag experts in voice and language into the court to testify as to how similar or dissimilar the two voices and mannerisms are. It doesn't nullify the law that there's a grey area that needs to get litigated over.
The things that make this case a slam dunk are that there's the relevant movie, plus there's the previous contact to SJ, plus there's the tweet with "Her" and supporting tweets clearly conflating the two together. You don't even really need the expert witnesses in this case because the behavior was so blatant.
And remember that you're not asking a computer program to analyze two recordings and determine similarity or dissimilarity in isolation. You're asking a judge to determine if someone was ripping off someone else's likeness for commercial purposes, and that judge will absolutely use everything they've learned about human behavior in their lifetime to weigh what they think was actually going on, including all the surrounding human context to the two voices in question.
A random person's normal speaking voice is nobody's intellectual property. The burden would have been on SJ to prove that the voice actor they hired was "impersonating" SJ. She was not: the Washington Post got her to record a voice sample to illustrate that she wasn't doing an impersonation.
Unless & until some 3rd other shoe drops, what we know now strongly --- overwhelmingly, really --- suggests that there was simply no story here. But we are all biased towards there being an interesting story behind everything, especially when it ratifies our casting of good guys and bad guys.
If "Her" weren't Sam's favorite movie, and if Sam hadn't tweeted "her" the day it launched, and if they hadn't asked SJ to do the voice, and if they hadn't tried to reach her again two days before the launch, and if half the people who first heard the voice said "Hey, isn't that SJ?" -
Then I'd say you have a point. But given all the other info, I'd have to say you're in denial.
> the Washington Post got her to record a voice sample
Actually it only says they reviewed "brief recordings of her initial voice test", which I assume refers to the voice test she did for OpenAI.
The "impersonating SJ" thing seems a straw man someone made up. The OpenAI talent call was for "warm, engaging, charismatic" voices sounding like 25-45 yr old (I assume SJ would have qualified, given that Altman specifically wanted her). They reviewed 400 applicants meeting this filtering criteria, and it seems threw away 395 of the ones that didn't remind Altman of SJ. It's a bit like natural selection and survival of the fittest. Take 400 giraffes, kill the 395 shortest ones, and the rest will all be tall. Go figure.
You’re right that a random person’s voice is not IP, but SJ is not a random person. She’s much more like Mr. Waits or Ms. Milder than you or I.
I don’t believe the burden would be to prove that the voice actor was impersonating, but that she was misappropriating. Walking down the street sounding like Bette Midler isn’t a problem but covering her song with an approximation of her voice is.
You are dead right that the order of operations recently uncovered precludes misappropriation. But it’s an interesting situation otherwise, hypothetically, to wonder if using SJ’s voice to “cover” her performance as the AI in the movie would be misappropriation.
> You are dead right that the order of operations recently uncovered precludes misappropriation.
I don't think that follows. It's entirely possible that OpenAI wanted to get ScarJo, but believed that simply wasn't possible so went with a second choice. Later they decided they might as well try anyway.
This scenario does not seem implausible in the least.
Remember, Sam Altman has stated that "Her" is his favorite movie. It's inconceivable that he never considered marketing his very similar product using the film's IP.
"We hold only that when a distinctive voice of a professional singer is widely known and is deliberately imitated in order to sell a product, the sellers have appropriated what is not theirs and have committed a tort in California."
Ford having multiple ads would not have changed the determination.
The voice doesn’t sound like her and the article shows there’s a fair amount of proof to back up the claim that it wasn’t meant to and that there was no attempt to imitate her.
And yet so many people think it does. What a weird coincidence.
> there’s a fair amount of proof to back up the claim that it wasn’t meant to
Sam has said his favorite movie is "Her". Sam tweeted "her" the day it was released. Sam wrote to ScarJo to try to get her to do the voice. OpenAI wrote to her two days before the release to try to get her to change her mind. A large percentage of the people who hear the voice thinks it sounds like ScarJo.
I think we're just going to have to agree to disagree about what the evidence says. You take care now.
I interpret it completely differently given that the voice actor does not sound like SJ.
1. OpenAI wants to make a voice assistant. 2. They hire the voice actor. 3. Someone at OpenAI wonders why they would make a voice assistant that doesn’t sound like the boss’s favorite movie. 4. They reach out to SJ who tells them to pound sand.
Accordingly, there is no misappropriation because there is no use.
The voice is different enough that anyone who listens to samples longer than 5 seconds side by side and says they can’t tell them apart is obviously lying.
All the reporting around this I’ve seen uses incredibly short clips. There are hours of recorded audio of SJ speaking and there are lots of examples of the Sky voice out there since it’s been around since September.
It doesn't even need to sound like the person. It's about the intent. Did OpenAI intend to imitate ScarJo.
Half the people of the world thinking it's ScarJo is strong evidence that it's not an accident.
Given that "Her" is Sam's favorite movie, and that he cryptically tweeted "her" the day it launched, and that he reached out to ScarJo to do the voice, and that the company reached out to her again to reconsider two days before the launch -
I personally think the situation is very obvious. I understand that some people strongly disagree - but then there are some people who think the Earth is flat. So.
I don't think the intent matters (though it's moot in this case because I think there is clear intent): If someone unknowingly used a celebrity's likeness I think they would still be able to prohibit its use since the idea is that they have a "right" to its use in general, not that they have a defence against being wronged by a person in particular.
For example if someone anonymously used a celebrity's likeness to promote something you wouldn't need to identify the person (which would be necessary to prove intent) in order to stop have the offending material removed or prevented from being distributed.
"We hold only that when a distinctive voice of a professional singer is widely known and is deliberately imitated in order to sell a product, the sellers have appropriated what is not theirs and have committed a tort in California."
The passage you cited reads "we hold only that when" (compare with "we hold that only when") which I understand as that they are defining the judgment narrowly and punting on other questions (like whether the judgment would be different if there were no intent) as courts often do. In fact the preceding sentence is "We need not and do not go so far as to hold..."
It might make sense for intent to be required in order to receive damages but it would surprise me if you couldn't stop an inadvertent use of someone's likeness. In fact the Midler case cites the Ford one: 'The defendants were held to have invaded a "proprietary interest" of
Motschenbacher in his own identity.'. I think you can invade someone's "proprietary interest" inadvertently just as you can take someone's property inadvertently; and courts can impose a corrective in both cases, in the first by ordering the invasion of proprietary interest be stopped and in the second by returning the taken property.
No. (I did cite the Ford statement about "proprietary interest" which I think supports my argument).
I'm not familiar with all the case law but I assume that no case has been brought that directly speaks to the issue but people can and do discuss cases that don't yet have specific precedent.
I don't think that's true. I can't cite them off the top of my head but when I read about supreme court cases often a big point of contention of the ruling is whether they decided to issue a narrow or broad ruling. Sometimes they decide to hear a case or not based on whether it would serve as a good basis for the type (narrow/broad) ruling they want to issue.
And the legal universe is vast with new precedent case law being made every year so I don't think the corpus of undecided law is confined to well known legal paradoxes.
As for this case it doesn't seem that odd to me that the issue of intent has never been at issue: I would expect that typically the intent would be obvious (as it is in the OpenAI case) so no one has ever had to decide whether it mattered.
That's a different standard: "Can you tell them apart side-by-side" vs. "does this sound like person X" or "is this voice exploiting the likeness of person X". It's the latter question that is legally relevant.
I cannot read the article because of it's paywall - is there actual proof OpenAI reached out to Johansson - or is it just being alleged by her lawyers?
It seems she has every reason to benefit from claiming Sky sounded like her even if it was a coincidence. "Go away" payments are very common, even for celebrities - and OpenAI has deep pockets...
Even so, if they got a voice actor to impersonate or sound similar to Johansson, is that something that's not allowed?
Johansson is a super successful actress and no doubt rejects 95% of roles offered to her, just as she rejected Altman's request to be the voice of ChatGPT.
She doesn't need "go away" payments, and in any case that is not what we're looking at here. OpenAI offered her money to take the part, and she said no.
According to celebrity net worth website, SJ is worth $165M.
> Johansson is a super successful actress and no doubt rejects 95% of roles offered to her
> She doesn't need "go away" payments
> According to celebrity net worth website, SJ is worth $165M.
I have no idea what Johansson's estimated net worth, or her acting career have to do with this? Wealthy people sue all the time for all kinds of ridiculous things.
The voice is, in fact, not Johansson. Yet, it appears she will be suing them non-the-less...
It's not illegal to sound like someone else - despite what people might be claiming. If it turns out to be true that Sky's voice actor was recorded prior to the attempted engagement with Johansson, then all of this is extra absurd.
Also, Sky doesn't sound like Johansson anyway... but apparently that isn't going to matter in this situation.
Miller v. Ford Motor Co. would disagree. There is a viable legal precedent, and it is very arguably illegal.
> The appellate court ruled that the voice of someone famous as a singer is distinctive to their person and image and therefore, as a part of their identity, it is unlawful to imitate their voice without express consent and approval. The appellate court reversed the district court's decision and ruled in favor of Midler, indicating her voice was protected against unauthorized use.
They’d have to prove the voice actor was imitating SJ. If OpenAI recorded the entire interview process as well as the sessions where they recorded the actor for samples to steer the model with then it should be open and shut. There’s also the fact of the 4 other voices. Who are they meant to be?
If her lawyers are half competent, then they wouldn’t lie. They may not tell the whole truth, but we’re not discussing what wasn’t said here.
As for your second question, yes. Otherwise you have a perfect workaround that would mean a person likeness is free-for-all to use, but we already decided that is not acceptable.
> Otherwise you have a perfect workaround that would mean a person likeness is free-for-all to use, but we already decided that is not acceptable
That is not decided. There have been high profile cases were someone's likeness was explicitly used without permission, and they still had no recourse. It was argued the person was notable enough they could not protect their likeness.
Regardless, it appears debated if Sky even sounded like Johansson, which will make this very difficult for anyone to argue (being subjective and all). If the Sky voice actor was recorded prior to engaging with Johansson (which has been claimed by OpenAI), then it seems even more difficult to argue.
In the end, this will net Johansson a nice "go away" payday and then everyone will forget about it.