Several Representatives, including both Democrats and Republicans, have introduced the “Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2024’’ (NO FAKES Act) in the House of Representatives.
The bill would create a federal intellectual property (IP) right to a person’s voice and likeness.
A Senate version of the bill was introduced in July.
Numerous AI and entertainment-industry heavy-hitters, including OpenAI, The Walt Disney Company, Warner Music Group, the Authors Guild, the Recording Industry Association of America (RIAA), the Motion Picture Association (MPA), Universal Music Group, and SAG-AFTRA came out in support of the Senate bill.
According to House bill sponsor Madeleine Dean (D-PA),
By granting everyone a clear, federal right to control digital replicas of their own voice and likeness, the NO FAKES Act will empower victims of deep fakes; safeguard human creativity and artistic expression; and defend against sexually explicit deepfakes.
The House and Senate bills come in the wake of a report from the US Copyright Office recommending the creation of a new federal law that would create a kind of property right in a person’s digital replica, including their face, body, and voice.
The Report defines a “digital replica” as “a video, image, or audio recording that has been digitally created or manipulated to realistically but falsely depict an individual.”
Some existing state and federal laws address the issue, but there are gaps in coverage, as the Copyright Office Report notes:
State laws are both inconsistent and insufficient in various respects. As described above, some states currently do not provide rights of publicity and privacy, while others only protect certain categories of individuals. Multiple states require a showing that the individual’s identity has commercial value. Not all states’ laws protect an individual’s voice; those that do may limit protection to distinct and well-known voices, to voices with commercial value, or to use of actual voices without consent (rather than a digital replica).
Non-commercial uses can include deepfake pornography.
For example, as the New York Times reported, politician Sabrina Javellana, who in 2018, at age 21, won a seat on the city commission in Hallandale Beach, Florida, was attacked by Internet trolls who used her face to make fake porn. She had no legal tools to fight back
Actor Tom Hanks recently took to Instagram to warn fans that deep fake videos that appeared to be of him were being used to promote miracle cures and wonder drugs. Last year, fake Hanks images were used to promote a dental plan.
As the New York Times reported in May,
Days before OpenAI demonstrated its new, flirty voice assistant last week, the actress Scarlett Johansson said, Sam Altman, the company’s chief executive, called her agent and asked that she consider licensing her voice for a virtual assistant.
Although she refused to give her permission, OpenAI used a voice that she claimed sounded “eerily similar” to hers.
Johansson voiced the unseen AI character in the 2013 Spike Jonze movie Her.
According to the Copyright Office,
A federal digital replica right should prioritize the protection of the livelihoods of working artists, the dignity of living persons, and the security of the public from fraud and misinformation regarding current events. For these purposes, a postmortem term is not necessary.
At the same time, we recognize that there is a reasonable argument for allowing heirs to control the use of and benefit from a deceased individual’s persona that had commercial value at the time of death. If postmortem rights are provided in a new federal law, we would recommend an initial term shorter than twenty years, perhaps with the option of extending it if the persona continues to be commercially exploited.
The NO FAKES Act would create a “descendible and licensable property right that continues for 70 years after the individual’s death, even if it is not exploited during their lifetime.”
According to a poll from the The Human Artistry Campaign, 85% of those surveyed believed “we need new guardrails to protect people from being exploited by AI.”
Just like the haiku above, we like to keep our posts short and sweet. Hopefully, you found this bite-sized information helpful. If you would like more information, please do not hesitate to contact us here.