Last week Genna Bain, whose late husband was the game critic and YouTube personality John “Totalbiscuit” Bain, broke the news to fans that she’s contemplating removing all of his videos from the internet to prevent “AI” machine-learning technologies from storing and manipulating his voice to promote particular political and social views.
“Today was fun. Being faced with making a choice of scrubbing all of my late husband’s lifetime of content from the internet. Apparently people think it’s okay to use his library to train voice AIs to promote their social commentary and political views,” Bain wrote on Twitter.
Read More: AI Comic Art Gets Copyright Revoked After Office Learns How AI Works
A sympathetic and split community
John Bain, who died of cancer in 2018 at the age of 33, left behind a vast YouTube library of reviews, podcast interviews, and other videos which could conceivably be valuable source material for machine-learning technologies that feed on such data. While folks within the r/Cynicalbritofficial subreddit, an online community of Totalbiscuit’s fans, were sympathetic to the situation now faced by his widow, many felt that her efforts to prevent AI voice learning tech from committing digital necromancy by purging Totalbiscuit’s videos from the internet would be in vain. Chief among their collective reasoning is the age-old internet expression that once something is put on the internet, for better or worse, it’s there forever. In other words, even if Bain deletes her late husband’s videos from his channel, they’ve likely already been archived and saved elsewhere for fans and, woefully, AI libraries.
G/O Media may get a commission
HP Envy Desktop Bundle
This PC has an RTX 3070 GPU, a 12th gen Intel i9 processor, 16GB of SDRAM, and a 1 TB SSD, and it comes with a mouse and keyboard included as well.
“Technology is getting pretty scary. Like, this is identity theft on steroids, they don’t just steal your name but your face and voice too,” Reddit user iMogwai wrote.
“This is why we can’t have nice things,” fellow fan community member bers90 wrote. “These AI people are out of control. Her taking down everything will not stop these people as they probably have backed everything up already. It will remove TB from the internet and the ‘normal’ people with good intentions will not be able to get inspired by him, listen to his wisdom and generally chill out to his content.”
“I don’t think deleting the whole library would change much of this shitty situation,” said another member who goes by Existing_End6867. “TB’s voice is widely available in so many other places that the only effect scrubbing the YouTube channel would have…would be erasing his legacy to the detriment of all of us going there to remember him,” Existing_End6867 wrote.
Read More: Your Favorite Voice Actors Call Out AI Sites Copying Voices Without Consent
Although the U.S. Copyright Office recently ruled that procedurally generated images are not granted copyright protection, that hasn’t stopped AI voice websites from copying and selling famous voice actors’ voices without consent. Last month, well-known video game and anime voice actors like Cowboy Bebop’s Steve Blum and the Mass Effect series’ Jennifer Hale posted a PSA to Twitter warning fans not to buy into any AI website that has copied and sold their voices but to support real actors instead.
As AI voice technology becomes increasingly sophisticated, the ways in which people attempt to use it become more varied, and sometimes more troubling. For example, a harassment campaign last month involved AI-generated sound clips posted to Twitter in which actors’ voices were made to use racist and homophobic slurs as well as reveal private information including the actors’ home addresses.
“Hey friends, I know AI technology is exciting, but if you see my voice, or any of the characters that I voice, offered on any of those sites, please know that I have not given my permission, and never will,” Blum tweeted. “This is highly unethical. We all appreciate your support. Thank you.”
Kotaku reached out to Bain for comment.