AI fuckery, part 1,323: Wherein Google shits on their golden goose
Google abuses their massive YouTube library to once again rat-fuck the content creators. News at eleven.
Look, I use some of the generative AI tools. In the right situation, for certain uses, they’re quite good.
Earlier this week, I got a demo of a service that translates video into many languages, and it did a flawless job, even syncing to lip motions.
I mean, so good that if you didn’t know it was dubbed with AI, you could not tell.
Alas, to do things like this requires models. Models trained with lots of data.
And that is where I want to go today. In the least-surprising news EVER, it turns out that Google’s new (also quite impressive) VEO3 video model was trained on videos from the YouTube service.
“Creators say they didn’t know Google uses YouTube to train AI”
No shit, really? Google dipped into the exabytes of their video library, uploaded by people like me (yes, I have a handful of vids I made) and are using them to train their models?
Oh noes!
The tech company is turning to its catalog of 20 billion YouTube videos to train these new-age AI tools, according to a person who was not authorized to speak publicly about the matter. Google confirmed to CNBC that it relies on its vault of YouTube videos to train its AI models, but the company said it only uses a subset of its videos for the training and that it honors specific agreements with creators and media companies.
This is the least surprising thing ever, but also sleepy as fuck. It is also precisely what I expect out of Google, their impunity with which they treat customers and customer data knows no lower bound.
The company shared in a blog post published in September that YouTube content could be used to “improve the product experience … including through machine learning and AI applications.” Users who have uploaded content to the service have no way of opting out of letting Google train on their videos.
“It’s plausible that they’re taking data from a lot of creators that have spent a lot of time and energy and their own thought to put into these videos,” said Luke Arrigoni, CEO of Loti, a company that works to protect digital identity for creators. “It’s helping the Veo 3 model make a synthetic version, a poor facsimile, of these creators. That’s not necessarily fair to them.”
I mean, how could anyone complain about this?
Wait, we should all be upset about this. Alas, the whole ZIRP era, and the rise of the influencer economy (one of my co workers has three high school aged kids, and two of them aspire to forgo college and just become influencers, because, c’mon, how hard could it be?) Google and others (cough - META - cough) have built an ecosystem of payments to condition influencers to build an audience and monetize it. Now they are sitting on this YUUUUUUGE pile of video based content, in all languages, and they are itching to just mine that to build another model optimized and based off the sweat and labor of their audience.
Lather, rinse, repeat.
Later in the article, there is a glimmer of hope:
“By providing Content to the Service, you grant to YouTube a worldwide, non-exclusive, royalty-free, sublicensable and transferable license to use that Content,” the terms of service read.
“We’ve seen a growing number of creators discover fake versions of themselves circulating across platforms — new tools like Veo 3 are only going to accelerate the trend,” said Dan Neely, CEO of Vermillio, which helps individuals protect their likeness from being misused and also facilitates secure licensing of authorized content.
Uh, like with the proliferation of ChatGPT being used to cheat on writing assignments, this is going to be a monstrous game fo what-a-mole.
So, what about that EULA?
YouTube also allows creators to opt out of third party training from select AI companies including Amazon, Apple and Nvidia, but users are not able to stop Google from training for its own models.
Uh, no, you can’t opt out of google ass-raping you for their own benefit. Neat eh?
One day, either the courts will rule that the trainers and purveyors of Generative AI, that is only possible because of massive amounts of copyright violations is not OK, or we will be in a place where intellectual property rights are pretty much null and void. Since the supremacy of the US’s economy largely rests on services and respect for intellectual property, this will be the beginning of the end.
But at least our training videos will be available in English, French, Portuguese, Spanish and Mandarin.
Yay!
And, a late add: Fuck you Google for making me find common cause with Senator Josh Hawley:
“The people who are losing are the artists and the creators and the teenagers whose lives are upended,” said Sen. Josh Hawley, R-Mo., in May at a Senate hearing about the use of AI to replicate the likeness of humans. “We’ve got to give individuals powerful enforceable rights and their images in their property in their lives back again or this is just never going to stop.”
Yo Sen Hawley, you can strike that preemptive ban on any AI related legislation for 10 years from that homeless abortion of a bill, the OBBBA, k’thx-bye
Yet another reason for me to feel no guilt for blocking ads on YouTube and using a VPN to opt out of their superfluous tracking cookies and other ad-related bullshit.
Intellectual property rights are worthless without distribution. No creator is owed distribution, any more than jackass right-wingers are entitled to not be banned from social media for spreading hate and lies.
Google owns the distribution. Or at least some chunk of it. I find it notable that plenty of content creators upload with zero hope of ever monetizing, just because they want to — and I suspect that’s because they find “distribution for my copyright” to be a fair trade.
Influencers were never a sustainable or socially positive business sector anyways. It was vapid and clogged our society with a tidal wave of bullshit, and as you mention, has distracted our youth with a mirage of “easy” money for an absolutely pointless and absurd “career”. I’m not shedding a tear for that business model being eaten by its own distributor. People with successful influencer careers will get hurt, sure, but this looks less like an “ass-raping” to me than the snake eating itself from its own tail.
Everyone else, meanwhile, will probably just keep using YouTube to upload the same hobby content they always have, except perhaps a little more responsibly and warily, and they might have to use a new generation of spam filters to ignore the slop. La plus ça change…