You may have seen recently that OpenAI, creators of ChatGPT, have been sued by the New York Times and multiple other media outlets, including even authors such as George R.R. Martin, Jonathan Franzen, and Michael Connelly, to name a few. It turns out—shockingly enough—that all of these AI tech companies have been stealing from us. As congress agreed with the media industry, lawmakers have backed this call to force tech companies to pay for the data they skim off the internet to train their AI algorithms, according to Wired. Thank god. We can only hope that MidJourney and the other tech company thieves of hardworking artists go down with them, as hundreds of artists are suing them alongside DeviantArt and Stability AI for wrongfully training their machines.
“Waaah, waaah,” cried the tech companies, “This would surely destroy the AI industry for good! How could we possibly train our theft matrix without stealing,” said OpenAI and other numerous crybabies. The answer is surprisingly simple, AI CEO Sam Altman (I know you’re reading this; you skimmed it with your machine); pay the fuck up. As we’ve noted in previous articles here at The Sentry, algorithmic data assemblage is exactly that—it isn’t, and never was original “art” or “writing,” only approximations based on a pool of existing work. ChatGPT takes words from around the net, puts them in a blender, and appears to look unstolen until you realize it’s just rephrasing existing sentiments (including tsentheir biases, such as racist, made-up ideals) and puking them back up in a stew of human-looking patterns. The same goes for AI “art;” typing in “Mona Lisa” does not yield an original portrait; you can literally see it stitch together google results into basically the exact same image. It’s the same with any AI content generation.
This makes what AI art and writing is being used for even more unsettling, for a list of hundreds of artists being used by MidJourney without their consent has emerged, and the information stored is nothing short of horrifying. Yes, the promised “Torment Nexus” has arrived, and now the machine can wear the skin of diseased artists, recreate explicit photos of children (according to Futurist), and can reproduce numerous illegal facsimiles based on your health records, private photos, and even deep-fake parrot your voice back to you!
So yes, AI Techbros, you had a chance to make this a thing that would benefit mankind, and maybe even automate some of the most soul-crushing parts of our existence. Instead of doing that though, you chose to mechanize the things that made us human. You know how a creator thinks deeply about what something means to them when they write, draw, film, or otherwise? “No,” you thought, “we just need content. That’s all that matters.” Well, techbro, we’re here to put you back in your place. In the words of Richard Blumenthal, the Democrat who chairs the Judiciary Subcommittee on Privacy, Technology, and the Law that held the hearing, “It’s not only morally right, it’s legally required.” Data scrapers, prepare to be sued. Go hang out with the NFT losers in the corner with your dying techno-fads.