I heard about C2PA and I don’t believe for a second that it’s not going to be used for surveillance and all that other fun stuff. What’s worse is that they’re apparently trying to make it legally required. It also really annoys me when I see headlines along the lines of “Is AI the end of creativity?!1!” or “AI will help artists, not hurt them!1!!” or something to that effect. So, it got me thinking and I tried to come up with some answers that actually benefit artists and their audience rather that just you know who.
Unfortunately my train of thought keeps barreling out of control to things like, “AI should do the boring stuff, not the fun stuff” and “if people didn’t risk starvation in the first place…” So I thought I’d find out what other people think (search engines have become borderline useless haven’t they).
So what do you think would be the best way to satisfy everyone?
AI needs data to train up on. It can’t create art without first consuming existing art and spitting out parts of the originals. There’s a reasonable claim to be made that AI synthesis of prior art is itself original enough to count as having intrinsic worth, but if the only way to get it is stealing other people’s work to train up your model, the whole value proposition of AI art is probably net negative, entirely at the expense of artists whose work was used to feed the model.
Yes, there’s the argument that automation of new things is inevitable, but we do have choices about whether the automated violation of copyright to feed the model is tolerable or not. Sure, it’s a cool sexy technology and who doesn’t love getting on the bandwagon of the future and all, but the ethics of modern AI development are trash and despite promises that automated AI labor will save the owner class money by doing for free what the plebes demand to be paid for, it’s really as much a ponzi scheme as all those crypto currencies that don’t have intrinsic value unless enough suckers can be convinced to feed the scheme.
And yet, it’s a powerful technology that has potential to be a legitimate boon to humanity. I’d like to see it used to do things (like picking crops that are hard to automate with dumb machines, or cleaning trash off of beaches or out of the ocean, or refactoring boilerplate code to not use deprecated packages or to review boilerplate contract text for errors) that aren’t just ways for owners to cut labor out of the economy and pocket the differences.
Perhaps, if we are going to allow AI to be a great big machine that steals inputs (like art, or writing, or code) from others and uses them to do for-profit work for their owners, the proceeds attributable to AI ought to be taxed at a 90%+ rate and used to fund a Universal Basic Income as payment for the original work that went into the AI model.