Morning Light: Clarify

(I wrote enough words in this morning’s missive that I decided to crosspost them to ye olde blog. Not to mention that today’s topic is relevant to the public interest. Feel free to subscribe if you want a morning photo, a few musings about cats, art, writing, and fencing, and the occasional rant. A snippet of hospitality chez moi, as it says on the tin.)

Good morning!

I bought a rapier from Castille Armory a few years back, so I’m on the email list they put together for a new venture — a magazine covering HEMA, SCA, academe, and other ways historical weapons are used and studied today, called Quill & Quillon. Anyway, yesterday a blast announcement for the upcoming first issue arrived in my mailbox, followed later that evening by a blank blast email titled “An Apology,” which was then followed by an email with the actual apology the owner intended.

The apology was for the banner image used in the announcement, which was of a group of young men in an old city square, dressed in colorful landsknecht outfits with swords, taking a selfie. It was done in that uncanny-valley style between line art and photography that you see a lot of nowadays. This was AI-generated art. The owner of Castille explained that he had hired a marketing team to do some of the promotion work, and was horrified when he got his email and there was AI art in it. He pulled the image from the blast and in his apology reported that he wrote personally to everyone who responded in dismay about the inclusion of the image. He finished up by saying he would either set a boundary with the marketing team he hadn’t known he needed to, or find another team altogether.

This came on the heels of the announcement by Automattic that they had signed a deal with certain AI companies to scrape WordPress and Tumblr blogs that don’t opt out for stuff to train their machine learning systems with. In his post about the news, John Scalzi linked this Vox video for the basic explainer of the various forms of AI and how ubiquitous it’s becoming. (It’s a decent explainer but it’s also sponsored by Microsoft, so there’s a pitch midway through for their workplace machine learning tool.) The video doesn’t include the question of the arts in its scope, but that is of course what most of us who make art are concerned about. So these machine learning systems are being trained on words we wrote, pictures we painted, research we toiled at, and performances we gave with our own breath, so that they can produce the same for their owners without making any of those pesky demands for things like food and shelter and livelihood. An awful lot of money is being exchanged over the heads of us “creators” here, and we won’t see a red cent.

By the way, I’m really coming to loathe “creator” as a term almost as much as I hate the word “content” for what we produce. I know, we all want one term of few syllables to describe people who monetize sell things they made on the internet. But [lowering my Jeff Goldblum sunglasses] have you considered whether we should call us by an umbrella term? As the Vox video points out, the internet isn’t just a casual place people go for discrete purposes. It is now organically part of the public square, where humans gather. Authors sell words; video essayists sell essays; musicians sell music; visual artists sell paintings &c. All humans enjoy art; some also make it; of those, some also sell it. But instead, we have this economy where “creators” make “content” and “consumers” consume it, and the “content” is bait for the “consumers” so that their eyeballs will be forced to touch ads. So far from being human, we are humus, an acre of dirt in which other people grow money.

This is a terrible thing to be teaching machines. Artists already didn’t make a proper salary; now the owners of machines begrudge us even collecting hat.

My quotidian fare is a thing I provide for free, but it isn’t public domain. If money changes hands over it, one of those hands ought to be mine. I’m just saying.

Leave a Reply

Your email address will not be published. Required fields are marked *