Generative AI — things like ChatGPT and Midjourney — are having a moment. A lot of people are infuriated at the way they take real people’s work and use them to train their systems and then turn around and spit out work in the stye of the work they were trained upon. Or plagiarize the work of the people whose work they used for training. Magazines are being inundated with stories “written” by these generative AI tools. We have seen companies use AI art for book covers and other work, depriving people of paid work they would otherwise have. All of these are important issues, but they are all the least interesting questions surrounding these technologies. Because our society is so broken, however, they are the only questions that matter.
And these questions do matter. The companies that make these tools are making money based on the training material unwittingly supplied by writers and artists. These companies openly admit that they could not do what they do without this material. Yet they neither acknowledge nor compensate the people who created the material they depend on to make their products even as those products threaten the livelihood of the very people who make them possible.
Magazines so far haven’t purchased stories written by AI tools — the tools do a poor job of creating actual art — but they have forced at least one to stop accepting submissions temporarily and others to limit their submissions. This, in turn, means that there are fewer opportunities for unknown authors to get known. And that makes it more likely that writing for a living will remain the domain of the privileged even more than it is already. Magazines will shrink the slush piles or get rid of them entirely and rely on connections and MFAs or other expensive, hard to obtain credentials.
Artists and writers can already have their styles aped by these tools. At some point, possibly in the near future, they may find that an AI can produce their next work for them — meaning they don’t need to be paid for that work.
Chatbots are already being used in search engines, and already producing terribly wrong answers. What does it mean that companies can profit off these wrong answers? What does it mean when the chatbots can produce realistic sounding disinformation that can drive engagement and thus ad money despite the social harm that disinformation does?
These are all very real problems, but they aren’t, as I said earlier, the interesting questions. Despite my disdain for the concept of artificial intelligence (it simply does not exist), individual tools can have interesting applications. Are there ways that generative AI can be leveraged by artists to create more art? Can these tools be like word processors in that way? Can they be like CGI — a tool that opens up the kinds of art that can be created, allowing artists to do new and interesting things that they couldn’t really do before, enriching the artistic commons? Can these chatbots be evolved to the point where they could help people who don’t have access to regular human contact be less lonely? Is that even a good idea? We don’t have enough doctors and medical professionals in general. Could chatbots, properly trained, live up the old idea of expert systems and guide people through their initial intake into the medical system? (I should stress that these are long term questions — we can’t build soap dispensers that turn on for African American hands. I am not willing to turn over medical decisions to those systems just yet.) Can these tools make programming more accessible and secure, or at least more efficient? These tools could provide real benefits to society, but we cannot even begin to morally approach those areas because of the immoral system we live under.
Generative AI’s immediate future is almost certainly one of destruction. Given who owns these systems and given the current economic incentives, it is almost certain that people who write, draw, code, and interact with people for a living are going to find themselves under immense economic pressure in the near future. And because of that, instead of looking at the potential benefits of these tools for everyone, we will be forced to decide who these tools immiserate and who these tools grant economic and thus political power to. It will be a war, even if it is not acknowledged as such, and the result is likely to be far more people pushed to the margins of society, far more people pushed away from a comfortable life, and far more people pushed into either political despair and despondency or political radicalism and violence.
It doesn’t have to be this way. We are quite literally the richest society that humanity has every produced by quite literally every measure of human well-being. Literally, not one person in our society need go without education, food, housing, or medicine. That they do is a choice — a choice based on the false idea that if we do not hold the prospect of death and misery over the heads of people, they will not contribute meaningfully to society. That people are inherently selfish and lazy. This is, of course, not true. People help others in times of emergency. They contribute their communities of their free time and free will. They build and create and teach when given the space and the time. People, by and large, are social creatures who inherently, almost instinctively, create and nurture communities. Some few don’t, it is true, but building a punitive system based on the outliers serves only to retard the progress for the majority of us all in the service of elevating the fortunate few.
Generative AI is the perfect example of this unfortunate dynamic. The arguments around generative AI show how broken our society is. We pretend that people get what they deserve and that the people on top, of course, deserve to watch the people below suffer. It is counterproductive, counter to what we know of basic human nature, counter to the common good, and flatly immoral. We could be having discussion about how and if these tools could change our society for the better. Instead, we must fight about who these tools will grind into the dust.