AI is Inherently Evil
Parts 1 + 2
Preface:
I wrote this essay during the summer of 2023 and have been revising it incrementally in the months since. It is intended for a wide audience—beyond people who know me or subscribe to this newsletter—and isn’t a piece of creative writing or illustration work, so it’s a bit different from my usual essays and announcements here.
1. Introduction
Seriously? Another article about AI? Okay, this one looks pretty angry, alt-right extremism, neurotic liberal elites, plagiarism, copyright, automation stealing jobs, ethical guidelines and regulations, monopolies, poisoning, Google is making us stupid, etc.
Believe me, I sympathize. I, too, am sick of reading the same articles and having the same conversations over and over. The positions I hear on AI usually fall somewhere between "I've married my AI girlfriend"1 and "It's stealing my job." The more delusional will mention sentience, or divinity. Many older, established artists I've talked to believe AI is just a tool, like photography, a temporary but survivable shock to creative industries. The most negative articles about AI stick to mocking its errors2, worrying about the economy3, or pointing out dangerous use cases4.
This essay, which advances a much more polemical stance, draws on four precedents: apocalypticism5, accelerationism6, technique7, and enshittification8. While I think the original authors of these theories arrived at essentially similar conclusions to me, their arguments are relatively opaque and abstract. This essay unifies their theses and extrapolates results. In other words, I'm not claiming to be some thoughtful thinker thinking thoughts no one has ever thought before. I just want to be able to communicate my position to a general audience that may not be as familiar with the same theories on technology.
I intend this essay to be highly procedural and systematic, starting with simple assumptions and working from there. I am not a semiotician or a philosopher; if you are one, you may find the next two sections of this essay tedious, for which I apologize and ask you bear with me. I cite sources where necessary, but only to clarify abstract ideas, not to try and prove a point with evidence. The panic about “mental disorders in young people”9 or the "revelations" from tech insiders are just samples, easily refuted with other samples. Worse, they are old news. Even the most avid tech fans joke about how evil Big Tech is. What's important is why they are evil, and what the end result will be.
I am not interested in the killer AI theories espoused by the likes of Elon Musk or Eliezer Yudkowsky10. These people confuse machine learning with consciousness, suggesting that current AI development will somehow lead to a computer god; they also generally claim that AI can do great things, but "only with proper safeguards." They worry that AI will not like humans, but AI would have to know that humans exist in order to dislike us. My concern is not how AI feels about us, but how we feel about AI.
One last point: some sympathetic readers, especially those who actually know me, might wonder why I would spend time on this essay instead of, say, promoting the authors I name as inspirations, working on my art projects, or debating AI users one-by-one. The answer is that my particular theories about tech, which I see as commonsense and reasonable, strike most as childish or absolutist. This is why I have written a clear, exacting, point-by-point manifesto to lay out my reasoning. This is the most comprehensible I can make myself.
2. Traditional Information Architecture
Information consists of two parts: form and reference. "Form" is the medium used to make information intelligible to users, whether that be speech, printed images, or digitized files. "Reference" is the reality11 external to the user that form describes through abstracted metaphor. Science papers reference experimental data, novels reference the author's opinions, memories, and perceptions. Without form, information could not be stored or transferred; without reference, there would be no information to transfer.
Reference does not necessarily refer to reality accurately. Without falling into the favorite pastime of sophist adolescents and philosophy majors, that of "defining" "truth," let's say that information contains reference that represents reality, reference that fails to represent reality, or a combination of the two. For example, the novel Feed, though set in a fake world filled with fake people, references real patterns in technological, capitalist societies. The fake objects are not reference; they are form12. As a counterexample, social media like 4chan, Gab, and X teem with conspiracy theories presented as fact, but without any reference to actual events.
Information can be created, stored, or deleted. It can only be created through conscious perception13. You can take automatic measurements with a thermometer or seismograph, but the numbers are just particles of mercury or ink until you look at them and preserve them. Bits14 only exist so long as they have storage locations. Data can only move between locations through a medium that exists independently of the locations, the user, and the data itself.
For the purposes of this essay, that medium is information technology, or, less specifically but more concisely, just technology15. Technology includes ink, video, and musical instruments, among many other things. It stores and transfers form, and form stores reference. A violin is technology, a sequence of notes is form, and the feeling conveyed by the music is reference.
Technology can imitate other technology, but the most limiting for users, and therefore the determining factor in how the technology handles information, is that which the user tangibly interacts with. For example, the characters in Snow Crash communicate through virtual reality, spoken language, and a mind-virus, but these are just part of the form Snow Crash uses to reference Neal Stephenson's very interesting thoughts about religion and psychology16. The only material technology Snow Crash actually uses is printed text. Likewise, if you're reading Snow Crash as a digital PDF, the written prose affects how you receive Stephenson's ideas. But the only physical technology in play is the computer, and the ways you can use your computer limit how you can use the text—not the other way around.
This is the end of parts 1 and 2 of “AI is Inherently Evil.” I will publish each succeeding part every day from now until all are posted. Each part will link to each preceding part. I will also post a link to a complete-text version of this essay in the final post.
Because masculine AIs are apparently too threatening.
Nico Grant, "Google's A.I. Search Errors Cause a Furor Online," The New York Times, 24 May 2024. https://www.nytimes.com/2024/05/24/technology/google-ai-overview-search.html.
Brian Merchant, "Understanding the Real Threat AI Poses to Our Jobs," Substack, Blood in the Machine, 3 Jun. 2024, https://www.bloodinthemachine.com/p/understanding-the-real-threat-generative.
David Gilbert, "Neo-Nazis Are All-In on AI," Wired, 20 June 2024, https://www.wired.com/story/neo-nazis-are-all-in-on-ai/.
Especially as espoused in M. T. Anderson's FEED, though he doesn't call it such.
A theory originating with Nick Land and taken up by his many successors.
A concept unrelated to the English word "technique," from the French philosopher Jacques Ellul.
A word coined by Cory Doctorow, referring to what he calls "platform decay."
Which I call a “panic” not because it’s not real, but because the adults doing the panicking seem unconcerned by their own screen time and phone addictions.
Robert Evans, "The Cult of AI," Rolling Stone, 27 Jan. 2024, https://www.rollingstone.com/culture/culture-features/ai-companies-advocates-cult-1234954528/.
I am going to use the word "reality" a lot; for this purpose, I define “reality” as the entire potentially observable material universe. "Potential" includes both known and unknown objects, "observable" means that, if a user is positioned to perceive the object, they must perceive it, and "material" means the object has observable effects on other objects. This isn't a perfect definition of "reality," but it's good enough for this usage.
When readers mistake form for reference, they are typically deemed either psychotic or religious.
Note that AI doesn't "create" information; it predicts it based on information people have created.
A bit being the conventional unit of information storage in computer science. I use the words "information," "bits," and "data" as interchangeable synonyms.
This is a non-normative definition that I make for simplicity's sake. For the purposes of this essay, hovercars, guns, and electric sheep do not count as technology, unless they collect and transmit data.
And his very xenophobic opinions about refugees and immigrants.

