the machine's work is good enough
oops, we spent the last decade participating in a system that incentivizes robotic, low-quality "art"
here’s how to think about a hot new trend
One of the more formative, clarifying pieces I've ever read was a 2009 WIRED article about "The Good Enough Revolution," which made a compelling case that — when it comes to technology — we're always willing to compromise. The thesis was loosely based on the principle that 80% of a user's needs could be delivered with 20% of the effort or "input", as opposed to expending 100% effort to meet 100% of those needs. Basically, the Pareto Princriple. "Cheap, fast, simple tools are suddenly everywhere,” Robert Capps wrote. “We get our breaking news from blogs, we make spotty long-distance calls on Skype, we watch video on small computer screens rather than TVs, and more and more of us are carrying around dinky, low-power netbook computers that are just good enough to meet our surfing and emailing needs." He was mostly talking about consumer technology but the idea now applies to the art and culture created with that technology as well.
Landing on "good enough" is usually done by compromising. We do this a bazillion times a day. If you use the internet in an even remotely normal capacity, you're committing mostly benign, occasionally serious violations of all shapes and sizes. All of these transgressions I am going to lump under the label of "crime" for simplicity's sake.
Screenshotting a JPEG? Crime
. Using an ad blocker that might break a web page? Crime
. Sandwiching a giant blockquote of someone else's writing between two sentences of your own? Crime
. Reuploading someone else's video? Crime
. Streaming a game on Twitch? Crime. Sharing, knowingly or not, information that's inaccurate? Crime
. Banning someone? Crime
. Circumventing a ban? Crime
. Using a profile picture of yourself from 10 years ago? Crime
. Posting a picture of someone without their permission? Crime
. Embedding someone else’s post from social media on another website? Crime
. Violating copyright and then saying "no copyright intended"? Crime
. Watching Tenet on a phone? Crime
. Snitch-tagging? Crime
. Sending a LinkedIn request to a guy you don't know just to get an internal referral? Crime
. Deepfake? Crime
. Sharing a Netflix password? Crime
. Forwarding an email that contains the sentence "Confidential: Do not forward to unintended recipients." at the bottom? Crime
. Dropshipping nebula lights at a 600% markup? Crime
. Reblogging? Crime
. Retweeting? Crime
. Regramming? Crime
. Screenshotting a big passage of someone's work for Twitter but not providing the link? Crime
. Buying Steam keys on grey markets? Crime
. Using one of those websites that lets you watch Instagram Stories without sending a read receipt? Crime
. Selling shirts of someone else's art on Redbubble? Crime
. Buying those shirts? Crime
. Posting your private chats from a dating app for likes? Crime
. Entering your birthday as January 1, 1900 on an age gate? Crime
. Using a VPN to access content in another country? Crime
. Listing your job on Facebook as "CEO at Me"? Crime
. Softblocking? Crime
. Starting a flame war? Crime
. Using a retro game emulator even though none of those games are available via legal means? Crime
. Barely functional MySpace profile that was a garish, unnavigable mess? Crime
.
All of these are “crimes” to me because, conceivably, someone could raise a halfway-reasonable objection to them. They are actions that are deceptive, or unauthorized, or socially disagreeable, or make tasks ever so slightly more difficult for someone else, or may take earnings away from someone else who arguably deserves them. And yet all of this stuff is commonplace because the friction-light technology of the internet makes the ease of doing crimes outweigh the objections. Internet culture is borne out of the fact that — regardless of what specific scale of right versus wrong you use — it's way, way easier to do the slightly wrong thing than it is to do the fully right thing. Every day, you wake up, and log on, and cut some corners in a way that would bug someone else.
The process of internet culture iterates rapidly because people rip each other off and violate boundaries every day in service of “good enough,” not in spite of that fact. It's constantly evolving because for every person with a novel, fun, great idea, there are a thousand others willing to copy that idea and do it 50% worse but 500% faster (stats approximate). Algorithmic ranking incentivizes this — “This was popular. Try doing it again.” Being creative on the internet means, if you have any sense of shame, constantly having to come up with new ideas because the low(er)-quality bootlegs are only a few steps behind.
The newest internet crime in service of a “good enough” product: generating text or images from one the ever-increasing number of “AI” text and image generators out there.
Are there plenty of valid objections for machine-generated texts and images? Absolutely. Can any of them overcome the fact that these works are already "good enough"? I don't think so. A couple of weeks ago, Ted Chiang wrote a piece for the New Yorker about the kind of text generated by large language models, comparing it to a blurry JPEG.
Think of ChatGPT as a blurry JPEG of all the text on the Web. It retains much of the information on the Web, in the same way that a JPEG retains much of the information of a higher-resolution image, but, if you’re looking for an exact sequence of bits, you won’t find it; all you will ever get is an approximation. But, because the approximation is presented in the form of grammatical text, which ChatGPT excels at creating, it’s usually acceptable. You’re still looking at a blurry JPEG, but the blurriness occurs in a way that doesn’t make the picture as a whole look less sharp.
It's a useful point of comparison. Even if machine-generated text puts out something partially or entirely false, it does so in a way that is grammatically correct and scans as highly authoritative — which is good enough to make people believe that it's true. It’s certainly good enough for SEO or algorithm chum.1 If I ask DALL·E to give me the Minions in the style of Van Gogh, it's probably going to get most of the way there.
This has led to a lot of handwringing. Some questions I've seen recently: What does this kind of text mean for the nature of truth? What does generated art mean for the future of creative fields? Will this change academia forever? Is Bing sentient? Will this technology eliminate the few journalism jobs left in the twenty-first century?
Here's my question: Who cares? I mean, I care in that the topic interests me, and I think writing is fun, and learning new stuff is easier for me when I read things that are distinctive (and accurate). But all of these questions are asking "Should we let the horse leave the barn?" while the people asking stand in front of an empty barn. If years of collective online activity is anything to go by, this stuff is good enough for everyone else. If ChatGPT is a blurry JPEG, it's worth pointing out that the vast majority of web users are totally fine with blurry JPEGs. I see them all over the place. (Sometimes, as I wrote in 2014, the blurriness is the point.) We love parlor tricks like SmartChild and Akinator. Whether or not machine-text and -art is "good" or “convincing” is not the relevant issue. The issue is whether it is “good enough.”
Clearly, to many people, it is good enough! Internet users are really skilled at convincing themselves that the convenient thing in front of them is the thing they want, whether that’s a barely coherent machine-generated article about what time the Super Bowl is or dubious footage a preferred/reviled political candidate. As someone who used to get paid full-time to write whatever I wanted, I get how people can be exceedingly precious and protective of their role as writer, or fact-purveyor. And I can similarly see how a computer popping out images of Donald Trump as Popeye the Sailor has sent a whole generation of boardwalk t-shirt hawkers into an existential crisis. At the same time, the pessimist in me doesn't think arts patronage is going to come back in a big way any time soon. The people claiming that machine-work utilizers would've otherwise hired a professional artist or writer sound a lot like the Hollywood execs who think every media pirate would've otherwise bought the movie or the album.
One thing I've been reminded of recently, in a line of work where the only writing I do is code comments, commit messages, emails, proposal documents, and slide decks, is that originality is not necessary. In fact, it might even be detrimental. Writing, for most people, is not a job; it's just a thing you have to do in the course of a different, actual job. Boilerplates exist for a reason2. Sure, I wince a little when I see a reply-all in the vein of, "Thanks for sharing these learnings! It's so important to amplify our impact when connecting with our partners and stakeholders" — but I also understand that coming up with unique text is almost always wasted effort. Exponentially more of this type of writing is done manually by people all over the world on a daily basis than is, say, a longform narrative investigation or a funny blog post.
In the same way that memes provide ready-made templates, shortcuts to sounding witty and relevant, machine-generated works provide a shortcut to competent writing for people who only write as a means to an end. Or who need art for a PowerPoint and not a gallery showing. Should we instead address the systemic and structural issues that have created a corporate culture of tedious or redundant works, rather than inventing robots to produce even more of said works? Yeah, probably3. There’s few things more on-the-nose than a story like “School apologizes for using ChatGPT to write email about mass shooting.” But I’m only one dude with an occasional newsletter, and catalyzing broad societal shifts is not my wheelhouse.
Plus, it would be too easy to throw some terms like “late capitalism,” “billionaires,” “profits,” “clickbait,” and “engagement” into my bingo cage4, spin it around, and attribute the current wave of chatbot fever to whatever sentence falls out. Blaming a small cabal of powerful figures for forcing AI on the unwilling masses feels like, at the very least, a rhetorical cop out, and at most, outright wrong. The Midjourney Discord server has 13 million members. I’d wager that relatively few are titans of industry, or get-rich-quick guys, and that most just think it’s really funny to put “Naruto at Yalta” into a magic box and see it conjured into existence.
I just think it’s worth reiterating that the story of internet culture recently has not been one of austerity or moderation. It’s about taking the easy route and flooding the zone with the same meme templates and TikTok sounds everyone else is using at a regular interval — as opposed to things that are creative and unique and, well, good. This has been true for years: consistency over quality is a winning strategy in terms of audience growth. All of the stories I read about content creator burnout are about how exhausting and awful it is to have to post so often, rather than about what most artists have traditionally struggled with throughout most of human history: being in a creative rut. To me, that's extremely telling.5 A flywheel system that encourages this type of brainless output incentivizes the proliferation of automated systems that let people continue to pump out at-best-mediocre stuff while shirking responsibility for what's actually generated. So I see the twisted appeal of the shortcuts, and am not more aghast about it than anything else I’ve seen over the last decade. The posters have been sleepwalking for a very long time.6
So this stuff bugs me, but in a steady, evolutionary way, and not a cataclysmic, revolutionary one. Professional writers aren't the only people who have to write; they teach everyone the three-point essay in elementary school. A machine-generated creation, even if it's a little blurry, even if it's a little awkward or formulaic, even if it’s inaccurate, has progressed to a good-enough fidelity that can get the larger point or feeling across. This stuff is “good enough” in a work context and a play context. I’d wager that for pretty much everyone who doesn’t write or paint for a living, that’s totally fine.
As for what we/you/I/society should do about this, if anything? Beats me.
with that out of the way, does anyone know where *my friend* can find a rip of the Neil Young bootleg tape, Catalytic Reaction?
A music site long thought dead has suddenly reappeared in my RSS reader with dozens of suspect articles every day for the past few weeks. They have titles like “100 Greatest Songs from 1948s.” (“The year 1948 was a pivotal year in history, marked by many significant events and cultural shifts. It was the year that the Universal Declaration of Human Rights was adopted, and it was also the year that marked the beginning of the Cold War.”) I can’t find the names in the byline anywhere else online and they don’t have avatars.
Some boilerplates even get reused so much that they can end up further enhancing the original message.
Yeah, I’ve been reading Graeber recently: “So the more automation proceeds, the more it should be obvious that actual value emerges from the caring element of work. Yet this leads to another problem. The caring value of work would appear to be precisely that element in labor that cannot be quantified.” (Bullshit Jobs: A Theory, 2018)
Little callback for fans of Brian canon.
A fun recent innovation in amassing and retaining an Instagram audience: posting gore. Definitely a sign of a healthy, worthwhile system that users are growth-hacking with footage such as “men getting run over by cars and trains,” “animals being shot, beaten and dismembered,” and “a video of a young child being shot in the head.” Increasingly difficult to not to be an accelerationist these days.
There’s still plenty of good, cool stuff out there. It’s just relatively few and far between.