The who cares era
One of the latest generative AI-motivated blunders, the recommendation of non-existent books in a “special supplement” of US newspapers Chicago Sun-Times and Philadelphia Inquirer, generated yet another wave of criticism of the technology.
Dan Sinker defined the moment as “the who cares era”:
The writer didn’t care. The supplement’s editors didn’t care. The biz people on both sides of the sale of the supplement didn’t care. The production people didn’t care. And, the fact that it took two days for anyone to discover this epic fuckup in print means that, ultimately, the reader didn’t care either.
It’s so emblematic of the moment we’re in, the Who Cares Era, where completely disposable things are shoddily produced for people to mostly ignore.
Dan focuses on AI, but I have to say that the problem runs deeper and predates it. Supplements of this kind already existed before, and while slip-ups of this nature were rare, the fact that this one took two days to be noticed implies that on the reader’s side, nobody cares — yes, and they don’t care for a long time, well before the popularization of generative AI.
I find myself wondering how much stuff has already been printed not to be read, or at most, to be read and ignored. Or, in the digital realm, how much content isn’t published to be read and spark action or make people think, but rather to fill space, capture attention to redirect it toward ads or similar things.
Rob Horning raised this argument more thoroughly and elegantly, as he often does:
The fact that LLMs can generate endless amounts of explicitly “fake” copy with the traces of human intention and presence deeply diluted through countless layers of processing and concatenation could hopefully demystify not only that particular subject position that seeks safe harbor in “real texts” — i.e. an alibi in a “real supplement” for the dubious pleasures such supplements have always supplied — but also the fantasy of accessing perfect authenticity through media.