There’s nothing wrong with Large Language Models, I’m going to be combining one with a knowledge graph. The difference is that I’m fusing the LLM with curated information - things that were written by humans, and then collected by a human (me). I don’t need an all knowing LLM, I just need one that can write tolerable English, which they all can do any more.
I’ve run tarpits in the past - LaBrea back when I managed ISP infrastructure, and I’ve got some stuff now that poisons scraping. I haven’t done any of these AI specific things, I’m headed in the direction of enabling humans, rather than disabling machines.
This nonsense, on the other hand, which I first mentioned in Ghost Jobs a couple weeks ago, is truly out of hand. I’m on the hunt for a new position or an anchor customer, and the only thing that is working is asking actual humans I know if they’ve seen anything that might fit me.
And if we, as a society, don’t get ahead of this, absolutely nothing means anything any more. This is how you edit out everyone but the white males and eliminate episodes that don’t fit a white Christian nationalist narrative.
The Dead Internet began as a conspiracy theory, but it’s becoming an infuriating reality. I’m overdue to revisit Simulacra & Simulation. That book is now forty four years old and Jean Baudrillard departed this life eighteen years ago. I wonder what he’d make of the news of the day here in 2025.