Future Stupiding: That Is the Goal

February 17, 2026

green-dino_thumb_thumb3_thumbAnother dinobaby post. No AI unless it is an image. This dinobaby is not Grandma Moses, just Grandpa Arnold.

I think it was 1979 when I was in Ellen Shedlarz office adjacent a blue chip consulting firm’s library. Ms. Shedlarz said, “Know what this is?” as she pointed to a flat gizmo with a keyboard.

“Yes, that’s a version of a teletype terminal. I saw a version at IT&T a couple of years ago.”

“This is different. This is the future,” said Ms. Shedlarz. “What are you working on?”

I described a project involving food fabrication. She then asked me a number of questions. After five or six questions about converting soy bean paste into something that would sell as a snack to hungry teens, she typed into the device. After a minute or so, paper began to spew out of the slot in the back of the machine.

image

Ms. Shedlarz had just reduced several days of work in libraries, a number of telephone calls, and chats with food scientists and engineers scattered across the consulting firm’s global operations.

She explained that the gizmo connected to an online service. That service contained bibliographic information and abstracts of journal articles, conference papers, and other types of textual data. The information displayed matched the query she fed into the gizmo.

I asked, “Is the information accurate?”

She said, “Yes, I select specific online databases I know to have rigorous editorial standards. I don’t need your colleagues standing in my door shouting and yelling at me.

Several points:

  1. A consummate professional selected specific sources she knew would be acceptable to the often out-of-control Type As at the blue chip firm
  2. She performed a reference interview in order to use her training and on-the-job experience to get on-point information
  3. She would review the output before handing it to one of my often-intense colleagues.

Where are we today?

According to “How Generative and Agentic AI Shift Concern from Technical Debt to Cognitive Debt,” in 2026 were are standing in a pig pen filling with cognitive debt. Instead of mammals, we have smart software and people who believe they are experts in finding information themselves. The majority of people with whom I interact wouldn’t know a special librarian unless one stood at 39th and 3rd in Manhattan holding a sign that said, “Special librarian here.” Who needs the old-fashioned curated databases? Who needs a person to intermediate between the “give me everything about…” person and the high value content accessible online. Nope. Just let a black box output an answer. We live in a world where the “work” of creating knowledge value is unnecessary and not valued. Oh, the AI companies want old fashioned professionals who can do knowledge work. But those not in the elite just take what’s output. We are in a “good enough” society in the US.

The article says:

Even if AI agents produce code that could be easy to understand, the humans involved may have simply lost the plot and may not understand what the program is supposed to do, how their intentions were implemented, or how to possibly change it.

To my way of thinking, “lost the plot” means stupid. Making a “change” to the output is now beyond the ken of many professionals and most of the people I see at my Planet Fitness. You know these folks, the sit on the machine and doom scroll. Yeah, big thinkers.

The write up predictably tosses out some bait for those who need a consultant to fix up the problem or who want to attend a training class chock full of glittering panaceas.

The concept of “cognitive debt” is a good one. However, the write up does not nail the issues the blunting of knowledge work delivers by the garbage truck load:

  1. Learning is not easy. Without effort, learning is not valued and superficial
  2. An inability to learn and think critically means that decisions will probably be ill considered, half formed, incorrect, or disastrous
  3. A society without an educated citizenry becomes one that can be shaped
  4. Information will be weaponized.

When that human clerk cannot make change, that’s a person who will believe everything that fits into whatever their uninformed world view accepts. If the world view is expansive (the purpose of a college education for some people), there is a chance that weaponized or shaped information can be recognized, evaluated, and processed in a context of information believed by people like Ms. Shedlarz to be useful. The people who processed the print outs she delivered, in theory, would then continue to curate and process the information. The goal was to convert soy paste into a snack that met the terms of the client engagement.

Without people who are educated, the baseline is not excellence for most people. The baseline is good enough. Do you want to stand in front of a Waymo-type self driving car? Do you have confidence your hospital will not infect you with a forever virus? Do you believe that the word “organic” on a bunch of vegetables is free from forever chemicals?

The main shift of the big technology companies is to create knowledge dependence. With that dependence, “facts” are whatever the online systems and the black boxes output. Control that information flow and one has a social construct that has the capability to make people into puppets.

Cognitive debt means Punch and Judy shows. Ads, entertainment, and loss of knowledge control. No Ms. Shedlarz needed. Do some US big technology companies want this type of control? You bet your life. Say the secret word and get a free month on an AI system.

Stephen E Arnold, February 17, 2026

Comments

Got something to say?





  • Archives

  • Recent Posts

  • Meta