Another Duh! Moment: AI Cannot Read Social Situations

May 12, 2025

No AI. Just a dinobaby who gets revved up with buzzwords and baloney.

I promise I won’t write “Duh!” in this blog post again. I read Science Daily’s story “Awkward. Humans Are Still Better Than AI at Reading the Room.” The write up says without total awareness:

Humans, it turns out, are better than current AI models at describing and interpreting social interactions in a moving scene — a skill necessary for self-driving cars, assistive robots, and other technologies that rely on AI systems to navigate the real world.

Yeah, what about in smart weapons, deciding about health care for an elderly patient, or figuring out whether the obstacle is a painted barrier designed to demonstrate that full self driving is a work in progress. (I won’t position myself in front of a car with auto-sensing and automatic braking. You can have at it.)

The write up adds:

Video models were unable to accurately describe what people were doing in the videos. Even image models that were given a series of still frames to analyze could not reliably predict whether people were communicating. Language models were better at predicting human behavior, while video models were better at predicting neural activity in the brain.

Do these findings say to you, “Not ready for prime time?” It does to me.

One of the researchers who was in the weeds with the data points out:

“I think there’s something fundamental about the way humans are processing scenes that these models are missing.”

Okay, I prevaricated. Duh!” (Do marketers care? Duh!)

Stephen E Arnold, May 12, 2025

Comments

Got something to say?





  • Archives

  • Recent Posts

  • Meta