Mixed Messages about AI: Why?
July 23, 2025
Just a dinobaby working the old-fashioned way, no smart software.
I learned that Meta is going to spend hundreds of billions for smart software. I assume that selling ads to Facebook users will pay the bill.
If one pokes around, articles like “Enterprise Tech Executives Cool on the Value of AI” turn up. This write up in BetaNews says:
The research from Akkodis, looking at the views of 500 global Chief Technology Officers (CTOs) among a wider group of 2,000 executives, finds that overall C-suite confidence in AI strategy dropped from 69 percent in 2024 to just 58 percent in 2025. The sharpest declines are reported by CTOs and CEOs, down 20 and 33 percentage points respectively. CTOs also point to a leadership gap in AI understanding, with only 55 percent believing their executive teams have the fluency needed to fully grasp the risks and opportunities associated with AI adoption. Among employees, that figure falls to 46 percent, signaling a wider AI trust gap that could hinder successful AI implementation and long-term success.
Okay. I know that smart software can do certain things with reasonable reliability. However, when I look for information, I do my own data gathering. I think pluck items which seem useful to me. Then I push these into smart AI services and ask for error identification and information “color.”
The result is that I have more work to do, but I would estimate that I find one or two useful items or comments out of five smart software systems to which I subscribe.
Is that good or bad? I think that for my purpose, smart software is okay. However, I don’t ask a question unless I have an answer. I want to get additional inputs or commentary. I am not going to ask a smart software system a question to which I do not think I know the answer. Sorry. My trust in high-flying Google-type Silicon Valley outfits is non existent.
The write up points out:
The report also highlights that human skills are key to AI success. Although technical skill are vital, with 51 percent of CTOs citing specialist IT skills as the top capability gap, other abilities are important too, including creativity (44 percent), leadership (39 percent) and critical thinking (36 percent). These skills are increasingly useful for interpreting AI outputs, driving innovation and adapting AI systems to diverse business contexts.
I don’t agree with the weasel word “useful.” Knowing the answer before firing off a prompt is absolutely essential.
Thus, we have a potential problem. If the smart software crowd can get people who do not know the answers to questions, these individuals will provide the boost necessary to keep this technical balão de fogo up in the air. If not, up in flames.
Stephen E Arnold, July 23, 2025
Comments
Got something to say?