Microsoft and OpenAI: An Expensive Sitcom

July 1, 2025

Dino 5 18 25No smart software involved. Just an addled dinobaby.

I remember how clever I thought the book title “Who Says Elephants Can’t Dance?: Leading a Great Enterprise Through Dramatic Change.” I find the break dancing content between Microsoft and OpenAI even more amusing. Bloomberg “real” news reported that Microsoft is “struggling to sell its Copilot solutions. Why? Those Microsoft customers want OpenAI’s ChatGPT. That’s a hoot.

Computerworld adds to this side show more Monte Python twists. “Microsoft and OpenAI: Will They Opt for the Nuclear Option?” (I am not too keen on the use of the word “nuclear.” People bandy it about without understanding exactly what the actual consequences of such an opton means. Please, do a bit of homework before suggesting that two enterprises are doing anything remotely similar.)

The estimable Computerworld reports:

Microsoft needs access to OpenAI technologies to keep its worldwide lead in AI and grow its valuation beyond its current more than $3.5 trillion. OpenAI needs Microsoft to sign a deal so the company can go public via an IPO. Without an IPO, the company isn’t likely to keep its highly valued AI researchers — they’ll probably be poached by companies willing to pay hundreds of millions of dollars for the talent.

The problem seems to be that Microsoft is trying to sell its version of smart software. The enterprise customers and even dinobabies like myself prefer the hallucinatory and unpredictable ChatGPT to the downright weirdness of Copilot in Notepad. The Computerworld story says:

Hovering over it all is an even bigger wildcard. Microsoft’s and OpenAI’s existing agreement dramatically curtails Microsoft’s rights to OpenAI technologies if the technologies reach what is called artificial general intelligence (AGI) — the point at which AI becomes capable of human reasoning. AGI wasn’t defined in that agreement. But Altman has said he believes AGI might be reached as early as this year.

People cannot agree over beach rights and school taxes. The smart software (which may remain without regulation for a decade) is a much bigger deal. The dollars at stake are huge. Most people do not know that a Board of Directors for a Fortune 1000 company will spend more time arguing about parking spaces than a $300 million acquisition. The reason? Most humans cannot conceive of the numbers of dollars associated with artificial intelligence. If the AI next big thing does not work, quite a few outfits are going to be selling snake oil from tables at flea markets.

Here’s the humorous twist from my vantage point. Microsoft itself kicked off the AI boom with its announcements a couple of years ago. Google, already wondering how it can keep the money gushing to pay the costs of simply being Google, short circuited and hit the switch for Code Red, Yellow, Orange, and probably the color only five people on earth have ever seen.

And what’s happened? The Google-spawned methods aren’t eliminating hallucinations. The OpenAI methods are not eliminating hallucinations. The improvements are more and more difficult to explain. Meanwhile start ups are doing interesting things with AI systems that are good enough for certain use cases. I particularly like consulting and investment firms using AI to get rid of MBAs.

The punch line for this joke is that the Microsoft version of ChatGPT seems to have more brand deliciousness. Microsoft linked with OpenAI, created its own “line of AI,” and now finds that the frisky money burner OpenAI is more popular and can just define artificial general intelligence to its liking and enjoy the philosophical discussions among AI experts and lawyers.

One cannot make this sequence up. Jack Benny’s radio scripts came close, but I think the Microsoft – OpenAI program is a prize winner.

Stephen E Arnold, July 1, 2025

Publishing for Cash: What Is Here Is Bad. What Is Coming May Be Worse

July 1, 2025

Dino 5 18 25Smart software involved in the graphic, otherwise just an addled dinobaby.

Shocker. Pew Research discovers that most “Americans” do not pay for news. Amazing. Is it possible that the Pew professionals were unaware of the reason newspapers, radio, and television included comic strips, horoscopes, sports scores, and popular music in their “real” news content? I read in the middle of 2025 the research report “Few Americans Pay for News When They Encounter Paywalls.” For a number of years I worked for a large publishing company in Manhattan. I also worked at a privately owned publishing company in fly over country.

image

The sky looks threatening. Is it clouds, locusts, or the specter of the new Dark Ages? Thanks, you.com. Good enough.

I learned several things. Please, keep in mind that I am a dinobaby and I have zero in common with GenX, Y, Z, or the horrific GenAI. The learnings:

  • Publishing companies spend time and money trying to figure out how to convert information into cash. This “problem” extended from the time I took my first real job in 1972 to yesterday when I received an email from a former publisher who is thinking about batteries as the future.
  • Information loses its value as it diffuses; that is, if I know something, I can generate money IF I can find the one person who recognizes the value of that information. For anyone else, the information is worthless and probably nonsense because that individual does not have the context to understand the “value” of an item of information.
  • Information has a tendency to diffuse. It is a bit like something with a very short half life. Time makes information even more tricky. If the context changes exogenously, the information I have may be rendered valueless without warning.

So what’s the solution? Here are the answers I have encountered in my professional life:

  1. Convert the “information” into magic and the result of a secret process. This is popular in consulting, certain government entities, and banker types. Believe me, people love the incantations, the jargon talk, and the scent of spontaneous ozone creation.
  2. Talk about “ideals,” and deliver lowest common denominator content. The idea that the comix and sports scores will “sell” and the revenue can be used to pursue ideals. (I worked at an outfit like this, and I liked its simple, direct approach to money.)
  3. Make the information “exclusive” and charge a very few people a whole lot of money to access this “special” information. I am not going to explain how lobbying, insider talk, and trade show receptions facilitate this type of information wheeling and dealing. Just get a LexisNexis-type of account, run some queries, and check out the bill. The approach works for certain scientific and engineering information, financial data, and information people have no idea is available for big bucks.
  4. Embrace the “if it bleeds, it leads” approach. Believe me this works. Look at YouTube thumbnails. The graphics and word choice make clear that sensationalism, titillation, and jazzification are the order of the day.

Now back to the Pew research. Here’s a passage I noted:

The survey also asked anyone who said they ever come across paywalls what they typically do first when that happens. Just 1% say they pay for access when they come across an article that requires payment. The most common reaction is that people seek the information somewhere else (53%). About a third (32%) say they typically give up on accessing the information.

Stop. That’s the key finding: one percent pay.

Let me suggest:

  1. Humans will take the easiest path; that is, they will accept what is output or what they hear from their “sources”
  2. Humans will take “facts” and glue they together to come up with more “facts”. Without context — that is, what used to be viewed as a traditional education and a commitment to lifelong learning, these people will lose the ability to think. Some like this result, of course.
  3. Humans face a sharper divide between the information “haves” and the information “have nots.”

Net net: The new dark ages are on the horizon. How’s that for a speculative conclusion from the Pew research?

Stephen E Arnold, July 1, 2025

  • Archives

  • Recent Posts

  • Meta