Again Footnotes. Hello, AI.

July 17, 2025

Dino 5 18 25No smart software involved with this blog post. (An anomaly I know.)

Footnotes. These are slippery fish in our online world. I am finishing work on my new monograph “The Telegram Labyrinth.” Due to the volatility of online citations, I am not using traditional footnotes, endnotes, or interlinear notes. Most of the information in the research comes from sources in the Russian Federation. We learned doing routine chapter updates each month that documents disappeared from the Web. Some were viewable if we used a virtual private network in a “friendly country” to the producer of the article. Others were just gone. Poof. We do capture images of pages when these puppies are first viewed.

My new monograph is intended for those who attend my lectures about Telegram Messenger-type platforms. My current approach is to just give it away to the law enforcement, cyber investigators, and lawyers who try to figure out money laundering and other digital scams. I will explain my approach in the accompany monograph. I will tell them, “It’s notes. You are on your own when probing the criminal world.” Good luck.

I read “Springer Nature Book on Machine Learning Is Full of Made-Up Citations.” Based on my recent writing effort, I think the problem of citing online resources is not just confined to my team’s experience. The flip side of online research is that some authors or content creation teams (to use today’s jargon) rely on smart software to help out.

The cited article says:

Based on a tip from a reader [of Mastering Machine Learning], we checked 18 of the 46 citations in the book. Two-thirds of them either did not exist or had substantial errors. And three researchers cited in the book confirmed the works they supposedly authored were fake or the citation contained substantial errors.

A version of this “problem” has appeared in the ethics department of Harvard University (where Jeffrey Epstein allegedly had an office), Stanford University, and assorted law firms. Just let smart software do the work and assume that its output is accurate.

It is not.

What’s the fix? Answer: There is none.

Publishers either lack the money to do their “work” or they have people who doom scroll in online meetings. Authors don’t care because one can “publish” anything as an Amazon book with mostly zero oversight. (This by the way is the approach and defense of the Pavel Durov-designed Telegram operation.) Motivated individuals can slap up a free post and publish a book in a series of standalone articles. Bear Blog, Substack, and similar outfits enable this approach. I think Yahoo has something similar, but, really, Yahoo?

I am going to stick with my approach. I will assume the reader knows everything we describe. I wonder what future researchers will think about the information voids appearing in unexpected places. If these researchers emulate what some authors are doing today, the future researchers will let AI do the work. No one will know the difference. If something online can’t be found it doesn’t exist.

Just make stuff up. Good enough.

Stephen E Arnold, July 17, 2025

Comments

Got something to say?





  • Archives

  • Recent Posts

  • Meta