Amazon: You Are Lovable… to Some I Guess

August 21, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

Three “real news” giants have make articles about the dearly beloved outfit Amazon. My hunch is that the publishers were trepidatious when the “real” reporters turned in their stories. I can hear the “Oh, my goodness. A negative Amazon story.” Not to worry. It is unlikely that the company will buy ad space in the publications.

8 17 giant

A young individual finds that the giant who runs an alleged monopoly is truly lovable. Doesn’t everyone? MidJourney, after three tries I received an original image somewhat close to my instructions.

My thought is the fear that executives at the companies publishing negative information about the lovable Amazon could hear upon coming home from work, “You published this about Amazon. What if our Prime membership is cancelled? What if our Ring doorbell is taken offline? And did you think about the loss of Amazon videos? Of course not, you are just so superior. Fix your own dinner tonight. I am sleeping in the back bedroom tonight.”

The first story is “How Amazon’s In-House First Aid Clinics Push Injured Employees to Keep Working.” Imagine. Amazon creating a welcoming work environment in which injured employees are supposed to work. Amazon is pushing into healthcare. The article states:

“What some companies are doing, and I think Amazon is one of them, is using their own clinics to ‘treat people’ and send them right back to the job, so that their injury doesn’t have to be recordable,” says Jordan Barab, a former deputy assistant secretary at OSHA who writes a workplace safety newsletter.

Will Amazon’s other health care units operate in a similar way? Of course not.

The second story is “Authors and Booksellers Urge Justice Dept. to Investigate Amazon.” Imagine. Amazon exploiting its modest online bookstore and its instant print business to take sales away from the “real” publishers. The article states:

On Wednesday[August 16, 2023], the Open Markets Institute, an antitrust think tank, along with the Authors Guild and the American Booksellers Association, sent a letter to the Justice Department and the Federal Trade Commission, calling on the government to curb Amazon’s “monopoly in its role as a seller of books to the public.”

Wow. Unfair? Some deliveries arrive in a day. A Kindle book pops up in the incredibly cluttered and reader-hostile interface in seconds. What’s not to like?

The third story is from the “real news outfit” MSN which recycles the estimable CNBC “talking heads”. This story is “Amazon Adds a New Fee for Sellers Who Ship Their Own Packages.” The happy family of MSN and CNBC report:

Beginning Oct. 1, members of Amazon’s Seller Fulfilled Prime program will pay the company a 2% fee on each product sold, according to a notice sent to merchants … The e-commerce giant also charges sellers a referral fee between 8% and 15% on each sale. Sellers may also pay for things like warehouse storage, packing and shipping, as well as advertising fees.

What’s the big deal?

To admirer who grew up relying on a giant company, no problem.

Stephen E Arnold, August 21, 2023

The ISP Ploy: Heck, No, Mom. I Cannot Find My Other Sock?

August 16, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

Before I retired, my team and I were doing a job for the US Senate. One day at lunch we learned that Google could not provide employment and salary information  to a government agency housed in the building in which we were working. The talk, as I recall, was tinged with skepticism. If a large company issues paychecks and presumably files forms with the Internal Revenue Service, records about who and wages were available. Google allowed many people to find answers, but the company could not find its employment data. The way things work in Washington, DC, to the best of my recollection, a large company with considerable lobbying help and a flock of legal eagles can make certain processes slow. As staff rotate, certain issues get pushed down the priority pile and some — not everyone, of course — fade away.

8 16 cant find it mom

A young teen who will mature into a savvy ISP tells his mom, “I can’t find my other sock. It is too hard for me to move stuff and find it. If it turns up, I will put it in the laundry.” This basic play is one of the keys to the success of the Internet Service Provider the bright young lad runs today. Thanks, MidJourney. You were back online and demonstrating gradient malfunctioning. Perhaps you need a bit of the old gain of function moxie?

I thought about this “inability” to deliver information when I read “ISPs Complain That Listing Every Fee Is Too Hard, Urge FCC to Scrap New Rule.” I want to focus on one passage in the article and suggest that you read the original report. Keep in mind my anecdote about how a certain big tech outfit handles some US government requests.

Here’s the snippet from the long source document:

…FCC order said the requirement to list “all charges that providers impose at their discretion” is meant to help broadband users “understand which charges are part of the provider’s rate structure, and which derive from government assessments or programs.” These fees must have “simple, accurate, [and] easy-to-understand name[s],” the FCC order said. “Further, the requirement will allow consumers to more meaningfully compare providers’ rates and service packages, and to make more informed decisions when purchasing broadband services. Providers must list fees such as monthly charges associated with regulatory programs and fees for the rental or leasing of modem and other network connection equipment,” the FCC said.

Three observations about the information in the passage:

  1. The argument is identical to that illustrated by the teen in the room filled with detritus. Crap everywhere makes finding easy for the occupant and hard for anyone else. Check out Albert Einstein’s desk on the day he died. Crap piled everywhere. Could he find what he needed? According to his biographers, the answer is, “Yes.”
  2. The idea that a commercial entity which bills its customers does not have the capacity to print out the little row entries in an accounting system is lame in my opinion. The expenses have to labeled and reported. Even if they are chunked like some of the financial statements crafted by the estimable outfits Amazon and Microsoft, someone has the notes or paper for these items. I know some people who could find these scraps of information; don’t you?
  3. The wild and crazy government agencies invite this type of corporate laissez faire behavior. Who is in charge? Probably not the government agency if some recent anti-trust cases are considered as proof of performance.

Net net: Companies want to be able to fiddle the bills. Period. Printing out comprehensive products and services prices reduces the gamesmanship endemic in the online sector.

Stephen E Arnold, August 16, 2023

Sam AI-Man: A Big Spender with Trouble Ahead?

August 15, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

$700,000 per day. That’s an interesting number if it is accurate. “ChatGPT In Trouble: OpenAI May Go Bankrupt by 2024, AI Bot Costs Company $700,000 Every Day” states that the number is the number. What’s that mean? First, forget salaries, general and administrative costs, the much-loved health care for humans, and the oddments one finds on balance sheets. (What was that private executive flight to Tampa Bay?)

81 cannt pay ees

A young entrepreneur realizes he cannot pay his employees. Thanks, MidJourney, whom did you have in your digital mind?

I am a dinobaby, but I can multiply. The total is $255,500,000. I want to ask about money (an investment, of course) from Microsoft, how the monthly subscription fees are floating the good ship ChatGPT, and the wisdom of hauling an orb to scan eyeballs from place to place. (Doesn’t that take away from watching the bourbon caramel cookies reach their peak of perfection? My hunch is, “For sure.”)

The write up reports:

…the shift from non-profit to profit-oriented, along with CEO Sam Altman’s lack of equity ownership, indicates OpenAI’s interest in profitability. Although Altman might not prioritize profits, the company does. Despite this, OpenAI hasn’t achieved profitability; its losses reached $540 million since the development of ChatGPT.

The write up points out that Microsoft’s interest in ChatGPT continues. However, the article observes:

Complicating matters further is the ongoing shortage of GPUs. Altman mentioned that the scarcity of GPUs in the market is hindering the company’s ability to enhance and train new models. OpenAI’s recent filing for a trademark on ‘GPT-5’ indicates their intention to continue training models. However, this pursuit has led to a notable drop in ChatGPT’s output quality.

Another minor issue facing Sam AI-Man is that legal eagles are circling. The Zuck dumped his pet Llama as open source. And the Google and Googley chugs along and Antropic “clawed” into visibility.

Net net: Sam AI-Man may find that he will an opportunity to explain how the dial on the garage heater got flipped from Hot to Fan Only.

Stephen E Arnold, August 15, 2023

Killing Horses? Okay. Killing Digital Information? The Best Idea Ever!

August 14, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_tNote: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

Fans at the 2023 Kentucky Derby were able to watch horses killed. True, the sport of kings parks vehicles and has people stand around so the termination does not spoil a good day at the races. It seems logical to me that killing information is okay too. Personally I want horses to thrive without brutalization with mint juleps, and in my opinion, information deserves preservation. Without some type of intentional or unintentional information, what would those YouTuber videos about ancient technology have to display and describe?

In the Age of Culling” — an article in the online publication tedium.co — I noted a number of ideas which resonated with me. The first is one of the subheads in the write up; to wit:

CNet pruning its content is a harbinger of something bigger.

The basic idea in the essay is that killing content is okay, just like killing horses.

The article states:

I am going to tell you right now that CNET is not the first website that has removed or pruned its archives, or decided to underplay them, or make them hard to access. Far from it.

The idea is that eliminating content creates an information loss. If one cannot find some item of content, that item of content does not exist for many people.

I urge you to read the entire article.

I want to shift the focus from the tedium.co essay slightly.

With digital information being “disappeared,” the cuts away research, some types of evidence, and collective memory. But what happens when a handful of large US companies effectively shape the information training smart software. Checking facts becomes more difficult because people “believe” a machine more than a human in many situations.

8 13 library

Two girls looking at a museum exhibit in 2028. The taller girl says, “I think this is what people used to call a library.” The shorter girl asks, “Who needs this stuff. I get what I need to know online. Besides this looks like a funeral to me.” The taller girl replies, “Yes, let’s go look at the plastic dinosaurs. When you put on the headset, the animals are real.” Thanks MidJourney for not including the word “library” or depicting the image I requested. You are so darned intelligent!

Consider the power information filtering and weaponizing conveys to those relying on digital information. The statement “harbinger of something bigger” is correct. But if one looks forward, the potential for selective information may be the flip side of forgetting.

Trying to figure out “truth” or “accuracy” is getting more difficult each day. How does one talk about a subject when those in conversation have learned about Julius Caesar from a TikTok video and perceive a problem with tools created to sell online advertising?

This dinobaby understands that cars are speeding down the information highway, and their riders are in a reality defined by online. I am reluctant to name the changes which suggest this somewhat negative view of learning. One believes what one experiences. If those experiences are designed to generate clicks, reduce operating costs, and shape behavior — what’s the information landscape look like?

No digital archives? No past. No awareness of information weaponization? No future. Were those horses really killed? Were those archives deleted? Were those Shakespeare plays removed from the curriculum? Were the tweets deleted?

Let’s ask smart software. No thanks, I will do dinobaby stuff despite the efforts to redefine the past and weaponize the future.

Stephen E Arnold, August 14, 2023

MBAs, Lawyers, and Sociology Majors Lose Another Employment Avenue

August 4, 2023

Note: Dinobaby here: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid. Services are now ejecting my cute little dinosaur gif. (´?_?`) Like my posts related to the Dark Web, the MidJourney art appears to offend someone’s sensibilities in the datasphere. If I were not 78, I might look into these interesting actions. But I am and I don’t really care.

Some days I find MBAs, lawyers, and sociology majors delightful. On others I fear for their future. One promising avenue of employment has now be cut off. What’s the job? Avocado peeler in an ethnic restaurant. Some hearty souls channeling Euell Gibbons may eat these as nature delivers them. Others prefer a toast delivery vehicle or maybe a dip to accompany a meal in an ethnic restaurant or while making a personal vlog about the stresses of modern life.

Chipotle’s Autocado Robot Can Prep Avocados Twice as Fast as Humans” reports:

The robot is capable of peeling, seeding, and halving a case of avocados significantly faster than humans, and the company estimates it could cut its typical 50-minute guacamole prep time in half…

When an efficiency expert from a McKinsey-type firm or a second tier thinker from a mid-tier consulting firm reads this article, there is one obvious line of thought the wizard will follow: Replace some of the human avocado peelers with a robot. Projecting into the future while under the influence of spreadsheet fever, an upgrade to the robot’s software will enable it to perform other jobs in the restaurant or food preparation center; for example, taco filler or dip crafter.

Based on this actual factual write up, I have concluded that some MBAs, lawyers, and sociology majors will have to seek another pathway to their future. Yard sale organizer, pet sitter, and possibly the life of a hermit remain viable options. Oh, the hermit will have GoFundMe and  BuyMeaCoffee pages. Perhaps a T shirt or a hat?

Stephen E Arnold, August 4, 2023

Netflix Has a Job Opening. One Job Opening to Replace Many Humanoids

July 27, 2023

Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

I read “As Actors Strike for AI Protections, Netflix Lists $900,000 AI Job.” Obviously the headline is about AI, money, and entertainment. Is the job “real”? Like so much of the output of big companies, it is difficult to determine how much is clickbait, how much is surfing on “real” journalists’ thirst for the juicy info, and how much is trolling? Yep, trolling. Netflix drives a story about AI’s coming to Hollywood.

The write up offers Hollywood verbiage and makes an interesting point:

The [Netflix job] listing points to AI’s uses for content creation:“Artificial Intelligence is powering innovation in all areas of the business,” including by helping them to “create great content.” Netflix’s AI product manager posting alludes to a sprawling effort by the business to embrace AI, referring to its “Machine Learning Platform” involving AI specialists “across Netflix.”

The machine learning platform or MLP is an exercise in cost control, profit maximization, and presaging the future. If smart software can generate new versions of old content, whip up acceptable facsimiles, and eliminate insofar as possible the need for non-elite humans — what’s not clear.

The $900,000 may be code for “Smart software can crank out good enough content at lower cost than traditional Hollywood methods.” Even the TikTok and YouTube “stars” face an interesting choice: [a] Figure out how to offload work to smart software or [b] learn to cope with burn out, endless squabbles with gatekeepers about money, and the anxiety of becoming a has-been.

Will humans, even talented ones, be able to cope with the pressure smart software will exert on the production of digital content? Like the junior attorney and cannon fodder for blue chip consulting companies, AI is moving from spitting out high school essays to more impactful outputs.

One example is the integration of smart software into workflows. The jargon about this enabling use of smart software is fluid. The $900,000 job focuses on something that those likely to be affected can understand: A good enough script and facsimile actors and actresses with a mouse click.

But the embedded AI promises to rework the back office processes and the unseen functions of humans just doing their jobs. My view is that there will be $900K per year jobs but far fewer of them than there are regular workers. What is the future for those displaced?

Crafting? Running yard sales? Creating fine art?

Stephen E Arnold, July 27, 2023

Ethics Are in the News — Now a Daily Feature?

July 27, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_t[1]Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

It is déjà vu all over again, or it seems like it. I read “Judge Finds Forensic Scientist Henry Lee Liable for Fabricating Evidence in a Murder Case.” Yep, that is the story. Scientist Lee allegedly has a knack for non-fiction; that is, making up stuff or arranging items in a special way. One of my relatives founded Hartford, Connecticut, in the 1635. I am not sure he would have been on board with this make-stuff-up approach to data. (According to our family lore, John Arnold was into beating people with a stick.) Dr. Lee is a big wheel because he worked on the 1995 running-through-airports trial. The cited article includes this interesting sentence:

[Scientist] Lee’s work in several other cases has come under scrutiny…

7 22 scientist and cookies

No one is watching. A noted scientist helps himself to the cookies in the lab’s cookie jar. He is heard mumbling, “Cookies. I love cookies. I am going to eat as many of these suckers as I can because I am alone. And who cares about anyone else in this lab? Not me.” Chomp chomp chomp. Thanks, MidJourney. You depicted an okay scientist but refused to create an image of a great leader whom I identified by proper name. For this I paid money?

Let me mention three ethics incidents which for one reason or another hit my radar:

  1. MIT accepting cash from every young person’s friend Jeffrey Epstein. He allegedly killed himself. He’s off the table.
  2. The Harvard ethics professor who made up data. She’s probably doing consulting work now. I don’t know if she will get back into the classroom. If she does it might be in the Harvard Business School. Those students have a hunger for information about ethics.
  3. The soon-to-be-departed president of Stanford University. He may find a future using ChatGPT or an equivalent to write technical articles and angling for a gig on cable TV.

What do these allegedly true incidents tell us about the moral fiber of some people in positions of influence? I have a few ideas. Now the task is remediation. When John Arnold chopped wood in Hartford, justice involved ostracism, possibly a public shaming, or rough justice played out to the the theme from Hang ‘Em High.

Harvard, MIT, and Stanford: Aren’t universities supposed to set an example for impressionable young minds? What are the students learning? Anything goes? Prevaricate? Cut corners? Grub money?

Imagine sweatshirts with the college logo and these words on the front and back of the garment. Winner. Some at Amazon, Apple, Facebook, Google, Microsoft, and OpenAI might wear them to the next off-site. I would wager that one turns up in the Rayburn House Office Building wellness room.

Stephen E Arnold, July 27, 2023

Will Smart Software Take Customer Service Jobs? Do Grocery Stores Raise Prices? Well, Yeah, But

July 26, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_t[1]Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

I have suggested that smart software will eliminate some jobs. Who will be doing the replacements? Workers one finds on Fiverr.com? Interns who will pay to learn something which may be more useful than a degree in art history? RIF’ed former employees who are desperate for cash and will work for a fraction of their original salary?

7 22 robot woman

“Believe it or not, I am here to help you. However, I strong suggest you learn more about the technology used to create software robots and helpers like me. I also think you have beautiful eyes. My are just blue LEDs, but the Terminator finds them quite attractive,” says the robot who is learning from her human sidekick. Thanks, MidJourney, you have the robot human art nailed.

The fact is that smart software will perform many tasks once handled by humans? Don’t believe me. Visit a local body shop. Then take a tour of the Toyota factory not too distant from Tokyo’s airport. See the difference? The local body shop is swarming with folks who do stuff with their hands, spray guns, and machines which have been around for decades. The Toyota factory is not like that.

Machines — hardware, software, or combos — do not take breaks. They do not require vacations. They do not complain about hard work and long days. They, in fact, are lousy machines.

Therefore, the New York Times’s article “Training My Replacement: Inside a Call Center Worker’s Battle with AI”  provides a human interest glimpse of the terrors of a humanoid who sees the writing on the wall. My hunch is that the New York Times’s “real news” team will do more stories like this.

However, it would be helpful to people like to include information such as a reference or a subtle nod to information like this: “There Are 4 Reasons Why Jobs Are Disappearing — But AI Isn’t One of Them.” What are these reasons? Here’s a snapshot:

  • Poor economic growth
  • Higher costs
  • Supply chain issues (real, convenient excuse, or imaginary)
  • That old chestnut: Covid. Boo.

Do I buy the report? I think identification of other factors is a useful exercise. In the short term, many organizations are experimenting with smart software. Few are blessed with senior executives who trust technology when those creating the technology are not exactly sure what’s going on with their digital whiz kids.

The Gray Lady’s “real news” teams should be nervous. The wonderful, trusted, reliable Google is allegedly showing how a human can use Google AI to help humans with creating news.

Even art history major should be suspicious because once a leader in carpetland hears about the savings generated by deleting humanoids and their costs, those bean counters will allow an MBA to install software. Remember, please, that the mantra of modern management is money and good enough.

Stephen E Arnold, July 26, 2023

Hedge Funds and AI: Lovers at First Sight

July 26, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_t[1]Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

One promise of AI is that it will eliminate tedious tasks (and the jobs that with them). That promise is beginning to be fulfilled in the investment arena, we learn from the piece, “Hedge Funds Are Deploying ChatGPT to Handle All the Grunt Work,” shared by Yahoo Finance. What could go wrong?

7 22 swim in money

Two youthful hedge fund managers are so pleased with their AI-infused hedge fund tactics, they jumped in a swimming pool which is starting to fill with money. Thanks, MidJourney. You have nailed the happy bankers and their enjoyment of money raining down.

Bloomberg’s Justina Lee and Saijel Kishan write:

“AI on Wall Street is a broad church that includes everything from machine-learning algorithms used to compute credit risks to natural language processing tools that scan the news for trading. Generative AI, the latest buzzword exemplified by OpenAI’s chatbot, can follow instructions and create new text, images or other content after being trained on massive amounts of inputs. The idea is that if the machine reads enough finance, it could plausibly price an option, build a portfolio or parse a corporate news headline.”

Parse the headlines for investment direction. Interesting. We also learn:

“Fed researchers found [ChatGPT] beats existing models such as Google’s BERT in classifying sentences in the central bank’s statements as dovish or hawkish. A paper from the University of Chicago showed ChatGPT can distill bloated corporate disclosures into their essence in a way that explains the subsequent stock reaction. Academics have also suggested it can come up with research ideas, design studies and possibly even decide what to invest in.”

Sounds good in theory, but there is just one small problem (several, really, but let’s focus on just the one): These algorithms make mistakes. Often. (Scroll down in this GitHub list for the ChatGPT examples.) It may be wise to limit one’s investments to firms patient enough to wait for AI to become more reliable.

Cynthia Murrell, July 26, 2023

Silicon Valley and Its Busy, Busy Beavers

July 21, 2023

Vea4_thumb_thumb_thumb_thumb_thumb_t[1]Note: This essay is the work of a real and still-alive dinobaby. No smart software involved, just a dumb humanoid.

Several stories caught my attention. These are:

7 21 beavers

Google’s busy beavers have been active: AI, pricing tactics, quantum goodness, and team building. Thanks, MidJourney but you left out the computing devices which no high value beaver goes without.

Google has allowed its beavers to gnaw on some organic material to build some dams. Specifically, the newspapers which have been affected by Google’s online advertising (no I am not forgetting Craigslist.com. I am just focusing on the Google at the moment) can avail themselves of AI. The idea is… cost cutting. Could there be some learnings for the Google? What I mean is that such a series of tests or trials provides the Google with telemetry. Such telemetry allows the Google to refine its news writing capabilities. The trajectory of such knowledge may allow the Google to embark on its own newspaper experiment. Where will that lead? I don’t know, but it does not bode well for real journalists or some other entities.

The YouTube price increase is positioned as a better experience. Could the sharp increase in ads before, during, and after a YouTube video be part of a strategy? What I am hypothesizing is that more ads will force users to pay to be able to watch a YouTube video without being driven crazy by ads for cheap mobile, health products, and gun belts? Deteriorating the experience allows a customer to buy a better experience. Could that be semi-accurate?

The quantum supremacy thing strikes me as 100 percent PR with a dash of high school braggadocio. The write up speaks to me this way: “I got a higher score on the SAT.” Snort snort snort. The snorts are a sound track to putting down those whose machines just don’t have the right stuff. I wonder if this is how others perceive the article.

And the busy beavers turned up at the White House. The beavers say, “We will be responsible with this AI stuff.  We AI promise.” Okay, I believe this because I don’t know what these creatures mean when the word “responsible” is used. I can guess, however.

Net net: The ethicist from Harvard and the soon-to-be-former president of Stanford are available to provide advisory services. Silicon Valley is a metaphor for many good things, especially for the companies and their senior executives. Life will get better and better with certain high technology outfits running the show, pulling the strings, and controlling information, won’t it?

Stephen E Arnold, July 21, 2023

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta