Online Anonymity: Maybe a Less Than Stellar Idea

July 20, 2021

On one hand, there is a veritable industrial revolution in identifying, tracking, and pinpointing online users. On the other hand, there is the confection of online anonymity. The idea is that by obfuscation, using a fake name, or hijacking an account set up for one’s 75 year old spinster aunt — a person can be anonymous. And what fun some can have when their online actions are obfuscated either by cleverness, Tor cartwheels, and more sophisticated methods using free email and “trial” cloud accounts. I am not a big fan of online anonymity for three reasons:

  1. Online makes it easy for a person to listen to one’s internal demons’ chatter and do incredibly inappropriate things. Anonymity and online, in my opinion, are a bit like reverting to 11 year old thinking often with an adult’s suppressed perceptions and assumptions about what’s okay and what’s not okay.
  2. Having a verified identity linked to an online action imposes social constraints. The method may not be the same as a small town watching the actions of frisky teens and intervening or telling a parent at the grocery that their progeny was making life tough for the small kid with glasses who was studying Lepidoptera.
  3. Individuals doing inappropriate things are often exposed, discovered, or revealed by friends, spouses angry about a failure to take out the garbage, or a small investigative team trying to figure out who spray painted the doors of a religious institution.

When I read “Abolishing Online Anonymity Won’t Tackle the Underlying Problems of Racist Abuse.” I agree. The write up states:

There is an argument that by forcing people to reveal themselves publicly, or giving the platforms access to their identities, they will be “held accountable” for what they write and say on the internet. Though the intentions behind this are understandable, I believe that ID verification proposals are shortsighted. They will give more power to tech companies who already don’t do enough to enforce their existing community guidelines to protect vulnerable users, and, crucially, do little to address the underlying issues that render racial harassment and abuse so ubiquitous.

The observation is on the money.

I would push back a little. Limiting online use to those who verify their identity may curtail some of the crazier behaviors online. At this time, fractious behavior is the norm. Continuous division of cultural norms, common courtesies, and routine interactions destroys.

My thought is that changing the anonymity to real identity might curtail some of the behavior online systems enable.

Stephen E Arnold, July 20, 2021

A Good Question and an Obvious Answer: Maybe Traffic and Money?

July 19, 2021

I read “Euro 2020: Why Is It So Difficult to Track Down Racist Trolls and Remove Hateful Messages on Social Media?” The write up expresses understandable concern about the use of social media to criticize athletes. Some athletes have magnetism and sponsors want to use that “pull” to sell products and services. I remember a technology conference which featured a former football quarterback who explained how to succeed. He did not reference the athletic expertise of a former high school science club member and officer.  As I recall, the pitch was working hard, fighting (!), and a overcoming a coach calling a certain athlete (me, for example) a “fat slug.” Relevant to innovating in online databases? Yes, truly inspirational and an anecdote from the mists of time.

The write up frames its concern this way about derogatory social media “posts”:

Over a quarter of the comments were sent from anonymous private accounts with no posts of their own. But identifying perpetrators of online hate is just one part of the problem.

And the real “problem”? The article states:

It’s impossible to discover through open-source techniques that an account is being operated from a particular country.

Maybe.

Referencing Instagram (a Facebook property), the Sky story notes:

Other users may anonymise their existing accounts so that the comments they post are not traceable to them in the offline world.

Okay, automated systems with smart software don’t do the job. Will another government bill in the UK help.

The write up does everything but comment about the obvious; for example, my view is that online accounts must be linked to a human and verified before posts are permitted.

The smart software thing, the government law thing, and the humans making decision thing, are not particularly efficacious. Why? The online systems permit — if not encourage — anonymity because money maybe? That’s a question for the Sky Data and Forensics team. It is:

a multi-skilled unit dedicated to providing transparent journalism from Sky News. We gather, analyse and visualise data to tell data-driven stories. We combine traditional reporting skills with advanced analysis of satellite images, social media and other open source information. Through multimedia storytelling we aim to better explain the world while also showing how our journalism is done.

Okay.

Stephen E Arnold, July 19, 2021

Zuckin and Duckin: Socialmania at Facebook

July 19, 2021

I read “Zuck Is a Lightweight, and 4 More Things We Learned about Facebook from ‘An Ugly Truth’.” My initial response was, “No Mashable professionals will be invited to the social Zuckerberg’s Hawaii compound.” Bummer. I had a few other thoughts as well, but, first, here’s couple of snippets in what is possible to characterize a review of a new book by Sheera Frenkel and Cecilia Kang. I assume any publicity is good publicity.

Here’s an I circled in Facebook social blue:

Frenkel and Kang’s careful reporting shows a company whose leadership is institutionally ill-equipped to handle the Frankenstein’s monster they built.

Snappy. To the point.

Another? Of course, gentle reader:

Zuckerberg designed the platform for mindless scrolling: “I kind of want to be the new MTV,” he told friends.

Insightful but TikTok, which may have some links to the sensitive Chinese power plant, aced out the F’Book.

And how about this?

[The Zuck] was explicitly dismissive of what she said.” Indeed, the book provides examples where Sandberg was afraid of getting fired, or being labeled as politically biased, and didn’t even try to push back…

Okay, and one more:

Employees are fighting the good fight.

Will I buy the book? Nah, this review is close enough. What do I think will happen to Facebook? In the short term, not much. The company is big and generating big payoffs in power and cash. Longer term? The wind down will continue. Google, for example, is dealing with stuck disc brakes on its super car. Facebook may be popping in and out of view in that outstanding vehicle’s rear view mirrors. One doesn’t change an outfit with many years of momentum.

Are the book’s revelations on the money. Probably reasonably accurate but disenchantment can lead to some interesting shaping of non fiction writing. And the Mashable review? Don’t buy a new Hawaiian themed cabana outfit yet. What about Facebook’s management method? Why change? It worked in high school. It worked when testifying before Congress. It worked until a couple of reporters shifted into interview mode and reporters are unlikely to rack up the likes on Facebook.

Stephen E Arnold, July xx, 2021

Social Media and News Diversity

July 16, 2021

Remember the filter bubble phrase. The idea is that in an online world one can stumble into, be forced into, or be seduced into an info flow which reinforces what one believes to be accurate. The impetus for filter bubbling is assumed to be social media. Not so fast, pilgrim.

“Study: Social Media Contributes to a More Diverse News Diet — Wait, What?!” provides rock solid, dead on proof that social media is not the bad actor here. I must admit that the assertion is one I do not hear too often. I noted this passage:

The study found that people who use search engines, social media, and aggregators to access news can actually have more diverse information diets.

The study is “More Diverse, More Politically Varied: How Social Media, Search Engines and Aggregators Shape News Repertoires in the United Kingdom.” With a word like “repertoire” in the title, one can almost leap at the assumption that the work was from Britain’s most objectively wonderful institutions of higher learning. None of that Cambridge Analytica fluff. These are Oxfordians and Liverpudlians. Liverpool is a hot bed of “repertoire” I have heard. You can download the document at this url from Sage, a fine professional publisher, at https://journals.sagepub.com/doi/pdf/10.1177/14614448211027393.

The original study states:

There is still much to learn about how the rise of new, ‘distributed’, forms of news access through search engines, social media and aggregators are shaping people’s news use.

That lines up with my understanding of what is known about the upsides and downsides of social media technology, content, its use, and its creators. There’s a handy list of tracked articles read:

image

The Canary.co is interesting because it runs headlines which are probably intuitively logical in Oxford and Liverpool pubs. Here’s a headline from July 11, 2021:

Boris Johnson Toys with Herd Immunity Despite Evidence Linking Long Covid to Brain Damage.

I am not sure about Mr. Johnson’s toying, herd immunity, and brain damage. But I live in rural Kentucky, not Oxford or Liverpool.

The Sage write up includes obligatory math; for example:

image

And then charge forward into the discussion of this breakthrough research.

Social media exposes their users to more diverse opinions. I will pass that along to the folks who hang out in the tavern in Harrod’s Creek. Some of those individuals get their info from quite interesting groups on Telegram. SocialClu, anyone? Brain damage? TikTok?

Stephen E Arnold, July 16, 2021

Facebook Has Channeled Tacit Software, Just without the Software

July 14, 2021

I would wager a free copy of my book CyberOSINT that anyone reading this blog post remembers Tacit Software, founded in the late 1990s. The company wrote a script which determined what employee in an organization was “consulted” most frequently. I recall enhancements which “indexed” content to make it easier for a user to identify content which may have been overlooked. But the killer feature was allowing a person with appropriate access to identify individuals with particular expertise. Oracle, the number one in databases, purchased Tacit Software and integrated the function into Oracle Beehive. If you want to read marketing collateral about Beehive, navigate to this link. Oh, good luck with pinpointing the information about Tacit. If you dig a bit, you will come across information which suggests that the IBM Clever method was stumbled upon and implemented about the same time that Backrub went online. Small community in Silicon Valley? Yes, it is.

So what?

I thought about this 1997 innovation in Silicon Valley when I read “Facebook’s Groups to Highlight Experts.” With billions of users, I wonder why it took Facebook years to figure out that it could identify individuals who “knew” something. Progress never stops in me-to land, of course. Is Facebook using its estimable smart software to identify those in know?

The article reports:

There are more than 70 million administrators and moderators running active groups, Facebook says. When asked how they’re vetting the qualifications of designated experts, a Facebook spokesperson said it’s “all up the discretion of the admin to designate experts who they believe are knowledgeable on certain topics.”

I think this means that humans identify experts. What if the human doing the identifying does not know anything about the “expertise” within another Facebooker?

Yeah, maybe give Oracle Beehive a jingle. Just a thought.

Stephen E Arnold, July 14, 2021

Facebook and Milestones for the Zuck

July 2, 2021

I read “Facebook Reaches $1 Trillion Valuation Faster Than Any Other Company.” The write up reports that the highly esteemed Facebook (WhatsApp and Instagram) have out performed Apple, the Google, the spaceship outfits, and Microsoft. The article reports:

No stranger to breaking records, Facebook has just achieved another: the social media giant’s market cap has exceeded $1 trillion for the first time, reaching the milestone faster than any other company in history.

What has caused this surge in “valuation”? The answer is revealed in “Judge Dismisses Gov’t Antitrust Lawsuits against Facebook.” This separate write up in a news service oh, so close to Microsoft’s headquarters, states:

A federal judge on Monday dismissed antitrust lawsuits brought against Facebook by the Federal Trade Commission and a coalition of state attorneys general, dealing a significant blow to attempts by regulators to rein in tech giants. U.S. District Judge James Boasberg ruled Monday that the lawsuits were “legally insufficient” and didn’t provide enough evidence to prove that Facebook was a monopoly.

Concentration is good, it seems. For social media start ups, have you thought about issuing T shirts with the Zuck’s smiling face printed on them? You can update your Facebook page and do some Instagram posts. Don’t overlook WhatsApp as a way to spread the good news. About the T shirts I mean.

Stephen E Arnold, July 2, 2021

Cheap, Convenient, and Much Too Easy: Fabricated Twitter Trends

July 1, 2021

Here is some news out of Turkey that perked our ears. News EPFL reports, “Mass Scale Manipulation of Twitter Trends Discovered.” Is Turkey an outlier when it comes to digital baloney? Perhaps a little, for now, but this problem recently uncovered by researchers appears to occur the world over.

Twitter Trends supposedly uses an algorithm to calculate the most popular topics at any given moment then deliver this information to its users. However, these calculations can be manipulated. Writer Tanya Petersen explains:

“Now, new EPFL research focused on Turkey, from the Distributed Information Systems Laboratory, part of the School of Computer and Communication Sciences has found a vulnerability in the algorithm that decides Twitter Trending Topics: it does not take deletions into account. This allows attackers to push the trends they want to the top of Twitter Trends despite deleting their tweets which contain the candidate trend shortly afterwards. ‘We found that attackers employ both fake and compromised accounts, which are the accounts of regular people with stolen credentials, or who installed a malicious app on their phones. Generally, they are not aware that their account is being used as a bot to manipulate trending topics, sometimes they are but don’t know what to do about it and in both cases they keep using Twitter,’ said Tu?rulcan Elmas, one of the authors of the research, accepted by the IEEE European Symposium of Security and Privacy 2021, a top cybersecurity conference. ‘We found that 47% of local trends in Turkey and 20% of global trends are fake, created from scratch by bots. Between June 2015 and September 2019, we uncovered 108,000 bot accounts involved, the biggest bot dataset reported in a single paper.’”

Those bot-created trends cover topics from gambling promotions to disinformation campaigns to hate speech against vulnerable populations. Another of the paper’s co-authors, Rebekah Overdorf, points out fake Twitter trends are magnified by media outlets, which regularly seize on them as examples of what people are talking about. Well if they weren’t before, they are now. Effective. When contacted by researchers, Twitter acknowledged the vulnerability but showed no interest in doing anything about it. We are sensing a (real) trend here.

Cynthia Murrell, July 1, 2021

Real Silicon Valley News Predicts the Future

July 1, 2021

I read “Why Some Biologists and Ecologists Think Social Media Is a Risk to Humanity.” I thought this was an amusing essay because the company publishing it is very much a social media thing. Clicks equal fame, money, and influence. These are potent motivators, and the essay is cheerfully ignorant of the irony of the Apocalypse foretold in the write up.

I learned:

One of the real challenges that we’re facing is that we don’t have a lot of information

But who is “we”? I can name several entities which have quite comprehensive information. Obviously these entities are not part of the royal “we”. I have plenty of information and some of it is proprietary. There are areas about which I would like to know more, but overall, I think I have what I need to critique thumbtyper-infused portents of doom.

Here’s another passage:

Seventeen researchers who specialize in widely different fields, from climate science to philosophy, make the case that academics should treat the study of technology’s large-scale impact on society as a “crisis discipline.” A crisis discipline is a field in which scientists across different fields work quickly to address an urgent societal problem — like how conservation biology tries to protect endangered species or climate science research aims to stop global warming. The paper argues that our lack of understanding about the collective behavioral effects of new technology is a danger to democracy and scientific progress.

I assume the Silicon Valley “real” news outfit and the experts cited in the write up are familiar with the work of J. Ellul? If not, some time invested in reading it might be helpful. As a side note, Google Books thinks that the prescient and insightful analysis of technology is about “religion.” Because Google, of course.

The write up adds:

Most major social media companies work with academics who research their platforms’ effects on society, but the companies restrict and control how much information researchers can use.

Remarkable insight. Why pray tell?

Several observations:

  • Technology is not well understood
  • Flows of information are destructive in many situations
  • Access to information spawns false conclusions
  • Bias distorts logic even among the informed.

Well, this is a pickle barrel and “we” are in it. What is making my sides ache from laughter is that advocates of social media in particular and technology in general are now asking, “Now what?”

Few like China’s approach or that of other authoritarian entities who want to preserve the way it was.

Cue Barbara’s “The Way We Were.” Oh, right. Blocked by YouTube. Do ecologists and others understand cancer?

Stephen E Arnold, July 1, 2021

Fixing Social Media: Will $100 Million Win the Pennant?

June 22, 2021

I read “A Real Estate Mogul Has a $100 Million Plan to Save the Internet.” Note that you may have to pay to read this story on the Bloomberg “news is not free” service.

The main point is:

Project Liberty would use blockchain to construct a new internet infrastructure called the Decentralized Social Networking Protocol. With crypto currencies, blockchain stores information about the tokens in everyone’s digital wallets; the DSNP would do the same for social connections. Facebook owns the data about the social connections between its users, giving it an enormous advantage over competitors. If all social media companies drew from a common social graph, the theory goes, they’d have to compete by offering better services, and the chance of any single company becoming so dominant would plummet.

Interesting. Isn’t major league baseball a club, a very select group of everyman owners?

I noted this fascinating assumption, which is a variation on the old Google saw that changing search habits is just a one click choice:

Project Liberty is proposing that the entire internet start doing things drastically differently.

There are a number of individuals who want to decentralize “the Internet.” Does anyone hear this echo:

Saddle up the horses, Sancho. We’re going after that blue windmill.

Yep, vamos.

Stephen E Arnold, June 22, 2021

Amazon Burgoo: A Recipe from the Baedeker of Zuckland

June 17, 2021

Amazon Blames Social Media for Struggle with Fake Reviews” sparked a thought I had not entertained previously. Amazon is taking a page from the Zuck Baedeker to Disingenuousness. This is a collection of aphorisms, precepts, and management tips which I imagine is provided to each Facebook employee. Whether it is a Facebook senior manager explaining how Facebook is a contributor to cohesiveness or another top puppy leaning in on Cambridge Analytic-type matters, I visualize this top secret compendium as the Book. A Facebooker’s success depends on learning by rote the hows and whys of Facebooking.

image

This image is from a Kentucky inspired cook who knows about burgoo. The dark meat in the mish mash of what’s in the fridge is squirrel and maybe other critters. Reviews of burgoo suggest it is the best possible meal for a hungry person with a pile of dead squirrels.

Now, it is possible, that this Baedeker has fallen into the hands of Amazon’s senior managers. The write up “Amazon Blames Social Media” reports:

Amazon has blamed social media companies for its failure to remove fake reviews from its website, arguing that “bad actors” turn to social networks to buy and sell fake product reviews outside the reach of its own technology.

I interpret this as meaning “not our fault.” It is a variation on the type of thinking which allegedly sparked this observation by the social media top dog:

A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa.

The write up “Amazon Blames Social Media” includes this passage, allegedly from the Bezos bulldozer’s exhaust pipes:

Amazon says the blame for those organizations should lie with social media companies, who it says are slow to act when warned that fake reviews are being solicited on their platforms. “In the first three months of 2020, we reported more than 300 groups to social media companies, who then took a median time of 45 days to shut down those groups from using their service to perpetrate abuse,” an unsigned Amazon blog post said. “In the first three months of 2021 we reported more than 1,000 such groups, with social media services taking a median time of five days to take them down. “While we appreciate that some social media companies have become much faster at responding, to address this problem at scale it is imperative for social media companies to invest adequately in proactive controls to detect and enforce fake reviews ahead of our reporting the issue to them.”

Delicious. One possible monopoly blaming another possible monopoly using the type of logic employed by other monopolies.

Okay, who is to blame? Obviously not Amazon. Those reviews, however, can be tomfoolery, but they are indexable. And in the quest to grow one’s share of the product search market, words are needed. Bulkage is good.

Trimming the wordage benefits not the bulldozer. Facebook-type outfits seek engagement. Remember the dying squirrel? Ponder the squirrel as a creature who wants truth, accuracy, and integrity to prevail in the forest. How’s that working out for the squirrel and modern business practices? Just great for some. For others, burgoo. Now try to take the carrots, beans, and dead squirrel out of the pot and uncook them. Tough job, right?

Stephen E Arnold, June 17, 2021

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta