Facebook: Controlling Behavior Underscores Facebook Insecurity

August 30, 2021

Misinformation was running rampant long before the pandemic hit its stride. No one knows if the misinformation wave that currently plagues the United States and the world has hit its peak. Experts, like social media researcher Laura Edelson, are investigating the how misinformation spreads, but social media platforms do not like it says Vox Recode in “‘People Do Not Trust That Facebook Is Healthy Ecosystem.’” Edelson works at the NYU Ad Observatory and focuses her current research on Facebook’s role in spreading misinformation.

She believes that misinformation encourages COVID anti-vaxxers and is eroding democracy. Unfortunately Facebook decided to block Edelson and her colleagues’ Facebook accounts. They use their accounts to study political advertisements and misinformation. Facebook stated that the Ad Observatory violated users’ privacy through its Ad Observer tool. Edelson replied that only volunteers download the tool.

Lawmakers, free speech advocates, and the FTC condemned Facebook. Edelson states that Facebook wants to bury her research, because it exposes its part in spreading misinformation. On Facebook, users share misinformation more than any other content and the company refuses to disclose who pays for political ads. It demonstrates that Facebook does not like Edelson’s research and wants to stop it, because it hurts their bottom dollar.

Facebook, of course, denies the allegation and it points to larger problems:

“But Facebook’s effective shutdown of the Ad Observatory raises larger questions about whether the company is trying to limit outside interrogation of the company’s business practices in the name of protecting its users’ privacy. At the same time, the social media network has good reason to be worried about privacy as it faces intense regulatory scrutiny for past missteps that led to it having to pay the largest penalty ever imposed by the Federal Trade Commission.”

Edelson states that Facebook is an unhealthy misinformation ecosystem. Facebook and other misinformation platforms could be doing irreparable damage to society. Because this is a current problem, Facebook should be working with Edelson and other researchers who want to combat the misinformation plague.

Facebook and other companies, however, are more concerned about losing control and revenue. The good news is … Wait. There isn’t any for those researching the world’s most stabilizing and refreshing social media system.

Whitney Grace, August 30, 2021

Remember Who May Have Wanted to License Pegasus?

August 20, 2021

Cyber intelligence firm NSO, makers of Pegasus spyware, knows no bounds when it comes to enabling government clients to spy on citizens. Apparently, however, it draws the line at helping Facebook spy on its users. At his Daring Fireball blog, computer scientist John Gruber reports that “Facebook Wanted NSO Spyware to Monitor iOS Users.” We learn that NSO CEO Shalev Hulio has made a legal declaration stating he was approached in 2017 by Facebook reps looking to purchase certain Pegasus capabilities. Gruber quotes Motherboard’s Joseph Cox, who wrote:

“At the time, Facebook was in the early stages of deploying a VPN product called Onavo Protect, which, unbeknownst to some users, analyzed the web traffic of users who downloaded it to see what other apps they were using. According to the court documents, it seems the Facebook representatives were not interested in buying parts of Pegasus as a hacking tool to remotely break into phones, but more as a way to more effectively monitor phones of users who had already installed Onavo. ‘The Facebook representatives stated that Facebook was concerned that its method for gathering user data through Onavo Protect was less effective on Apple devices than on Android devices,’ the court filing reads. ‘The Facebook representatives also stated that Facebook wanted to use purported capabilities of Pegasus to monitor users on Apple devices and were willing to pay for the ability to monitor Onavo Protect users.’”

We are glad to learn NSO has boundaries of any sort. And score one for Apple security. As for Facebook, Gruber asserts this news supports his oft-stated assertion that Facebook is a criminal operation. He bluntly concludes:

“Facebook’s stated intention for this software was to use it for mass surveillance of its own honest users. That is profoundly [messed] up — sociopathic.”

Perhaps.

Cynthia Murrell, August 20, 2021

Facebook Keeps E2EE Goodness Flowing

August 18, 2021

Facebook is a wonderful outfit. One possible example is the helpful company’s end to end encryption for Facebook Messenger. “Facebook Messenger now have End-to-End Encryption for Voice and Video Calls” reports:

The social media giant said that end-to-end encryption for group voice and video calls will soon be a part of Messenger. Encryption is already available in Messenger as Secret Conversation. But Secret Conversation makes many features disable and only can be done with individuals. Facebook is going to change it in the coming weeks. Users will be able to control who can reach your chat lists, who will stay in the requests folder, and who can’t message you at all. In the blog, Facebook also talked about that Instagram is also likely to get end-to-end encryption and one-to-one conversations.

Should Facebook be subject to special oversight?

Stephen E Arnold, August 18, 2021

Facebook: A Force for Good. Now What Does Good Mean?

August 17, 2021

I read Preston Byrne’s essay about the Taliban’s use of WhatsApp. You can find that very good write up at this link. Mr. Byrne asks an important question: Did America just lose Afghanistan because of WhatsApp?

I also read “WhatsApp Can’t Ban the Taliban Because It Can’t Read Their Texts.” The main point of the write up is to point out that Facebook’s encrypted message system makes blocking users really difficult, like impossible almost.

I noted this statement:

the Taliban used Facebook-owned chat app WhatsApp to spread its message and gain favor among local citizens…

Seems obvious, right. Free service. Widely available. Encrypted. Why the heck not?

Here’s a statement in the Vice write up which caught my attention:

The company spokesperson said that WhatsApp complies with U.S. sanctions law, so if it encounters any sanctioned people or organizations using the app, it will take action, including banning the accounts. This obviously depends on identifying who uses WhatsApp, without having access to any of the messages sent through the platform, given that the app uses end-to-end encryption. This would explain why WhatsApp hasn’t taken action against some account spreading the Taliban’s message in Afghanistan.

Let me ask a pointed question: Is it time to shut down Facebook, WhatsApp, and Instagram? Failing that, why not use existing laws to bring a measure of control over access, message content, and service availability?

Purposeful action is needed. If Facebook cannot figure out what to do to contain and blunt the corrosive effects of the “free” service, outsource the task to an entity which will make an effort. That approach seems to be what is looming for the NSO Group. Perhaps purposeful action is motivating Apple to try and control the less salubrious uses of the iPhone ecosystem?

Dancing around the Facebook earnings report is fine entertainment. Is it time to add some supervision to the largely unregulated, uncontrolled, and frat boy bash? One can serve a treat like Bore Palaw too.

Stephen E Arnold, August 17, 2021

Facebook, Booze, Youngsters, and Australia: Shaken and Stirred

August 6, 2021

Quite a mixologist’s concoction: Facebook, booze, young people, and the Australian government. The country seems to be uncomfortable with some of Facebook’s alleged practices. I would assume that some Australian citizens who hold shares in the social media outfit are pleased as punch with the company’s financial results.

Others are not amused. “Facebook to Limit Ads Children See after revelations Australian Alcohol Companies Can Reach Teens” reports:

Facebook will impose more control on the types of ads that children as young as 13 are exposed to on Instagram and other platforms, as new research finds Australian alcohol companies are not restricting their social media content from reaching younger users.

How many companies targeted the youngsters down under? The write up asserts:

The paper examined the use of social media age-restriction controls by 195 leading alcohol brands on Instagram and Facebook, and found large numbers were not shielding their content from children. The 195 brands were owned by nine companies, and the research identified 153 Facebook accounts, including 84 based in Australia, and 151 Instagram accounts, of which 77 were Australian-based. The authors found 28% of the Instagram accounts and 5% of Facebook accounts had not activated age-restriction controls.

I did spot a quote attributed to one of the experts doing the research about Facebook, Booze, Youngsters, and Australia; to wit:

it was clear that companies were not complying with the code. “The alcohol industry has demonstrated that it is unable to effectively control its own marketing…

Shocking that about self regulation. Has anyone alerted the US financial sector?

Stephen E Arnold, August 6, 2021

Facebook Lets Group Admins Designate Experts. Okay!

August 2, 2021

Facebook once again enlists the aid of humans to impede the spread of misinformation, only this time it has found a way to avoid paying anyone for the service. Tech Times reports, “Facebook Adds Feature to Let Admin in Groups Chose ‘Experts’ to Curb Misinformation.” The move also has the handy benefit of shifting responsibility for bad info away from the company. We wonder—what happened to that smart Facebook software? The article does not say. Citing an article from Business Insider, writer Alec G. does tell us:

“The people who run the communities on Facebook now have the authority to promote individuals within its group to gain the title of ‘expert.’ Then, the individuals dubbed as experts can be the voices of which the public can then base their questions and concerns. This is to prevent misinformation plaguing online communities for a while now.”

But will leaving the designation of “expert” up to admins make the problem worse instead of better? The write-up continues:

“The social platform now empowers specific individuals inside groups who are devoted to solely spreading misinformation-related topics. The ‘Stop the Steal’ group, for example, was created in November 2020 with over 365,000 members. They were convinced that the election for the presidency was a fraud. If Facebook didn’t remove the group two days later, it would continue to have negative effects. Facebook explained that the organization talked about ‘the delegitimization of the election process,’ and called for violence, as reported by the BBC. Even before that, other groups within Facebook promoted violence and calls to action that would harm the civility of the governments.”

Very true. We are reminded of the company’s outsourced Oversight Board created in 2018, a similar shift-the-blame approach that has not worked out so well. Facebook’s continued efforts to transfer responsibility for bad content to others fail to shield it from blame. They also do little to solve the problem and may even make it worse. Perhaps it is time for a different (real) solution.

Cynthia Murrell, August 2, 2021

Facebook and NSO Group: An Odd Couple or Squabbling Neighbors?

July 28, 2021

Late in 2019, The Adware Guru published “Facebook Sues NSO Group Spyware Maker Due to Exploitation of WhatsApp Vulnerability.” That write up stated:

The cause of [Facebook’s]  lawsuit was WhatsApp’s zero-day vulnerability, which Facebook claims was sold to the NSO Group, and then the company helped use the problem to attack human rights defenders, journalists, political dissidents, diplomats, and governmental officials. According to court documents, more than 1,400 people in Bahrain, the United Arab Emirates, and Mexico suffered a total of 11 days from attacks. Facebook has already sent WhatsApp special messages to everyone affected.

In April 2020, Technadu published “The NSO Group Is Accusing Facebook of Having Tried to License Their Spyware.” That write up stated:

The ‘NSO Group’ is now turning the tables, claiming that they rejected Facebook’s proposal to license Pegasus because they only did it for governments and not private companies. In addition to that, they describe Facebook’s accusations as baseless and even accuse the social media company of failing to prepare the legal paperwork properly, which resulted in legislative procedure problems. NSO says Facebook didn’t have powerful methods to spy on iOS devices in the same way that they did with Android, and they felt like Pegasus could solve this problem for them. Facebook, on the other side, completely dismissed these statements by saying that these allegations had the sole purpose of distracting the court from the real facts.

Technadu added:

even if Facebook wasn’t trying to add Pegasus in Onavo for iOS, they are giving the NSO Group something to hold on to and make allegations that are at least seemingly realistic. At the very least, this development will complicate the legal process by much now.

Jump to the present. The Guardian’s story “Officials Who Are US Allies Among Targets of NSO Malware, Says WhatsApp Chief” reported on July 24, 2021:

Cathcart said that he saw parallels between the attack against WhatsApp users in 2019 – which is now the subject of a lawsuit brought by WhatsApp against NSO – and reports about a massive data leak that are at the centre of the Pegasus project… When WhatsApp says it believes its users were “targeted”, it means the company has evidence that an NSO server attempted to install malware on a user’s device.

The Guardian story includes this statement from the PR savvy NSO Group:

An NSO spokesperson said: “We are doing our best to help creating a safer world. Does Mr Cathcart have other alternatives that enable law enforcement and intelligence agencies to legally detect and prevent malicious acts of pedophiles, terrorists and criminals using end-to-end encryption platforms? If so, we would be happy to hear.”

Are Facebook’s statements credible? Is NSO Group’s version believable? Are these two behaving like the characters in Neil Simon’s “Odd Couple” or like the characters in the 1981 film “Neighbors”? Does each firm have something the other needs?

Stephen E Arnold, July 28, 2021

Does Facebook Kill?

July 22, 2021

I found it interesting that the US government suggested that Facebook information kills. You can refresh your knowledge of this assertion in “Biden: COVID Misinformation on Platforms Like Facebook Is ‘Killing People’”. The statement is an attention grabber. Facebook responded, according to Neowin in “Facebook Refutes Biden’s Blame That It’s “Killing People” with COVID Fake News”:

Facebook clearly took issue with these statements and a company spokesperson responded by saying, “We will not be distracted by accusations which aren’t supported by the facts”.

The US government asserts one thing; Facebook another. Which is the correct interpretation of Facebook: An instrument of death or a really great helper of humanity?

The US is a country, and it has legal tools at its disposal. Facebook is a commercial enterprise operating in the US with a single person controlling what the company does.

Facebook wants to use the laws of the country to advantage itself; for example, Facebook is not too keen on Lina Khan. The company filed a legal document to keep that person from getting involved in matters related to Facebook’s commercial behaviors.

I find the situation amusing. Facebook’s assertions are not going to get a like from me. The US government, on the other hand, is a country. When countries take action — as China did with regard to Jack Ma — consequences can be significant.

The phrase “Facebook kills” is meme-able. That may be a persistent problem for the Zuck and the Zuckers in my opinion.

Stephen E Arnold, July 22, 2021

Zuckin and Duckin: Socialmania at Facebook

July 19, 2021

I read “Zuck Is a Lightweight, and 4 More Things We Learned about Facebook from ‘An Ugly Truth’.” My initial response was, “No Mashable professionals will be invited to the social Zuckerberg’s Hawaii compound.” Bummer. I had a few other thoughts as well, but, first, here’s couple of snippets in what is possible to characterize a review of a new book by Sheera Frenkel and Cecilia Kang. I assume any publicity is good publicity.

Here’s an I circled in Facebook social blue:

Frenkel and Kang’s careful reporting shows a company whose leadership is institutionally ill-equipped to handle the Frankenstein’s monster they built.

Snappy. To the point.

Another? Of course, gentle reader:

Zuckerberg designed the platform for mindless scrolling: “I kind of want to be the new MTV,” he told friends.

Insightful but TikTok, which may have some links to the sensitive Chinese power plant, aced out the F’Book.

And how about this?

[The Zuck] was explicitly dismissive of what she said.” Indeed, the book provides examples where Sandberg was afraid of getting fired, or being labeled as politically biased, and didn’t even try to push back…

Okay, and one more:

Employees are fighting the good fight.

Will I buy the book? Nah, this review is close enough. What do I think will happen to Facebook? In the short term, not much. The company is big and generating big payoffs in power and cash. Longer term? The wind down will continue. Google, for example, is dealing with stuck disc brakes on its super car. Facebook may be popping in and out of view in that outstanding vehicle’s rear view mirrors. One doesn’t change an outfit with many years of momentum.

Are the book’s revelations on the money. Probably reasonably accurate but disenchantment can lead to some interesting shaping of non fiction writing. And the Mashable review? Don’t buy a new Hawaiian themed cabana outfit yet. What about Facebook’s management method? Why change? It worked in high school. It worked when testifying before Congress. It worked until a couple of reporters shifted into interview mode and reporters are unlikely to rack up the likes on Facebook.

Stephen E Arnold, July xx, 2021

Facebook Has Channeled Tacit Software, Just without the Software

July 14, 2021

I would wager a free copy of my book CyberOSINT that anyone reading this blog post remembers Tacit Software, founded in the late 1990s. The company wrote a script which determined what employee in an organization was “consulted” most frequently. I recall enhancements which “indexed” content to make it easier for a user to identify content which may have been overlooked. But the killer feature was allowing a person with appropriate access to identify individuals with particular expertise. Oracle, the number one in databases, purchased Tacit Software and integrated the function into Oracle Beehive. If you want to read marketing collateral about Beehive, navigate to this link. Oh, good luck with pinpointing the information about Tacit. If you dig a bit, you will come across information which suggests that the IBM Clever method was stumbled upon and implemented about the same time that Backrub went online. Small community in Silicon Valley? Yes, it is.

So what?

I thought about this 1997 innovation in Silicon Valley when I read “Facebook’s Groups to Highlight Experts.” With billions of users, I wonder why it took Facebook years to figure out that it could identify individuals who “knew” something. Progress never stops in me-to land, of course. Is Facebook using its estimable smart software to identify those in know?

The article reports:

There are more than 70 million administrators and moderators running active groups, Facebook says. When asked how they’re vetting the qualifications of designated experts, a Facebook spokesperson said it’s “all up the discretion of the admin to designate experts who they believe are knowledgeable on certain topics.”

I think this means that humans identify experts. What if the human doing the identifying does not know anything about the “expertise” within another Facebooker?

Yeah, maybe give Oracle Beehive a jingle. Just a thought.

Stephen E Arnold, July 14, 2021

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta