Facebook: Interesting Real News Filtering
September 29, 2018
Here in Harrod’s Creek, it is difficult to determine what is accurate and what is not. For example, allegedly a university president fiddled his pay. Then we had rumors of a novel way to recruit basketball players. News about these events were filtered because, hey, basketball is a big deal along with interesting real estate deals in River City.
We read “Facebook Users Unable to Post Story about Huge Facebook Hack on Facebook.” A real news outfit in London noticed that stories about Facebook’s most recent security lapse were not appearing on Facebook.
Another real news outfit reported that some Facebook users saw this message:
“Action Blocked: Our security systems have detected that a lot of people are posting the same content, which could mean that it’s spam. Please try a different post.”
Facebook fans suggested that Facebook was not blocking a story which might put Facebook in a bad light.
Here in rural Kentucky we know that no Silicon Valley company would filter news about its own security problems.
Facebook is a fine outfit. Obviously the news about the security lapse was fake; otherwise, why would the information be blocked?
Just a misunderstanding which the 50 million plus people affected are certain to understand. What’s the big deal with regaining access to one’s account?
The Facebook service is free and just wonderful. Really wonderful.
Stephen E Arnold, September 29, 2018
Academic Sees Facebook Chasing Amazon
September 17, 2018
I assume the one trillion dollar Amazon poobah and the world’s richest hombre will not friend Facebook. The assumption is that the information in “How Facebook AI May Help to Change the Way We Shop Online in the Future” is accurate. The author is an accounting instructor at Villanova University. We do not have that type of expert in Harrod’s Creek. We do have some fast money guys from the health care outfits down the road, however.
The main point of the write up is that Facebook has some smart software which will change the way people shop. Maybe not in Harrod’s Creek, but certainly in a big city where the economic action is. (Keep in mind that insurance fraud is a core competency of some in the Commonwealth of Kentucky.)
I learned that:
The most powerful algorithm is called FBLearner Flow: Facebook could use its massive data on user preferences to anticipate the products that consumers want before consumers even realize it, and could work with retailers on predictive shipping.
Facebook also has DeepText and DeepFace. The trio of smart software adds up to a potential threat to Amazon.
The dismal performance of some facial recognition and image analysis systems is not a problem for the Facebook wizards. I learned:
DeepFace is used to identify people in photos and suggest that users tag people they know. In reality, DeepFace can recognize any face in any photograph on its own. This facial recognition algorithm is actually 97 percent accurate, incredibly even higher than humans who fall a close second at 96 percent accuracy, and the FBI at 85 percent.
The write up suggests that Facebook’s technology could, maybe, possibly could edge toward mind control.
Whatever.
My thought is that Facebook can snag more ad revenue. I think that Facebook ad gains might come at the expense of the Google. Google, unlike Amazon, seems to be drifting with Loon balloons, employee push back, and electric scooter investments. Amazon’s ads are just fly wheeling up and trying to build Mr. Bezos’ much beloved momentum.
Our research suggests that Amazon is implementing a game plan that once was associated with the pre 2006 Google; that is, a number of large scale plays for core business expansions. These range from policeware to back office financial services to replacing existing retail infrastructure with the Amazon equivalent of old school retail.
Ads, therefore, will be a billion dollar plus business at Amazon. Are those product listings ads or objective product summaries. That’s a question to ponder.
But ads may not become much more than just another Amazon revenue stream.
Facebook has to find revenue. Amazon, thanks to the cleverness of the happy Amazonians, is wallowing in revenue streams. Some employees may be unhappy, but most customers are thrilled for Amazon’s gentle approach to vendor lock in.
Net net: Facebook will do ads. Facebook will do smart software. Facebook will also have to figure out how to dodge the bullets regulators are now stuffing into regulatory weapons to tackle with “we’re sorry, we’ll do better” approach to business.
Stephen E Arnold, September 15, 2018
Facebook: The Old Is Newish Again
September 14, 2018
Social media giant, Facebook, has been making a very public effort to clean up its act and establish a greater sense of security for users. As this campaign is underway more troubling news recently came out regarding the platform. We learned all the disturbing information from a recent article in The Verge, “How Autocratic Governments Use Facebook Against Their Own Citizens.”
According to the story:
“Armed groups use Facebook to find opponents and critics, some of whom have later been detained, killed or forced into exile, according to human rights groups and Libyan activists…Swaggering commanders boast of their battlefield exploits and fancy vacations, or rally supporters by sowing division and ethnic hatred. Forged documents circulate widely, often with the goal of undermining Libya’s few surviving national institutions.”
While this is indeed interesting news, we’d say that it is not just limited to autocratic regimes. Take, for example, the news that the US government would like to start wiretapping Facebook’s messenger app. Clearly, some governments are using social media for more overt evil, however, we can’t imagine a nation in the world that would overlook this powerful tool and consider ways they can use it for their own good.
Patrick Roland, September 14, 2018
High School Science Club Management Methods: August 30, 2018
August 30, 2018
Years ago, I learned that Google was worried about government regulation. President Trump seems to be making moves in that direction. But my topic today is high school science club management methods or HSSCMM.
The first example is news about a group of Facebook staff who are concerned about the intolerant liberal culture within Facebook. Okay, Facebook is about friends and people who share interests or likes. The notion of a political faction within an online company is one more example of a potential weakness in HSSCM. The idea that an employee worked for a company, had a job description, and received money strikes me as inoperative. The problem is that the needs of the Science Club are not the needs of the people on the football team or the field hockey team. Will the lunchroom have tables for the Science Club folks and other tables for the sports? In my high school, the Science Club was different from the band and the student council. Snort, snort, we said, when asked to coordinate with the booster club to celebrate a big win. Snort, snort.
The second example the story “14 Powerful Human-Rights Groups Write to Google Demanding It Kill Plans to Launch a China Search Engine.” The issue for Google and China is revenue. How will the HSSCM address a group of human rights organizations. I assume that these entities can issue news releases, pump out Twitter messages, and update their Facebook pages. If that sounds like the recipe for information warfare, I am not suggesting such an aggressive approach. What’s important to me is that Google will have to dip into its management methods to deal with this mini protest.
The question is, “Are high school science club management methods up to these two challenges?
My view is, “Sure, really smart people can find clever solutions.”
On the other hand, the very management methods which made Facebook and Google the business home runs each is will have to innovate. Business school curricula may not cover how to manage revolts from unexpected sources.
Stephen E Arnold, August 30, 2018
More Administrative Action from Facebook
August 20, 2018
Rarely do we get a report from the front lines of the war on social spying and fake news. However, recently a story appeared that showcased Facebook’s heavy-handed tactics up close and personal. The article appeared in Gizmodo, titled: “Facebook Wanted to Kill This Investigative Tool.”
The story is about how one designer at Gizmodo tried creating a program that collected data on Facebook, trying to determine what they used their data farms for. It did not go well and the social media giant attempted to gain access to the offending account almost instantly.
“We argued that we weren’t seeking access to users’ accounts or collecting any information from them; we had just given users a tool to log into their own accounts on their own behalf, to collect information they wanted collected, which was then stored on their own computers. Facebook disagreed and escalated the conversation to their head of policy for Facebook’s Platform…”
News such as this has been slowly leaking its way into the mainstream. In short, Facebook has been attempting to crack down on offenders, but in the process might be going a little too far—this is not unlike overcorrecting a car while skidding on ice. Wall Street is more than a little worried they won’t pull out of this wreck, but some experts say it’s all just growing pains.
We think this could be another example of management decisions fueled by high school science club thinking.
Patrick Roland, August 20, 2018
Facebook: A New York City-Sized PR Problem
July 20, 2018
I read “Once nimble Facebook Trips Over Calls to Control Content.” If you are looking for this write up online, the story’s headline was changed to “What Stays on Facebook and What Goes? The Social Network Cannot Answer.” You may be able to locate the online version at this link. (No promises.) The dead tree version is on Page A1 of the July 20, 2018, edition which comes out on Thursday night. Got the timeline square?
I wanted to highlight a handful of comments in the “real” news story. Here we go with direct statements from the NYT article in red:
- The print version headline uses the phrase “once nimble.” Here in Harrod’s Creek that means stumbling bobolyne. In Manhattan, the phrase may mean something like “advertise more in the New York Times.” I am, of course, speculating.
- I marked in weird greenish yellow this statement: “Facebook still seems paralyzed over how to respond.” So much for nimble.
- Another: “Comically tripped up”. Yep, a clown’s smile on the front page of the NYT.
- My favorite: The context for being a bit out of his depth. Whatever does “yet lucidity remai9ned elusive.” Does this mean stupid, duplicitous, or something else?
- I thought Silicon Valley wunderkind were sharp as tacks. In the NYT, I read “Facebook executives’ tortured musings.” Not Saturday Night Live deep thoughts, just musings and tortured ones at that.
- How does Facebook perceive “real” journalism? Well, not the way the NYT does. I circled this phrase about Alex Jones, a luminary with some avid believers one mine drainage ditch down the road a piece which is Kentucky talk for “some”: “Just being false doesn’t violate community standards” and “Infowars was a publisher with a ‘different point of view.’”
- This is a nifty sequence crafted to recycle another “real” journalist’s scoop interview with Mark Zuckerberg: “what Facebook would or would not allow on its site became even more confusing.” So, a possible paralyzed clown who lacks lucidity is confusing.
- The “bizarre idea” word pair makes sure I understand what the NYT believes in a lack of clear thinking.
But these brief rhetorical flourishes set up this statement:
A Facebook spokeswoman [who is not identified] explained that it would be possible, theoretically, to deny the Holocaust without triggering Facebook’s hate-speech clause.
Those pesky algorithms are at work. But the failure to identify the person at Facebook who offered this information is not identified. Why not?
Here’s another longer statement from the NYT write up:
And what exactly constitutes imminent violence is a shifting line, the company said— it is still ‘iterating on’ its policy, and the rules may change.
I don’t want to be too dumb, but I would like to know who at the company offered the statement. A company, to my knowledge, cannot talk unless one considers firing a question at Amazon’s Alexa.
I put an exclamation point on this statement in the NYT article:
All of this fails a basic test: It’s not even coherent. It is a hodge podge of declarations and exceptions and exceptions to the exceptions.
Net net: Facebook has a public relations problem with the New York Times. Because of the influence of the “real” newspaper and its “real” journalists, Facebook has a PR problem of magnitude. Perhaps the point of the story is to create an opportunity for a NYT ad sales professional to explain the benefits of a full page ad across the print and online versions of the New York Times?
Stephen E Arnold, July 20, 2018
Facebook: A Fan of Infowars
July 13, 2018
I don’t know much about Infowars. I do know that the host has an interesting verbal style. The stories, however, don’t make much sense to me. I just ignore the host and the program.
However, if the information in “Facebook Proves It Isn’t Ready To Handle Fake News” is accurate, Facebook is okay with the host and the Infowars’ approach to information.
The write up reports a Facebook news expert as saying:
“I guess just for being false that doesn’t violate the community standards.” I think part of the fundamental thing here is that we created Facebook to be a place where different people can have a voice. And different publishers have very different points of view.
The Buzzfeed story makes this statement:
Despite investing considerable money into national ad campaigns and expensive mini documentaries, Facebook is not yet up to the challenge of vanquishing misinformation from its platform. As its videos and reporter Q&As take pains to note, Facebook knows the truth is messy and hard, but it’s still not clear if the company is ready to make the difficult choices to protect it.
Hey, it’s difficult for some people to deal with responsibility. Ease off. Facebook is trying hard to be better. Every day. Better.
Stephen E Arnold, July 13, 2018
Facebook: Information Governance?
July 9, 2018
Anyone else annoyed by the large amount of privacy disclosures filling your index and slowing down your favorite Web site? User data privacy and how companies are collecting and/or selling that information is a big issue.
Facebook is one of the more notorious data management case studies. Despite the hand waving, it may be easy for Facebook data to be appropriated.
Josip Franjkovi? writes how user data can be stolen in the post, “Getting Any Facebook User’s Friend List And Partial Payment Card Details.”
There are black hat and white hat hackers, the latter being the “good guys.” It is important for social media Web sites to hack themselves, so they can discover any weaknesses in their structures. Franjkovi? points out that Facebook uses a GraphQL endpoint that is only accessible their first part applications. He kept trying to break into the endpoint, even sending persisted queries on a loop. The same error message kept returning, but it did return information already available to the public and the privately held friends list.
The scarier hack was about credit card information:
“A bug existed in Facebook’s Graph API that allowed querying for any user’s payment cards details using a field named payment_modules_options. I found out about this field by intercepting all the requests made by Facebook’s Android application during registration and login flow.”
Thankfully Franjkovi? discovered this error and within four hours and thirteen minutes the issue was resolved. Credit card information was stolen this time around, but how much longer until it is again? We await Franjkovi?’s analysis of Google email being available to certain third parties.
Whitney Grace, July 9, 2018
Phrase of the Day: Collateral Damage
June 14, 2018
The phrase “collateral damage” means, according to the Cambridge Dictionary:
during a war, the unintentional deaths and injuries of people who are not soldiers, and damage that is caused to their homes, hospitals, schools, etc.
Cambridge University itself may be touched by blowback from the antics of one of its professors and a company which shares the name of the town on the River Cam. Twitch the mantle blue, of course.
The Cambridge Analytica/Facebook data scandal has rightly been scrutinized by everyone from individual users to entire government bodies. As could be expected when the players are this large, what people are finding links together unlikely suspects and victims in this data breach. One such surprise popped up this week when we read a Gizmodo report, “Facebook ‘Looking Into’ Palantir’s Access to User Data.”
According to the story:
“The inquiry was led by Damian Collins, chair of Parliament’s Digital, Culture, Media, and Sport Committee. According to CNBC, Collins asked if Palantir was part of Facebook’s “review work…. While it’s unclear if it gained access to the Facebook user data that Cambridge Analytica harvested, Palantir’s connection to the social network extends beyond any potential collaboration with Cambridge Analytica. Peter Thiel, a Facebook board member, is a Palantir co-founder.”
We aren’t sure what the big data powerhouse Palantir knew or didn’t know, but so far the company has been outside the blast zone.
Take for example, the recent news that Cambridge Analytica’s data seems to be out of business or in business under a different name.
Keep that ceramic plate on. The dominoes may continue to fall.
Patrick Roland, May 13, 2018
Is Real News Synthetic?
June 13, 2018
There are new artificial intelligence algorithms being designed to develop new security measures. AI algorithms “learn” when they are fed large datasets to discover patterns, inconsistencies, and other factors. It is harder than one thinks to generate large datasets, so Google has turned to fake…er…synthetic data over real. Valuewalk wrote more about synthetic data in, “Why Facebook Now Uses Synthetic (‘Fake’) Data.”
Facebook recently announced plans to open two new AI labs to develop user security tools and the algorithms would be built on synthetic data. Sergey Nikolenko, a data scientist, complimented the adoption of synthetic data, especially since it would enable progress without hindering user privacy.
“ ‘While fake news has caused problems for Facebook, fake data will help fix those problems,’ said Nikolenko. ‘In a computing powerhouse like Facebook, where reams of data are generated every day, you want a solution in place that will help you quickly train different AI algorithms to perform different tasks, even if all the training data is. That’s where synthetic data gets the job done!’ “
One of the biggest difficulties AI developers face is a lack of usable data. In other words, data that is high-quality, task-specific and does not compromise user privacy. Companies like Neuromation nabbed this niche, so they started creating qualifiable data.
Facebook will use the AI tools to fight online harassment, political propaganda from foreign governments, fake news, and various networking tools and apps. This might be the start of better safety protocols protecting users and preventing online bullies.
Perhaps “real news” is synthetic?
Whitney Grace, June 13, 2018