Tumblr Tumbles, Marking yet Another Poor Investment Decision by Yahoo
April 14, 2016
The article on VentureBeat titled As Tumblr’s Value Head to Zero, a Look at Where It Ranks Among Yahoo’s 5 Worst Acquisition Deals pokes fun at Yahoo’s tendency to spend huge amounts of cash for companies only to watch them immediately fizzle. In the number one slot is Broadcast.com. Remember that? Me neither. But apparently Yahoo doled out almost $6B in 1999 to wade into the online content streaming game only to shut the company down after a few years. And thusly, we have Mark Cuban. Thanks Yahoo. The article goes on with the ranking,
“2. GeoCities: Yahoo paid $3.6 billion for this dandy that let people who knew nothing about the Web make web pages. Fortunately, this was also mostly shut down, and nearly all of its content vanished, saving most of us from a lot GIF-induced embarrassment. 3. Overture: Yahoo paid $1.63 billion in 2003 for this search engine firm after belatedly realizing that some upstart called Google was eating its lunch. Spoiler alert: Google won.”
The article suggests that Tumblr would slide into fourth place given the $1.1B price tag and two year crash and burn. It also capitulates that there are other ways of measuring this list, such as: levels of hard to watch. By that metric, cheaper deals with more obvious mismanagement like the social sites Flickr or Delicious might take the cake.
Chelsea Kerwin, April 14, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Algorithmic Bias and the Unintentional Discrimination in the Results
October 21, 2015
The article titled When Big Data Becomes Bad Data on Tech In America discusses the legal ramifications of relying on algorithms for companies. The “disparate impact” theory has been used in the courtroom for some time to ensure that discriminatory policies be struck down whether they were created with the intention to discriminate or not. Algorithmic bias occurs all the time, and according to the spirit of the law, it discriminates although unintentionally. The article states,
“It’s troubling enough when Flickr’s auto-tagging of online photos label pictures of black men as “animal” or “ape,” or when researchers determine that Google search results for black-sounding names are more likely to be accompanied by ads about criminal activity than search results for white-sounding names. But what about when big data is used to determine a person’s credit score, ability to get hired, or even the length of a prison sentence?”
The article also reminds us that data can often be a reflection of “historical or institutional discrimination.” The only thing that matters is whether the results are biased. This is where the question of human bias becomes irrelevant. There are legal scholars and researchers arguing on behalf of ethical machine learning design that roots out algorithmic bias. Stronger regulations and better oversight of the algorithms themselves might be the only way to prevent time in court.
Chelsea Kerwin, October 21, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

