Twitter Fingers May Be Sharing News Story Links but That Does Not Mean Anyone Read the Article

December 14, 2016

The article on ScienceDaily titled New Study Highlights Power of Crowd to Transmit News on Twitter shows that Twitter is, in fact, good at something. That something is driving recommendations of news stories. A study executed by Columbia University and the French National Institute found that the vast majority of clicks on news stories is based on reader referrals. The article details the findings:

Though far more readers viewed the links news outlets promoted directly on Twitter… most of what readers shared and read was crowd-curated. Eighty-two percent of shares, and 61 percent of clicks, of the tweets in the study sample referred to content readers found on their own. But the crowd’s relative influence varied by outlet; 85 percent of clicks on tweets tied to a BBC story came from reader recommendations while only 10 percent of tweets tied to a Fox story did.

It will come as no shock that people are getting a lot more of their news through social media, but the study also suggests that people are often sharing stories without reading them at all. Indeed, one of the scientists stated that the correlation between likes, shares, and actual reads is very low. The problem inherent in this system is that readers will inevitably only look at content that they already agree with in a news loop that results in an even less informed public with even more information at their fingertips than ever before. Thanks Twitter.

Chelsea Kerwin, December 14, 2016

Content Cannot Be Searched If It Is Not There

August 16, 2016

Google Europe is already dealing with a slew of “right to be forgotten” requests, but Twitter had its own recent fight with deletion related issue.  TechCrunch shares the story about “Deleted Tweet Archive PostGhost Shut Down After Twitter Cease And Desist” order.  PostGhost was a Web site that archived tweets from famous public figures.  PostGhost gained its own fame for recording deleted tweets.

The idea behind PostGhost was to allow a transparent and accurate record.  The Library of Congress already does something similar as it archives every Tweet.  Twitter, however, did not like PostGhost and sent them a cease and desist threatening to remove their API access.  Apparently,Google it is illegal to post deleted tweets, something that evolved from the European “right to be forgotten” laws.

So is PostGhost or Twitter wrong?

“There are two schools of thought when something like this happens. The first is that it’s Twitter’s prerogative to censor anything and all the things. It’s their sandbox and we just play in it.  The second school of thought says that Twitter is free-riding on our time and attention and in exchange for that they should work with their readers and users in a sane way.”

Twitter is a platform for a small percentage of users, the famous and public figures, who instantly have access to millions of people when they voice their thoughts.  When these figures put their thoughts on the Internet it has more meaning than the average tweet.  Other Web sites do the same, but it looks like public figures are exempt from this rule.  Why?  I am guessing money is exchanging hands.

 

Whitney Grace, August 16, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

There is a Louisville, Kentucky Hidden /Dark Web meet up on August 23, 2016.
Information is at this link: https://www.meetup.com/Louisville-Hidden-Dark-Web-Meetup/events/233019199/

The Missing Twitter Manual Located

April 7, 2016

Once more we turn to the Fuzzy Notepad’s advice and their Pokémon mascot, Evee.  This time we visited the fuzz pad for tips on Twitter.  The 140-character social media platform has a slew of hidden features that do not have a button on the user interface.  Check out “Twitter’s Missing Manual” to read more about these tricks.

It is inconceivable for every feature to have a shortcut on the user interface.   Twitter relies on its users to understand basic features, while the experienced user will have picked up tricks that only come with experience or reading tips on the Internet.  The problem is:

“The hard part is striking a balance. On one end of the spectrum you have tools like Notepad, where the only easter egg is that pressing F5 inserts the current time. On the other end you have tools like vim, which consist exclusively of easter eggs.

One of Twitter’s problems is that it’s tilted a little too far towards the vim end of the scale. It looks like a dead-simple service, but those humble 140 characters have been crammed full of features over the years, and the ways they interact aren’t always obvious. There are rules, and the rules generally make sense once you know them, but it’s also really easy to overlook them.”

Twitter is a great social media platform, but a headache to use because it never came with an owner’s manual.  Fuzzy notepad has lined up hint for every conceivable problem, including the elusive advanced search page.

 

Whitney Grace, April 7, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

Bing Clocks Search Speed

February 4, 2016

Despite attempts to improve Bing, it still remains the laughing stock of search engines.  Google has run it over with its self-driving cars multiple times.   DuckDuckGo tagged it as the “goose,” outran it, and forced Bing to sit in the proverbial pot.  Facebook even has unfriended Bing.  Microsoft has not given up on its search engine, so while there has been a list of novelty improvements (that Google already did or copied not long after their release) it has a ways to go.

Windows Central tells about the most recent Bing development: a bandwidth speed test in “Bing May Be Building A Speed Test Widget Within Search Results.”  Now that might be a game changer for a day, until Google releases its own version.  Usually to test bandwidth, you have to search for a Web site that provides the service.  Bing might do it on command within every search results page.  Not a bad idea, especially if you want to see how quickly your Internet runs, how fast it takes to process your query, or if you are troubleshooting your Internet connection.

The bandwidth test widget is not available just yet:

“A reader of the site Kabir tweeted a few images displaying widget like speed test app within Bing both on the web and their phone (in this case an iPhone). We were unable to reproduce the results on our devices when typing ‘speed test’ into Bing. However, like many new features, this could be either rolling out or simply A/B testing by Microsoft.”

Keep your fingers crossed that Microsoft releases a useful and practical widget.  If not just go to Google and search for “bandwidth test.”

 

Whitney Grace, February 4, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Data Companies Poised to Leverage Open Data

July 27, 2015

Support for open data, government datasets freely available to the public, has taken off in recent years; the federal government’s launch of Data.gov in 2009 is a prominent example. Naturally, some companies have sprung up to monetize this valuable resource. The New York Times reports, “Data Mining Start-Up Enigma to Expand Commercial Business.”

The article leads with a pro bono example of Enigma’s work: a project in New Orleans that uses that city’s open data to identify households most at risk for fire, so the city can give those folks free smoke detectors. The project illustrates the potential for good lurking in sets of open data. But make no mistake, the potential for profits is big, too.  Reporter Steve Lohr explains:

“This new breed of open data companies represents the next step, pushing the applications into the commercial mainstream. Already, Enigma is working on projects with a handful of large corporations for analyzing business risks and fine-tuning supply chains — business that Enigma says generates millions of dollars in revenue.

“The four-year-old company has built up gradually, gathering and preparing thousands of government data sets to be searched, sifted and deployed in software applications. But Enigma is embarking on a sizable expansion, planning to nearly double its staff to 60 people by the end of the year. The growth will be fueled by a $28.2 million round of venture funding….

“The expansion will be mainly to pursue corporate business. Drew Conway, co-founder of DataKind, an organization that puts together volunteer teams of data scientists for humanitarian purposes, called Enigma ‘a first version of the potential commercialization of public data.’”

Other companies are getting into the game, too, leveraging open data in different ways. There’s Reonomy, which supplies research to the commercial real estate market. Seattle-based Socrata makes data-driven applications for government agencies. Information discovery company Dataminr uses open data in addition to Twitter’s stream to inform its clients’ decisions. Not surprisingly, Google is a contender with its Sidewalk Labs, which plumbs open data to improve city living through technology. Lohr insists, though, that Enigma is unique in the comprehensiveness of its data services. See the article for more on this innovative company.

 

Cynthia Murrell, July 27, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Quality Peer Reviews Are More Subjective Than Real Science

July 16, 2015

Peer reviewed journals are supposed to have an extra degree of authority, because a team of experts read and critiqued an academic work.  Science 2.0 points out in the article, “Peer Review Is Subjective And The Quality Is Highly Variable” that peer-reviewed journals might not be worth their weight in opinions.

Peer reviews are supposed to be objective criticisms of work, but personal beliefs and political views are working their way into the process and have been for some time.  It should not come as a surprise, when academia has been plagued by this problem for decades.  It also has also been discussed, but peer review problems are brushed under the rug.  In true academic fashion, someone is conducting a test to determine how reliable peer review comments are:

“A new paper on peer review discusses the weaknesses we all see – it is easy to hijack peer review when it is a volunteer effort that can drive out anyone who does not meet the political or cultural litmus test. Wikipedia is dominated by angry white men and climate science is dominated by different angry white men, but in both cases they were caught conspiring to block out anyone who dissented from their beliefs.  Then there is the fluctuating nature of guidelines. Some peer review is lax if you are a member, like at the National Academy of Sciences, while the most prominent open access journal is really editorial review, where they check off four boxes and it may never go to peer review or require any data, especially if it matches the aesthetic self-identification of the editor or they don’t want to be yelled at on Twitter.”

The peer review problem is getting worse in the digital landscape.  There are suggested solutions, such as banning all fees associated with academic journals and databases, homogenizing review criteria across fields, but the problems would be far from corrected.  Reviewers are paid to review works, which likely involves kickbacks of some kind.  Also trying to get different academic journals, much less different fields to standardize an issue will take a huge amount of effort and work, if they can come to any sort of agreement.

Fixing the review system will not be done quickly and anytime money is involved, the process is slowed even further.  In short, academic journals are far from being objective, which is why it pays to do your own research and take everything with a grain of salt.

 

Whitney Grace, July 16, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Twitter Gets a Search Facelift

June 25, 2015

Twitter has been experimenting with improving its search results and according to TechCrunch the upgrade comes via a new search results interface: “Twitter’s New Search Results Interface Expands To All Users.”  The new search results interface is the one of the largest updates Twitter has made in 2015.  It is supposed to increase the ease with a cleaner look and better filtering options.  Users will now be able to filter search results by live tweets, photos, videos, news, accounts, and more.

Twitter made the update to help people better understand how to use the message service and to take a more active approach to using it, rather than passively reading other peoples tweets.  The update is specifically targeted at new Twitter users.

The tweaked search interface will return tweets related to the search phrase or keyword, but that does not mean that the most popular tweets are returned:

“In some cases, the top search result isn’t necessarily the one with the higher metrics associated with it – but one that better matches what Twitter believes to be the searcher’s “intent.” For example, a search for “Steve Jobs” first displays a heavily-retweeted article about the movie’s trailer, but a search for “Mad Men” instead first displays a more relevant tweet ahead of the heavily-favorited “Mad Men” mention by singer Lorde.”

The new interface proves to be simpler and better list trends, related users, and news.  It does take a little while to finesse Twitter, which is a daunting task to new users.  Twitter is not the most popular social network these day and it’s using these updates to increase its appeal.

Whitney Grace, June 25, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Search Improvements at Twitter

June 18, 2015

Search hasn’t exactly been Twitter’s strong point in the past. Now we learn that the site is rolling out its new and improved search functionality to all (logged-in) users in TechCrunch’s article, “Twitter’s New Search Results Interface Expands to All Web Users.” Reporter Sarah Parez tells us:

“Twitter is now rolling out a new search results interface to all logged-in users on the web, introducing a cleaner look-and-feel and more filtering options that let you sort results by top tweets, ‘live’ tweets, accounts, photos, videos, news and more. The rollout follows tests that began in April which then made the new interface available to a ‘small group’ of Twitter users the company had said at the time. The updated interface is one of the larger updates Twitter’s search engine has seen in recent months, and it’s meant to make the search interface itself easier to use in terms of switching between tweets, accounts, photos and videos.”

Twitter has been working on other features meant to make the site easier to use. For example, the revamped landing page will track news stories in specified categories. Users can also access the latest updates through the “instant timeline” or “while you were away” features. The article supplies a few search-interface before-and-after screenshots. Naturally, Twitter promises to continue improving the feature.

Cynthia Murrell, June 18, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

  • Archives

  • Recent Posts

  • Meta