Google Gives Third Day Keynote at Pubcon

November 1, 2016

Technology conferences are the thing to do when you want to launch a product, advertise a new business, network, or get a general consensus about the tech industry.  There are multiple conferences revolving around different aspects in the tech industry held each month.  In October 2016, Pubcon took place in Las Vegas, Nevada and they had a very good turn out.  The thing that makes a convention, though, is the guests.  Pubcon did not disappoint as on the third day, Google’s search expert Gary Illyes delivered the morning keynote.  (Apparently, Illyes also hold the title Chief of Sunshine and Happiness at Google).  Outbrain summed up the highlights of Pubcon 2016’s third day in “Pubcon 2016 Las Vegas: Day 3.”

Illyes spoke about search infrastructure, suggesting that people switch to HTTPS.  His biggest push for HTTPS was that it protected users from “annoying scenarios” and it is good for UX.  Google is also pushing for more mobile friendly Web sites.  It will remove “mobile friendly” from search results and AMP can be used to make a user-friendly site.  There is even bigger news about page ranking in the Google algorithm:

Our systems weren’t designed to get two versions of the same content, so Google determines your ranking by the Desktop version only. Google is now switching to a mobile version first index. Gary explained that there are still a lot of issues with this change as they are losing a lot of signals (good ones) from desktop pages that are don’t exist on mobile. Google created a separate mobile index, which will be its primary index. Desktop will be a secondary index that is less up to date.

As for ranking and spam, Illyes explained that Google is using human evaluators to understand modified search better, Rankbrain was not mentioned much, he wants to release the Panda algorithm, and Penguin will demote bad links in search results.  Google will also release “Google O for voice search.

It looks like Google is trying to clean up search results and adapt to the growing mobile market, old news and new at the same time.

Whitney Grace, November 1, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Big Brother Now in Corporate Avatar

October 31, 2016

Companies in the US are now tracking employee movements and interactions to determine how productive their assets are. Badges created by Humanyze; embedded in employee IDs track these key indicators and suggest appropriate measures to help improve employee productivity.

An article published on Business Insider titled Employees at a dozen Fortune 500 companies wear digital badges that watch and listen to their every move reveals:

Humanyze visualizes the data as webs of social interaction that reveal who’s talking to whom on a by-the-second basis. The goal: Revolutionize how companies think about how they organize themselves.

The badges though only track employees who have explicitly given permission to track their working hours, imagination is the only inhibiting factor that will determine how the meta-data can be used. For instance, as the badges are being embedded into employee IDs (that already have chips), it can also be used by someone with right tools to track the movement of an employee beyond working hours.

Social engineering in the past has been used in the past to breach IT security at large organizations. With Humanyze badges, hackers now will have one more weapon in their arsenal.

One worrisome aspect of these badges becomes apparent here:

But the badges are already around the necks of more than 10,000 employees in the US, Waber says. They’ve led to wild insights. One client moves the coffee machine around each night, so the next morning employees in nearby departments naturally talk more.

The ironic part is, companies are exposing themselves to this threat. Google, Facebook, Amazon are already tracking people online. With services like Humanyze, the Big Brother has also entered the corporate domain. The question is not how the data will be used by hacked; it’s just when?

Vishal Ingole October 31, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Facebook Still Having Trouble with Trending Topics

October 28, 2016

Despite taking action to fix its problems with Trending Topics,  Facebook is still receiving criticism on the issue. A post at Slashdot tells us, “The Washington Post Tracked Facebook’s Trending Topics for 3 Weeks, Found 5 Fake Stories and 3 Inaccurate Articles.” The Slashdot post by msmash cites a Washington Post article. (There’s a paywall if, like me, you’ve read your five free WP articles for this month.) The Post monitored Facebook’s Trending Topics for three weeks and found that issue far from resolved. Msmash quotes the report:

The Megyn Kelly incident was supposed to be an anomaly. An unfortunate one-off. A bit of (very public, embarrassing) bad luck. But in the six weeks since Facebook revamped its Trending system — and a hoax about the Fox News Channel star subsequently trended — the site has repeatedly promoted ‘news’ stories that are actually works of fiction. As part of a larger audit of Facebook’s Trending topics, the Intersect logged every news story that trended across four accounts during the workdays from Aug. 31 to Sept. 22. During that time, we uncovered five trending stories that were indisputably fake and three that were profoundly inaccurate. On top of that, we found that news releases, blog posts from sites such as Medium and links to online stores such as iTunes regularly trended. Facebook declined to comment about Trending on the record.

It is worth noting that the team may not have caught every fake story, since it only checked in with Trending Topics once every hour. Quite the quandary. We wonder—would a tool like Google’s new fact-checking feature help? And, if so, will Facebook admit its rival is on to something?

Cynthia Murrell, October 28, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

How to Find an Email Address

October 27, 2016

Like any marketers, search engine optimizers must reach out to potential clients, and valid email addresses are important resources. Now, Search Engine Journal explains “How to Find Anyone’s Email Address in 60 Seconds or Less.” Anyone’s, really? Perhaps that’s a bit of an exaggeration.

SEO pro, Joshua Daniels discusses six techniques to quickly find an email address. He writes:

If you’re a specialist in SEO or link acquisition, then you’ll know that generic email addresses are as much use as a chocolate fireguard when it comes to outreach. You need to develop personal connections with influencers, regardless of whether you work in PR or SEO, it’s always the same. But finding the right person’s email address can be a draining, time-consuming task. Who has time for that?

Well, actually, it’s not so difficult, or time-consuming. In this post, I’m going to walk you through the exact step-by-step process our agency uses to find (almost) anyone’s email address, in 60 seconds or less!

For each suggestion, Daniels provides instructions, most with screen shots. First, he recommends LinkedIn’s search function paired with Email Hunter, a tool which integrates with the career site. If that doesn’t work, he says, try a combination of the Twitter analyzer Followerwonk and corporate-email-finder Voila Norbert.

The article also suggests leveraging Google’s search operators with one of these formats: [site:companywebsite.com + “name” + contact] or [site:companywebsite.com + “name” + email]. To test whether an email address is correct, verify it with MailTester, and to target someone who posts on Twitter, search the results of All My Tweets for keywords like “email” or “@companyname.com”. If all else fails, Daniels advises, go old school—“… pick up the phone and just ask.”

Cynthia Murrell, October 27, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Google Introduces Fact Checking Tool

October 26, 2016

If it works as advertised, a new Google feature will be welcomed by many users—World News Report tells us, “Google Introduced Fact Checking Feature Intended to Help Readers See Whether News Is Actually True—Just in Time for US Elections.” The move is part of a trend for websites, who seem to have recognized that savvy readers don’t just believe everything they read. Writer Peter Woodford reports:

Through an algorithmic process from schema.org known as ClaimReview, live stories will be linked to fact checking articles and websites. This will allow readers to quickly validate or debunk stories they read online. Related fact-checking stories will appear onscreen underneath the main headline. The example Google uses shows a headline over passport checks for pregnant women, with a link to Full Fact’s analysis of the issue. Readers will be able to see if stories are fake or if claims in the headline are false or being exaggerated. Fact check will initially be available in the UK and US through the Google News site as well as the News & Weather apps for both Android and iOS. Publishers who wish to become part of the new service can apply to have their sites included.

Woodford points to Facebook’s recent trouble with the truth within its Trending Topics feature and observes that many people are concerned about the lack of honesty on display this particular election cycle. Google, wisely, did not mention any candidates, but Woodford notes that Politifact rates 71% of Trump’s statements as false (and, I would add, 27% of Secretary Clinton’s statements as false. Everything is relative.)  If the trend continues, it will be prudent for all citizens to rely on (unbiased) fact-checking tools on a regular basis.

Cynthia Murrell, October 26, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Trending Topics: Google and Twitter Compared

October 25, 2016

For those with no time to browse through the headlines, tools that aggregate trending topics can provide a cursory way to keep up with the news. The blog post from communications firm Cision, “How to Find Trending Topics Like an Expert,” examines the two leading trending topic tools—Google’s and Twitter’s. Each approaches its tasks differently, so the best choice depends on the user’s needs.

Though the Google Trends homepage is limited, according to writer Jim Dougherty, one can get further with its extension, Google Explore. He elaborates:

If we go to the Google Trends Explore page (google.com/trends/explore), our sorting options become more robust. We can sort by the following criteria:

*By country (or worldwide)

*By time (search within a customized date range – minimum: past hour, maximum: since 2004)

*By category (arts and entertainment, sports, health, et cetera)

*By Google Property (web search, image search, news search, Google Shopping, YouTube)

You can also use the search feature via the trends page or explore the page to search the popularity of a search term over a period (custom date ranges are permitted), and you can compare the popularity of search terms using this feature as well. The Explore page also allows you to download any chart to a .csv file, or to embed the table directly to a website.

The write-up goes on to note that there are no robust third-party tools to parse data found with Google Trends/ Explore, because the company has not made the API publicly available.

Unlike Google, we’re told, Twitter does not make it intuitive to find and analyze trending topics. However, its inclusion of location data can make Twitter a valuable source for this information, if you know how to find it. Dougherty suggests a work-around:

To ‘analyze’ current trends on the native Twitter app, you have to go to the ‘home’ page. In the lower left of the home page you’ll see ‘trending topics’ and immediately below that a ‘change’ button which allows you to modify the location of your search.

Location is a huge advantage of Twitter trends compared to Google: Although Google’s data is more robust and accessible in general, it can only be parsed by country. Twitter uses Yahoo’s GeoPlanet infrastructure for its location data so that it can be exercised at a much more granular level than Google Trends.

Since Twitter does publicly share its trending-topics API, there are third-party tools one can use with Twitter Trends, like TrendoGate, TrendsMap, and ttHistory. The post concludes with a reminder to maximize the usefulness of data with tools that “go beyond trends,” like (unsurprisingly) the monitoring software offered by Daugherty’s company. Paid add-ons may be worth it for some enterprises, but we recommend you check out what is freely available first.

Cynthia Murrell, October 25, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Pattern of Life Analysis to Help Decrypt Dark Web Actors

October 18, 2016

Google funded Recorded Future plans to use technologies like natural language processing, social network analysis and temporal pattern analysis to track Dark Web actors. This, in turn, will help security professionals to detect patterns and thwart security breaches well in advance.

An article Decrypting The Dark Web: Patterns Inside Hacker Forum Activity that appeared on DarkReading points out:

Most companies conducting threat intelligence employ experts who navigate the Dark Web and untangle threats. However, it’s possible to perform data analysis without requiring workers to analyze individual messages and posts.

Recorded Future which deploys around 500-700 servers across the globe monitors Dark Web forums to identify and categorize participants based on their language and geography. Using advanced algorithms, it then identifies individuals and their aliases who are involved in various fraudulent activities online. This is a type of automation where AI is deployed rather than relying on human intelligence.

The major flaw in this method is that bad actors do not necessarily use same or even similar aliases or handles across different Dark Web forums. Christopher Ahlberg, CEO of Recorded Future who is leading the project says:

A process called mathematical clustering can address this issue. By observing handle activity over time, researchers can determine if two handles belong to the same person without running into many complications.

Again, researchers and not AI or intelligent algorithms will have to play a crucial role in identifying the bad actors. What’s interesting is to note that Google, which pretty much dominates the information on Open Web is trying to make inroads into Dark Web through many of its fronts. The question is – will it succeed?

Vishal Ingole, October 18, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Labor Shortage of Cyber Security Professionals

October 13, 2016

It’s no surprise that hackers may be any age, but that teenagers could cause 60 million pounds worth of damage to a corporation is newsworthy, regardless of age. The Telegraph published an article, From GCHQ to Google: the battle to outpace hackers in the cyber race, reporting on this. A 15-year-old boy hacked the TalkTalk computer network stole personal data, including financial information, of 157,000 customers. This comes at a time when the UK government announced plans to invest £1.9 billion in cyber security over the next five years. We also learned,

No amount of money will help overcome one of the greatest difficulties in the security industry though: the lack of skilled people. By 2019 there will be a global shortfall of 1.5 million security professionals, according to ISC Squared, a security certification and industry education body. And the numbers could in fact be significantly higher, given that there are already more than 1 million cybersecurity positions unfilled worldwide, according to a 2015 Cisco report. Heading up the government’s move to train more cyber defenders is spook agency GCHQ, which sponsors academic bursaries, runs summer camps and training days, holds competitions and has created a cyber excellence accreditation for top universities and masters programmes. The intention is to spot talent in children and nurture them through their education, with the end goal being a career in the industry.

The problem of for any rocketing industry ready to blast off always seems to boil down to people. We have seen it with big data in all of it’s forms from electronic medical records to business analytics to cyber security. It seems industry is most fertile when people and technology work best stride-by-stride.

Megan Feil, October 13, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Bing Finally Turned a Profit

October 7, 2016

Bing is the redheaded stepchild of search engines, but according to the Motley Fool the Microsoft owned search engine started to earn a profit during its last fiscal year.  The Motley Fool shares the story in “Bing Became Profitable Last Year.  Can It Keep Up?” Bing’s search advertising generated $5.5 billion in estimated revenue, which is more than what Twitter and Tencent earned.  Into 2016, Bing continues to turn a profit.

Bing’s revenue grew in Microsoft’s last fiscal year quarter and in June 40% of the search revenue came from Windows 10 devices.  When the free Windows 10 upgrade ends soon and thus will end the growth, as Bing will no longer be see a high adoption rate.  Microsoft will continue to grow Bing and profit is predicted to continue to rise:

One important factor is that Microsoft outsourced its display advertising business at the beginning of fiscal 2016. That has allowed the company to focus its sales team on its search advertisements, which generally carry higher prices and margins than display ads. That makes the sales team more cost-efficient for Microsoft to run while it collects high-margin revenue from outsourcing its display ads.

This means Microsoft will raise its ad prices and will focus on selling more ads to appear with search results.  Bing will never compete with Google’s massive revenue, but it has proven that it is less of a copycat and a stable, profit generating search engine.

Whitney Grace, October 7, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Reverse Image Searching Is Easier Than You Think

October 6, 2016

One of the newest forms of search is using actual images.  All search engines from Google to Bing to DuckDuckGo have an image search option, where using keywords you can find an image to your specifications.  It seemed to be a thing of the future to use an actual image to power a search, but it has actually been around for a while.  The only problem was that reverse image searching sucked and returned poor results.

Now the technology has improved, but very few people actually know how to use it.  ZDNet explains how to use this search feature in the article, “Reverse Image Searching Made Easy…”. It explains that Google and TinEye are the best way to begin reverse image search. Google has the larger image database, but TinEye has the better photo experts.  TinEye is better because:

TinEye’s results often show a variety of closely related images, because some versions have been edited or adapted. Sometimes you find your searched-for picture is a small part of a larger image, which is very useful: you can switch to searching for the whole thing. TinEye is also good at finding versions of images that haven’t had logos added, which is another step closer to the original.

TinEye does have its disadvantages, such as outdated results and not being able to find them on the Web.  In some cases Google is the better choice as one can search by usage rights.  Browser extensions for image searching are another option.  Lastly if you are a Reddit user, Karma Decay is a useful image search tool and users often post comments on the image’s origin.

The future of image searching is now.

Whitney Grace, October 6, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta