Google and the Future of Search Engine Optimization
September 30, 2016
Regular readers know that we are not big fans of SEO (Search Engine Optimization ) or its champions, so you will understand our tentative glee at the Fox News headline, “Is Google Trying to Kill SEO?” The article is centered around a Florida court case whose plaintiff is e.ventures Worldwide LLC, accused by Google of engaging in “search-engine manipulation”. As it turns out, that term is a little murky. That did not stop Google from unilaterally de-indexing “hundreds” of e.ventures’ websites. Writer Dan Blacharski observes:
The larger question here is chilling to virtually any small business which seeks a higher ranking, since Google’s own definition of search engine manipulation is vague and unpredictable. According to a brief filed by e-ventures’ attorney Alexis Arena at Flaster Greenberg PC, ‘Under Google’s definition, any website owner that attempts to cause its website to rank higher, in any manner, could be guilty of ‘pure spam’ and blocked from Google’s search results, without explanation or redress. …
The larger question here is chilling to virtually any small business which seeks a higher ranking, since Google’s own definition of search engine manipulation is vague and unpredictable. According to a brief filed by e-ventures’ attorney Alexis Arena at Flaster Greenberg PC, ‘Under Google’s definition, any website owner that attempts to cause its website to rank higher, in any manner, could be guilty of ‘pure spam’ and blocked from Google’s search results, without explanation or redress.
We cannot share Blacharski’s alarm at this turn of events. In our humble opinion, if websites focus on providing quality content, the rest will follow. The article goes on to examine Google’s first-amendment based stance, and considers whether SEO is even a legitimate strategy. See the article for its take on these considerations.
Cynthia Murrell, September 30, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
SEO Is a Dirty Web Trick
August 17, 2016
Search engine optimization is the bane of Web experts. Why? If you know how to use it you can increase your rankings in search engines and drive more traffic to your pages, but if you are a novice at SEO you are screwed. Search Engine Land shares some bad SEO stories in “SEO Is As Dirty As Ever.”
SEO has a bad reputation in many people’s eyes, because it is viewed as a surreptitious way to increase traffic. However, if used correctly SEO is not only a nifty trick, but is a good tool. As with anything, however, it can go wrong. One bad SEO practice is using outdated techniques like keyword stuffing, copying and pasting text, and hidden text. Another common mistake is not having a noindex tag, blocking robots, JavaScript frameworks not being indexed.
Do not forget other shady techniques like the always famous shady sales, removing links, paid links, spam, link networks, removing links, building another Web site on a different domain, abusing review sites, and reusing content. One thing to remember is that:
“It’s not just local or niche companies that are doing bad things; in fact, enterprise and large websites can get away with murder compared to smaller sites. This encourages some of the worst practices I’ve ever seen, and some of these companies do practically everything search engines tell them not to do.”
Ugh! The pot is identifying another pot and complaining about its color and cleanliness.
Whitney Grace, August 17, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
There is a Louisville, Kentucky Hidden /Dark Web meet up on August 23, 2016.
Information is at this link: https://www.meetup.com/Louisville-Hidden-Dark-Web-Meetup/events/233019199/
Behind the Google Search Algorithm
June 16, 2016
Trying to reveal the secrets behind Google’s search algorithm is almost harder than breaking into Fort Knox. Google keeps the 200 ranking factors a secret, what we do know is that keywords do not play the same role that they used to and social media does play some sort of undisclosed factor. Search Engine Journal shares that “Google Released The Top 3 ranking Factors” that offers a little information to help SEO.
Google Search Quality Senior Strategist Andrey Lipattsev shared that the three factors are links, content, and RankBrain-in no particular order. RankBrain is an artificial intelligence system that relies on machine learning to help Google process search results to push the more relevant search results to the top of the list. SEO experts are trying to figure out how this will affect their jobs, but the article shares that:
“We’ve known for a long time that content and links matter, though the importance of links has come into question in recent years. For most SEOs, this should not change anything about their day-to-day strategies. It does give us another piece of the ranking factor puzzle and provides content marketers with more ammo to defend their practice and push for growth.”
In reality, there is not much difference, except that few will be able to explain how artificial intelligence ranks particular sites. Nifty play, Google.
Whitney Grace, June 15, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Google Changes Its Algorithm Again
May 26, 2016
As soon as we think we have figured out how to get our content to the top of Google’s search rankings, the search engine goes and changes its algorithms. The Digital Journal offers some insight into “Op-Ed: How Will The Google 2016 Algorithm Change Affect Our Content?”
In early 2016, Google announced they were going to update their Truth Algorithm and it carries on many of the aspects they have been trying to push. Quality content over quantity is still very important. Keyword heavy content is negated in favor of pushing Web sites that offer relevant, in-depth content and that better answer a user’s intent.
SEO changes took a dramatic turn with a Penguin uploaded and changes in the core algorithm. The biggest game changer is with mobile technologies:
“The rapid advancement of mobile technologies is deeply affecting the entire web scenario. Software developers are shifting towards the development of new apps and mobile websites, which clearly represent the future of information technology. Even the content for mobile websites and apps is now different, and Google had to account for that with the new ranking system changes. The average mobile user is very task oriented and checks his phones just to quickly accomplish a specific task, like finding a nearby café or cinema. Mobile-oriented content must be much shorter and concise than web-oriented one. The average web surfer wants to know, learn and explore things in a much more relaxed setting.”
Google wants to clear its search results of what is known as unviable information and offer users a better quality search experience for both their mobile devices and standard desk computers. Good to know that someone wants to deliver a decent product.
Whitney Grace, May 26, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Mastering SEO Is Mastering the Internet
May 5, 2016
Search engine optimization, better known as SEO, is one of the prime tools Web site owners must master in order for their site to appear in search results. A common predicament most site owners find themselves in is that they may have a fantastic page, but if a search engine has not crawled it, the site might as well not exist. There are many aspects to mastering SEO and it can be daunting to attempt to make a site SEO friendly. While there are many guides that explain SEO, we recommend Mattias Geniar’s “A Technical Guide To SEO.”
Some SEO guides get too much into technical jargon, but Geniar’s approach uses plain speak so even if you have the most novice SEO skills it will be helpful. Here is how Geniar explains it:
“If you’re the owner or maintainer of a website, you know SEO matters. A lot. This guide is meant to be an accurate list of all technical aspects of search engine optimisation. There’s a lot more to being “SEO friendly” than just the technical part. Content is, as always, still king. It doesn’t matter how technically OK your site is, if the content isn’t up to snuff, it won’t do you much good.”
Understanding the code behind SEO can be challenging, but thank goodness content remains the most important aspect part of being picked up by Web crawlers. These tricks will only augment your content so it is picked up quicker and you will receive more hits on your site.
Whitney Grace, May 5, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Scientific Research Has Turned into a Safe Space
December 31, 2015
The Internet is a cold, cruel place, especially if you hang out in the comments section on YouTube, eBay forums, social media, and 4chan. If you practice restraint and limit your social media circles to trusted individuals, you can surf the Internet without encountering trolls and haters. Some people do not practice common sense, so they encounter many hateful situations on the Internet and as a result they demand “safe spaces.” Safe spaces are where people do not encounter anything negative.
Safe spaces are stupid. Period. What is disappointing is that the “safe space” and “only positive things” has made its way into the scientific community according to Nature in the article, “‘Novel, Amazing, Innovative’: Positive Words On The Rise In Science Papers.”
The University Medical Center in the Netherlands studied the use of positive and negative words in the titles of scientific papers and abstracts from 1974-2014 published on the medical database PubMed. The researchers discovered that positive words in titles grew from 2% in 1974 to 17.5% in 2014. Negative word usage increased from 1.3% to 2.4%, while neutral words did not see any change. The trend only applies to research papers, as the same test was run using published books and it showed little change.
“The most obvious interpretation of the results is that they reflect an increase in hype and exaggeration, rather than a real improvement in the incidence or quality of discoveries… The findings “fit our own observations that in order to get published, you need to emphasize what is special and unique about your study,” he says. Researchers may be tempted to make their findings stand out from thousands of others — a tendency that might also explain the more modest rise in usage of negative words.”
While there is some doubt associated with the findings, because it was only applied to PubMed. The original research team thinks that it points to much larger problem, because not all research can be “innovative” or “novel.” The positive word over usage is polluting the social, psychological, and biomedical sciences.
Under the table, this really points to how scientists and researchers are fighting for tenure. What would this mean for search engine optimization if all searches and descriptions had to have a smile? Will they even invent a safe space filter?
Whitney Grace, December 31, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
SEO Tips Based on Recent Google Search Quality Guidelines
December 30, 2015
Google has recently given search-engine optimization pros a lot to consider, we learn from “Top 5 Takeaways from Google’s Search Quality Guidelines and What They Mean for SEO” at Merkle’s RKG Blog. Writer Melody Pettula presents five recommendations based on Google’s guidelines. She writes:
“A few weeks ago, Google released their newest Search Quality Evaluator Guidelines, which teach Google’s search quality raters how to determine whether or not a search result is high quality. This is the first time Google has released the guidelines in their entirety, though versions of the guidelines have been leaked in the past and an abridged version was released by Google in 2013. Why is this necessary? ‘Quality’ is no longer simply a function of text on a page; it differs by device, location, search query, and everything we know about the user. By understanding how Google sees quality we can improve websites and organic performance. Here’s a countdown of our top 5 takeaways from Google’s newest guidelines and how they can improve your SEO strategy.”
We recommend any readers interested in SEO check out the whole article, but here are the five considerations Pettula lists, from least to most important: consider user intent; supply supplementary content; guard your reputation well; consider how location affects user searches; and, finally, “mobile is the future.” On that final point, the article notes that Google is now almost entirely focused on making things work for mobile devices. SEO pros would do well to keep that new reality in mind.
Cynthia Murrell, December 30, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
RankBrain, the Latest AI from Google, Improves Search Through Understanding and Learning
December 23, 2015
The article on Entrepreneur titled Meet RankBrain, the New AI Behind Google’s Search Results introduces the AI that Google believes will aid the search engine in better understanding the queries it receives. RankBrain is capable of connecting related words to the search terms based on context and relevance. The article explains,
“The real intention of this AI wasn’t to change visitors’ search engine results pages (SERPs) — rather, it was to predict them. As a machine-learning system, RankBrain actually teaches itself how to do something instead of needing a human to program it…According to Jack Clark, writing for Bloomberg on the topic: “[Rankbrain] uses artificial intelligence to embed vast amounts of written language into mathematical entities — called vectors — that the computer can understand.”
Google scientist Greg Corrado spoke of RankBrain actually exceeding his expectations. In one experiment, RankBrain beat a team of search engineers in predicting which pages would rank highest. (The engineers were right 70% of the time, RankBrain 80%.) The article also addresses concerns that many vulnerable brands relying on SEOs may have. The article ventures to guess that it will be mainly newer brands and services that will see a ranking shift. But of course, with impending updates, that may change.
Chelsea Kerwin, December 23, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Redundant Dark Data
September 21, 2015
Have you heard the one about how dark data hides within an organization’s servers and holds potential business insights? Wait, you did not? Then where have you been for the past three years? Datameer posted an SEO heavy post on its blog called, “Shine Light On Dark Data.” The post features the same redundant song and dance about how dark data retained on server has valuable customer trend and business patterns that can put them bring them out ahead of the competition.
One new fact is presented: IDC reports that 90% of digital data is dark. That is a very interesting fact and spurs information specialists to action to get a big data plan in place, but then we are fed this tired explanation:
“This dark data may come in the form of machine or sensor logs that when analyzed help predict vacated real estate or customer time zones that may help businesses pinpoint when customers in a specific region prefer to engage with brands. While the value of these insights are very significant, setting foot into the world of dark data that is unstructured, untagged and untapped is daunting for both IT and business users.”
The post ends on some less than thorough advice to create an implementation plan. There are other guides on the Internet that better prepare a person to create a big data action guide. The post’s only purpose is to serve as a search engine bumper for Datameer. While Datameer is one of the leading big data software providers, one would think they wouldn’t post a “dark data definition” post this late in the game.
Whitney Grace, September 21, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Advice for Smart SEO Choices
August 11, 2015
We’ve come across a well-penned article about the intersection of language and search engine optimization by The SEO Guy. Self-proclaimed word-aficionado Ben Kemp helps website writers use their words wisely in, “Language, Linguistics, Semantics, & Search.” He begins by discrediting the practice of keyword stuffing, noting that search-ranking algorithms are more sophisticated than some give them credit for. He writes:
“Search engine algorithms assess all the words within the site. These algorithms may be bereft of direct human interpretation but are based on mathematics, knowledge, experience and intelligence. They deliver very accurate relevance analysis. In the context of using related words or variations within your website, it is one good way of reinforcing the primary keyword phrase you wish to rank for, without over-use of exact-match keywords and phrases. By using synonyms, and a range of relevant nouns, verbs and adjectives, you may eliminate excessive repetition and more accurately describe your topic or theme and at the same time, increase the range of word associations your website will rank for.”
Kemp goes on to lament the dumbing down of English-language education around the world, blaming the trend for a dearth of deft wordsmiths online. Besides recommending that his readers open a thesaurus now and then, he also advises them to make sure they spell words correctly, not because algorithms can’t figure out what they meant to say (they can), but because misspelled words look unprofessional. He even supplies a handy list of the most often misspelled words.
The development of more and more refined search algorithms, it seems, presents the opportunity for websites to craft better copy. See the article for more of Kemp’s language, and SEO, guidance.
Cynthia Murrell, August 11, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

