Google Study Finds Web Banners Ineffective

August 31, 2011

On Saturday, one reader sent us a link to this story: “Is Google’s Search for Quality Content a Ruse for a Massive Diversion of Cash to Its Own Sites?” We are not sure if the points in the write up are spot on, but the theme of the article connected to another story we noticed.

According to a 2010 survey by Google, the average click through rate for banner ads this past year was 0.09 percent which is down from 0.1 percent in 2009. This decrease leads me to believe that attempts to make banner ads more inviting to potential customers are failing miserably. However, the article Google: Click-Through Rates Fell in 2010 [Study] states:

[The study] found that the format of a display ad can make a difference. A 250×250 pixel ad using Flash got the highest CTR of any format — 0.26%. The worst performers were vertical 120×240 banners with Flash and a full (468×60) banner with Flash, which both got rates of 0.05%.

As with television ads, it’s difficult to determine the effectiveness of digital advertising by only looking at click-through. It is important that we recognize that banner-ads are not created inside a vacuum, but are rather one small part of a larger complex advertising strategy. Needless to say, if studies continue to come out showing any aspect of this strategy to be failing it could lead to major implications for Google.

At lunch on Sunday, I discussed these two items with two people immersed in Web advertising. Three observations stuck in my mind:

First, if there is a softening in click through or online ad revenue, Google will have little choice but find ways to pump up its revenue.

Second, the notion of social media fatigue seems germane. People may be tired of online ads. The result is to shift to a more low profile “pay to play” model. Overt ads may be on the down side after a long run up.

Third, the urgency for organizations like Google and Flipbook to find a way to inject rich media is an indication that the ad revenues flowing to television advertisers are the next Klondike.

I am not sure what to think, but this notion that online ad revenue may need some xoskeletal supports is fascinating. There are significant implications for objective search results as well.

Jasmine Ashton, August 31, 2011

Sponsored by Pandia.com

Google and Its Algorithm: Persistence and Change

August 23, 2011

The pot of gold at the end of the internet’s rainbow is Google’s algorithm it uses for determining PageRank. Google, being the most used search engine on the planet, can make or break a website just by where the website falls in search results. The article, Google Algorithm Change History, on SEOmoz, provides major dates of Google algorithm changes.

The PageRank is a patented system of algorithms that is the core of Google’s search engine. Named after Google co-founder, Larry Page, Stanford University holds the patent as Page was a student there when he and Sergey Brin began creating Google. PageRank takes several things into consideration when ranking a webpage such as the number of links other sites have to it, language used within the page, and age of the page.

Because most people click on links found on the first or second page of search results, those 20-40 slots are prime real estate. An industry has popped up, search engine optimization (SEO), as a result promising clients that they can land websites in the top slots. SEOs manipulate search engines to get the results.

Keeping tabs on Google algorithm updates becomes vital to the survival of SEOs. As the article explains,

Each year, Google changes its search algorithm up to 500-600 times. While most of these changes are minor, every few months Google rolls out a “major” algorithmic update that affect search results in significant ways. For search marketers, knowing the dates of these Google updates can help explain changes in rankings and organic website traffic.

While these ‘major’ changes the article speaks of may have an impact on how SEOs conduct their business, the truth of the matter is that Google still relies primarily on the same system it has always used. The impact of Google’s method hits search engine optimization experts hard. The SEO pros, some of whom were former art history majors, must convince their clients that SEO magic can improve a Google result for a particular client. Wow. That is hard since Google is making changes frequently. SEO experts have some sort of answer, but is it the right one?

Catherine Lamsfuss, August 23, 2011

Sponsored by Pandia.com

Some Reverse Engineering Baloney about Google

August 18, 2011

This search engine optimization baloney is really getting on my nerves. The idea behind an index is to point to content about something. I don’t want a book index to point me to a page with information that does not match the index. In the free Web search world, the online services depending on advertising have created a mess, and now that mess has undermined the notion of relevance. Now I know what’s relevant. Relevance means those who paid to make a connection between a word or concept and the link. Precision? Recall? Works perfectly when you think about how money delivers eyeballs.

Now that I have explained my disdain for ad-supported services, let me defend one of the outfits which has driven a Hummer over the tidy algorithms for measuring precision and recall. I saw in one of my newsfeeds the spider bait title “How to Reverse Engineer Google Algorithms.”

I knew a trick when I saw.

Here’s a memorable comment:

These web crawlers are trying to get smarter at natural language processing, a sibling of computational linguistics, and they do get smarter, every day – and not just when Google or another search engine company makes major announcements on things like the Panda algorithm. Perhaps in the real world these web crawler ”smarts” are increasingly dumb and blind to global business-to-business (B2B) companies’ online marketing efforts, but that’s another story. If your company is presented with a static list of target ”phrase depth” or ”title counts” or other on-page or off-page factors as a to-do list for search engine optimization that are purported to affect every site on the web the same way, you need to know that this list is pure fiction.

Yep, search engine optimization experts are going to generate traffic with insights like these. Cut to the chase. Fire the PR firm. Just buy Adwords.

Stephen E Arnold, August 18, 2011

Sponsored by Pandia.com, publishers of The New Landscape of Enterprise Search

Google Conspiracy or Poorly Designed Web Sites?

August 5, 2011

It’s got to be tough being alpha dog. At least, it seems that way for Google who has one of the largest most used search engines in the world. With a slew of patent infringement lawsuits pending and several states looking into anti-trust issues associated with the top-dog, yet another company is complaining about Google’s business practices, as explained in the article, Local Business Site Challenges Google Ranking, on SiliconValley.com.

 How search engines determine ranking is a closely guarded secret, a series of algorithms that can make or break websites, depending on where they fall in the rankings. This is precisely what ShopCity is complaining about. According to the small company, Google is ‘manually monkeying’ with the rankings in order for ShopCity sites to appear lower than Google owned competing sites.

 Google asserts that ShopCity sites are low in the ranking because…well, they are basically bad sites. While ShopCity admits they are still working on building several of their sites (meaning they know their sites are rotten), many of the sites in the Bay Area, like ShopPaloAlto and ShopPleasanton, are alive and stuffed full of helpful and legitimate information. They believe those sites should be higher up in the rankings, as they are on Yahoo!

“Search industry expert Danny Sullivan, editor in chief of Search Engine Land, said such suspicions about a site as small as ShopPaloAlto.com are “ludicrous. If that was what (Google) was worried about, you would never find Yelp,” a formidable competitor for Google that offers restaurant reviews and business listings, Sullivan said. But Sullivan said Google should be able to differentiate between higher-quality ShopCity sites such as the Bay Area sites, and placeholder sites waiting until ShopCity makes partnerships with local groups for listings.”

 Is ShopCity going to be just another flea on Google’s back, or will something come from their claims? Coincidentally, after the FTC inquiry was announced, ShopCity’s Bay Area sites jumped in Google rankings, causing a 400% increase in traffic, but then plummeted back to page seven of search results after only three weeks. A Google imposed penalty for outside complaints if the official explanation.

 Catherine Lamsfuss, August 5, 2011

Sponsored by Quasar CA, your source for informed financial advisory services

Search Engine Optimization Thrashing

July 20, 2011

The addled goose is not into clicks. The goose is near retirement. The content he posts is primarily an aide de memoire. The topics he covers are of interest to a few believers in precision and recall, not 20 something and faux consultant court jesters.

Adam Audette’s article on searchengineland.com, “Weighing In-House vs. Agency SEO Enterprise Search Strategies” is a two handed grasp at finding something that will work for the SEO crowd. Audette makes valid arguments for utilizing in-house SEO’s as well as agencies.

The primary weakness of the in-house SEO role is that of myopia. Not in the sense of a lack of imagination, but in a pervading nearsightedness that’s almost inescapable. The in-house is so deeply immersed in her industry, her company, and her sites, that she can’t see the forest for the trees. Even worse, she becomes out of touch with where the industry is trending.

 

This conundrum is where an outside agency is beneficial. Their workload is often much more diversified and have large pools of people and resources to contact and brainstorm should the need arise, they are the ones responsible for SEO on a daily basis.

Audette’s solution to the problem is a “dream team” built of in-house, agency and consultants working together in harmony.I’m not entirely sure that it will play out that way.

Our view is that SEO can chew up a lot of cash for iffy results in the post-Panda world. In order to control costs, organizations clinging to the icons of the SEO faith will want to do in the cubes of the organization’s marketing department. For outfits flush with bucks to pump into the third part experts’ pockets, performance, not promises, are going to be needed.

Can the SEO industry survive and thrive? Absolutely. PT Barnum had it nailed.

Stephen E Arnold, July 20, 2011

Sponsored by Stephen E Arnold, author of The New Landscape of Enterprise Search

Belgium Google Dust Up

July 18, 2011

Short honk: The goose was incorrect. The goose believed that Google would not manually intervene in search results. The goose is shattered. Navigate to “After Copiepresse “Boycott,” Google Restores Search of News Sites”. If the story is accurate, there seems to be an allegation that Google had imposed a “so called boycott” of the Copiepresse newspapers. I thought Google was an algorithm baby. Now it seems that humans do shape search results. Implications? Lots. What about sites affected by Panda? Algorithm or manual intervention? What about relevance? Algorithm or human? What about big advertisers’ position in results sets? Algorithm or human?

Stephen E Arnold, July 18, 2011

Freebie

SEP: Bitten by Search

July 15, 2011

Search Engine Poisoning: One More Thing To Worry About,” declares Network Computing. Though Search Engine Poisoning (SEP) has been around for a while, it is now the primary online threat according to a report from security firm Blue Coat Systems.

For those unfamiliar with the concept, SEP works by creating links that masquerade as legitimate answers to search queries. Many of these queries are ones that workers commonly use in the course of their job, so the schemes affect enterprises as well as home users.

Network Computing’s Robert Mullins elaborates:

The way SEP works is that distributors of malware maintain large ‘link farms’ where they create malicious links that represent all sorts of things people would search for online. [Tom Clare of Blue Coat] gave the example of Keen Footwear, a brand of hiking shoes. If someone searches for that brand in a search engine, as many as half of the top 10 results could be links to malware. SEP is particularly devious in that it doesn’t actually have to infect the Web site of Keen Footwear but can still trick end users.

The malefactors’ job is made easier by URLs that are vulnerable to cross-site scripting (XSS). That vulnerability allows the injection of malicious code.

We continue to look with skepticism on the search engine optimization business. We think that Google wants SEO professionals to optimize their pages and then, if traffic falters, feel really good about herding the traffic thirsty Web masters toward Adwords.

Stephen E Arnold, July 14, 2011

Sponsored by Pandia.com, publishers of The New Landscape of Enterprise Search

Search and Why No One Knows How to Find Stuff

July 15, 2011

I received a call early this morning from an addled prospect in a distant time zone. After the call, I grabbed by trusty iPad and took at look at the news scooped up by my Pulse and Flipbook apps. Why search when I can let an anonymous system “tell” me what I need to know. One of the “must know” things snagged my eye with this fetching headline: “My Top 10 Insights from 10 Years in Search”. After dismissing an annoying ad about something called “search marketing”, my expectations were at snail level. I was not disappointed.

Search, I learned, was not about findability, information retrieval, semantics, or text mining. I learned that search is about:

  1. Pumping content without regard for quality to the top of a brute force search result
  2. Not resting on one’s laurels when a content trick actually works and fools the Facebook-obsessed Google from delivering high precision and recall
  3. Being “great” at Excel.

There were seven other “learnings” which shined a very weak light on the topic of search as I understand the concept.

Search has been usurped—maybe the proper word is devalued—by search engine marketing. The idea is simple. Traffic means revenues. The world in which this addled goose paddles uses a different denotation and connotation. Search is about answering questions. Search NOT about polluting a relevance method, getting clicks, and making money. I make money because I find information, absorb it, and frame my own ideas, products, and services. Information is one type of high value input. Fiddle with the input and the output is probably flawed or misleading.

Search is difficult, and it is getting problematic. Those with access to log files know that the majority of users’ actions can be converted to quite tidy items of data. These items can be used to deliver exactly what the majority of people want a search system to be: An input system for that which is consciously or unconsciously needed by an individual user.

In my odd little goose pond, I run a query on a phrase like “confluent opportunity space.” I want to know who said it, where, when, and why. Answering that question is something that few search systems can do. My hunch is that an individual operating with the foundation upon which the “10 insights” are erected may use approaches that may not work for my context. Heck, maybe the approach won’t work at all.

What’s this say about “search”?

First, I don’t think most people know what search is. The failure to define the overused and much abused term is leading to confusion. I think people use the term and talk about systems and methods that are more fractured than than rocks in the Allegheny orogeny.

Second, heaven help the vendor of text mining if the word search slips into a conversation with an online marketer. The dust up in “When Worlds Collide” will look like two puppies pushing into a bowl of kibble.

Third, I am convinced that people are not interested in understand search. The entire sector is confused and increasingly indifferent to useful communication about information retrieval.

We are on the path to a super saturated world of advertising. Calling this search is downright amazing. I am delighted to be 66, indifferent to such linguistic jumping jacks, and an addled goose paddling in a rural backwater.

The consequences of this devaluation and distortion of a once useful word may have more to do with today’s crises in decision making, innovating, and revenue generating than meets the eye.

Stephen E Arnold, July 15, 2011

Freebie by golly!

SEO, Curation, and Algorithms

July 13, 2011

I read an unusual blog post “Sometimes I Really Get It Wrong; My Apology to SEO Industry.” The sentence that caught my attention (albeit briefly) was:

I thought more human-oriented approaches, like Mahalo, would get better results than algorithmic approaches, like Google.

The write up points out a mea culpa:

it’s 2011 now and it’s clear that the Google way of doing things is still better for most people.

Fascinating. Google has a fan. The paragraph I tucked into my “Online Touchstones” was:

I went for cheap SEO tricks. Truth is, if you bash the SEO world they will all link to you, argue with you, etc. (Bloggers even have a name for this: “link bait”). Folks who do SEO as a profession love fighting about that stuff and it almost always works. But, does it really help you get the traffic you want? The reputation you want? No way. Putting up great content, like when I interviewed Mike McCue and told the world about Flipboard is a far more effective way to get good Google Juice. Taking shortcuts just tarnishes your reputation. Anyway, just wanted to say I’m sorry to the SEO industry.

Several observations on this sunny morning in Harrod’s Creek, far from the roiling popularity fish tanks on the left and right coasts.

First, I recall reading in the paper edition of the New York Times about Google’s apparent inability to filter certain types of content. My recollection is addled, but it seems finding a locksmith is allegedly a scam. I just look in the Yellow Pages, but I am in the intellectual dead zone. Use the Google and you may not get the old fashioned service still available in a rural backwater. I am not sure if the locksmith issue, if true, is search engine optimization or a slightly more sophisticated content operation. Doesn’t matter. Humans are doing these alleged actions and the Google algorithms are either on vacation or watching “Lizard Lick” reruns on Tru TV.

Second, the Google+ service is Google’s most recent attempt to get involved with human centric content generation. The social part is nice, and it is alluring to those looking for “connections”, but there is the content part. Humans are generating lots of data. The “lots of data” part translates to money because algorithms and scripts can generate ad revenue. The algorithm part makes money. I am not so sure about the relevance part anymore.

Third, my view of search engine optimization is that traffic makes jobs. When traffic to a Web site declines, search engine optimization kicks into gear. Adwords and Google love become an “organic” and logical response when organic methods no longer work.

Net net: information originates with humans via intent or as a consequence of an action. Machines can generate meta information. Now the trajectory of the Internet is moving toward broadly based human functions: talking. Finding is important, but it is a sub function. SEO is going to have to work overtime to recapture the glorious years of BP 2009. “BP” is before Panda. Brute force search is not where it is at. AltaVista-style finding will remain, but the datasphere is more human centric than algorithmic. HAL? HAL? What’s with the nursery rhyme.

Stephen E Arnold, July 13, 2011

Sponsored by ArticleOnePartners.com, the source for legal research.

The Wages of SEO: Content Free Content

June 28, 2011

In the last two weeks, I have participated in a number of calls about the wrath of Panda. The idea is that sites which produce questionable content like Beyond Search suck. I agree that Beyond Search sucks. The site provides me with a running diary of what I find important in search and content processing. Some search vendors have complained that I cover Autonomy and not other engines. I find Autonomy interesting. It held an IPO, buys companies, manages reasonably well, and is close to generating an annual turnover of $1.0. I don’t pay much attention to Dieselpoint and a number of other vendors because these companies do not strike me as disruptive or interesting.

I paddle away in Harrod’s Creek, oblivious to the machinations of “content farms.” I have some people helping me because I have a number of projects underway, and once I find an article I want to capture, I enlist the help of librarians and other specialists. Other folks are doing similar things, but rely on ads for revenue which I do not do. I have some Google ads, but these allow me to look at Google reports and keep tabs o n various Googley functions. The money buys a tank of gas every month. Yippy.

I read “Google’s War on Nonsense.” You should too while I go out to clean the pasture spring. The main point is that a number of outfits pay people to write content that is of questionable value. No big surprise. I noted this passage in the write up:

The insultingly vacuous and frankly bizarre prose of the content farms — it seems ripped from Wikipedia and translated from the Romanian — cheapens all online information. A few months ago, tired of coming across creepy, commodified content where I expected ordinary language, I resolved to turn to mobile apps for e-books, social media, ecommerce and news, and use the open Web only sparingly. I had grown confused by the weird articles I often stumbled on. These prose-widgets are not hammered out by robots, surprisingly. But they are written by writers who work like robots. As recent accounts of life in these words-are-money mills make clear, some content-farm writers have deadlines as frequently as every 25 minutes. Others are expected to turn around reported pieces, containing interviews with several experts, in an hour. Some compose, edit, format and publish 10 articles in a single shift. Many with decades of experience in journalism work 70-hour weeks for salaries of $40,000 with no vacation time. The content farms have taken journalism hackwork to a whole new level.

My take on this approach to information—what I call content  free content—is that we are in the midst of a casserole created by Google and its  search engine optimization zealots. Each time Google closes a loophole for metatag stuffing or putting white text on a white background, another corner cutter cooks up some other way to confuse and dilute Google’s relevance recipe.

The content free content revolution has been with us for a long time.  A  Web searcher’s ability to recognize baloney is roughly in line with the Web searcher’s ability to invest the time and effort to fact check, ferret out the provenance of a source, and think critically. Google makes this flaw in its ad machine’s approach with its emphasis on “speed” and “predictive methods.” Speed means that Google is not doing much, if any, old fashioned index look up. The popular stuff is cached and updated when it suits the Google. No search required, thank you. Speed, just like original NASCAR drivers, is a trick. And that trick works. Maybe not for queries like mine, but I don’t count literally. Predictive means that Google uses inputs to create a query, generate good enough results, and have them ready or pushed to the user. Look its magic. Just not to me.

With short cuts in evidence at Google and in the world of search engine optimization, with Web users who are in a hurry and unwilling or unable to check facts, with ad revenue and client billing more important than meeting user needs—we have entered the era of content free content. As lousy as Beyond Search is, at least I use the information in my for fee articles, my client reports, and my monographs.

The problem, however, is that for many people what looks authoritative is authoritative. A Google page that puts a particular company or item at the top of the results list is the equivalent of a Harvard PhD  for some. Unfortunately the Math Club folks are not too good with content. Algorithms are flawless, particularly when algorithms generate big ad revenue.

Can we roll back the clock on relevance, reading skills, critical thinking, and the pursuit of knowledge for its own sake? Nope, search is knowledge. SEO is the into content free content. In my opinion, Google likes this situation just fine.

Stephen E Arnold, June 28, 2011

You can read more about enterprise search and retrieval in The New Landscape of Enterprise Search, published my Pandia in Oslo, Norway, in June 2011.

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta