Now Business Intelligence Is Dead
July 18, 2012
I received a “news item” from Information Enterprise Software, an HTML email distributed by InformationWeek Software. The story was labeled “Commentary.” I did not think that “real” journalists engaged in “commentary.” Isn’t there “real” news out there to “cover” or “make.”
Read the article. Navigate to “If BI Is Dead, What’s Next?” The “commentary” is hooked to an azure chip consultant report called “BI Is Dead! Long Live BI” which costs a modest $250. You can buy this document from Constellation Research here. First, let’s look at the summary of the report and then consider the commentary. I want to wrap up with some blunt talk about analytic baloney which is winging through the air.
Here’s the abstract so get your credit card ready:
We [Constellation Research] suggest a dozen best practices needed to move Business Intelligence (BI) software products into the next decade. While five “elephants” occupy the lion’s share of the market, the real innovation in BI appears to be coming from smaller companies. What is missing from BI today is the ability for business analysts to create their own models in an expressive way. Spreadsheet tools exposed this deficiency in BI a long time ago, but their inherent weakness in data quality, governance and collaboration make them a poor candidate to fill this need. BI is well-positioned to add these features, but must first shed its reliance on fixed-schema data warehouses and read-only reporting modes. Instead, it must provide businesspeople with the tools to quickly and fully develop their models for decision-making.
I like the animal metaphors. I must admit I thought more in terms of baloney, but that’s just an addled goose’s reaction to “real” journalism.
The point is that business intelligence (I really dislike the BI acronym) can do a heck of a lot more. So what’s dead? Excel? Nah. Business intelligence? Nah. A clean break with the past which involved SAS, SPSS, and Cognos type systems? Nah.

Information about point and click business intelligence should be delivered in this type of vehicle. A happy quack to the marketing wizard at Oscar Mayer for the image at http://brentbrown98.hubpages.com/hub/12-of-the-Worst-Sports-Logos-Ever
So what?
Answer: Actually not a darned thing. What this report has going for it is a shocking headline. Sigh.
Now to the “commentary.” Look a pay to play report is okay. The report is a joint work of InformationWeek and the Constellation report. Yep, IDC is one of the outfits involved in the study. The “commentary” is pretty much a commercial. Is this “real” journalism? Nah, it is a reaction to a lousy market for consulting studies and an attempt to breathe controversy into a well known practice area.
Here’s the passage I noted:
We all saw the hand wringing in recent years over BI not living up to its promise, with adoption rates below 20% or even 10% of potential users at many enterprises. But that’s “probably the right level” given the limitations of legacy BI tools, says Raden. I couldn’t agree more, and I’ve previously called for better ease of use, ease of deployment, affordability, and ease of administration. What’s largely missing from the BI landscape, says Raden, is the ability for business users to create their own data models. Modeling is a common practice, used to do what-if simulation and scenario planning. Pricing models, for instance, are used to predict sales and profits if X low-margin product is eliminated in hopes of retaining customers with products A, B, and C.
So what we are learning is that business intelligence systems have to become easier to use. I find this type of dumbing down a little disturbing. Nothing can get a person into more business trouble faster than fiddling around with numbers and not understanding what the implications of a decision are. Whether it is the fancy footwork of a Peregrine or just the crazy US government data about unemployment, a failure to be numerically literature can have big consequences.
Inteltrax: Top Stories, July 16 to July 20
July 16, 2012
Inteltrax, the data fusion and business intelligence information service, captured three key stories germane to search this week, specifically, some breaking news in the industry.
Our story: “Data Mining and Other Issues on Slate at 2012 Joint Statistical Meetings” showed that analytics is rightly on statistic experts’ radar.
“Mike Miller Joins Digital Reasoning as VP of Sales” provided a glimpse into the wisest hiring minds in the business.
“Florida Community Benefits Medically and Financially from Analytics” gives a glimpse at the immediate impact analytics is making on the community level.
News crops up in all areas of analytics, so it’s helpful to have stories wrangled up that might slip through the cracks. We’re here everyday, monitoring just such stories so you don’t have to.
Follow the Inteltrax news stream by visiting www.inteltrax.com
Patrick Roland, Editor, Inteltrax.
July 16, 2012
Study Purports to Show Google Knowledge Graph Results Often Outdated
July 16, 2012
Google Analytics may need some spiffing up if this write up is accurate. Search Engine Watch declares, “Google Knowledge Graph Shows Outdated Search Results for Trending Topics [Study].” Writer Miranda Miller cites a recent study from SEO firm Conductor, which found many of Google’s search engine results pages (SERP’s) to be behind the curve, especially on trending topics. Twenty percent of those were found to be out of date, while only four percent of the more static subjects were dusty. The report specifies Conductor’s methods:
“For each query we compared the Knowledge Graph result on the SERP to its Wikipedia entry and noted whether it was or was not an exact match. When they did not match, we measured the lag distribution of the mismatched queries by using WikiBlame to determine when the change occurred and, subsequently, the number of days the Knowledge Graph was behind.”
Why Wikipedia? Conductor has found that high-activity queries are very likely to have their Wikipedia entries updated promptly. By comparison, Google’s much-anticipated Knowledge Graph seems to be falling short. The report commented:
“While a real time Wikipedia update may ultimately not be practical, if Google is indeed positioning Knowledge Graph as the future of search, we have to believe that they can do better than the 2-4 day lag many of their mismatched keywords currently reflect.”
We’d all like to believe that, I think. Miller cautions that users are ultimately responsible for verifying anything they find online. Wise words, considering how influential Web search results, through Google and other engines, have become.
Cynthia Murrell, July 16, 2012
Sponsored by PolySpot
New eDiscovery Platform Is kCura Contender
July 16, 2012
When a judge recently ruled in favor of predictive coding, it gave many litigation support technology IVPs the change to display their wares on a bigger scale. Law.com reports that “FTI Consulting Unveils Ringtail 8.2 with Attenex Document Mapper.” FTI Consulting is a global business advisory firm and they have caught onto the eDiscovery trend with their Ringtail 8.2 software that has integrated graphical views to facilitate document analysis and review. The newest version of Ringtail makes them a contender for LexisNexis Concordance and Content Analyst partner KCura’s Relativity platform.
FTI Consulting’s director of product marketing, JR Jenkins believes in predictive coding’s power:
“ ‘Asked where’s the “predictive coding,” he said “it’s built in.” The Document Mapper provides a high-level view of a collection, which results from machine learning processes that combine both “supervised and unsupervised” learning. Jenkins believes that the “workflow around predictive coding is more important than the technology.’ The ‘people, processes, and technology from FTI,’ said Jenkins, that results in a Document Mapper view of a collection aims to open the black box covering predictive coding.”
FTI Consulting is on the right track, but what they lack is the smaller, more personal solutions the legal community needs.
Whitney Grace, July 16, 2012
Protected: Habit Changes Boost eDiscovery
July 13, 2012
Pros and Cons of Palantir?
July 12, 2012
It seems as though writer Dave Kellogg would love to hate analytics firm Palantir, if only he could. That’s my summary of his Kellblog entry, “Why Palantir Makes my Head Hurt.” Though he admits there are several things about the company he is forced to admire, his sense of fair play compels him to slam it in print.
Kellogg believes that Palantir is playing fast and loose with definitions for accounting reasons. For example, they claim to have no marketing, sales, or services. He also asserts that their positioning as a billion-dollar company is a stretch. (See the piece for his explanations.)
Kellogg admits to some bias borne of his experiences covering Palantir. He writes:
“It turns out being a naysayer isn’t fun work: for three years you sound like a whining, doubting-Thomas constantly on the back foot, constantly playing defense and then one day you’re proven right. But there’s no joy in it. And the naysaying doesn’t help sell newspapers so you don’t get much press coverage. And, in the end, all people remember is that ‘MicroStrategy was pretty cool back in the day’ and ‘Dave’s a grump.'”
Okay, so now we know where his head is at. Kellogg is conflicted, because he still thinks Palantir does several things very well; he calls them “the first SI to figure out how to build a world-class software business.” Sounds like he really admires the company. If only he didn’t despise them so.
Cynthia Murrell, July 12, 2012
Sponsored by PolySpot
Food for Your Big Data Files
July 12, 2012
A couple of recent articles give us some observations about the field of big data. Karmasphere shares their research in “Karmasphere Unveils ‘Trends and Insights into Big Data Analytics’ Survey Results.” Meanwhile, Sys-Con Media answers the question, “Big Data & Analytics—What’s New?”
For the results of the Karmasphere study, click here (though you’ll have to register first.) The survey, performed this past May, assembled responses from 376 North American data analytics pros. One of the key findings: a lack of data experts in companies of all sizes is driving a need for self-service Hadoop access. Not surprisingly, SQL is seen as the primary skill set for data analytics. Also, big-data team members are widely being called upon to sport multiple (metaphorical) hats. Much, much more information is included; I think the full report is worth surrendering your email address.
In the Sys-Con article, Jnan Dash extols the progress of Hadoop. He writes:
“A friend of mine from my IBM days (an expert in Data Warehousing, BI, etc.) told me about the Hadoop conference he attended in San Jose few weeks back. When he attended the same conference two years ago in New York, there were hardly 200 attendees whereas this time, the number exceeded 2000 and it was a sold out event. This just proves how fast Hadoop has generated interest. He said that one theme in every presentation was the need for Hadoop skills as almost every presentation had a slide, ‘we are hiring’.”
Hiring is good. Very good. Make a note of it.
Dash shares his thoughts on three specific players in the Hadoop arena, Cloudera, Hortonworks, and MapR. He also plugs a couple of start-ups in the Hadoop-fueled business intelligence (BI) space, Datameer and Karmasphere. See the write up for more details.
He also notes that, because companies will not be eager to waste the existing investments in BI and analytics, integrating Hadoop with current technology will be a must going forward. Good observation.
Cynthia Murrell, July 12, 2012
Sponsored by PolySpot
Recorded Future Suggested for Cyber Attack Prediction
July 12, 2012
Oh, oh, scary marketing. Careful, the goose is easily startled. Sys-Con Media claims our attention with “Recorded Future for Forecasting Cyber Attacks.” Blogger Bob Gourley does a good job, though, of explaining why Recorded Future would be a good tool for predicting cyber attacks.
Already employed by agencies such as the US Southern Command, Recorded Future has been successfully used to anticipate citizen unrest and to analyze intelligence stored on a private cloud (the Bin Laden Letters, no less.) The software automates the aggregation and organization of data, leaving more time for human analysts to focus on assessment. The application presents the information collected from articles, blog posts, and/or tweets chronologically, including (this is the best part) a prediction of future events. The software also helps with the analysis stage by mapping relationships and tracking buzz.
Gourley asserts that the company’s technology can also be used in the struggle against international hackers:
“All together, these capabilities allow an organization to forecast more accurately whether they will be the target of a major cyber attacks and what threat vectors they should most worry about. Within minutes, analysts could see if there has been a trend of attacks against similar organizations, any threats reported online, or events likely to trigger attacks coming up. They can drill down into coverage by blogs or trade journals if they find the mainstream media insufficient or misleading, and map out the interactions and relationships between hacking groups, companies, government agencies, and law enforcement. While Recorded Future can’t tell you who will attack you and when, it makes open source intelligence intelligence analysis for cybersecurity easier, faster, and more effective.”
Still in the start-up phase, Recorded Future has headquarters in Cambridge, MA, and Göteborg, Sweden. Staffed with statisticians, linguists, and technical business pros as well as computer scientists, the company seems well-equipped to deliver what they call “the world’s first temporal analytics engine.”
Cynthia Murrell, July 12, 2012
Sponsored by PolySpot
Is Mixpanel Tracking You?
July 11, 2012
A pair of articles highlight the ways in which Mixpanel is taking tracking software to new levels. Whether those levels are highs or lows depends entirely upon your perspective.
VentureBeat exclaims, “Mixpanel Now Lets Apps Target You—Yeah YOU—on a Deeply Personal Level.” Reporter Jolie O’Dell notes that Mixpanel is regarded as a provider of quality analytics. Now the company is working on tying data to individual users, a departure from the don’t-worry-this-data-isn’t-tied-to-you-personally convention we’ve grown used to. The aim, of course, is to target advertising with ever better accuracy. Reminds me of that scene from “Minority Report.”
Search Engine Marketing and Website Optimization’s blog is a bit more matter-of-fact than VentureBeat, stating simply, “Mixpanel is Tracking More Than Actions Now, Introduces User Analytics.” That piece also mentions that Mixpanel ties to individual users, and discusses related analysis:
“Specifically, when customers open up their Mixpanel dashboard, they’ll see a new menu under the ‘actions’ section called ‘people’, where they can get data about all of their visitors, such as gender, age, and country, and then correlate that data with user activity. . . .
“[Mixpanel’s co-founder Suhail Doshi] says the new features should be useful to companies of all sizes. If you’ve got a brand new website and only a few hundred visitors, you can look at the individual profiles. If you’re more established, with millions of users, you can still look for patterns among those users, and also target messages to specific groups.”
Great. I can’t say I’m surprised things are progressing this direction. I also can’t say I don’t appreciate not seeing ads about things I’m not interested in. Still, the whole trend leaves me a bit unsettled. Better get used to it, I suppose.
Cynthia Murrell, July 11, 2012
Sponsored by PolySpot
Unstructured Data Becomes the New Gold Rush
July 11, 2012
Knowledge is like gold, and today’s internet prospectors are digging deep for the ‘golden’ Data vein. Big Data offers a gold rush of information that may exceed the shiny nuggets of 1848. The big data cavern glimmers with possibilities according to Sys-Con’s article, “Actuate Unveils ‘Big Data’ Research Results” and data engineers are creating better tools. Data miners are digging for sophisticated analytics on unstructured content to be metamorphosed into pattern analysis, keyword correlation, incident prediction and fraud prevention. The purpose is to get data in active hands to be utilized for customer communications, management, profit and the overall organization’s evolution.
A study was done to determine the trials that companies face when digging deeper than the surface and:
“The findings highlight the contrast between organizations’ internal experience of accessing relevant, up-to-date decision-supporting information and their ability to pinpoint a wealth of data on almost any other subject via the public Internet.”
“As many as 80% of respondents admitted that they had no extended search capability across multiple repositories, while 70% said they found it “harder” or “much harder” (23%) to access key information held on internal systems versus that available on the Web, even though the content exists within their organization.”
Unstructured content was the golden glimmer that caught the organizational eye, now they have to access and deploy it efficiently. Rumors about this technology and its benefits will spread quickly, so it will not take long for this new gold rush to get underway.
Jennifer Shockley, July 11, 2012

