New Visual Analytics Tool Arrives Right on Time for Midsized Businesses

February 19, 2013

SAS has announced a new product geared toward work groups and midsize businesses. The new SAS Visual Analytics brings an enterprise-level computing ability to scaled down systems and works with database appliances Greenplum and Teradata.

According to “SAS Rolls Out Visual Analytics for Work Groups and Midsized Businesses,” by 2015 more than 30 percent of analytics projects will deliver insights based on structured and unstructured data. This is important to know because small business intelligence will have to scale up in order to meet the demand. That’s where SAS comes in.

“Designed as a starting point for organizations wanting to add analytics to their business strategies, SAS Visual Analytics’ self-service option lets business users explore their data without having to seek assistance from their IT departments.”

While SAS touts its new software as more than just a simple business intelligence product and confirms that it will be fast and easy to use there is still the question of whether or not the software can live up to its billing as a useful to those businesses with a handful of users, all the way up to global deployment. Seems ambitious.

Be that as it may, there is no doubt that it is a step up from the small business platforms that are currently on the market. It will be an interesting system to watch unfold as it moves into the future.

Leslie Radcliff, February 19, 2013

Sponsored by ArnoldIT.com, developer of Augmentext

A Big Data Partnership

February 17, 2013

Big data inspire companies to partner up and pool their research and products. Datameer, the Hadoop big data analytics leader, and Caserta Concepts, a consulting and technology business specializing in big data analytics, BI, and data warehousing, have formed a joint partnership. Virtual Strategy runs through the details in the article, “Caserta Concepts Announces Partnership With Datameer For Big Data Analytics On Hadoop.”

The companies have paired up, because of a study done by Ventana Research’ entitled, “The Challenge of Big Data” by Mark Smith. The research explains that average users find it hard to make sense of the data Hadoop captures, because they are more used to working with Excel or other BI intelligence dashboards. Datameer’s software allows the everyman user to read and harness the power of Hadoop with familiar dashboards and tools.

“’We are very pleased to partner with Datameer, the only provider of big data analytics built natively on Hadoop’” said Joe Caserta, founder and CEO of Caserta Concepts. ‘As organizations struggle to make sense of all their available data, Datameer’s big data analytics and discovery solution makes Hadoop’s power and flexibility instantly accessible to business analysts and data scientists alike.’”

What does this partnership teach us? It teaches that while big data is desirable, many users do not have the experience using analytics tools. Big data tools need to be user-friendlier if anything is going to be gleaned from it.

Whitney Grace, February 17, 2013

Sponsored by ArnoldIT.com, developer of Beyond Search

Whitepaper Sheds Light on Predicting the Future

February 16, 2013

A new white paper from Microsoft Research indicates that search is now future-oriented. The detailed paper, “Mining the Web to Predict Future Events” (PDF) was composed by Kira Radinsky of the Technion-Israel Institute of Technology and Microsoft Research’s Eric Horvitz. The introductory Abstract specifies:

“We describe and evaluate methods for learning to forecast forthcoming events of interest from a corpus containing 22 years of news stories. We consider the examples of identifying significant increases in the likelihood of disease outbreaks, deaths, and riots in advance of the occurrence of these events in the world. We provide details of methods and studies, including the automated extraction and generalization of sequences of events from news, corporate, and multiple web resources. We evaluate the predictive power of the approach on real-world events withheld from the system.”

The researchers delve into the management of large amounts of information from a variety of sources (including bits of news that may seem insignificant at first) and patterns that can be extracted from such data. They illustrate their points with a series of evaluations and representative examples, and conclude that automated analysis can, indeed, discover helpful new relationships and context-sensitive outcome probabilities.

Radinsky and Horvitz conclude with the hope that other researchers will take up this topic, and that such research will lead to “valuable predictions about future events and interventions of importance.” See the paper for the thorough details behind this optimistic stance.

Cynthia Murrell, February 16, 2013

Sponsored by ArnoldIT.com, developer of Augmentext

Autonomy Improves its eDiscovery Software

February 15, 2013

HP is on the move, leveraging their Autonomy investment with new features, we learn in the company’s announcement, “HP Autonomy Strengthens eDiscovery Solution with New Information Governance Capabilities.”

The crucial early case assessment (ECA) phase occurs at the onset of a legal procedure, when large volumes of data must be assessed quickly, thoroughly, and carefully. The press release informs us:

“Autonomy has extended its Meaning Based Coding (MBC) capability to its ECA module, further enhancing its in-depth eDiscovery analysis capabilities. Autonomy’s MBC capabilities enable organizations to automate analysis based on the Autonomy Intelligent Data Operating Layer (IDOL), which quickly categorizes data by concepts, ideas and patterns in information. Unlike traditional predictive coding technologies, MBC classifications are carried through to the review and production phase without new processing or indexing. As a result, Autonomy ECA can perform an analysis of the data faster, more accurately and at a lower cost.”

Also new is the software’s integration with HP’s Application Information Optimizer, which automates data migration and retirement. Furthermore, Autonomy has added native discovery functionality to the on-premise version of their archiving solution, Autonomy Consolidated Archive. They say these improvements streamline the eDiscovery process, saving money, time, and frustration.

Autonomy, founded in 1996, offers solutions that use IDOL to tame mind-boggling amounts of unstructured data. The technology grew from research originally performed at Cambridge University, and now serves prominent public and private organizations around the world. HP acquired Autonomy in 2011.

Cynthia Murrell, February 15, 2013

Sponsored by ArnoldIT.com, developer of Augmentext

Enterprise Organizations Search for Solutions to Deliver Inisights

February 14, 2013

While ETL technologies were once good enough on their own, the era of big data has made waves for more augmenting technologies. However, Smart Data Collective points out that it is not just big data, but also the need for predictive analytics that has caused the paradigm shift. Their article “Data Integration Ecosystem for Big Data Analytics” defines common terminology related to enterprise software in the world inundated with big data through business contexts.

The author identifies the six sources of the integrated data ecosystem in a typical enterprise organization: sources, big data storage, data discovery platform, enterprise data warehouse, business intelligence portfolio, data analytics portfolio.

We learned the following from the article in regards to what processes integrated data can facilitate with greater ease and efficiency:

While the business intelligence deals with what has happened, business analytics deal with what is expected to happen. The statistical methods and tools that predict the process outputs in the manufacturing industry have been there for several decades, but only recently they are being experimented with the organizational data assets for a potential to do a much broader application of predictive analytics.

This was a useful write up as it sheds light on one of the most important topics for enterprise organizations right now dealing with getting a grip on big data. Organizations are looking for solutions that can deliver enterprise information in real time and across various departments and applications.

Megan Feil, February 14, 2013

Sponsored by ArnoldIT.com, developer of Beyond Search.

Change Comes to Attensity

February 14, 2013

Just as the demand for analytics is ascending, Attensity makes a management change. We learn the company recently named J. Kirsten Bay their head honcho in “Attensity Names New President/CEO,” posted at Destination CRM. The press release stresses the new CEO’s considerable credentials:

“Bay brings to Attensity nearly 20 years of strategic process and organizational policy experience derived from the information management, finance, and consumer product industries. She is an expert in advising both the public and private sector on the development of econometric policy models. Most recently, as vice president of commercial business with iSIGHT Partners, Bay provided strategic counsel to Fortune 500 companies on managing intelligence requirements and implementing customer and development programs to integrate intelligence into decision programs.”

The company’s flagship product Attensity Pipeline collects and semantically annotates data from social media and other online sources. From there, it passes to Attensity Analyze for text analytics and customer engagement suggestions.

Headquartered in Palo Alto, California, folks at Attensity pride themselves on the accuracy of their analytic engines and their intuitive reports. Rooted in their development of tools that serve the intelligence community, the company now provides semantic solutions to many Global 2000 companies and government agencies.

Cynthia Murrell, February 14, 2013

Sponsored by ArnoldIT.com, developer of Augmentext

A Case Study for the Use of Onalytica Software

February 7, 2013

In order to tout their product, Onalytica presents a case study in “‘The Onalytica Way’—Onalytica Insight Used in Jefferies Equity Research.” The global investment bank used Onalytica’s solution specifically to research companies whose business models tap into on-line networks. The write-up states:

“The research explores the growth dynamics and business models of UK comparison sites MoneySuperMarket and Rightmove, as examples of businesses which exploit the many-to-many dynamic in different ways. Our previous work with Jefferies includes research analysing the global Fashion debate and retail markets, which eventually led to a buy recommendation on ASOS based on our insights. Using our InfluenceMonitor platform, we draw out brand insights from the online debate to see how these sites’ share of influence has developed over the quarters in comparison to their market competitors.”

For more information on weighting for influence in online analysis, the post points us to another of the blog’s articles titled, not surprisingly, “Weighting for Influence.” It might be worth checking out if you are curious.

I hate to be the one to point this out, but. . . if this worked as they claim, wouldn’t Jefferies be the dominant investment firm? Just asking.

The marketing consultancy was founded in 2004, and is based in London. Onalytica provides clients with “near real-time” analysis with the aim of better positioning themselves amidst day-to-day market changes.

Cynthia Murrell, February 07, 2013

Sponsored by ArnoldIT.com, developer of Augmentext

Ai-One Touts Intelligent Agent Advantage

February 6, 2013

Is it another breakthrough in the analysis of unstructured text? Ai-one provides a detailed account of its data-analysis platform ai-BrainDocs in, “Big Data Solutions: Intelligent Agents Find Meaning of Text.” The write-up begins with an summary of the familiar problems many organizations face when trying to make the most of the vast amounts of data they have collected, particularly limitations of the keyword approach. Ai-one describes how they have moved beyond those limitations:

“Our approach generates an ‘ai-Fingerprint’ that is a representational model of a document using keywords and association words. The ‘ai-Fingerprint‘ is similar to a graph G[V,E] where G is the knowledge representation, V (vertices) are keywords, and E (edges) are associations. This can also be thought of as a topic model. . . .

“The magic is that ai-one’s API automatically detects keywords and associations – so it learns faster, with fewer documents and provides a more precise solution than mainstream machine learning methods using latent semantic analysis. Moreover, using ai-one’s approach makes it relatively easy for almost any developer to build intelligent agents.”

The write-up tells us how to build such “intelligent agents,” delving into the perspectives of both humans and conventional machine learning (including natural language processing and latent analysis techniques). It concludes by describing the creation of their ai-BrainDocs prototype. The article is rich in detail—a worthwhile read for anyone interested in such mechanics.

Founded in Zurich in 2003, ai-one is now headquartered in La Jolla, California, with research in Zurich and European operations in Berlin. The company licenses their software to developers around the world, who embed it in their own products.

Cynthia Murrell, February 06, 2013

Sponsored by ArnoldIT.com, developer of Augmentext

More Real Time BI Promises

February 5, 2013

Have we not heard this story before? Datamation reports that, “Real Time Is BI Right Time BI,” an article about real time intelligence being incorporated in businesses. While the technology does exist, companies are reluctant to adopt real time because of the cost and the complexity associated with it. Technology has changed, as have consumers’ requirements. Real time solutions have become a desired if not necessary BII solution, as found in a series of surveys (though it is not mentioned who conducted it).

We can assume this company issues the surveys:

“Tony Cosentino, VP & Research Director of Ventana Research, noted that the survey found even stronger interest in implementing real time BI/analytics in the next two years via complex event processing.”

But the main question that keeps popping up is: what exactly is real time intelligence?

“’There is still a lot of confusion over what right time and real time really are,’ noted Cosentino. ‘When we talk about real-time transaction processing, with data streaming into mobile devices right from transaction systems, rules-based analytics trigger actions. This complex event process is like when a car crashes and the air bags deploy. These rule-based systems are different than the complex algorithms that may be at work behind the scenes.’ In addition to the immediate reaction to an event, Cosentino and others warn IT that everybody expects everything right away. “If someone waits more than two seconds for something to appear on their screen, they will jump to another screen,’ Cosentino added.”

Throw in people’s constant ned for immediate access to everything and analytics and the IT departments have a lot of work cut out for them. Will real time finally deployed? Maybe now more than ever, but we will have to see…again.

Whitney Grace, February 05, 2013

Sponsored by ArnoldIT.com, developer of Beyond Search

PolySpot Utilizes Data from both Past and Present to Offer Insights

February 1, 2013

A very interesting article came out of Wired recently. This piece, “Stop Hyping Big Data and Start Paying Attention to Long Data,” asserts that perhaps living in real-time and considering only current snapshots may be a flawed method.

Analytics should look at both slow changes, throughout longer periods of time and fast changes happening in real-time. The author refers to slow changes as “long data.”

The article states:

By “long” data, I mean datasets that have massive historical sweep — taking you from the dawn of civilization to the present day. The kinds of datasets you see in Michael Kremer’s “Population growth and technological change: one million BC to 1990,” which provides an economic model tied to the world’s population data for a million years; or in Tertius Chandler’s Four Thousand Years of Urban Growth, which contains an exhaustive dataset of city populations over millennia. These datasets can humble us and inspire wonder, but they also hold tremendous potential for learning about ourselves.

This is an article in which the angle made the story. No one says big data and implies that the historical contexts and perspectives are not included within that terminology. Big data solutions such as PolySpot are utilizing data from across every sector of the enterprise, both past and present, in order to deliver effective information on the future.

Megan Feil, February 1, 2013

Sponsored by ArnoldIT.com, developer of Beyond Search.

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta