Semantria Adds Value to Unstructured Data With Sentiment Analysis

March 19, 2013

We are constantly on the lookout for movers and shakers in the area of text analysis and sentiment analysis. So, I was intrigued when I came across Semantria’s Web site recently, a company claiming text and sentiment analysis is made fast and easy with their software. With claims to simplify costs and high-value capturing, I had to research further.

The company was founded in 2011 as a software-as-a-service and services company, specializing in cloud-based text and sentiment analysis.The team boasts a foundation from text analytics provider Lexalytics, software development Postindustria, and demand generation consultancy DemandGen.

The company page shares about how its software can give insight into unstructured content:

“Semantria’s API helps organizations to extract meaning from large amounts of unstructured text. The value of the content can only be accessed if you see the trends and sentiments that are hidden within. Add sophisticated text analytics and sentiment analysis to your application: turn your unstructured content into actionable data.”

Semantria API is powered by the Lexalytics Salience 5 analytics engine and is fully REST compliant.  A processing demo is available at at https://semantria.com/demo. We think it is well worth a look.

Andrea Hayden, March 19, 2013

Sponsored by ArnoldIT.com, developer of Beyond Search

Tag Management Systems Use Governance to Improve Indexing

March 18, 2013

An SEO expert advocates better indexing in the recent article “Top 5 Arguments For Implementing a Tag Management Solution” on Search Engine Watch. The article shares that because of increased functionality and matured capabilities of such systems, tag management is set for a “blowout year” in 2013.

Citing such reasons as ease of modifying tags and cost reduction, it is easy to see how businesses will begin to adopt these systems if they haven’t already. I found the point on code portability and becoming vendor agnostic most appealing:

“As the analytics industry matures, many of us are faced with sharing information between different systems, which can be a huge challenge with respect to back-end integrations. Tag management effectively bridges the gap between several front-end tagging methodologies that can be used to leverage existing development work and easily port information from one script or beacon to another.”

I think this is a very interesting concept and I love the notion of governance as a way to improve indexing. I am reminded of the original method from the days of the library at Ephesus. Next month, the same author will tackle the most common arguments against implementing a tag management system. We will keep an eye out.

Andrea Hayden, March 18, 2013

Sponsored by ArnoldIT.com, developer of Beyond Search

Oracle has a Hand in Many Cookie Jars

March 17, 2013

Oracle has long been seen as the kingpin of the analytic industry. That could be tough for their competition to argue with, especially after a recent report from Market Watch, “Oracle Positioned in Leaders Quadrant for Business Intelligence and Analytics Platforms.”

Gartner, Inc. identified Oracle as being in the Leaders Quadrant of its 2013 report, Magic Quadrant for Business Intelligence and Analytics Platforms. This is the seventh consecutive year for Oracle has been given this distinction.

Here’s what the article had to say:

“Gartner’s Magic Quadrant reports position vendors within a particular quadrant based on their completeness of vision and ability to execute… Oracle’s positioning in the Leaders Quadrant showcases how Oracle Business Analytics portfolio provides customers with an end-to-end family of analytic solutions ranging from descriptive to prescriptive based on a comprehensive set of BI, advanced analytics and CPM functionality that is also integrated and optimized with the Oracle technology stack.”

That is another impressive accolade to add to their mantle. However, it begs the question: what about the properties which comprise Oracle. Are those leaders too? This company is one that has its hand in many, many cookie jars. While we do not have direct experience with them all, it is safe to say not everything owns Oracle’s mastery, such as their analytics arm. This is to say that the homework required by choosing a new product still needs to be done, even for a company like Oracle.

Patrick Roland, March 17, 2013

Sponsored by ArnoldIT.com, developer of Beyond Search

Big Data Does A Lot Yet Not Everything

March 8, 2013

Big data is a wonder tool that is supposed to improve how organizations use their information and advertently make everything better. GCN took a look at “Four Ways Big Data Can Save Lives And Money.” The article breaks down a Tech American Foundation survey “Big Data And the Public Sector.” Across the board the study claims that big data is providing the veritable magic data wand people have been wishing for:

“Both federal and state IT officials believe big data analytics can have real and immediate impacts on how governments operate, from helping to predict crime to cutting waste and fraud, according to the survey of nearly 200 public sector IT professionals commissioned by SAP AG, and conducted by pollsters Penn Schoen and Berland.”

Big data has been deployed by NASA for airline safety, Homeland Security for its bio-preparedness collective, the National Weather Service for weather patterns, and police departments to prevent crime. The analytics offer new and valuable insights. Saving lives has been tacked as the number one reason big data is useful; it helps medical researchers aggregate data. Crime prevention, improving life quality from fixing potholes to repairing social/welfare programs, and saving money were other ways that big data is helpful.

Big data can do wonders for information, but it is not a magic wand that can be waved and POOF the world is magically fixed. Big data provides the insights, people do the rest.

Whitney Grace, March 08, 2013

Sponsored by ArnoldIT.com, developer of Beyond Search

Data Manipulation And Intent

March 8, 2013

Data supposedly tells us about what happened in a project, but while the data may record an action it does not record the intent behind it. The Tow Center for Digital Journalism takes a look at “What The Tesla Affair Tells Us About Data Journalism.” The article points out that intent can shape data, but the context is lost when it turns into cold hard facts. The truth about data is related to the recent Tesla test drive review. Tesla was very upset when New York Times reporter John Broder gave a poor review on the new car and stated that it did not factually represent it. Tesla did not release the data from Broder’s review, only the company’s interpretation of the review data.

At this point, no one can really tell the truth about the vehicle. Broder could provide context, but his opinion has already become devalued. It is also important to remember that Tesla only wanted the review for publicity and all negative truths were bad PR.

What we can learn is:

“So, to recap. The Tesla Affair reinforces that: data does not equal fact; that context matters enormously to data journalism; that trust and documentation are even more important in a world of data journalism; and that companies will continue to prioritize positive PR over good journalism in reviews of their products.”

Great, more reason to doubt data, but people have been manipulating it since time began. Will this become a greater trend, though? Is this a caution for consumer oriented analytics systems?

Whitney Grace, March 08, 2013

Sponsored by ArnoldIT.com, developer of Beyond Search

Red Hat Ventures into Big Data Analytics

March 6, 2013

Red Hat is a well-known leader in open source technology, and has made a name for itself as one of the largest contributors to the Linux kernel. Red Hat has once again made headlines as it ventures into the world of Big Data and its newest trend, analytics. Datacenter Dynamics covers the story in their article, “Red Hat Brings Open-source to Big Data Analytics for Enterprises.”

“Red Hat announced its big data direction and solutions earlier this week, aiming at enterprise requirements for scalable and reliable infrastructure to run analytics workloads. The company also announced it would contribute the Red Hat Storage Hadoop plug-in to the ApacheTM Hadoop open-source community to turn Red Hat Storage into a Hadoop-compatible file system for big data requirements.”

Analytics may be a new buzzword, in the same way Big Data made the rounds last year. However, the fundamentals are still true. Enterprises need a way to give massive amounts of unstructured data some meaning, and many developers and companies are throwing their hat in that ring. LucidWorks is a leader that has been around for a long time, responding to the shifts in the market with open source solutions. Check out their LucidWorks Big Data for another alternative.

Emily Rae Aldridge, March 6, 2013

Sponsored by ArnoldIT.com, developer of Beyond Search

Predictive Analysis Progress

March 6, 2013

Using data analysis to predict the future, a feat naturally called predictive analysis, is an intriguing facet of the analysis prism. GMA News takes a look at some progress on such software in, “New Software Can Predict Future News.” I suppose “new” is in the eye of the beholder; Recorded Future has been doing this for a couple of years now.

This article, though, covers research performed by Microsoft and the Technion-Israel Institute of Technology. Working with twenty-two years’ worth of New York Times articles and other information online, researchers have been testing ways of using this data to predict outbreaks of disease, violence, and other sources of significant mortality. Grim subject matter, to be sure, but imagine if we could take steps to deflect or minimize such occurrences before they happen. The article informs us:

“The system uses 22 years of New York Times archives, from 1986 to 2007, as well as data from the Internet to learn about what leads up to major news events. [Technion-Israel ‘s Kira]Radinsky said one useful source was DBpedia, a structured form of the information inside Wikipedia constructed using crowdsourcing. Other sources included WordNet, which helps software understand the meaning of words, and OpenCyc, a database of common knowledge. ‘We can understand, or see, the location of the places in the news articles, how much money people earn there, and even information about politics,’ Radinsky said. With all this information, researchers get valuable context not available in news articles, and which is necessary to figure out general rules for what events precede others.”

It appears that this project is far from complete, and Microsoft has formed no plans to bring it to market, according to Eric Horvitz of the Microsoft team. The article does acknowledge Recorded Future‘s work in this area, noting their strong customer base within the intelligence community.

Who can predict when the rest of us will get the chance to give this compelling technology a whirl?

Cynthia Murrell, March 06, 2013

Sponsored by ArnoldIT.com, developer of Augmentext

LucidWorks Partners with MapR

March 4, 2013

MapR Technologies and LucidWorks have proven to be good partners in the past. They are again joining forces to offer the best Big Data analytics solution on the market. PR Newswire offers the press release for the joint venture in, “LucidWorks™ Teams with MapR™ Technologies to Offer Best-in-Class Big Data Analytics Solution.”

The release states:

“Existing business intelligence (BI) tools have simply not been designed to provide spontaneous search on multi-structured data in motion. Responding directly to this need, LucidWorks, the company transforming the way people access information, and MapR Technologies, the Hadoop technology leader, today announced the integration between LucidWorks Search™ and MapR.  Available now, the combined solution allows organizations to easily search their MapR Distributed File System (DFS) in a natural way to discover actionable insights from information maintained in Hadoop.”

LucidWorks builds upon the strong search infrastructure of Solr. Adding this to the power of Hadoop through the MapR distribution makes it a solution that it without equal. The partnership makes it easier to put Big Data analytics into motion while combining the security strengths of both technologies.

Emily Rae Aldridge, March 4, 2013

Sponsored by ArnoldIT.com, developer of Beyond Search

Palantir Embraces Open Source

March 4, 2013

Palantir leaves its interesting legal past and affirms open-source goodness, we learn from Directions Magazine’s article, “Palantir: An Open Source Development Success Story.”

The tale begins with Gotham, an analytics platform with a nifty new geospatial component, in 2007. The product launched successfully, and Palantir looked to expand to different databases across a variety of industries. See the article for the details of their needs and their decision-making process; long story short, the company chose PostGIS as the springboard for their solution.

Yes, springboard. It took a lot of tinkering to make the software do just what they wanted, but the experience with the open source community was a positive one for the company. In fact, Palantir went on working with OpenGeo to develop more PostGIS enhancements that would benefit more than their own company. The article tells us:

“Gotham developers were happy to fund this open source development and were especially impressed with the network effects of community bug testing and further feature development. To them it seemed only fair that others would be able to benefit from their investment, since they had benefited so greatly from what was already built by others and would similarly benefit from what others built in the future. Their assumption has proved correct; since Palantir’s original investment, many users have funded or developed new functions and performance enhancements for geography calculations.”

If interested in the development of geospatial applications, the details in this article are worth checking out. Palantir was so happy with their open source experience that they have not only continued to support others’ open source projects, but have also opened some open source ventures of their own.

Based in Palo Alto, California, Palantir Technologies focuses on improving the ways their client organizations analyze data. The company was founded in 2004 by some enterprising folks from PayPal and from Stanford University.

Cynthia Murrell, March 04, 2013

Sponsored by ArnoldIT.com, developer of Augmentext

JackBe Releases Presto 3.5 BI Solution

March 4, 2013

The Best Analytics Blog presents us with quite the string of buzzwords in, “JackBe Brings Metric-Driven Real-Time Operational Intelligence to Front-Line. . . .” The press release tells us that the business intelligence outfit JackBe has released the newest version of its flagship product, Presto. This version is said to improve the accessibility of the software’s operational measures. The write-up states:

“Presto 3.5 extends its user-friendly interface to include new options to create dashboards through drag-and-drop, to add custom visualizations as easily as plugging in the view, and to customize Presto with a customer’s own logo and colors. Once created, all Presto dashboards are portable with HTML5 apps that run anywhere, including SharePoint, portals, websites, tablets and mobile phones with the same look-and-feel of the native device. Presto 3.5 has enhanced security for mobile devices and a more secure single-sign-on experience for social media sites.”

JackBe emphasizes real-time intelligence tools and easy-to-use dashboards while promising tight security features. They also offer their own add-ons for use with mobile devices, portals, and SharePoint. The company is headquartered in Chevy Chase, Maryland, with offices in Mexico City and Fremont, California.

Cynthia Murrell, March 04, 2013

Sponsored by ArnoldIT.com, developer of Augmentext

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta