Solcara Is The Best!  Ra Ra Ra!

June 15, 2015

Thomson-Reuters is a world renowned news syndication, but the company also has its own line of search software called Solcara Federated Search also known as Solcara SolSearch.”  In a cheerleading press release, Q-resolve highlights Solcara’s features and benefits: “Solcara Legal Search, Federated Search And Know How.”  Solcara allows users to search multiple information resources, including intranets, databases, Knowledge Management, and library and document management systems.  It returns accurate results according to the inputted search terms or keywords.  In other words, it acts like an RSS feed combined with Google.

Solcara also has a search product specially designed for those in the legal profession and the press release uses a smooth reading product description to sell it:

“Solcara legal Search is as easy to use as your favorite search engine. With just one search you can reference internal documents and approved legal information resources simultaneously without the need for large scale content indexing, downloading or restructuring. What’s more, you can rely on up-to-date content because all searches are carried out in real time.”

The press release also mentions some other tools, case studies, and references the semantic Web.  While Solcara does sound like a good product and comes from a reliable new aggregator like Thomson-Reuters, the description and organization of the press release makes it hard to understand all the features and who the target consumer group is.  Do they want to sell to the legal profession and only that group or do they want to demonstrate how Solcara can be adapted to all industries that digest huge information amounts?  The importance of advertising is focusing the potential buyer’s attention.  This one jumps all over the place.

Whitney Grace, June 15, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Lexalytics: GUI and Wizard

June 12, 2015

What is one way to improve a user’s software navigational experience?  One of the best ways is to add a graphical user interface (GUI).  Software Development @ IT Business Net shares a press release about “Lexalytics Unveils Industry’s First Wizard For Text Mining And Sentiment Analysis.”  Lexalytics is one of the leading companies that provides sentiment and analytics solutions and as the article’s title explains it has made an industry first by releasing a GUI and wizard for Semantria SaaS platform and Excel plug-in.  The wizard and GUI (SWIZ) are part of the Semantria Online Configurator, SWEB 1.3, which also included functionality updates and layout changes.

” ‘In order to get the most value out of text and sentiment analysis technologies, customers need to be able to tune the service to match their content and business needs,’ said Jeff Catlin, CEO, Lexalytics. ‘Just like Apple changed the game for consumers with its first Macintosh in 1984, making personal computing easy and fun through an innovative GUI, we want to improve the job of data analysts by making it just as fun, easy and intuitive with SWIZ.’”

Lexalytics is dedicated to helping its clients enjoy an easier experience when it comes to data analytics.  The company wants its clients to get the answers they by providing the tools they need to get them without having to over think the retrieval process.  While Lexalytics already provides robust and flexible solutions, the SWIZ release continues to prove it has the most tunable and configurable text mining technology.

Whitney Grace, June 12, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Search Companies: Innovative or Not?

June 11, 2015

Forbes’ article “The 50 Most Innovative Companies Of 2014: Strong Innovators Are Three Times More Likely To Rely on Big Data Analytics” points out how innovation is strongly tied to big data analytics and data mining these days.  The Boston Consulting Group (BCG) studies the methodology of innovation.  The numbers are astounding when companies that use big data are placed against those who still have not figured out how to use their data: 57% vs. 19%.

Innovation, however, is not entirely defined by big data.  Most of the companies that rely on big data as key to their innovation are software companies.  According to Forbes’ study, they found that 53% see big data as having a huge impact in the future, while BCG only found 41% who saw big data as vital to their innovation.

Big data cannot be and should not be ignored.  Forbes and BCG found that big data analytics are useful and can have huge turnouts:

“BCG also found that big-data leaders generate 12% higher revenues than those who do not experiment and attempt to gain value from big data analytics.  Companies adopting big data analytics are twice as likely as their peers (81% versus 41%) to credit big data for making them more innovative.”

Measuring innovation proves to be subjective, but one cannot die the positive effect big data analytics and data mining can have on a company.  You have to realize, though, that big data results are useless without a plan to implement and use the data.  Also take note that none of the major search vendors are considered “innovative,” when a huge part of big data involves searching for results.

Whitney Grace, June 11, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Data Darkness

May 28, 2015

According to Datameer, organizations do not use a large chunk of their data and it is commonly referred to “dark data.”  “Shine Light On Dark Data” explains that organizations are trying to dig out the dark data and use it for business intelligence or in more recent terms big data.  Dark data is created from back end business processes as well as from regular business activities.  It is usually stored on storage silo in a closet and only kept for compliance audits.

Dark data has a lot of hidden potential:

Research firm IDC estimates that 90 percent of digital data is dark. This dark data may come in the form of machine or sensor logs that when analyzed help predict vacated real estate or customer time zones that may help businesses pinpoint when customers in a specific region prefer to engage with brands. While the value of these insights are very significant, setting foot into the world of dark data that is unstructured, untagged and untapped is daunting for both IT and business users.”

The article suggests making a plan to harness the dark data and it does not offer much in the way of approaching a project other than making it specifically for dark data, such as identifying sources, use Hadoop to mine it, and tests results against other data sets.

This article is really a puff piece highlighting dark data without going into much detail about it.  They are forgetting that the biggest movement in IT from the past three years: big data!

Whitney Grace, May 28, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

Welcome YottaSearch

May 26, 2015

There is another game player in the world of enterprise search: Yotta Data Technologies announced their newest product: “Yotta Data Technologies Announces Enterprise Search And Big Data Analytics Platform.”  Yotta Data Technologies is known for its affordable and easy to use information management solutions. Yotta has increased its solutions by creating YottaSearch, a data analytics and search platform designed to be a data hub for organizations.

“YottaSearch brings together the most powerful and agile open source technologies available to enable today’s demanding users to easily collect data, search it, analyze it and create rich visualizations in real time.  From social media and email for Information Governance and eDiscovery to web and network server logs for Information Technology Operations Analytics (ITOA), YottaSearch™ provides the Big Data Analytics for users to derive information intelligence that may be critical to a project, case, business unit or market.”

YottaSearch uses the popular SaaS model and offers users not only data analytics and search, but also knowledge management, information governance, eDiscovery, and IT operations analytics.  Yotta decided to create YottaSearch to earn revenue from the burgeoning big data market, especially the enterprise search end.

The market is worth $1.7 billion, so Yotta has a lot of competition, but if they offer something different and better than their rivals they stand a chance to rise to the top.

Whitney Grace, May 26, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Is Collaboration the Key to Big Data Progress?

May 22, 2015

The article titled Big Data Must Haves: Capacity, Compute, Collaboration on GCN offers insights into the best areas of focus for big data researchers. The Internet2 Global Summit is in D.C. this year with many exciting panelists who support the emphasis on collaboration in particular. The article mentions the work being presented by several people including Clemson professor Alex Feltus,

“…his research team is leveraging the Internet2 infrastructure, including its Advanced Layer 2 Service high-speed connections and perfSONAR network monitoring, to substantially accelerate genomic big data transfers and transform researcher collaboration…Arizona State University, which recently got 100 gigabit/sec connections to Internet2, has developed the Next Generation Cyber Capability, or NGCC, to respond to big data challenges.  The NGCC integrates big data platforms and traditional supercomputing technologies with software-defined networking, high-speed interconnects and visualization for medical research.”

Arizona’s NGCC provides the essence of the article’s claims, stressing capacity with Internet2, several types of computing, and of course collaboration between everyone at work on the system. Feltus commented on the importance of cooperation in Arizona State’s work, suggesting that personal relationships outweigh individual successes. He claims his own teamwork with network and storage researchers helped him find new potential avenues of innovation that might not have occurred to him without thoughtful collaboration.

Chelsea Kerwin, May 22, 2014

Stephen E Arnold, Publisher of CyberOSINT at www.xenky.com

Explaining Big Data Mythology

May 14, 2015

Mythologies usually develop over a course of centuries, but big data has only been around for (arguably) a couple decades—at least in the modern incarnate.  Recently big data has received a lot of media attention and product development, which was enough to give the Internet time to create a big data mythology.  The Globe and Mail wanted to dispel some of the bigger myths in the article, “Unearthing Big Myths About Big Data.”

The article focuses on Prof. Joerg Niessing’s big data expertise and how he explains the truth behind many of the biggest big data myths.  One of the biggest items that Niessing wants people to understand is that gathering data does not equal dollar signs, you have to be active with data:

“You must take control, starting with developing a strategic outlook in which you will determine how to use the data at your disposal effectively. “That’s where a lot of companies struggle. They do not have a strategic approach. They don’t understand what they want to learn and get lost in the data,” he said in an interview. So before rushing into data mining, step back and figure out which customer segments and what aspects of their behavior you most want to learn about.”

Niessing says that big data is not really big, but made up of many diverse, data points.  Big data also does not have all the answers, instead it provides ambiguous results that need to be interpreted.  Have questions you want to be answered before gathering data.  Also all of the data returned is not the greatest.  Some of it is actually garbage, so it cannot be usable for a project.  Several other myths are uncovered, but the truth remains that having a strategic big data plan in place is the best way to make the most of big data.

Whitney Grace, May 14, 2015

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Hoping to End Enterprise Search Inaccuracies

May 1, 2015

Enterprise search is limited to how well users tag their content and the preloaded taxonomies.  According Tech Target’s Search Content Management blog, text analytics might be the key to turning around poor enterprise search performance: “How Analytics Engines Could Finally-Relieve Enterprise Pain.”  Text analytics turns out to only be part of the solution.  Someone had the brilliant idea to use text analytics to classification issues in enterprise search, making search reactive to user input to proactive to search queries.

In general, analytics search engines work like this:

“The first is that analytics engines don’t create two buckets of content, where the goal is to identify documents that are deemed responsive. Instead, analytics engines identify documents that fall into each category and apply the respective metadata tags to the documents.  Second, people don’t use these engines to search for content. The engines apply metadata to documents to allow search engines to find the correct information when people search for it. Text analytics provides the correct metadata to finally make search work within the enterprise.”

Supposedly, they are fixing the tagging issue by removing the biggest cause for error: humans. Microsoft caught onto how much this could generate profit, so they purchased Equivio in 2014 and integrated the FAST Search platform into SharePoint.  Since Microsoft is doing it, every other tech company will copy and paste their actions in time.  Enterprise search is gull of faults, but it has improved greatly.  Big data trends have improved search quality, but tagging continues to be an issue.  Text analytics search engines will probably be the newest big data field for development. Hint for developers: work on an analytics search product, launch it, and then it might be bought out.

Whitney Grace, May 1 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

AI May Give Edge to Small and Medium Businesses

April 7, 2015

Over at the B2B News Network, writer Rick Delgado shares some observations about the use of data-related AI in small and medium-sized businesses in his piece, “Building Business Intelligence Through Artificial Intelligence.” He asserts that using AI-enhanced data analysis can help such companies compete with the big players. He writes:

“Most smaller companies don’t have experienced IT technicians and data scientists familiar with the language required for proper data analysis. Having an AI feature allows employees to voice questions as they would normally talk, and even allows for simple-to-understand responses, as opposed to overly technical insights. The ability to understand a program is key to its functionality, and AI shortens the learning curve allowing organizations to get to work faster.”

The article observes that AI can help with sales and marketing by, for example, narrowing down leads to the most promising prospects. It can also make supply chains more efficient. Delgado notes that, though existing supply-chain tools are not very adaptable, he believes they will soon automatically adjust for changing factors like transportation costs or commodity prices around the world. The article concludes:

“Any attempt to predict how AI will evolve over the coming years is a fool’s errand, because every new discovery leads to countless possibilities. What we do know is that AI won’t remain restricted to just improving sales and organizational supply chain. Already we see its availability to everyday users with announcements like Microsoft combining AI with Windows. Experts are also exploring other possibilities, like using AI to improve network security, law enforcement and robotics. The important takeaway is that the combination of Big Data and AI will allow for rapid decisions that don’t require constant human oversight, improving both efficiency and productivity.”

Wonderful! We would caution our dear readers to look before they leap, however. To avoid wasting time and money, a company should know just what they need from their software before they go shopping.

Cynthia Murrell, April 7, 2015

Stephen E Arnold, Publisher of CyberOSINT at www.xenky.com

Apache Sparking Big Data

April 3, 2015

Apache Spark is an open source cluster computing framework that rivals MapReduceVenture Beat says that people did not pay that much attention to Apache Spark when it was first invented at University of California’s AMPLAB in 2011.  The article, “How An Early Bet On Apache Spark Paid Off Big” reports the big data open source supporters are adopting Apache Spark, because of its superior capabilities.

People with big data plans want systems that process real-time information at a fast pace and they want a whole lot of it done at once.  MapReduce can do this, but it was not designed for it.  It is all right for batch processing, but it is slow and much to complex to be a viable solution.

“When we saw Spark in action at the AMPLab, it was architecturally everything we hoped it would be: distributed, in-memory data processing speed at scale. We recognized we’d have to fill in holes and make it commercially viable for mainstream analytics use cases that demand fast time-to-insight on hordes of data. By partnering with AMPLab, we dug in, prototyped the solution, and added the second pillar needed for next-generation data analytics, a simple to use front-end application.”

ClearStory Data was built using Apache Spark to access data quickly, deliver key insights, and making the UI very user friendly.  People who use Apache Spark want information immediately to be utilized for profit from a variety of multiple sources.  Apache Spark might ignite the fire for the next wave of data analytics for big data.

Whitney Grace, April 3, 2015
Stephen E Arnold, Publisher of CyberOSINT at www.xenky.com

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta