Basho Releases Riak 1.2

August 8, 2012

Basho proclaims, “Riak 1.2 Is Official!” Riak is the powerful open source, distributed database behind many scalable, data-intensive Web, mobile, and e-commerce applications. The software’s newest version has creator Basho celebrating. There are several new features; the write up specifies:

“*More efficiently add multiple Riak nodes to your cluster

*Stage and review, then commit or abort cluster changes for easier operations; plus smoother handling of rolling upgrades

*Better visibility into active handoffs

*Repair Riak KV and Search partitions by attaching to the Riak Console and using a one-line command to recover from data corruption/loss

*More performant stats for Riak; the addition of stats to Riak Search

*2i and Search usage thru the Protocol Buffers API

*Official Support for Riak on FreeBSD

*In Riak Enterprise: SSL encryption, better balancing and more granular control of replication across multiple data centers, NAT support”

The write up details Riak’s latest innovations in areas like cluster management, partition rebuilding, and LevelDB performance improvements. I highly recommend checking out the article for more information.

Basho ends their post with a thank-you to their open source community, and, naturally, a petition for feedback on the newest version of Riak. The company was founded in 2008, and is headquartered in Cambridge, Massachusetts. Customers, from start-ups to Fortune 500 companies, use Riak to implement global session stores and to manage large amounts of structured and unstructured data.

Cynthia Murrell, August 08, 2012

Sponsored by ArnoldIT.com, developer of Augmentext

Open Source Wars

August 5, 2012

Is the open source community losing its civility? The Register declares, “Oracle Hurls MySQL at Microsoft Database Wobblers.” It seems Oracle hopes to win over Microsoft SQL Server users by touting the financial advantages of MySQL. The company also vows that migration will be a breeze. The article reveals:

“Oracle claimed the migration tool would also shift database tables and data to MySQL and ‘quickly’ convert existing apps.

“Pushing its case for conversion, Oracle claimed MySQL would reduce total cost of ownership for database customers by up to 90 per cent when compared to Microsoft’s SQL Server 2012.

“Oracle is also pushing its database as a back-end to Microsoft’s Excel.

“With the migration tool Oracle is also offering a MySQL for Excel plug-in, which it said would allow data analysts to play with data in Microsoft’s spreadsheet without needing to know MySQL.”

Will the promotion of MySQL result in a coup for Oracle? Perhaps, but currently Microsoft SQL Server 2012 is doing very well as one of Microsoft‘s fastest-growing projects. That platform only runs on Windows, though, which could be a big point in Oracle’s favor.

Cynthia Murrell, August 4, 2012

Sponsored by ArnoldIT.com, developer of Augmentext

Oracle Text Makes Search Scores Adjustable

July 29, 2012

Oracle Text Search lets you sort search result by score according to IT Newscast’s
article, “Adjusting the Score on Oracle Text search results.”

They explain the process in laymen’s terms as:

“In theory, the more relevant the search term is to the document, the higher ranked Score it should receive. But in practice, the relevancy score can seem somewhat of a mystery. It’s not entirely clear how it ranks the importance of some documents over others based on the search term. And often times, once a word appears a certain number of times within a document, the Score simply maxes out at 100 and the top results can be difficult to discern from one another.”

To index, search and analyze text both in the Oracle database and on the web, Oracle Text uses standard SQL. This software is capable of utilizing keyword search, context queries, Boolean operations, mixed thematic queries, HTML/XML and more.

It can also perform linguistic analysis and support multiple languages with their advanced relevance ranking technology. There are additional features available for those who need even more advanced search methods like clustering and classification.

Oracle has been a leader in database software for more than three and a half decades. Their knowledge on adjusting search results should not come as a shock. Oracle is one company that will probably remain on top with enterprise grade applications and platform services.

Jennifer Shockley, July 29, 2012

Sponsored by IKANOW

How to use Oracle Full Text Search in an Entity Framework

July 25, 2012

Oracle has the solutions, but how do you use Oracle full text search in an entity framework? We are not sure what this means, but the info you need can be found in Devart’s article, “Using Oracle Full-Text Search in Entity Framework.”

Devart began with:

“We decided to meet the needs of our users willing to take advantage of the full-text search in Entity Framework and implemented the basic Oracle Text functionality in our Devart dotConnect for Oracle ADO.NET Entity Framework provider. For working with Oracle Text specific functions in LINQ to Entities queries, the new OracleTextFunctions class is used, which is located in the Devart.Data.Oracle.Entity.dll assembly.”

It enables working with such Oracle Text functions as:

  • CONTAINS
  • CATSEARCH
  • MATCHES
  • SCORE
  • MATCH_SCORE

Devart presents a very detailed sales pitch for OraDirect, or dotConnect as their calling it now. Whatever name you choose, the gist is the software offers native connectivity to the Oracle database, tools and technology. They also offer a customized set of their own tools to increase Dataset productivity such as Dataset Wizard and Dataset Manager.

If you can decipher their article, than the wisdom of the Oracle is yours. For the most part this article reads like a coder handbook, and I am not a coder. If you happen to speak that very enlightened language, you will probably grasp Devart’s meaning a lot quicker than this gosling. If not, maybe the Oracle will see you some other day.

Jennifer Shockley, July 25, 2012

Sponsored by IKANOW

Attivio Casts Light on SQL Shadow

July 23, 2012

It is no longer structured versus unstructured data. It is now SQL versus NoSQL, and some are advising companies to stick with SQL. The good news is… you do not have to. There is a company that says we can do it all.

That company is Attivio, and they cover the basics in their article, “SQL versus NoSQL – Why Not Have the Best of Both Worlds?” Attivio can work with SQL, NoSQL, prototypes, and they have a patented JOIN operator. They can search and utilize relationships between records of any type, and can do so spontaneously.
The piece le resistance is their AIE. This is where Attivio says they really look good, and:

“Underneath Attivio’s support for SQL is an index structure that supports massive, linear scalability. We recently helped a customer index 300 million documents; its 1.7 TB of data, sharded across three servers. The entire set is replicated to three additional machines to provide fault tolerance and additional query capacity. Our support for non-collocated JOINs means that you can STILL query the entire data set. You don’t have to manage the content on each shard, either. AIE takes care of all of that for you.”

NoSQL users were lost in the shadows of multi-media doubt, but Attivio cast some light in their direction. There you go. Attivio offers a definitive solution.

Jennifer Shockley, July 23, 2012

Sponsored by Polyspot

An Ode to Databases

July 4, 2012

The Damien Katz blog recently published a piece on the importance of databases, not unlike a love letter, called “Why Database Technology Matters.”

After rambling on for several paragraphs about how he finds databases so fascinating, Katz starts to narrow into his thesis which is that databases are one of the most significant advancements of humanity. They are as important as telecommunications and the Internet, as well as libraries (the first non-digital database).

After stating this point, Katz then goes on to discuss the databases created by IBM, Google, and Oracle. Katz writes:

“When IBM was at the absolute height of its power, they were the richest, most powerful company on the planet. They primarily sold mainframes for a lot of money, and at the core of those mainframes were big database engines, providing a big competitive advantage their customers gladly paid for.

Google has created a database indexing of the internet. They are force because they found ways to find meaning in the massive amounts of information already available. They are a very visible example of changing the way humanity thinks.”

This piece makes some interesting points regarding the impact that databases have had on our society. I admire Katz’s passion for the subject.

Jasmine Ashton, July 4, 2012

Sponsored by IKANOW

Walmart Suffers Loses Its Innovation and Data Arm

July 4, 2012

Gigaom’s Eliza Kern reported on a significant loss from Walmart’s technology arm in the article “WalmartLabs Loses Kosmix Founders.”

According to the article, after a year of working in WalmartLabs as the innovative and data arm of the retail giant, Kosmix co-founders Venky Harinarayan and Anand Rajaraman have announced that they are leaving to “take time off from the industry.”

As a social media startup, after being acquired by the retail giant in 2011, Kosmix’s ability to aggregate social media and information about a particular topic allowed Walmart to analyze large sets of data and predict the buying habits of customers.

When explaining the goal behind WalmartLabs last Novermber, Harinarayan said:

“At Walmart Labs, we’re building a big and fast data group to combine store data with social media data in some meaningful way. For example, a Wal-Mart buyer in Arkansas doesn’t know the optimal time to stock football merchandise in Wisconsin. That buyer can look to the social streams to see when people in that region are tweeting about football or their favorite teams. Monitoring social media can even help Wal-Mart find breakout products.”

It looks like Walmart has some big shoes to fill in its technical line up. Hopefully the next group will stay longer than a year.

Jasmine Ashton, July 4, 2012

Sponsored by PolySpot

Could Sepaton Have Duped the Deduping Competition?

June 23, 2012

Sepaton just called ‘game over ‘on de-duplication competitors. Their newly released software will open doors for database de-duping the likes of which have never been seen, according to Sepaton Update Tackles Large-Enterprise Database Deduplication.

Additional storage options are always welcomed by customers, so Symantec’s clients should be content. The DeltaStor DbeXtreme should provide the flexibility to make some interesting waves in the industry.

Jason Buffington of Enterprise Strategy Group stated:

“If you ask a DBA how to best back up large data sets, they will tell you to ‘turn ON multi-streaming. Customers don’t have to choose between multi-streaming, multiplexing and capacity reduction through higher de-dupe. Sepaton’s customers can set data reduction ratios and storage utilization by client and backup job.”

This is a software only release for now, but storage and servers will become available within the next six months to a year. At that point customers will see an extreme boost in performance and security. They have been testing for a while and based on initial trials the software performance increases by a factor of 2 and throughout by 20%, so there is room for improvement.

The DeltaStor DbeXtreme software is unique because it eliminates tradeoffs between the backup performance and de-duplication process. Their database de-duplication doesn’t use hashing, but instead analyzes the data after receipt while it’s gathered in the storage pool. Thus, it eliminates redundant elements while many other solutions just can’t do that. If this software functions up to expectations, than Sepaton duped the competition.

Jennifer Shockley, June 23, 2012

Sponsored by IKANOW

Big Thoughts on Big Data

May 26, 2012

CorrelSense recently reported on one of the hottest IT trends to date in the article, “Big Data is Truly Transforming the Enterprise.”

According to the MIT’s principal research scientist, Andrew McAfee, Big Data can be likened to the invention of the Microscope in the sense that it exposes information that we couldn’t have found before the way that the Microscope allows you to view things that previously could not be seen.

The article states:

“As IT Pros, you are going to have to learn to process this big data and find tools for the non-technical experts and suits in the C-Suite to mix and match the data. The big difference between this and traditional business intelligence is that with BI you were looking back where you were at a given point in time, whereas with Big Data, you can analyze data in real time and begin to make more intelligent decisions about where to put your resources at any given moment.”

Rather than reducing jobs, as many people fear that technological progression may do, it rather will create them. We’re obviously going to need more people to decipher through this growing pile of data.

Jasmine Ashton, May 26, 2012

Sponsored by PolySpot

SAP Big Blue Rides Hana

May 25, 2012

The University of Kentucky‘s business intelligence team has had to make some adjustments after the school implemented SAP‘s HANA system. ComputerWorld declares, “For Univ. of Kentucky, SAP’s HANA is ‘Disruptive’.” Writer Patrick Thibodeau, punning on the term “disruptive technology,” notes that the University is (purposely) using HANA to restructure its BI system to better analyze student retention.

The new in-memory systems like HANA pull data from RAM instead of from hard disks. Speed and relative simplicity are the advantages, but these systems do require a hardware investment. In this case, Dell provided the hardware and developed the school’s student retention data models.

HANA is only a year old, and questions about its longevity are still in the air. Part of the issue is the hardware question—should organizations deploy on the tried and true x86 system or go with an engineered system, like IBM’s new PureSystems. Thibodeau writes:

“Engineered systems offer performance gains, meaning faster time to realize value and ‘less cumbersome’ management, said Alys Woodward, a research director at IDC. On the other hand, ‘software on commodity hardware reduces vendor lock-in and enables the use of cheaper components,’ said Woodward.

“How SAP HANA ‘will play in the broader marketplace — outside SAP’s core install base — against Oracle Exadata and IBM engineered systems, depends to some extent on how these two opposing concepts will play out,’ said Woodward.”

So, x86 or engineered, take your pick. If you are considering HANA, though, the write up notes that you should make sure it will do what you want before buying the pricey software. It will not, for example, make up for poor data quality. It is also more worth the cost and effort someplace where business requirements change frequently than for an organization with a more static environment.

Cynthia Murrell, May 25, 2012

Sponsored by PolySpot

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta