The Uncertain Fate of OpenOffice
September 27, 2016
We are in danger of losing a popular open-source alternative to the Microsoft Office suite, we learn from the piece, “Lack of Volunteer Contributors Could Mean the End for OpenOffice” at Neowin. Could this the fate of open source search, as well?
Writer William Burrows observes that few updates for OpenOffice have emerged of late, only three since 2013, and the last stable point revision was released about a year ago. More strikingly, it took a month to patch a major security flaw over the summer, reports Burrows. He goes on to summarize OpenOffice’s 14-year history, culminating it the project’s donation to Apache by Oracle in 2011. It appears to have been downhill from there. The article tells us:
It was at this point that a good portion of the volunteer developer base reportedly moved onto the forked LibreOffice project. Since becoming Apache OpenOffice, activity on project has diminished significantly. In a statement by Dennis Hamilton, the project’s volunteer vice president, released in an email to the mailing list it was suggested that “retirement of the project is a serious possibility” citing concerns that the current team of around six volunteer developers who maintain the project may not have sufficient resources to eliminate security vulnerabilities. There is still some hope for OpenOffice, though, with some of the contributors suggesting that discussion about a shutdown may be a little premature, and that attracting new contributors is still possible.
In fact, OpenOffice was downloaded over 29 million times last year, so obviously it still has a following. LibreOffice is currently considered more successful, but that could change if OpenOffice manages to attract a resurgence of developers willing to contribute to the project. Any volunteers?
Cynthia Murrell, September 27, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
There is a Louisville, Kentucky Hidden Web/Dark Web meet up on September 27, 2016.
Information is at this link: https://www.meetup.com/Louisville-Hidden-Dark-Web-Meetup/events/233599645/
Neural Networks and Thought Commands
July 22, 2015
If you’ve been waiting for the day you can operate a computer by thinking at it, check out “When Machine Learning Meets the Mind: BBC and Google Get Brainy” at the Inquirer. Reporter Chris Merriman brings our attention to two projects, one about hardware and one about AI, that stand at the intersection of human thought and machine. Neither venture is anywhere near fruition, but a peek at their progress gives us clues about the future.
The internet-streaming platform iPlayer is a service the BBC provides to U.K. residents who wish to catch up on their favorite programmes. In pursuit of improved accessibility, the organization’s researchers are working on a device that allows users to operate the service with their thoughts. The article tells us:
“The electroencephalography wearable that powers the technology requires lucidity of thought, but is surprisingly light. It has a sensor on the forehead, and another in the ear. You can set the headset to respond to intense concentration or meditation as the ‘fire’ button when the cursor is over the option you want.”
Apparently this operation is easier for some subjects than for others, but all users were able to work the device to some degree. Creepy or cool? Perhaps it’s both, but there’s no escaping this technology now.
As for Google’s undertaking, we’ve examined this approach before: the development of artificial neural networks. This is some exciting work for those interested in AI. Merriman writes:
“Meanwhile, a team of Google researchers has been looking more closely at artificial neural networks. In other words, false brains. The team has been training systems to classify images and better recognise speech by bombarding them with input and then adjusting the parameters to get the result they want.
But once equipped with the information, the networks can be flipped the other way and create an impressive interpretation of objects based on learned parameters, such as ‘a screw has twisty bits’ or ‘a fly has six legs’.”
This brain-in-progress still draws some chuckle-worthy and/or disturbing conclusions from images, but it is learning. No one knows what the end result of Google’s neural network research will be, but it’s sure to be significant. In a related note, the article points out that IBM is donating its machine learning platform to Apache Spark. Who knows where the open-source community will take it from here?
Cynthia Murrell, July 22, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Basho Enters Ring With New Data Platform
June 18, 2015
When it comes to enterprise technology these days, it is all about making software compliant for a variety of platforms and needs. Compliancy is the name of the game for Basho, says Diginomica’s article, “Basho Aims For Enterprise Operational Simplicity With New Data Platform.” Basho’s upgrade to its Riak Data Platform makes it more integration with related tools and to make complex operational environments simpler. Data management and automation tools are another big seller for NoSQL enterprise databases, which Basho also added to the Riak upgrade. Basho is not the only company that is trying to improve NoSQL enterprise platforms, these include MongoDB and DataStax. Basho’s advantage is delivering a solution using the Riak data platform.
Basho’s data platform already offers a variety of functions that people try to get to work with a NoSQL database and they are nearly automated: Riak Search with Apache Solr, orchestration services, Apache Spark Connector, integrated caching with Redis, and simplified development using data replication and synchronization.
“CEO Adam Wray released some canned comment along with the announcement, which indicates that this is a big leap for Basho, but also is just the start of further broadening of the platform. He said:
‘This is a true turning point for the database industry, consolidating a variety of critical but previously disparate services to greatly simplify the operational requirements for IT teams working to scale applications with active workloads. The impact it will have on our users, and on the use of integrated data services more broadly, will be significant. We look forward to working closely with our community and the broader industry to further develop the Basho Data Platform.’”
The article explains that NoSQL market continues to grow and enterprises need management as well as automation to manage the growing number of tasks databases are used for. While a complete solution for all NoSQL needs has been developed, Basho comes fairly close.
Whitney Grace, June 18, 2015
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Apache Sparking Big Data
April 3, 2015
Apache Spark is an open source cluster computing framework that rivals MapReduce. Venture Beat says that people did not pay that much attention to Apache Spark when it was first invented at University of California’s AMPLAB in 2011. The article, “How An Early Bet On Apache Spark Paid Off Big” reports the big data open source supporters are adopting Apache Spark, because of its superior capabilities.
People with big data plans want systems that process real-time information at a fast pace and they want a whole lot of it done at once. MapReduce can do this, but it was not designed for it. It is all right for batch processing, but it is slow and much to complex to be a viable solution.
“When we saw Spark in action at the AMPLab, it was architecturally everything we hoped it would be: distributed, in-memory data processing speed at scale. We recognized we’d have to fill in holes and make it commercially viable for mainstream analytics use cases that demand fast time-to-insight on hordes of data. By partnering with AMPLab, we dug in, prototyped the solution, and added the second pillar needed for next-generation data analytics, a simple to use front-end application.”
ClearStory Data was built using Apache Spark to access data quickly, deliver key insights, and making the UI very user friendly. People who use Apache Spark want information immediately to be utilized for profit from a variety of multiple sources. Apache Spark might ignite the fire for the next wave of data analytics for big data.
Whitney Grace, April 3, 2015
Stephen E Arnold, Publisher of CyberOSINT at www.xenky.com
Apache Samza Revamps Databases
March 19, 2015
Databases have advanced far beyond the basic relational databases. They need to be consistently managed and have real-time updates to keep them useful. The Apache Software Foundation developed the Apache Samza software to help maintain asynchronous stream processing network. Samza was made in conjunction with Apache Kafka.
If you are interested in learning how to use Apache Samza, the Confluent blog posted “Turning The Database Inside-Out With Apache Samza” by Martin Keppmann. Kleppmann recorded a seminar he gave at Strange Loop 2014 that explains his process for how it can improve many features on a database:
“This talk introduces Apache Samza, a distributed stream processing framework developed at LinkedIn. At first it looks like yet another tool for computing real-time analytics, but it’s more than that. Really it’s a surreptitious attempt to take the database architecture we know, and turn it inside out. At its core is a distributed, durable commit log, implemented by Apache Kafka. Layered on top are simple but powerful tools for joining streams and managing large amounts of data reliably.”
Learning new ways to improve database features and functionality always improve your skill set. Apache Software also forms the basis for many open source projects and startups. Martin Kleppman’s talk might give you a brand new idea or at least improve your database.
Whitney Grace, March 20, 2015
Stephen E Arnold, Publisher of CyberOSINT at www.xenky.com

