IBM’s Champion Human Resources Department Announces “Permanent” Layoff Tactics
August 16, 2016
The article on Business Insider titled Leaked IBM Email Says Cutting “Redundant” Jobs Is a “Permanent and Ongoing” Part of Its Business Model explores the language and overall human resource strategy of IBM. Netherland IBM personnel learned in the email that layoffs are coming, but also that layoffs will be a regular aspect of how IBM “optimizes” their workforce. The article tells us,
“IBM isn’t new to layoffs, although these are the first to affect the Netherlands. IBM’s troubled business units, like its global technology services unit, are shrinking faster than its booming businesses, like its big data/analytics, machine learning (aka Watson), and digital advertising agency are growing…All told, IBM eliminated and gained jobs in about equal numbers last year, it said. It added about 70,000 jobs, CEO Rometty said, and cut about that number, too.”
IBM seems to be performing a balancing act that involves gaining personnel in areas like data analytics while shedding employees in other areas that are less successful, or “redundant.” This allows them to break even, although the employees that they fire might feel that Watson itself could have delivered the news more gracefully and with more tact than the IBM HR department did. At any rate, we assume that IBM’s senior management asked Watson what to do and that this permanent layoffs strategy was the informed answer provided by the supercomputer.
Chelsea Kerwin, August 16, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
There is a Louisville, Kentucky Hidden /Dark Web meet up on August 23, 2016.
Information is at this link: https://www.meetup.com/Louisville-Hidden-Dark-Web-Meetup/events/233019199/
IBM Cognitive Storage Creates a Hierarchy of Data Value
August 5, 2016
The article titled IBM Introduces Cognitive Storage on EWeek reveals the advances in storage technology. It may sound less sexy than big data, but it is an integral part of our ability to sort and retrieve data based on the metric of data value. For a computer to determine a hierarchy of data value would also enable it to locate and archive unimportant data, freeing up space for data of more relevance. The article explains,
“In essence, the concept helps computers to learn what to remember and what to forget, IBM said… “With rising costs in energy and the explosion in big data, particularly from the Internet of Things, this is a critical challenge as it could lead to huge savings in storage capacity, which means less media costs and less energy consumption… if 1,000 employees are accessing the same files every day, the value of that data set should be very high.”
Frequency of use is a major factor in determining data value, so IBM created trackers to monitor this sort of metadata. Interestingly, the article states that IBM’s cognitive computing was inspired by astronomy. An astronomer would tag incoming data sets from another galaxy as “highly important” or less so. So what happens to the less important data? It isn’t destroyed, but rather relegated to what Charles King of Pund-IT calls a “deep freeze.”
Chelsea Kerwin, August 5, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
The Watson Update
July 15, 2016
IBM invested a lot of resources, time, and finances into developing the powerful artificial intelligence computer Watson. The company has been trying for years to justify the expense as well as make money off their invention, mostly by having Watson try every conceivable industry that could benefit from big data-from cooking to medicine. We finally have an update on Watson says ZDNet in the article, “IBM Talks About Progress On Watson, OpenPower.”
Watson is a cognitive computer system that learns, supports natural user interfaces, values user expertise, and evolves with new information. Evolving is the most important step, because that will allow Watson to keep gaining experience and learn. When Watson was first developed, IBM fed it general domain knowledge, then made the Watson Discovery to find answers to specific questions. This has been used in the medical field to digest all the information created and applying it to practice.
IBM also did this:
“Most recently IBM has been focused on making Watson available as a set of services for customers that want to build their own applications with natural question-and-answer capabilities. Today it has 32 services available on the Watson Developer Cloud hosted on its Bluemix platform-as-a-service… Now IBM is working on making Watson more human. This includes a Tone Analyzer (think of this as a sort spellchecker for tone before you send that e-mail to the boss), Emotion Analysis of text, and Personality Insights, which uses things you’ve written to assess your personality traits.”
Cognitive computing has come very far since Watson won Jeopardy. Pretty soon the technology will be more integrated into our lives. The bigger question is how will change society and how we live?
Whitney Grace, July 15, 2016
There is a Louisville, Kentucky Hidden Web/Dark
Web meet up on July 26, 2016. Information is at this link: http://bit.ly/29tVKpx.
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
What Could Possibly Go Wrong?
July 13, 2016
After reading The Atlantic’s article, “Technology, The Faux Equalizer” about how technology is limited to the very wealthy and does not level the playing field. It some ways new technology can be a nuisance to the average person trying to scratch out a living in an unfriendly economy. Self-driving cars are one fear, but did you ever think bankers and financial advisors would have to compete with algorithms? The International Business Times shares, “Will Financial Analysts Lose Their Jobs To Intelligent Trading Machines?”
Machine learning software can crunch numbers faster and can extrapolate more patterns than a human. Hedge fund companies have hired data scientists, physicists, and astronomers to remove noise from data and help program the artificial intelligence software. The article used UK-based Bridgewater Associates as an example of a financial institute making strides in automizing banking:
“Using Bridgewater as an example, Sutton told IBTimes UK: ‘If you look at their historic trading strategies, it’s been very much long-term bets around what’s happening at a macro level. They have built their entire business on having some of the best research and analytics in the industry and some of the smartest minds thinking on that. When you combine those two things, I would definitely expect artificial intelligence to be applied to identify large-scale trades that might not be evident to an individual researcher.’”
Developing artificial intelligence for the financial sector has already drawn the attention of private companies and could lead to a 30% lose of jobs due to digitization. It would allow financial companies a greater range of information to advise their clients on wise financial choices, but it could also mean these institutes lose talent as the analysts role was to groom more talent.
These will probably be more potential clients for IBM’s Watson. We should all just give up now and hail our robot overlords.
Whitney Grace, July 13, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
The Computer Chip Inspired by a Brain
July 6, 2016
Artificial intelligence is humanity’s attempt to replicate the complicated thought processes in their own brains through technology. IBM is trying to duplicate the human brain and they have been successful in many ways with supercomputer Watson. The Tech Republic reports that IBM has another success under their belt, except to what end? Check out the article, “IBM’s Brain-Inspired Chip TrueNorth Changes How Computers ‘Think,’ But Experts Question Its Purpose.”
IBM’s TrueNorth is the first computer chip with an one million neuron architecture. The chip is a collaboration between Cornell University and IBM with the BARPA SyNAPSE Program, using $100 million in public funding. Most computer chips use the Von Neumann architecture, but the TrueNorth chip better replicates the human brain. TrueNorth is also more energy efficient.
What is the purpose of the TrueNorth chip, however? IBM created an elaborate ecosystem that uses many state of the art processes, but people are still wondering what the real world applications are:
“ ‘…it provides ‘energy-efficient, always-on content generation for wearables, IoT devices, smartphones.’ It can also give ‘real-time contextual understanding in automobiles, robotics, medical imagers, and cameras.’ And, most importantly, he said, it can ‘provide volume-efficient, unprecedented neural network acceleration capability per unit volume for cloud-based streaming processing and provide volume, energy, and speed efficient multi-modal sensor fusion at an unprecedented neural network scale.’”
Other applications include cyber security, other defense goals, and large scale computing and hardware running on the cloud. While there might be practical applications, people still want to know why IBM made the chip?
” ‘It would be as if Henry Ford decided in 1920 that since he had managed to efficiently build a car, we would try to design a car that would take us to the moon,’ [said Nir Shavit, a professor at MIT’s Computer Science and Artificial Intelligence Laboratory]. ‘We know how to fabricate really efficient computer chips. But is this going to move us towards Human quality neural computation?’ Shavit fears that its simply too early to try to build neuromorphic chips. We should instead try much harder to understand how real neural networks compute.’”
Why would a car need to go to the moon? It would be fun to go to the moon, but it doesn’t solve a practical purpose (unless we build a civilization on the moon, although we are a long way from that). It continues:
” ‘The problem is,’ Shavit said, ‘that we don’t even know what the problem is. We don’t know what has to happen to a car to make the car go to the moon. It’s perhaps different technology that you need. But this is where neuromorphic computing is.’”
In other words, it is the theoretical physics of computer science.
Whitney Grace, July 6, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Supercomputers Have Individual Personalities
July 1, 2016
Supercomputers like Watson are more than a novelty. They were built to be another tool for humans, rather than replacing humans all together or so reads some comments from Watson’s chief technology officer Rob High. High was a keynote speaker at the Nvidia GPU Technology Conference in San Jose, California. The Inquirer shares the details in “Nvidia GTC: Why IBM Watson Dances Gangam Style And Sings Like Taylor Swift.”
At the conference, High said that he did not want his computer to take over his thinking, instead he wanted the computer to do his research for him. Research and keeping up with the latest trends in any industry consumes A LOT of time and a supercomputer could potentially eliminate some of the hassle. This requires that supercomputers become more human:
“This leads on to the fact that the way we interact with computers needs to change. High believes that cognitive computers need four skills – to learn, to express themselves with human-style interaction, to provide expertise, and to continue to evolve – all at scale. People who claim not to be tech savvy, he explained, tend to be intimidated by the way we currently interact with computers, pushing the need for a further ‘humanising’ of the process.”
In order to humanize robots, what is taking place is them learning how to be human. A few robots have been programmed with Watson as their main processor and they can interact with humans. By interacting with humans, the robots pick up on human spoken language as well as body language and vocal tone. It allows them to learn how to not be human, but rather the best “artificial servant it can be”.
Robots and supercomputers are tools that can ease a person’s job, but the fact still remains that in some industries they can also replace human labor.
Whitney Grace, July 1, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
IBM Cloud Powers Comic-Con Channel
June 30, 2016
The San Diego Comic-Con is the biggest geek and pop culture convention in the country and it needs to be experienced to be believed. Every year the San Diego Comic-Con gets bigger and more complex as attendees and the guests demand more from the purveyors. If you are at Comic-Con, then you need to think big. Thinking big requires thinking differently, which is why it would seem “IBM And Comic-Con HQ Make Strange Bedfellows” says Fortune.
IBM announced that they have teamed with Lionsgate to run a Comic-Con HQ video channeled powered by IBM’s cloud. The on-demand channel will premiere during 2016’s Comic-Con. Comic-con attendees and those unfortunate not to purchase a ticket have demanded video streaming services for years, practically ever since it became possible. Due to copyright as well as how to charge attendees for the service have kept video on-demand on the back burner, but now it is going to happen and it is going to be a challenge.
Video streaming is:
“Video is a demanding application for cloud computing. Storing and shipping massive video files, often shot in ultra-high-definition 4k format, is a useful testbed to show off cloud services.”
Anything new related to Comic-Con always proves to be a hassle and troublesome. One of the cases in point is when the SDCC launched its digital waiting room to purchase tickets and had way more traffic than their servers could handle. The end result was a lot of angry fans unable to buy tickets. Another challenge was handling the massive crowds that started flocking to the convention halls around the mid-2000s (attendance swelled around 2011 with the Twilight movies).
Anything that will improve the Comic-Con experience and even allow non-attendees a taste of the magical July event would be welcome.
Whitney Grace, June 30, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
The Unknown Future of Google Cloud Platform
June 10, 2016
While many may have the perception Google dominates in many business sectors, a recent graph published shows a different story when it comes to cloud computing. Datamation released a story, Why Google Will Dominate Cloud Computing, which shows Google’s position in fourth. Amazon, Microsoft and IBM are above the search giant in cloud infrastructure services when looking at the fourth quarter market share and revenue growth for 2015. The article explains why Google appears to be struggling,
“Yet as impressive as its tech prowess is, GCP’s ability to cater to the prosaic needs of enterprise cloud customers has been limited, even fumbling. Google has always focused more on selling its own services rather than hosting legacy applications, but these legacy apps are the engine that drives business. Remarkably, GCP customers don’t get support for Oracle software, as they do on Amazon Web Services. Alas, catering to the needs of enterprise clients isn’t about deep genius – it’s about working with others. GCP has been like the high school student with straight A’s and perfect SAT scores that somehow doesn’t have too many friends.”
Despite the current situation, the article hypothesizes Google Cloud Platform may have an edge in the long-term. This is quite a bold prediction. We wonder if Datamation may approach the goog to sell some ads. Probably not, as real journalists do not seek money, right?
Megan Feil, June 10, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
Artificial Intelligence Spreading to More Industries
May 10, 2016
According to MIT Technology Review, it has finally happened. No longer is artificial intelligence the purview of data wonks alone— “AI Hits the Mainstream,” they declare. Targeted AI software is now being created for fields from insurance to manufacturing to health care. Reporter Nanette Byrnes is curious to see how commercialization will affect artificial intelligence, as well as how this technology will change different industries.
What about the current state of the AI field? Byrnes writes:
“Today the industry selling AI software and services remains a small one. Dave Schubmehl, research director at IDC, calculates that sales for all companies selling cognitive software platforms —excluding companies like Google and Facebook, which do research for their own use—added up to $1 billion last year. He predicts that by 2020 that number will exceed $10 billion. Other than a few large players like IBM and Palantir Technologies, AI remains a market of startups: 2,600 companies, by Bloomberg’s count. That’s because despite rapid progress in the technologies collectively known as artificial intelligence—pattern recognition, natural language processing, image recognition, and hypothesis generation, among others—there still remains a long way to go.”
The article examines ways some companies are already using artificial intelligence. For example, insurance and financial firm USAA is investigating its use to prevent identity theft, while GE is now using it to detect damage to its airplanes’ engine blades. Byrnes also points to MyFitnessPal, Under Armor’s extremely successful diet and exercise tracking app. Through a deal with IBM, Under Armor is blending data from that site with outside research to help better target potential consumers.
The article wraps up by reassuring us that, despite science fiction assertions to the contrary, machine learning will always require human guidance. If you doubt, consider recent events—Google’s self-driving car’s errant lane change and Microsoft’s racist chatbot. It is clear the kids still need us, at least for now.
Cynthia Murrell, April 10, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
IBM Uses Watson Analytics Freebie Academic Program to Lure in Student Data Scientists
May 6, 2016
The article on eWeek titled IBM Expands Watson Analytics Program, Creates Citizen Data Scientists zooms in on the expansion of the IBM Watson Analytics academic program, which was begun last year at 400 global universities. The next phase, according to Watson Analytics public sector manager Randy Messina, is to get Watson Analytics into the hands of students beyond computer science or technical courses. The article explains,
“Other examples of universities using Watson Analytics include the University of Connecticut, which is incorporating Watson Analytics into several of its MBA courses. Northwestern University is building Watson Analytics into the curriculum of its Predictive Analytics, Marketing Mix Models and Entertainment Marketing classes. And at the University of Memphis Fogelman College of Business and Economics, undergraduate students are using Watson Analytics as part of their initial introduction to business analytics.”
Urban planning, marketing, and health care disciplines have also ushered in Watson Analytics for classroom use. Great, so students and professors get to use and learn through this advanced and intuitive platform. But that is where it gets a little shady. IBM is also interested in winning over these students and leading them into the data analytics field. Nothing wrong with that given the shortage of data scientists, but considering the free program and the creepy language IBM uses like “capturing mindshare among young people,” one gets the urge to warn these students to run away from the strange Watson guy, or at least proceed with caution into his lair.
Chelsea Kerwin, May 6, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

