UK Cybersecurity Director Outlines Agencys Failures in Ongoing Cyberwar

April 8, 2016

The article titled GCHQ: Spy Chief Admits UK Agency Losing Cyberwar Despite £860M Funding Boost on International Business Times examines the surprisingly frank confession made by Alex Dewdney, a director at the Government Communications Headquarters (GCHQ). He stated that in spite of the £860M funneled into cybersecurity over the past five years, the UK is unequivocally losing the fight. The article details,

“To fight the growing threat from cybercriminals chancellor George Osborne recently confirmed that, in the next funding round, spending will rocket to more than £3.2bn. To highlight the scale of the problem now faced by GCHQ, Osborne claimed the agency was now actively monitoring “cyber threats from high-end adversaries” against 450 companies across the UK aerospace, defence, energy, water, finance, transport and telecoms sectors.”

The article makes it clear that search and other tools are not getting the job done. But a major part of the problem is resource allocation and petty bureaucratic behavior. The money being poured into cybersecurity is not going towards updating the “legacy” computer systems still in place within GCHQ, although those outdated systems represent major vulnerabilities. Dewdney argues that without basic steps like migrating to an improved, current software, the agency has no hope of successfully mitigating the security risks.

 

Chelsea Kerwin, April 8, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

Watson in the Lab: Quoth the Stakeholder Forevermore

April 7, 2016

I read “Lawrence Livermore and IBM Collaborate to Build New Brain-Inspired Supercomputer.” The article reports that one of the US national labs and Big Blue are going to work together to do something with IBM’s neurosynaptic computer chip. I know. I know. IBM is not really into making chips anymore. I think it paid another company lots of money to take the fab business off IBM’s big blue hands.

Never mind, quoth the stakeholder.

The write up reports that the True North “platform”

will process the equivalent of 16 million neurons and 4 billion synapses and consume the energy equivalent of a hearing aid battery – a mere 2.5 watts of power.

I like the reference to nuclear weapons in the article. I used to work at Halliburton Nuclear in my salad days, and there are lots of calculations to perform when doing the nuclear stuff. Calculations are, in my experience, a lot better than doing lab experiments the Marie Curie muddled forward. Big computer capability is a useful capability.

According to the write up:

The [neuromorphic] technology represents a fundamental departure from computer design that has been prevalent for the past 70 years, and could be a powerful complement in the development of next-generation supercomputers able to perform at exascale speeds, 50 times (or two orders of magnitude) faster than today’s most advanced petaflop (quadrillion floating point operations per second) systems. Like the human brain, neurosynaptic systems require significantly less electrical power and volume.

This is not exactly a free ride. The write up points out:

Under terms of the $1 million contract, LLNL will receive a 16-chip TrueNorth system representing a total of 16 million neurons and 4 billion synapses. LLNL also will receive an end-to-end ecosystem to create and program energy-efficient machines that mimic the brain’s abilities for perception, action and cognition. The ecosystem consists of a simulator; a programming language; an integrated programming environment; a library of algorithms as well as applications; firmware; tools for composing neural networks for deep learning; a teaching curriculum; and cloud enablement.

One question: Who is paying whom? Is Livermore ponying up $1 million to get its informed hands on the “platform” or is IBM paying Livermore to take the chip and do a demonstration project.

The ambiguity in the write up is delicious. Another minor point is the cost of the support environment for the new platform. I understand the modest power draw, but perhaps there are other bits and pieces which gobble the Watts.

I recall a visit to Bell Labs.* During that visit, I saw a demo of what was then called holographic memory. The idea was that gizmos allowed data to be written to a holographic structure. The memory device was in a temperature controlled room and sat in a glass protected container. The room was mostly empty. After the demo, I asked one of the Bell wizards about the tidiness of the demo. He laughed and took me to a side door. Behind that door was a room filled with massive amounts of equipment. The point was that the demo looked sleek and lean. The gear required to pull off the demo was huge.

I recall that the scientist said, “The holographic part was easy. Making the system small is the challenge.”

Perhaps the neuromorphic chip has similar support equipment requirements.

I will let you know if I find out who is paying for the collaboration. I just love IBM. Watson, do you know who is paying for the collaboration?

——

* Bell Labs was one of the companies behind my ASIS Eagleton Award in the 1980s.

Stephen E Arnold, April 7, 2016

The Missing Twitter Manual Located

April 7, 2016

Once more we turn to the Fuzzy Notepad’s advice and their Pokémon mascot, Evee.  This time we visited the fuzz pad for tips on Twitter.  The 140-character social media platform has a slew of hidden features that do not have a button on the user interface.  Check out “Twitter’s Missing Manual” to read more about these tricks.

It is inconceivable for every feature to have a shortcut on the user interface.   Twitter relies on its users to understand basic features, while the experienced user will have picked up tricks that only come with experience or reading tips on the Internet.  The problem is:

“The hard part is striking a balance. On one end of the spectrum you have tools like Notepad, where the only easter egg is that pressing F5 inserts the current time. On the other end you have tools like vim, which consist exclusively of easter eggs.

One of Twitter’s problems is that it’s tilted a little too far towards the vim end of the scale. It looks like a dead-simple service, but those humble 140 characters have been crammed full of features over the years, and the ways they interact aren’t always obvious. There are rules, and the rules generally make sense once you know them, but it’s also really easy to overlook them.”

Twitter is a great social media platform, but a headache to use because it never came with an owner’s manual.  Fuzzy notepad has lined up hint for every conceivable problem, including the elusive advanced search page.

 

Whitney Grace, April 7, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

Potential Corporate Monitoring Concerns Tor Users

April 7, 2016

The Dark Web has been seen as a haven by anyone interested in untraceable internet activity. However, a recent article from Beta News, Tor Project says Google, CloudFlare and others are involved in dark web surveillance and disruption, brings to light the potential issue of Tor traffic being monitored. A CDN and DDoS protection service called CloudFlare has introduced CAPTCHAs and cookies to Tor for monitoring purpose and accusations about Google and Yahoo have also been made. The author writes,

“There are no denials that the Tor network — thanks largely to the anonymity it offers — is used as a platform for launching attacks, hence the need for tools such as CloudFlare. As well as the privacy concerns associated with CloudFlare’s traffic interception, Tor fans and administrators are also disappointed that this fact is being used as a reason for introducing measures that affect all users. Ideas are currently being bounced around about how best to deal with what is happening, and one of the simpler suggestions that has been put forward is adding a warning that reads “Warning this site is under surveillance by CloudFlare” to sites that could compromise privacy.”

Will a simple communications solution appease Tor users? Likely not, as such a move would essentially market Tor as providing the opposite service of what users expect. This will be a fascinating story to see unfold as it could be the beginning of the end of the Dark Web as it is known, or perhaps the concerns over loss of anonymity will fuel further innovation.

 

Megan Feil, April 7, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

Nasdaq Joins the Party for Investing in Intelligence

April 6, 2016

The financial sector is hungry for intelligence to help curb abuses in capital markets, judging by recent actions of Goldman Sachs and Credit Suisse. Nasdaq invests in ‘cognitive’ technology, from BA wire, announces their investment in Digital Reasoning. Nasdaq plans to connect Digital Reasoning algorithms with Nasdaq’s technology which surveils trade data. The article explains the benefits of joining these two products,

“The two companies want to pair Digital Reasoning software of unstructured data such as voicemail, email, chats and social media, with Nasdaq’s Smarts business, which is one of the foremost software for monitoring trading on global markets. It is used by more than 40 markets and 12 regulators. Combining the two products is designed to assess the context, content and relationships behind trading and spot signals that could indicate insider trading, market manipulation or even expenses rules violations.”

We have followed Digital Reasoning, and other intel vendors like them, for quite some time as they target sectors ranging from healthcare to law to military. This is just a case of another software intelligence vendor making the shift to the financial sector. Following the money appears to be the name of the game.

 

Megan Feil, April 6, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Glueware: A Sticky and Expensive Mess

April 5, 2016

I have been gathering open source information about DCGS, a US government information access and analysis system. I learned that the DCGS project is running a bit behind its original schedule formulated about 13 years ago. I also learned that the project is little over budget.

I noted “NASA Launch System Software Upgrade Now 77% overt Budget.” What interested me was the reference to “glueware.” The idea appears to be that it is better, faster, and maybe cheaper to use many different products. The “glueware” idea allows these technologies to be stuck or glued together. This is an interesting idea.

According to the write up:

To develop its new launch software, NASA has essentially kluged together a bunch of different software packages, Martin noted in his report. “The root of these issues largely results from NASA’s implementation of its June 2006 decision to integrate multiple products or, in some cases, parts of products rather than developing software in-house or buying an off-the-shelf product,” the report states. “Writing computer code to ‘glue’ together disparate products has turned out to be more complex and expensive than anticipated. As of January 2016, Agency personnel had developed 2.5 million lines of ‘glue-ware,’ with almost two more years of development activity planned.”

The arguments for the approach boil down to the US government’s belief that many flowers blooming in one greenhouse is better than buying flowers from a farm in Encinitas.

The parallels with DCGS and its well known government contractors and Palantir with its home brew Gotham system are interesting to me. What happens if NASA embraces a commercial provider? Good news for that commercial provider and maybe some push back from the firms chopped out of the pork loin. What happens if Palantir gets rebuffed? Unicorn burgers, anyone?

Stephen E Arnold, April 5, 2016

Google: Data Center Tour

April 5, 2016

If you marvel at all things Google, you will enjoy “Behind the Scenes and 360 at Google’s Dalles Data Center.” Dalles is not a suburb of the Cowboys’ football stadium, however. The video is a virtual tour of a “secret” Google facility. Even better, the article tells me the data center “is a highly secure area most Google employees aren’t even able to access.” There you go.

The article points out this surprising fact:

While the video is – naturally – highly curated, it nonetheless provides an interesting insight into Google’s vast data center, which can hold more than 750,00 machines and what’s behind running it all – from the sizeable hard drive shredder to very colorful water pipes – in Google colors of course.

Of course and naturally. Here’s the link.

Enjoy as long as you have Google Chrome, the YouTube app on m mobile gizmos, or the wonderful Google Cardboard thing. For information about the security in use at some Google facilities, check out this article too.

Stephen E Arnold, April 5, 2016

Forget World Population, Domain Population Is Overcrowded

April 5, 2016

Back in the 1990s, if you had a Web site without a bunch of gobbidly-gook after the .com, you were considered tech savvy and very cool.  There were plenty of domain names available in those days and as the Internet became more of a tool than a novelty, demand for names rose. It is not as easy anymore to get the desired Web address, says Phys.org in the article, “Overcrowded Internet Domain Space Is Stifling Demand, Suggesting A Future ‘Not-Com’ Boom.”

Domain names are being snapped up fast, so quickly, in fact, that Web development is being stunted.  As much as 25% of domains are being withheld, equaling 73 million as of summer 2015 with the inability to register domain names that would drive Internet traffic.

“However, as the Internet Corporation for Assigned Names and Numbers (ICANN) has begun to roll out the option to issue brand new top-level domains for almost any word, whether it’s dot-hotel, dot-books or dot-sex – dubbed the ‘not-coms’ – the research suggests there is substantial untapped demand that could fuel additional growth in the domain registrations.”

One of the factors that determine prime Internet real estate is a simple, catchy Web address.  With new domains opening up beyond the traditional .org, .com, .net, .gov endings, an entire new market is also open for entrepreneurs to profit from.  People are already buying not-com’s for cheap with the intention to resale them for a pretty penny.  It bears to mention, however, that once all of the hot not-com’s are gone, we will be in the same predicament as we are now.  How long will that take?

 

Whitney Grace, April 5, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Google DeepMind Acquires Healthcare App

April 5, 2016

What will Google do next? Google’s London AI powerhouse has set up a new healthcare division and acquired a medical app called Hark, an article from Business Insider, tells us the latest. DeepMind, Google’s artificial intelligence research group, launched a new division recently called DeepMind Health and acquired a healthcare app. The article describes DeepMind Health’s new app called Hark,

“Hark — acquired by DeepMind for an undisclosed sum — is a clinical task management smartphone app that was created by Imperial College London academics Professor Ara Darzi and Dr Dominic King. Lord Darzi, director of the Institute of Global Health Innovation at Imperial College London, said in a statement: “It is incredibly exciting to have DeepMind – the world’s most exciting technology company and a true UK success story – working directly with NHS staff. The types of clinician-led technology collaborations that Mustafa Suleyman and DeepMind Health are supporting show enormous promise for patient care.”

The healthcare industry is ripe for disruptive technology, especially technologies which solve information and communications challenges. As the article alludes to, many issues in healthcare stem from too little conveyed and too late. Collaborations between researchers, medical professionals and tech gurus appears to be a promising answer. Will Google’s Hark lead the way?

 

Megan Feil, April 5, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Venture Dollars Point to Growing Demand for Cyber Security

April 4, 2016

A UK cyber security startup has caught our attention — along with that of venture capitalists. The article Digital Shadows Gets $14M To Keep Growing Its Digital Risk Scanning Service from Tech Crunch reports Digital Shadows received $14 million in Series B funding. This Software as a service (SaaS) is geared toward enterprises with more than 1,000 employees with a concern for monitoring risk and vulnerabilities by monitoring online activity related to the enterprise. The article describes Digital Shadows’ SearchLight which was initially launched in May 2014,

“Digital Shadows’ flagship product, SearchLight, is a continuous real-time scan of more than 100 million data sources online and on the deep and dark web — cross-referencing customer specific data with the monitored sources to flag up instances where data might have inadvertently been posted online, for instance, or where a data breach or other unwanted disclosure might be occurring. The service also monitors any threat-related chatter about the company, such as potential hackers discussing specific attack vectors. It calls the service it offers “cyber situational awareness”.”

Think oversight in regards to employees breaching sensitive data on the Dark Web, for example, a bank employee selling client data through Tor. How will this startup fare? Time will tell, but we will be watching them, along with other vendors offering similar services.

 

Megan Feil, April 4, 2016

Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta