Busted Black Marketplace Pops Back Up

October 5, 2016

In June, a vendor of access to hacked servers, xDedic, was taken down. Now, reports intelligence firm Digital Shadows, it has resurrected itself as a Tor domain. Why am I suddenly reminded of the mythical hydra? We learn of the resurgence from SecurityWeek’s article, “Hacked Server Marketplace Returns as a Tor Domain.” The article tells us:

After Kaspersky Lab researchers revealed in mid-June that they counted over 70,000 hacked servers made available for purchase on xDedic, some for as low as just $6, the marketplace operators closed the virtual shop on June 16. However, with roughly 30,000 users a month, the storefront was too popular to disappear for good, and intelligence firm Digital Shadows saw it re-emerge only a week later, but as a Tor domain now.

In an incident report shared with SecurityWeek, Digital Shadows reveals that a user named xDedic posted on 24 Jun 2016 a link to the new site on the criminal forum exploit[.]in. The user, who ‘had an established reputation on the forum and has been previously identified as associated with the site,’ posted the link on a Russian language forum thread titled ‘xDedic ???????’ (xDedic burned).

We’re told that, though the new site looks just like the old site, the user accounts did not tag along. The now-shuttered site was attracting about 30,000 users monthly, so it should not take long to re-build their client list. Researchers are not able to assess the sites traffic, since it is now a Tor domain, but both Digital Shadows and Kaspersky Lab, another security firm, are “monitoring the situation.” We can rest assured they will inform law enforcement when they have more information.

Cynthia Murrell, October 5, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

World-Check Database Leaked by Third Party

October 4, 2016

This is the problem with sensitive data—it likes to wander from its confines. Motherboard reports, “Terrorism Database Used by Governments and Banks Leaked Online.” Security researcher Chris Vickery reported stumbling upon a copy of the World-Check intelligence database from mid-2014 that was made available by a third party. The database maintained by Thomson Reuters for use by governments, intelligence agencies, banks, and law firms to guard against risks. Reporter Joseph Cox specifies:

Described by Thomson Reuters as a ‘global screening solution,’ the World-Check service, which relies on information from all over the world, is designed to give deep insight into financial crime and the people potentially behind it.

We monitor over 530 sanctions, including watch and regulatory law and enforcement lists, and hundreds of thousands of information sources, often identifying heightened-risk entities months or years before they are listed. In fact, in 2012 alone we identified more than 180 entities before they appeared on the US Treasury Office of Foreign Assets Control (OFAC) list based on reputable sources identifying relevant risks,’ the Thomson Reuters website reads.

A compilation of sensitive data like the World-Check database, though built on publicly available info, is subject to strict European privacy laws. As a result, it is (normally) only used by carefully vetted organizations. The article notes that much the U.S.’s No Fly List, World-Check has been known to flag the innocent on occasion.

Though Vickery remained mum on just how and where he found the data, he did characterize it as a third-party leak, not a hack. Thomson Reuters reports that the leak is now plugged, and they have secured a promise from that party to never leak the database again.

Cynthia Murrell, October 4, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Pharmaceutical Research Made Simple

October 3, 2016

Pharmaceutical companies are a major power in the United States.  Their power comes from the medicine they produce and the wealth they generate.  In order to maintain both wealth and power, pharmaceutical companies conduct a lot of market research.  Market research is a field based on people’s opinions and their reactions, in other words, it contains information that is hard to process into black and white data.  Lexalytics is a big data platform built with a sentiment analysis to turn market research into useable data.

Inside Big Data explains how “Lexalytics Radically Simplifies Market Research And Voice Of Customer Programs For The Pharmaceutical Industry” with a new package called the Pharmaceutical Industry Pack.  Lexalytics uses a combination of machine learning and natural language processing to understand the meaning and sentiment in text documents.  The new pack can help pharmaceutical companies interpret how their customers react medications, what their symptoms are, and possible side effects of medication.

Our customers in the pharmaceutical industry have told us that they’re inundated with unstructured data from social conversations, news media, surveys and other text, and are looking for a way to make sense of it all and act on it,’ said Jeff Catlin, CEO of Lexalytics. ‘With the Pharmaceutical Industry Pack — the latest in our series of industry-specific text analytics packages — we’re excited to dramatically simplify the jobs of CEM and VOC pros, market researchers and social marketers in this field.

Along with basic natural language processing features, the Lexalytics Pharmaceutical Industry Pack contains 7000 sentiment terms from healthcare content as well as other medical references to understand market research data.  Lexalytics makes market research easy and offers invaluable insights that would otherwise go unnoticed.

Whitney Grace, October 3, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Europol Internet Referral Unit Criticized for Methods

October 3, 2016

In July of 2015 Europol launched their Internet Referral Unit (IRU), tasked with identifying extremist propaganda online and asking ISPs to take it down. Now that the group has been operating for a year, it is facing criticism about its methods, we learn from “Europol’s Online Censorship Unit is Haphazard and Unaccountable Says NGO” at ArsTechnica. The NGO referred to in the headline is the international digital rights organization AccessNow.

As of the IRU’s July birthday, the European Commission reports the IRU has examined about 8,000 posts over some 45 platforms and has made about 7,000 removal requests. As of May 2016, the group also has the power to hunt down terrorists; it has begun working with the UK National Counter Terrorism Internet Referral Unit to swiftly pursue those behind dangerous posts.

Not everyone is happy with IRU’s methods. Writer Jennifer Baker reports:

However AccessNow, a global digital rights organization, said Europe’s approach to dealing with online extremism is ‘haphazard, alarming, tone-deaf, and entirely counter-productive.

According to AccessNow, ‘the IRU is outside the rule of law on several grounds. First, illegal content is just that—illegal. If law enforcement encounters illegal activity, be it online or off, it is expected to proceed in dealing with that in a legal, rights-respecting manner.

Second, relegating dealing with this illegal content to a third private party, and leaving analysis and prosecution to their discretion, is both not just lazy—but extremely dangerous. Third, illegal content, if truly illegal, needs to be dealt with that way: with a court order and subsequent removal. The IRU’s blatant circumvention of the rule of law is in direct violation of international human rights standards.

For its part, Europol points to the IRU’s success at removing propaganda, including such worrisome content as bomb-making instructions and inflammatory speeches designed to spur specific acts of violence. Does this mean Europol believes the urgency of the situation calls for discarding the rule of law? Caution is warranted; we’ve been down this road before.

Cynthia Murrell, October 3, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Lexmark Upgrades Its Enterprise Search

September 30, 2016

Enterprise search has taken a back a back seat to search news regarding Google’s next endeavor and what the next big thing is in big data.  Enterprise search may have taken a back seat in my news feed, but it is still a major component in enterprise systems.  You can even speculate that without a search function, enterprise systems are useless.

Lexmark, one of the largest suppliers of printers and business solutions in the country, understand the importance of enterprise search.  This is why they recently updated the description of its Perceptive Enterprise Search in its system’s technical specifications:

Perceptive Enterprise Search is a suite of enterprise applications that offer a choice of options for high performance search and mobile information access. The technical specifications in this document are specific to Perceptive Enterprise Search version 10.6…

A required amount of memory and disk space is provided. You must meet these requirements to support your Perceptive Enterprise Search system. These requirements specifically list the needs of Perceptive Enterprise Search and do not include any amount of memory or disk space you require for the operating system, environment, or other software that runs on the same machine.

Some technical specifications also provide recommendations. While requirements define the minimum system required to run Perceptive Enterprise Search, the recommended specifications serve as suggestions to improve the performance of your system. For maximum performance, review your specific environment, network, and platform capabilities and analyze your planned business usage of the system. Your specific system may require additional resources above these recommendations.”

It is pretty standard fare when it comes to technical specifications, in other words, not that interesting but necessary to make the enterprise system work correctly.

Whitney Grace, September 30, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Google and the Future of Search Engine Optimization

September 30, 2016

Regular readers know that we are not big fans of SEO (Search Engine Optimization ) or its champions, so you will understand our tentative glee at the Fox News headline, “Is Google Trying to Kill SEO?” The article is centered around a Florida court case whose plaintiff is e.ventures Worldwide LLC, accused by Google of engaging in “search-engine manipulation”. As it turns out, that term is a little murky. That did not stop Google from unilaterally de-indexing “hundreds” of e.ventures’ websites. Writer Dan Blacharski observes:

The larger question here is chilling to virtually any small business which seeks a higher ranking, since Google’s own definition of search engine manipulation is vague and unpredictable. According to a brief filed by e-ventures’ attorney Alexis Arena at Flaster Greenberg PC, ‘Under Google’s definition, any website owner that attempts to cause its website to rank higher, in any manner, could be guilty of ‘pure spam’ and blocked from Google’s search results, without explanation or redress. …

The larger question here is chilling to virtually any small business which seeks a higher ranking, since Google’s own definition of search engine manipulation is vague and unpredictable. According to a brief filed by e-ventures’ attorney Alexis Arena at Flaster Greenberg PC, ‘Under Google’s definition, any website owner that attempts to cause its website to rank higher, in any manner, could be guilty of ‘pure spam’ and blocked from Google’s search results, without explanation or redress.

We cannot share Blacharski’s alarm at this turn of events. In our humble opinion, if websites focus on providing quality content, the rest will follow. The article goes on to examine Google’s first-amendment based stance, and considers whether SEO is even a legitimate strategy. See the article for its take on these considerations.

Cynthia Murrell, September 30, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

 

 

EasyAsk Has a Sticky Search

September 29, 2016

When I first began reading the EasyAsk article, “Search Laboratory: Rock ‘n’ Roll Lab Rats” it has the typical story about search difficulties and the importance about an accurate, robust search engine.   They even include video featuring personified search engines and the troubles a user goes through to locate a simple item, although the video refers to Google Analytics.   The article pokes fun at EasyAsk employees and how they develop the Search Lab, where they work on improving search functions.

One of the experiments that Search Lab worked on is “sticky search.”  What is sticky search?  Do you throw a keyword reel covered in honey into the Web pool and see what returns?  Is it like the Google “I Feel Lucky” button.  None of these are correct.  The Search Lab conducted an experiment where the last search term was loaded into the search box when a user revisited.  The Search Lab tracked the results and discovered:

As you can see, the sticky search feature was used by close-to one third of the people searching from the homepage, but by a smaller proportion of people on other types of page. Again, this makes sense as you’re more likely to use the homepage as a starting point when your intention is to return to a previously viewed product.  We had helped 30% of people searching from our homepage get to where they wanted to go more quickly, but added inconvenience to the other two thirds (and 75% of searchers across the site as a whole) because to perform their searches, rather than just tapping the search box and beginning to type they now had to erase the old (sticky) search term too.

In other words, it was annoying.  Search Lab retracted the experiment, but it was a decent effort to try something new even if the results could have been predicted.  Keep experimenting with search options SearchLab, but keep the search box empty.

Whitney Grace, September 29, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Googley Spin-Offs Underwhelm

September 29, 2016

One might think that starting out as a derivative of one of the most successful companies in the world would be a sure path to profits. Apparently one would be wrong. The Telegraph reports, “Alphabet’s Spin-Offs are Struggling to Repeat the Google Success Story.” Readers will recall that Alphabet was created last year as the holding company for Google and its derivatives, like Calico, Google Capital, Nest, Google Ventures, Verily, and X. Writer James Titcomb explains the logic behind the move:

The theory behind Alphabet, when Page laid it out in August, made sense. Google had become more than just an internet services and advertising company, even though the main internet business still made all the money. Google had set up units such as Calico, a life sciences division trying to eradicate death; Project Loon, which is trying to beam the internet to rural Asia with gigantic space balloons; and Boston Dynamics, which is trying to build humanoid robots.

These ‘moonshots’ weren’t able to realize their potential within the confines of a company focused on selling pay-per-click internet advertising, so they were separated from it. Page and Sergey Brin, Google’s two co-founders, left the everyday running of the internet business to their trusted lieutenant, Sundar Pichai, who had been effectively doing it anyway.

Being liberated from Google, the moonshots were supposed to thrive under the Alphabet umbrella. Have they? The early signs are not good.

The article concedes that Alphabet expected to lose money on some of these derivative projects, but notes that the loss has been more than expected—to the tune of some $3.6 billion. Titcomb examines Nest, Google’s smart-thermostat initiative, as an example; its once-bright future is not looking up at the moment. Meanwhile, we’re reminded, Apple is finding much success with its services division. See the article for more details on each company.

Will Alphabet continue to use Google Search’s stellar profits to prop up its pet projects? Consider that, from the beginning, one of the companies’ winning strategies has been to try anything and run with what proves successful; repeated failure as a path to success. I predict Alphabet will never relinquish its experimental streak.

Cynthia Murrell, September 29, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph

Recent Developments in Deep Learning Architecture from AlexNet to ResNet

September 27, 2016

The article on GitHub titled The 9 Deep Learning Papers You Need To Know About (Understanding CNNs Part 3) is not an article about the global media giant but rather the advancements in computer vision and convolutional neural networks (CNNs). The article frames its discussion around the ImageNet Large-Scale Recognition Challenges (ILSVRC), what it terms the “annual Olympics of computer vision…where teams compete to see who has the best computer vision model for tasks such as classification, localization, detection and more.” The article explains that the 2012 winners and their network (AlexNet) revolutionized the field.

This was the first time a model performed so well on a historically difficult ImageNet dataset. Utilizing techniques that are still used today, such as data augmentation and dropout, this paper really illustrated the benefits of CNNs and backed them up with record breaking performance in the competition.

In 2013, CNNs flooded in, and ZF Net was the winner with an error rate of 11.2% (down from AlexNet’s 15.4%.) Prior to AlexNet though, the lowest error rate was 26.2%. The article also discusses other progress in general network architecture including VGG Net, which emphasized depth and simplicity of CNNs necessary to hierarchical data representation, and GoogLeNet, which tossed the deep and simple rule out of the window and paved the way for future creative structuring using the Inception model.

Chelsea Kerwin, September 27, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monograph
There is a Louisville, Kentucky Hidden Web/Dark Web meet up on September 27, 2016.
Information is at this link: https://www.meetup.com/Louisville-Hidden-Dark-Web-Meetup/events/233599645/

Open Source CRM Galore for Salespeople, Manufacturers, and Even Freelancers

September 26, 2016

The article titled Top 10 Open Source CRM on Datamation weighs the customer relationship management (CRM) options based on individual needs in addition to features and functions. It highlights certain key benefits and points of strength such as EspoCRM’s excellent website, SugarCRM’s competitive edge over Salesforce, and the low cost of Dolibarr. The typical entry reads like this,

EPESI – The last in this list of Linux compatible CRM options is called EPESI. What makes it unique is the ability to take the mail page of the CRM and rearrange how things are laid out visually…it’s pretty nice to have when customizing ones workflow. In addition to expected CRM functionality, this tool also offers ERP options as well. With its modular design and cloud, enterprise and DIY editions, odds are there is a CRM solution available for everyone.

What strikes one the most about this list is how few familiar names appear. This list is certainly worth consulting to gain insights about the landscape, particularly since it does at least allude now and then to the specialty of several of the CRM software. For example, Dolibarr supports freelancers, Compiere is based around the needs of warehousing and manufacturing companies, and Zurmo was designed for salespeople. It is a good time to be in the market for CRM apps.

Chelsea Kerwin, September 26, 2016
Sponsored by ArnoldIT.com, publisher of the CyberOSINT monographThere is a Louisville, Kentucky Hidden Web/Dark Web meet up on September 27, 2016.
Information is at this link: https://www.meetup.com/Louisville-Hidden-Dark-Web-Meetup/events/233599645/

« Previous PageNext Page »

  • Archives

  • Recent Posts

  • Meta