Tag Archives: Report

Accenture: Intelligent operations goal requires data backbone

A newly released report co-authored by Accenture and market researcher HfS reveals 80% of the global enterprises surveyed worry about digital disruption, but many of those companies lack the data backbone that could help them compete.

The report stated that large organizations are “concerned with disruption and competitive threats, especially from new digital-savvy entrants.” Indeed, digital disrupters such as Uber and Lyft in personal transportation, Airbnb in travel and hospitality, and various fintech startups have upset the established order in those industries. The Accenture-HfS report views “intelligent operations” as the remedy for the digital challenge and the key to bolstering customer experience. But the task of improving operations calls for organizations to pursue more than a few mild course corrections, according to Debbie Polishook, group chief executive at Accenture Operations, a business segment that includes business process and cloud services.

In the past, enterprises that encountered friction in their operations would tweak the errant process, add a few more people and take on a Lean Six Sigma project, she noted. Those steps, however, won’t suffice in the current business climate, Polishook said.

“Given what is happening  today with the multichannel, with the various ways customers and employees can interact with you, making tiny tweaks is not going to get it done and meet the expectations of your stakeholders,” she said.

Graphic detailing data quality problems within organizations
Organizations struggle to leverage their data

Hard work ahead

The report, which surveyed 460 technology and services decision-makers in organizations with more than $3 billion in revenue, suggested professional services firms such as Accenture will have their work cut out for them as they prepare clients for the digital era.

The survey noted most enterprises struggle to harness data with an eye toward improving operations and achieving competitive advantage. The report stated “nearly 80% of respondents estimate that 50% [to] 90% of their data is unstructured” and largely inaccessible. A 2017 Accenture report also pointed to a data backbone deficit among corporations: More than 90% of the respondents to that survey said they struggle with data access.

In addition, half of the Accenture-HfS report respondents who were surveyed acknowledged their back office isn’t keeping pace with the front office demands to support digital capabilities.

“Eighty percent of the organizations we talked to are concerned with digital disruption and are starting to note that their back office is not quite keeping up with their front office,” Polishook said. “The entire back office is the boat anchor holding them back.”

That lagging back office is at odds with enterprises’ desire to rapidly roll out products and services. An organization’s operations must be able to accommodate the demand for speed in the context of a digital, online and mobile world, Polishook said.

Enterprises need a “set of operations that can respond to these pressures,” she added. “Most companies are not there yet.”

One reason for the lag: Organizations tend to prioritize new product development and front office concerns when facing digital disruption. Back office systems such as procurement tend to languish.

“Naturally, as clients … are becoming disrupted in the market, they pay attention first to products and services,” Polishook said. “They are finding that is not enough.”

The report’s emphasis on revamped operations as critical to fending off digital disruption mirrors research from MIT Sloan’s Center for Information Systems Research. In a presentation in 2017, Jeanne Ross, principal research scientist at the center, identified a solid operational backbone as one of four keys to digital transformation. The other elements were strategic vision, a focus on customer engagement or digitized solutions and a plan for rearchitecting the business.

The path to intelligent operations

The Accenture-HfS report identified five essential components necessary for intelligent operations: innovative talent, a data backbone, applied intelligence, cloud computing and a “smart partnership ecosystem.”

As for innovative talent, the report cited “entrepreneurial drive, creativity and partnering ability” as enterprises’ top areas of talent focus.

There is a lot of heavy lifting to be done.
Debbie Polishookgroup chief executive, Accenture Operations

“One of the most important pieces getting to intelligent operations is the talent,” Polishook said. She said organizations in the past looked to ERP or business process management to boost operations, but contended there is no technology silver bullet.

The data-driven backbone is becoming an important focus for large organizations. The report stated more than 85% of enterprises “are developing a data strategy around data aggregation, data lakes, or data curation, as well as mechanisms to turn data into insights and then actions.” Big data consulting is already a growing market for channel partners.

In the area of applied intelligence about 90% of the enterprises surveyed identified automation, analytics and AI as technologies that will emerge as the cornerstone of business and process transformation. Channel partners also look forward to the AI field and the expanded use of such automation tools as robotic process automation as among the top anticipated trends of 2018.

Meanwhile, more than 90% of large enterprises expect to realize “plug-and-play digital services, coupled with enterprise-grade security, via the cloud, according to the Accenture-HfS report. And a like percentage of respondents viewed partnering with an ecosystem as important for exploiting market opportunities. The report said enterprises of the future will create “symbiotic relationships with startups, academia, technology providers and platform players.”

The path to achieving intelligent operations calls for considerable effort among all partners involved in the transformation.

“There is a lot of heavy lifting to be done,” Polishook said.

CIA attributes NotPetya attacks to Russian spy agency

An unreleased CIA report is alleged to officially name Russia’s top foreign spy agency as the source of the NotPetya ransomware and the initial attacks against Ukraine.

The CIA reportedly concluded in November 2017 that Russia’s GRU foreign intelligence agency was responsible for the NotPetya attacks in June 2017. According to The Washington Post, unnamed officials said the classified CIA report attributed the NotPetya attacks to Russia’s GRU and said the hackers that created the ransomware worked for the Russian military’s GTsST, or Main Center for Special Technology.

The NotPetya attacks began by targeting Ukrainian agencies but quickly spread through the use of the EternalBlue exploit developed by the NSA and used in the WannaCry ransomware attacks.

Attributing the attacks to Russia is not in itself surprising as security researchers in June said Russia was the likely threat actor given that the initial NotPetya attacks targeted Ukraine government agencies through multiple software backdoors in the M.E.Doc tax program. However, experts noted that the CIA likely wanted to be certain before making any statement.

Tim Erlin, vice president of product management and strategy at Tripwire, the information security company headquartered in Portland, Ore., said “attributing cyberattacks to specific attackers or groups can be a challenging task.”

“It’s not always possible to make a direct connect, and indirect inferences are required to come to a conclusion. Accurate attribution is broadly valuable. While organizations should focus on the solid application of foundational controls first, characterizing the threat environment in terms of changing attackers can help prioritize more advanced protections,” Erlin told SearchSecurity. “It’s hard to say why the CIA didn’t publish this information sooner, though it’s important to realize that a three-month delay in disclosing this kind of nation-state attribution isn’t a very long time.”

The NotPetya attacks and Russian aggression

Tom Kellermann, CEO of Strategic Cyber Ventures LLC in Augusta, Ga., said the CIA likely “withheld attribution to prevent their sources and methods from being discovered.”

“The public announcement is significant as it is meant to warn the American public of the significant cyber threat posed by Russia,” Kellermann told SearchSecurity. “Cold War cyberattacks against the U.S. have dramatically increased over the past six weeks, as evidenced by the resurgence of Fancy Bear coming out of hibernation. We are under siege.”

Chris Morales, head of security analytics at Vectra Networks, a cybersecurity company based in San Jose, said the security industry felt comfortable attributing the NotPetya attacks to Russia “due to similarities of the NotPetya attack to prior attacks from Russia targeting the Ukraine.”

Cyberspace is the next major battle ground between major nation-states.
Chris Moraleshead of security analytics at Vectra Networks

“Russia has engaged in what the Pentagon calls ‘hybrid warfare’ against […] Ukraine, with three previously known attacks against the Ukrainian voting system and power grid dating back to 2014. With the CIA confirmation, NotPetya now looks like another attack in a succession of state-sponsored attacks,” Morales told SearchSecurity. “The bigger concern here for the U.S. is that we believe Russia is practicing and honing their craft against […] Ukraine, where they face little opposition from global powers. Cyberspace is the next major battle ground between major nation-states. Russia is arming themselves with cyber weapons that could be used against us or any other state as Russia would deem necessary in a bigger attack campaign. The irony of this attack is that it leveraged exploits developed by the NSA in their pursuit of weaponizing cyber space.”

Kaspersky sheds more light on Equation Group malware detection

Kaspersky Lab published a lengthy report that shed further light on its discovery of Equation Group malware and its possession of classified U.S. government materials.

The antivirus company, which has been under intense scrutiny by government officials and lawmakers this year, disclosed that classified materials were transmitted to Kaspersky’s network between September 11, 2014 and November 17, 2014. In a previous explanation, the company said Kaspersky antivirus software detected malware on a computer located in the greater Baltimore area. Kaspersky later discovered a 7zip archive on the computer that had Equation Group malware and other materials with U.S. government classified markings.

Kaspersky’s new investigation details were issued in response to several media reports that claimed Russian state-sponsored hackers used Kaspersky’s antivirus software to identify and locate U.S. government data. The reports claimed that in 2015 an NSA contractor’s system was compromised by Russian hackers using Kaspersky antivirus scans, which led to a massive leak of confidential NSA files and Equation Group malware. The news reports also claimed Israeli intelligence penetrated Kaspersky’s network in 2014 and found classified NSA materials on its network.

The Equation Group was an APT group that was first identified by Kaspersky researchers in 2015 and later linked to the U.S. National Security Agency (NSA) in 2016 following disclosures by the hacking group known as the Shadow Brokers.

New details in Kaspersky’s investigation

Thursday’s report provided new details about the computer with Equation Group malware, which was believed to be the NSA contractor’s system. Kaspersky did not confirm or deny these reports, saying its software anonymizes users’ information and divulging details about the specific user in this case would violate its ethical and privacy standards.

The Kaspersky investigation revealed the suspected NSA contractor’s computer was “compromised by a malicious actor on October 4, 2014” as a result of a backdoor Trojan known as Smoke Loader or Smoke Bot. The compromise occurred during the nearly two-month span Kaspersky identified and scanning the computer from Sept. 11 to Nov. 17, 2014.

Kaspersky said it believes the user turned Kaspersky’s antivirus software off at some point during that time frame in order to install a pirated version of Microsoft Office, which allowed Smoke Loader to activate. The report also noted Smoke Loader was attributed to a Russian Hacker in 2011 and was known to be distributed on Russian hacker forums.

Kaspersky said once the classified markings were discovered in the 7zip archive materials, all data except the malware binaries was deleted under order of CEO Eugene Kaspersky. The company also said it “found no indication the information ever left our corporate networks.”

Kaspersky’s report appeared to suggestthe threat actors who reportedly found the classified NSA data and Equation Group malware likely did so by hacking the computer directly with Smoke Loader and not, as media reports claimed, by hacking into Kaspersky’s network and abusing the company’s antivirus technology.

The company also said it’s possible the computer had other malware on it that Kaspersky didn’t detect.

“Given that system owner’s potential clearance level, the user could have been a prime target of nation states,” the report stated. “Adding the user’s apparent need for cracked versions of Windows and Office, poor security practices, and improper handling of what appeared to be classified materials, it is possible that the user could have leaked information to many hands. What we are certain about is that any non-malware data that we received based on passive consent of the user was deleted from our storage.”

Thursday’s report followed comments from Jeanette Manfra, assistant secretary for cybersecurity and communications at the U.S. Department of Homeland Security, who told the House Science, Space and Technology Oversight Subcommittee earlier this week that there was no conclusive evidence that Kaspersky software had been exploited to breach government systems.

Policy changes

The report also contained new information about how Kaspersky responded to the 2014 Equation Group malware discovery and the company policy changes that followed.

“The reason we deleted those files and will delete similar ones in the future is two-fold; We don’t need anything other than malware binaries to improve protection of our customers and secondly, because of concerns regarding the handling of potential classified materials,” the report states. “Assuming that the markings were real, such information cannot and will not [be] consumed even to produce detection signatures based on descriptions.”

Kaspersky said that those concerns led to the adoption of a new policy for the company that requires all analysts to “delete any potential classified materials that have been accidentally collected during anti-malware research or received from a third party.”

The report didn’t say whether or not Kaspersky ever notified the NSA or other government agencies about the Equation Group malware it discovered or the classified data contained in the 7zip archive. In a previous statement on the situation, the company stated, “As a routine procedure, Kaspersky Lab has been informing the relevant U.S. government institutions about active APT infections in the USA.” It’s also unclear why, after finding the classified U.S. government files, the company never disclosed Equation Group was connected to the NSA.

Kaspersky has not responded to requests for comment on these questions.

The company responded to media reports that claimed threat actors used Kaspersky antivirus scans to hunt for classified markings.

“We have done a thorough search for keywords and classification markings in our signature databases,” Kaspersky said. “The result was negative: we never created any signatures on known classification markings.”

Kaspersky did, however, acknowledged that a malware analyst created a signature for the word “secret” based on the discovery of the TeamSpy malware in 2013, which used a wildcard string pattern based on several keywords, including “secret.” The company hypothesized that a third party may have either misinterpreted the malware signature or maliciously used it against Kaspersky to spread false allegations.

Three years in a row – Microsoft is a leader in the ODBMS Magic Quadrant

We’re happy to report that Gartner has positioned Microsoft in the Leaders Quadrant in the 2017 Magic Quadrant for Operational Database Management Systems again this year. This is the third year that Microsoft has been positioned farthest in completeness of vision and ability to execute in the operational database management systems market.

At Microsoft, we’re dedicated to helping both enterprises and individuals realize their full potential. Our industry position in Operational DBMS is due to the unequaled capabilities of SQL Server.

The release of SQL Server 2017 brings the power of SQL Server to Windows, Linux, and Docker containers for the first time ever. Developers are able to build intelligent applications using preferred languages and environments, while enjoying in-memory performance across workloads, mission-critical high availability, and in-database advanced analytics. You can develop once and deploy anywhere in a consistent experience across your datacenter and public cloud.

SQL Server proves itself, year over year, to be the least vulnerable DBMS in the industry. Built for security from the ground up, SQL Server offers customers a layered protection approach that incorporates encryption, authentication, and monitoring and auditing at the disk, database, and application levels. Innovative security technologies like Always Encrypted, for encryption at rest and in motion, help transform global operations for the better.

Perhaps most noteworthy for organizations living in the real world of cost-benefit analyses, SQL Server 2017 remains one of the most cost-competitive DBMS offerings in the enterprise space. In fact, you can get all the robust business capabilities in SQL Server 2017 built-in to a single product SKU, without expensive add-ons — for one great, low total cost of ownership.

But don’t just take our word for it. We encourage you to take the time to read the full Gartner report.

And then take a moment to see how you can get free licenses when you migrate to SQL Server 2017. We’re confident you’ll find the industry-leading database you know and love — now across operating systems and application platforms, on-premises and in the cloud.

Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner complete document is available now. 

Warning for Equifax security issues came months before breach

A new report alleges the Equifax security issues were far worse than originally thought and some warnings may have gone unheeded for months prior to the company being breached.

A security researcher claims to have disclosed a number of Equifax security issues in December 2016 — approximately three months before the initial breach of Equifax systems.

One of the Equifax security issues detailed by the unnamed researcher in a report by Motherboard said an Equifax website exposed the personally identifiable information (PII), including names, city and state locations, social security numbers and birthdates, though a forced browsing bug.

Jake Williams, founder of consulting firm Rendition InfoSec LLC in Augusta, Ga., said this sort of bug was “inexcusable in this day and age.”

“[Andrew] “Weev” Auernheimer went to prison over exploiting a forced browsing bug that revealed far less sensitive information than that revealed through the Equifax web applications,” Williams told SearchSecurity. “That a company would be notified about a forced browsing issue exposing PII and then fail to fix it in the current security climate borders on negligence.”

Beyond that bug, the researcher said they found Equifax servers running outdated software and vulnerable to SQL injection attacks, allowing shell access to those systems.

Peter Tran, general manager and senior director of worldwide advanced cyber defense practice at RSA Security, said the Equifax security issues were not unique but “the table stakes increase exponentially in PII intensive businesses.”

“Blind spots in vulnerability monitoring and visibility can go off the rails very fast, particularly over publicly web facing assets open to overwhelming amounts of probing and reconnaissance,” Tran told SearchSecurity. “It’s a double bubble — if one security layer pops, you can pop the other — i.e., the classic SQL injection blind spot.”

Equifax security response

With the disclosure of these problems to Equifax, the security researcher asked the company to at least take down the public access to these servers. However, Equifax didn’t take action until June — approximately three months after the company had been breached via an unrelated Apache Struts vulnerability, and one month before the company detected that breach.

Hector Monsegur, director of assessment services at Seattle-based Rhino Security Labs, said the “entire situation is inexcusable,” but unfortunately he could also “see how vulnerability warnings may have gone under the radar.”

“This is common among organizations with large attack surfaces, vast amount of employees and no coordination between its various IT departments. Unless they drastically change their current state of security, I fear the situation may be getting worse,” Monsegur told SearchSecurity. “Eventually, large organizations with lax security will be facing a reality check: There are consequences to major blunders in security. Attorneys General across the United States have been taking action against companies who are not properly safeguarding financial or customer information. Being ‘too large to fail’ is no longer a free pass.”

That a company would be notified about a forced browsing issue exposing PII and then fail to fix it in the current security climate borders on negligence.
Jake Williamsfounder of consulting firm Rendition InfoSec LLC

Rick Holland, vice president of strategy for Digital Shadows, said the revelation of these latest Equifax security issues makes it “even more difficult to accept former CEO Richard Smith’s explanation that a single employee ‘not doing their job’ was the reason this intrusion occurred.”

“Systemic issues in Equifax’s vulnerability management program were more likely to have contributed to this breach than a single person. Given the nature of Equifax’s data they were highly likely to be targeted by a vast array of threat actors from nation states to hactivists to cyber criminals,” Holland told SearchSecurity. “If their security program is as weak as it is being reported, then you probably had multiple threats actors stepping all over themselves as they probed and pivoted across the environment.”

Jules Okafor, vice president of cyber risk programs at Fortress Information Security, said the Equifax security issues appeared systemic.

“Experts attribute Equifax’s breach to a combination of small, but incremental technical lapses. Yet breaches at large enterprises can be directly attributed to failed processes and priorities — innovation over security, single points of failure and a siloed approach to vulnerability risk management,” Okafor told SearchSecurity. “These are systemic issues that impact a security team’s ability to detect, respond and remediate critical threats in a timely fashion.”

DEFCON hopes voting machine hacking can secure systems

A new report pushes recommendations based on the research done into voting machine hacking at DEFCON 25, including basic cybersecurity guidelines, collaboration with local officials and an offer of free voting machine penetration testing.

It took less than an hour for hackers to break into the first voting machine at the DEFCON conference in July. This week, DEFCON organizers released a new report that details the results from the Voting Village and the steps needed to ensure election security in the future.

Douglas Lute, former U.S. ambassador to NATO and retired U.S. Army lieutenant general, wrote in the report that “last year’s attack on America’s voting process is as serious a threat to our democracy as any I have ever seen in the last 40+ years – potentially more serious than any physical attack on our Nation.”

“Loss of life and damage to property are tragic, but we are resilient and can recover. Losing confidence in the security of our voting process — the fundamental link between the American people and our government — could be much more damaging,” Lute wrote. “In short, this is a serious national security issue that strikes at the core of our democracy.”

In an effort to reduce the risks from voting machine hacking, DEFCON itself will be focusing more on the election systems. Jeff Moss, founder of DEFCON, said during a press conference for the report that access to voting machines is still a major hurdle.

“The part that’s really hard to get our hands on is the back-end software that ties the voting machines together — to tabulate, to accumulate votes, to provision a voting ballot, to run the election, to figure out a winner — and boy we really want to have a complete voting system to attack, so people can attack the network, they can attack the physical machines, they can go after the databases” Moss said. “This is the mind-boggling part: just as this is the first time this is really being done — no NDAs — there’s never been a test of a complete system. We want a full end-to-end system so it’s one less thing people can argue about. We can say, ‘See? We did it here too.'”

DEFCON had obtained the voting machines tested at the 2017 conference from second-hand markets, like eBay, but hopes to have more cooperation from election officials and the companies that make the voting equipment. Moss said it is still unclear what exactly DEFCON will be allowed to do in 2018 because the DMCA exemption that allows voting machine hacking currently needs to be renewed.

Immediate voting machine security

DEFCON officials noted that election security needs to be improved before the 2018 DEFCON conference so local officials can prepare for the 2018 mid-term elections.

John Gilligan, board chair and interim CEO for the Center for Internet Security (CIS), said his organization was working “to take the elections ecosystem and to develop a handbook of best practices” around election security. CIS has invited DHS, NIST, the Election Assistance Commission, the National Association of Secretaries of State and other election officials to collaborate on the process.

“We have 400 or 500 people who currently collaborate with us, but we’re going to expand that horizon a bit because there are those who have specific expertise in election systems. The view is: let’s get together and very quickly — by the end of this calendar year — produce a set of best practices that will be given to the state and local governments,” Gilligan said in a news conference on Tuesday. “Our effort will complement what the Election Assistance Commission is developing presently with NIST.”

Jake Braun, cybersecurity lecturer at the University of Chicago and CEO of private equity firm Cambridge Global, headquartered in Washington, said the DEFCON team would provide free voting machine pen testing to any election officials that want the help.

The only way you can see if the machine was hacked is if the attacker wanted to be found. That’s the sad truth. It can be done without leaving a trace.
Harri Hurstifounding partner at Nordic Innovations Lab

“If you’re an election official, the thing you can do coming out of this is to contact DEFCON and offer to give out your schemes, your databases, give access to whatever else you want tested. This is essentially free testing and training for your staff, and that would normally cost you millions of dollars to purchase on your own.”

Moss said the industry fear of hackers is common, but urged that the team only wanted to help.

“This is the first scrutiny the manufacturers have had and they don’t know what to do. And that’s a pretty routine response. We saw that from the medical device world, car world, access control, ATMs,” Moss said. “When these industries first come into contact with hackers and people who are giving an honest opinion of their technology, they pull back and hide for a while. If you’re doing a good job, we’ll tell you, ‘Hey, that’s awesome.’ And, if you’re doing a poor job, we’ll say, ‘Can you please fix that?’ But the best part is it’s free. You’re getting some of the world’s best hackers doing pro bono work, giving away reports for free — normally these people make thousands of dollars a day — and they’re doing it just because they want to see what’s possible.”

The DEFCON voting machine hacking report noted a number of misconceptions surrounding the security of elections, but Harri Hursti, founding partner at Nordic Innovations Lab, said one of the biggest issues was the idea that there had “never been a documented incident where votes have been changed during a real election.”

“These machines don’t have the capability of providing you forensic evidence to see that. They cannot prove they are honest; they cannot prove they were not hacked. They simply don’t have the fundamental, basic capabilities of providing you that data,” Hursti said in the press conference. “The only way you can see if the machine was hacked is if the attacker wanted to be found. That’s the sad truth. It can be done without leaving a trace.”

Users plagued by iOS app security issues, according to new research

A new report shows despite Apple iOS’ reputation as a secure mobile operating system, users are at risk more often than it seems.

San Francisco-based mobile security software company Zimperium published its Global Threat Report from the second quarter of 2017 and highlighted iOS app security issues plaguing Apple users, finding that one in 50 iOS applications could potentially leak data to third parties.

“Enterprises have no way to detect this type of risk unless they are scanning apps for security or privacy issues,” the report stated, noting that 1,101 out of 50,000 iOS apps the researchers scanned had at least one security or privacy issue.

“Through deep analysis, Zimperium researchers found the 1,101 apps downloaded over 50 million times. Companies and individuals should be concerned if these iOS apps are on their devices and inside of their networks.”

Zimperium looked at the iOS app security risks and threats detected on zIPS-protected devices between April 1 and June 30, 2017. It categorized what it found as device threats and risks, network threats and app threats.

When studying device threats and risks, the researchers found that, so far in 2017, there have been more registered common vulnerabilities and exposures on both Android and iOS devices than in all of 2016.

“While not all vulnerabilities are severe, there were still hundreds that enabled remote code execution (such as Stagefright and Pegasus) that forced the business world to pay attention to mobile device security,” the report stated.

Zimperium also found that over 23% of iOS devices were not running the latest version of the operating system, which is somewhat unexpected, since Apple controls the update process itself. Despite that, the report also stated that the number of iOS devices with mobile malware was extremely low, at just 1%. However, Zimperium found that iOS devices “have a greater percentage of suspicious profiles, apps using weak encryption and potentially retrieving private information from devices.

“The most concerning risks associated with iOS devices were malicious configuration profiles and ‘leaky apps,'” the report stated. “These profiles can allow third parties to maintain persistence on a device, decrypt traffic, synchronize calendars and contacts, track the device’s location and could allow a remote connection to control the device or siphon data from the device without the user’s knowledge.”

Additional findings include man-in-the-middle attacks that were detected on 80% of the scanned devices, as well as the seven most severe iOS app security issues: malware, keychain sharing, MD2 encryption, private frameworks, private info URL, reading UDID and stored info being retrieved during public USB recharges.

In other news:

  • Following the initial breach report, new developments revealed CCleaner malware is worse than originally thought. Security company Morphisec and networking giant Cisco found and revealed CCleaner, a Windows tool set from Avast, had been taken over by hackers who installed backdoors on the software. The companies confirmed over 700,000 computers have been affected and now have backdoors on them. A few days after the reveal, Cisco Talos, the company’s security division, analyzed the command-and-control (C2) server to which the infected versions of CCleaner connects. “In analyzing the delivery code from the C2 server, what immediately stands out is a list of organizations, including Cisco, that were specifically targeted through delivery of a second-stage loader,” the Talos team wrote in a blog post. “Based on a review of the C2 tracking database, which only covers four days in September, we can confirm that at least 20 victim machines were served specialized secondary payloads.” According to Cisco Talos’ findings, Intel, Google, Microsoft, VMware and Cisco were among the targeted companies.
  • Media company Viacom Inc. is the latest major organization to expose sensitive information to the public due to a misconfigured AWS Simple Storage Service cloud storage bucket. According to Chris Vickery, director of cyber-risk research at UpGuard, based in Mountain View, Calif., Viacom exposed a wide array of internal resources, credentials and critical data. “Exposed in the leak are a master provisioning server running Puppet, left accessible to the public internet, as well as the credentials needed to build and maintain Viacom servers across the media empire’s many subsidiaries and dozens of brands,” UpGuard explained in a blog post. “Perhaps most damaging among the exposed data are Viacom’s secret cloud keys, an exposure that, in the most damaging circumstances, could put the international media conglomerate’s cloud-based servers in the hands of hackers.” The exposure, the research firm noted, could enable hackers to perform any number of damaging attacks through Viacom’s infrastructure.
  • The U.S. District Court for Washington, D.C., has dismissed two lawsuits filed in regard to the 2017 data breach of the Office of Personnel Management (OPM). One of the lawsuits was filed by the American Federation of Government Employees, a federal workers union, alleging that the data breaches occurred as a result of gross negligence by federal officials. The second suit was filed by another union, the National Treasury Employee Union. It targeted the OPM’s acting director and alleged constitutional violations of the victims’ right to information privacy. This week, the court dismissed both lawsuits because neither plaintiff “has pled sufficient facts to demonstrate that they have standing.” In 2015, the OPM revealed two data breaches that exposed over 20 million people, mostly U.S. federal employees, in which hackers stole their sensitive information.

What story does your timeline tell? Introducing the Timeline Storyteller custom visual for Microsoft Power BI | Microsoft Power BI Blog | Microsoft Power BI

Use the new Timeline Storyteller custom visual in a report, and win a Power BI Super Swag Prize Pack! See details.

——————————————————————————–

Timeline Storyteller, a new custom visual for Power BI I created with a team of other researchers at Microsoft, is now available in the Office Store for anyone to use.

Alberto Cairo, Knight Chair at the University of Miami and renowned data visualization professor, author, designer and practitioner, shared his thoughts on the new visual after seeing it presented at a recent Microsoft event.

When humans began transforming information and pictures to enhance understanding, two of the first things they visualized were space and time. Timeline Storyteller is the latest landmark in a tradition that spans centuries, and what a great accomplishment it is,” says Cairo.

The work on this visual began in 2015 when we drew on our expertise in information visualization and data-driven storytelling and set out to explore ways to help people tell expressive data stories with timelines while maintaining perceptual and narrative effectiveness.

People have been using timelines for centuries to visually communicate stories about sequences of events, from historical and biographical data, to project plans and medical records. From hand-drawn illustrations to contemporary infographics, storytellers have employed a wide range of visual forms for communicating with timelines. Depending on how a timeline is drawn, different types of insights and temporal characteristics can be emphasized, including periodicity and synchronicity.

In recent years, there has been an emergence of interactive timeline visualization tools used for data-rich presentations and storytelling, especially within the data journalism community. Yet, most of these presentation tools adopt the linear, chronological timeline design popularized by Joseph Priestley in the late 18th century, and thus lack the expressivity to communicate a range of timeline narratives or allow viewers to visualize timeline data themselves in new and interesting ways.

timeline2
The linear, chronological form of Joseph Priestley’s Chart of Biography (1765 has dominated the design of contemporary timelines. Click to enlarge. (Source: https://en.wikipedia.org/wiki/A_Chart_of_Biography)

We conducted a survey of hundreds of timelines published over the course of history from a broad range of sources including timeline visualization tools and visualization techniques proposed in academic research literature, as well as bespoke dataset-specific interactive timelines and timeline infographics.

We identified 14 design choices characterized by three dimensions: representation, scale, and layout. Representation, which refers to the overall shape of the path across the display, is the most visually salient aspect of a timeline. Scale is used to convey relations between events (e.g. order, duration, & synchronicity), and refers to the correspondence between temporal distances and distances on the display. Layout is used to communicate relations between groups of events, and describes how the timeline is partitioned into separate regions of the display. Given these dimensions, we also identified viable combinations of representation, scale, and layout that correspond to different narrative purposes. This design space for timelines became the basis for the initial design of Timeline Storyteller.

timeline1
The 14 design choices characterized by three dimensions (representation, scale, and layout) for expressive storytelling with timelines.

Timeline Storyteller is a flexible tool that enables designers, journalists, and scientists to visualize time in multiple ways, some clear and straightforward —my preferred ones— others quirky and expressive. It’s a very flexible and easy to use tool that fills an underserved niche,” says Cairo.

The data storytelling tool was developed to realize the expressive potential of the timeline design space, combining a wide range of visual design ideas with modern techniques for presenting, annotating, and interacting with data. To create a data story with Timeline Storyteller, an author creates a series of scenes, where each scene has a unique filter state, design specification, and a set of associated annotations, images, and captions. Additionally, Timeline Storyteller uses animated transitions between the scenes of a story to promote a cohesive and engaging storytelling experience.

Timeline Storyteller was initially released as a standalone web application in January 2017. Within the next few months, I demonstrated Timeline Storyteller at the Tapestry data storytelling Conference and at OpenVisConf, a practitioner conference centered on visualizing data on the web. Meanwhile at Microsoft, we partnered with the Power BI product team as well as Principal Researcher Chris White and his team to bring Timeline Storyteller to Power BI as a custom visual. Following in the footsteps of SandDance, a custom visual for Power BI that originated as a Microsoft Research project, we worked to make the custom visual available to Power BI users for free in the Office Store. Both Timeline Storyteller and SandDance are examples of the growing library of custom visuals that provide experiences beyond Power BI’s out-of-the-box visualization types and set Power BI apart as a robust tool for data storytelling.

Throughout the development process of the Timeline Storyteller custom visual, we were motivated and informed by clients’ timeline stories. Using the initial prototype version of Timeline Storyteller, we worked with the UK National Trust to produce a timeline story about some of the most famous historic sites in their portfolio. During James Phillips’ keynote at the 2017 Data Insights Summit, the National Trust’s Jon Townsend presented this story, providing an audience of thousands with a first glimpse of the storytelling capabilities of the new custom visual.

We have continued to showcase Timeline Storyteller in stories ranging from the history of the U.S. Open golf tournament to the progress of Artificial Intelligence at a variety of industry events.

See how 2016 U.S Open champion Dustin Johnson stacks up against previous tournament winners.

We are seeing a lot of excitement and interest in the Timeline Storyteller custom visual for Power BI. For the first time, a single visual can help people tell stories about the history of epidemics spanning centuries, the development of severe hurricanes over the past several decades, or even the daily routines of famous creative people. Timeline Storyteller is of particular interest to data journalists, and has been featured on several industry sites including Storybench and Visualizing Data.

We understand that interactive, data-rich stories are in high demand, and that journalists need to be able to easily create unique data stories that not only inform and educate, but also engage and entertain. Power BI is a powerful data storytelling tool, and with Timeline Storyteller, journalists can now visualize a sequence of events in a compelling way using the large palette of design options that Timeline Storyteller provides. Plus, with Power BI publish to web functionality, journalists can easily publish their interactive timeline stories to their website, reaching an unlimited number of readers with the scale of the Microsoft cloud.

Tools like Timeline Storyteller serve another purpose: they bridge the gap between code and presentation. Many graphic designers and journalists aren’t able to code visualizations from scratch. Timeline Storyteller, and other tools like it, abstract creation process through a graphical user interface and, at the same time, allow people to customize the results in an almost unlimited manner,”  says Cairo.

We encourage you to keep learning more about how Power BI can be used to tell your data stories.

[embedded content]

Cisco revenues fall, likely to go lower

Cisco’s latest earnings report reflects the pains of a legacy vendor struggling to overhaul an outdated business model while rivals chip away at its market share.

Overall Cisco revenues dropped 4% year over year for the quarter ended July 29, to $12.1 billion, the company reported this week. Cisco expected the decline to continue in the current quarter, forecasting a reduction of between 1% and 3%. The drop in the October quarter would mark nearly two straight years of declines.

Cisco’s troubles are mostly due to the steady weakening of its switching business — the company’s largest. Sales of switches in the July quarter fell 9%. The company reported the same decline in its router business, another important hardware line.

Falling Cisco revenues show rivals cutting into market share

While Cisco stumbled, switching rivals Arista Networks and Juniper Networks reported double-digit growth in their June quarters.

Rivals are chipping away at Cisco, for sure,” said Glenn O’Donnell, an analyst at Forrester Research. “In many ways, they [rivals] are reacting more effectively at what the market really needs.”

Cisco’s competitors have been more successful at selling to cloud and communication service providers that favor products less likely to tie them to a single vendor. At the same time, enterprises — Cisco’s core customers — are buying fewer switches, as they migrate more software to cloud providers.

“Cisco’s decline in their core market is another signal that the general networking vendor is an old business model,” said Andre Kindness, an analyst at Forrester.

Cisco’s solution to revenue drop

Cisco understands its dilemma and is gradually moving away from its legacy hardware approach to networking. The company has introduced software that centralizes network control, so operators no longer have to make changes box by box.

In June, Cisco introduced a central software console, called the Digital Network Architecture Center, for managing a campus network. The hardware underpinning is a new line of Catalyst switches, called the 9000 Series.

Cisco’s decline in their core market is another signal that the general networking vendor is an old business model.
Andre Kindnessanalyst at Forrester Research

Cisco attributed its latest drop in switch revenue to the product launch. “Anytime we do a major platform announcement, particularly in switching, there is a period of time where our customers pause because they want to understand what this means,” Cisco CEO Chuck Robbins told financial analysts following the latest earnings report.

Within the enterprise data center — Cisco’s historic sweet spot — the company has been pushing customers to switch to its software-defined networking platform, called Application Centric Infrastructure. ACI is also dependent on Cisco hardware, namely the Nexus 9000 Series of switches.

Enterprise adoption of ACI, which Cisco started shipping in 2014, has been slow, according to analysts. In February, Gartner reported that only 30% of companies buying Nexus 9000 switches were also using ACI.

Forrester has found that many businesses are choosing VMware’s competing NSX because it requires fewer architectural changes within the data center. “Cisco ACI is an all-or-nothing proposition,” Kindness said.

Also, ACI is less flexible when working with third-party appliance vendors. Companies using ACI are often limited to products from Cisco partners for load balancing and firewalls.

“My clients don’t want to have one vendor dictate the other vendors,” Kindness said.

Cisco revenues from security slow

To mitigate its troubles in networking, Cisco has been focusing on high-growth areas in the tech industry, such as the internet of things, technology for connecting data centers to the cloud and security. In security, Cisco’s revenues failed to meet analysts’ expectations for the July quarter, reporting 3% growth, which was significantly less than the 16% increase a year ago and the 9% growth in the previous quarter.

Despite the slowdown, Robbins said he had “zero concerns about the business,” because the company has recently recorded “some of the strongest order growth as we’ve seen in the last two years.”

Cisco’s approach to security is to sell it as part of an overall purchase of networking infrastructure, and not as a solo product. As a result, security sales will tend to move up or down depending on sales of switches and other products.

“I think we should wait a few quarters to see where this goes,” said Patrick Moorhead, an analyst with Moor Insights & Strategy, based in Austin, Texas. “The security products are so linked to their networking products that we didn’t see the numbers that the street [Wall Street] had expected.”

If Moorhead has it right, then security sales will improve as sales of switches head north. But indicators are Cisco has yet to reach the bottom.