Tag Archives: Report

McAfee details rise in blockchain threats, cryptocurrency attacks

A new McAfee report on blockchain threats shows

cryptomining
malware grew more than 600% in the first

quarter
this year.

McAfee’s “Blockchain Threat Report” details the massive increase in

cyberattacks
against cryptocurrency owners, exchanges and other companies leveraging blockchain as the value of those cryptocurrencies has surged over the last year. Steve Povolny, head of advanced threat research at McAfee, said the intent of the report is to create a baseline for the industry as it deals with increased blockchain threats that use many of the same attack techniques and methods of the last five to 10 years.

“We’ve seen an explosion in cryptocurrency value recently,” Povolny said. “Hundreds of them were created in a very short time, and now we’re seeing threat actors trying to capitalize on that value.”

While attackers have learned to adopt different attack methods that target both consumers and businesses, according to McAfee researchers, the four major attack vectors include familiar threats like phishing, malware, implementation vulnerabilities

and
technology. Phishing is the most familiar blockchain attack due to its prevalence and success rate, the researchers wrote. Malware, meanwhile, has exploded over the last year; the report shows the total

cryptomining
malware
samples increased 629% quarter-over-quarter in Q1 of this year. The report also notes that malware developers began to shift from ransomware to cryptocurrency mining in the last six months with “ransomware attacks declining 32% in Q1 2018 from Q4 2017 while coin mining increased by 1,189%.”

Technology attacks, as explained by the researchers, are threats like dictionary attacks that are used against cryptocurrency private keys. Lastly, implementation vulnerabilities refer to flawed deployments of blockchain technology; the report cites examples such as the 2017 attack on blockchain startup Iota, where attackers exploited cryptographic vulnerabilities to created hash collisions and forged signatures, which enabled the hackers to steal coins from users’ digital wallets. Povolny stressed these vulnerabilities are not flaws with blockchain itself, which has proved to be secure so far.

The “Blockchain Threat Report” states, “In most cases, the consumers of blockchain technology are the easiest targets. Due to a widespread start-up mentality, in which security often takes a backseat to growth, cryptocurrency companies often fall in this category.”

Povolny said the issue of security within cryptocurrency and blockchain creates a two-sided problem. The first side revolves around the companies that initially rushed to capitalize on cryptocurrency but didn’t complete basic security checks and risk assessments; those shortcomings, which include a lack of proper access controls,

make
them easy targets for threat actors, he said. The second side is the financial motivation, as many cryptocurrencies’ values reached all-time highs in late 2017, when Bitcoin was valued at almost $20,000 per coin, thus catching the attention of hackers. This two-sided cryptocurrency problem created a continuous cycle that resulted in the development of wallets and ledgers being built without a complete understanding of security risks or an implementation of security around the programs, McAfee researchers claim.

The report also notes that “recovering from cryptocurrency theft is more difficult and complicated than with most other currencies due to their decentralized nature.” In order to secure a network, a tailored risk assessment should be conducted.

As industries begin to implement their own blockchain technology, users should prepare for continued development of new technologies by cybercriminals to further compromise them, McAfee researchers wrote. However, since there is not a clear understanding of where these risks are,

trust
may be placed in unwarranted blockchain applications. In order to keep cryptocurrency wallets safe, Povolny recommends storing them locally on a computer that lacks network accessibility and notes that we may not see people flock to a currency like this again.

Despite the increase in threats, Povolny said the surge in cryptocurrency startups and blockchain deployments is expected to continue.

Research claims ‘widespread’ Google Groups misconfiguration troubles

A new report claims a significant number of G Suite users misconfigured Google Groups settings and exposed sensitive data, but the research leave unanswered questions about the extent of the issue.

According to Kenna Security research, there is a “widespread” Google Groups misconfiguration problem wherein Groups are set to public and are exposing potentially sensitive email data that could lead to “spearphishing, account takeover, and a wide variety of case-specific fraud and abuse.” Last year, Redlock Cloud Security Intelligence also found Google Groups misconfiguration responsible for exposure of data from hundreds of accounts.

Kenna said it sampled 2.5 million top-level domains and found 9,637 public Google Groups. Of those public Groups, the researchers sampled 171 and determined 31% of those organizations “are currently leaking some form of sensitive email” with a confidence level of 90%.

“Extrapolating from the original sample, it’s reasonable to assume that in total, over 10,000 organizations are currently inadvertently exposing sensitive information,” Kenna wrote in its blog post. “The affected organizations including Fortune 500 organizations; Hospitals; Universities and Colleges; Newspapers and Television stations; Financial Organizations; and even U.S. government agencies.”

For context, there are currently more than 3 million paid G Suite accounts and an unknown number of free G Suite accounts, and Kenna acknowledged via email that they “do not believe [they] tested the vast majority of G Suite enabled domains.” Additionally, Google confirmed that Groups are set to private by default and an administrator would need to actively choose to make a Group public or allow other users to create public Groups.

It is unclear how many G Suite accounts are set to public, but a source close to the situation said the vast majority of Google Groups are set to private, and Google has sent out messages to users who may be affected with instructions on how to fix the Google Groups misconfiguration.

Specifics versus extrapolation         

Kenna Security’s research likened the Google Groups misconfiguration issue to the recent spate of Amazon Web Server (AWS) exposures where S3 buckets were accidentally left public.

“Ultimately, each organization is responsible for the configuration of their systems. However, there are steps that can be taken to ensure organizations can easily understand the public/private state for something as critical as internal email,” a Kenna spokesperson wrote via email. “For example, when the AWS buckets leak occurred, AWS changed its UX, exposing a ‘Public’ badge on buckets and communicated proactively to owners of public buckets. In practice, public Google Group configurations require less effort to find than public S3 buckets, and often have more sensitive information exposed, due to the nature of email.”

However, a major difference between the research from Kenna and that done by UpGuard in uncovering multiple public AWS buckets is in the details. Kenna is extrapolating from a sample to claim approximately 10,000 of 3 million Google Groups (0.3%) are misconfigured, and the examples of exposed emails reveal the potential for spearphishing attacks or fraud.

On the other hand, UpGuard specifically attributed the exposed data it found, including Republican National Committee voter rolls for 200 million individuals, info on 14 million Verizon customers, data scraped from LinkedIn and Facebook, and NSA files detailing military projects.  

Alex Calic, chief strategy and revenue officer of The Media Trust, said Google “made the right call by making private the default setting.”

“At the end of the day, companies are responsible for collaborating with their digital partners/vendors on improving and maintaining their security posture,” Calic wrote via email. “This requires developing and sharing their policies on what information can be shared on workplace communication tools like Google Groups and who can access that information, keeping in mind that — given how sophisticated hackers are becoming and the ever-present insider threat, whether an attack or negligence — there is always some risk that the information will see the light of day.”

Federal cybersecurity report says nearly 75% of agencies at risk

The latest federal cybersecurity report holds little good news regarding the security posture of government agencies, and experts are not surprised by the findings.

The Office of Management and Budget (OMB) and the Department of Homeland Security (DHS) developed the report in accordance with President Donald Trump’s cybersecurity executive order issued last year. The report acknowledged the difficulties agencies face in terms of budgeting, maintaining legacy systems and hiring in the face of the cybersecurity skills gap, and it identified 71 of 96 agencies as being either “at risk or high risk.”

“OMB and DHS also found that federal agencies are not equipped to determine how threat actors seek to gain access to their information. The risk assessments show that the lack of threat information results in ineffective allocations of agencies’ limited cyber resources,” OMB and DHS wrote in the report. “This situation creates enterprise-wide gaps in network visibility, IT tool and capability standardization, and common operating procedures, all of which negatively impact federal cybersecurity.”

The federal cybersecurity report tested the agencies involved under 76 metrics and identified four major areas of improvement: increasing threat awareness, standardizing IT capabilities, consolidating security operations centers (SOCs), and improving leadership and accountability.

Greg Touhill, president of Cyxtera Federal Group, based in Coral Gables, Fla., and former CISO for the United States, said the report was an “accurate characterization of the current state of cyber risk and a reflection of the improvements made over the last five years in treating cybersecurity as a risk management issue, rather than just a technology problem.”

“I am concerned that the deletions of and vacancies in key senior cyber leadership positions [are] sending the wrong message about how important cybersecurity is to the government workforce, commercial and international partners, and potential cyber adversaries,” Touhill wrote via email. “As national prosperity and national security are dependent on a strong cybersecurity program that delivers results that are effective, efficient and secure, I believe cybersecurity ought to be at the top of the agenda, and we need experienced cyber leaders sitting at the table to help guide the right decisions.”

Agencies at risk

The federal cybersecurity report said many agencies lack situational awareness and noted this has been a long-standing issue in the U.S. government.

I am concerned that the deletions of and vacancies in key senior cyber leadership positions [are] sending the wrong message about how important cybersecurity is to the government workforce, commercial and international partners, and potential cyber adversaries.
Greg Touhillpresident of Cyxtera Federal Group and former CISO for the United States

“For the better part of the past decade, OMB, the Government Accountability Office, and agency [inspectors general] have found that agencies’ enterprise risk management programs do not effectively identify, assess, and prioritize actions to mitigate cybersecurity risks in the context of other enterprise risks,” OMB wrote. “In fact, situational awareness is so limited that federal agencies could not identify the method of attack, or attack vector, in 11,802 of the 30,899 cyber incidents (38%) that led to the compromise of information or system functionality in [fiscal year] 2016.”

Sherban Naum, senior vice president of corporate strategy and technology at Bromium, based in Cupertino, Calif., said improving information sharing might not “address the protection component.”

“Sharing information in real time of an active and fully identified attack is critical. However, more information alone won’t help if there is no contextual basis to understand what was attacked, what vulnerability was leveraged, the attacker’s intent and impact to the enterprise,” Naum said. “I wonder what systems are in place or are needed to process the real-time threat data to then automatically protect the rest of the federal space.”

Not all of the news was bad. OMB noted that 93% of users in the agencies studied use multifactor authentication in the form of personal identity verification cards. However, the report said this was only the beginning, as “agencies have not matured their access management capabilities” for modern mobile use.

“One of the most significant security concerns that results from the current decentralized and fragmented IT landscape is ineffective identity, credential, and access management processes,” OMB wrote. “Fundamentally, any organization must have a clear understanding of the people, assets, and data on its networks.”

The federal cybersecurity report acknowledged the number of high-profile data leaks and breaches across government systems in recent years and said the situation there is not improving.

“Federal agencies do not have the visibility into their networks to effectively detect data exfiltration attempts and respond to cybersecurity incidents. The risk assessment process revealed that 73 percent of agency programs are either at risk or high risk in this critical area,” OMB wrote. “Specific metrics related to data loss prevention and exfiltration demonstrate even greater problems, with only 40 percent of agencies reporting the ability to detect the encrypted exfiltration of information at government-wide target levels. Only 27 percent of agencies report that they have the ability to detect and investigate attempts to access large volumes of data, and even fewer agencies report testing these capabilities annually.”

Additionally, only 16% of agencies have properly implemented encryption on data at rest.

Suggested improvements

The federal cybersecurity report had suggestions for improving many of the poor security findings, including consolidating email systems, creating standard software configurations and a shared marketplace for software, and improving threat intelligence sharing across SOCs. However, many of the suggestions related directly to following National Institute of Standards and Technology (NIST) Cybersecurity Framework guidelines, the Cyber Threat Framework developed by the Office of the Director of National Intelligence, or DHS’ Continuous Diagnostics and Mitigation (CDM) program.

Katherine Gronberg, vice president of government affairs at ForeScout Technologies, based in San Jose, Calif., said the focus of CDM is on real-time visibility.

“For example, knowing you have 238 deployed surveillance cameras found to have a particular vulnerability is a good example of visibility. Knowing that one or more of those cameras is communicating with high-value IT assets outside of its segment is further visibility, and then seeing that a camera is communicating externally with a known, malicious command-and-control IP address is the type of visibility that helps decision-making,” Gronberg wrote via email. “CDM intends to give agencies this level of real-time domain awareness in addition to securing data. It’s worth noting that many agencies are now moving to Phase 3 of CDM, which is about taking action on the problems that are discovered.”

Katie Lewin, federal director for the Cloud Security Alliance, said “standardization is an effective tool to get the best value from resources,” especially given that many risks faced by government agencies are due to the continued use of legacy systems.

“Standardized, professionally managed cloud systems will significantly help reduce risks and eliminate several threat vectors,” Lewis wrote via email. “If agencies adopt DHS’s Continuous Diagnostics and Mitigation process, they will not have to develop and reinvent custom programs. However, as with all standards, there needs to be some flexibility. Agencies should be able to modify a standard approach within defined limits. Failure to involve agencies in developing a common approach and in defining the boundaries of flexibility will result in limited acceptance and adoption of the common approach.”

Gary McGraw, vice president of security technology at Synopsys Inc., based in Mountain View, Calif., said focusing on standards may not hold much improvement.

“The NIST Framework has lots of very basic advice and is very useful. It would be a step in the right direction. However, it is important to keep in mind that standards generally reflect the bare minimum,” McGraw said. “Organizations that view security solely as a compliance requirement generally fall short, compared to others that treat it as a core or enabling component of their operations.”

Michael Magrath, director of global regulations and standards at OneSpan, said, “Improving resource allocations is a crucial to improving our federal cyberdefenses.” 

“With $5.7 billion in projected spending across federal civilian agencies, some agencies may cry poor. The report notes that email consolidation can save millions of dollars each year, and unless agencies have improved efficiencies like email consolidation, have implemented electronic signatures and migrated to the cloud, there remains an opportunity to reallocate funds to better protect their systems,” Magrath said. “The report also notes that agencies are operating multiple versions of the same software. This adds unnecessary expense, and as more and more agencies migrate to the cloud, efficiencies and cost reductions should follow enabling agencies to reallocate budget and IT resources to other areas.”

Apple transparency report shows national security requests rising

The Apple transparency report for the second half of 2017 showed national security requests on the rise, and the number of devices included in requests is up sharply.

The latest semiannual Apple Report on Government and Private Party Requests for Customer Information detailed requests by governments around the world from July 1, 2017, through Dec. 31, 2017. According to Apple, although overall device requests are down, governments around the world have been using fewer requests to attempt to get information on far more accounts.

The Apple transparency report showed a slight year-over-year decrease in the total number of device requests received worldwide (30,184 in the second half of 2016 versus 29,718 in H2 2017), but the number of devices impacted by those requests more than doubled from 151,105 to 309,362.

Apple is not alone in receiving more government data requests; Google has reported similar increases, but Apple noted it has complied with a higher percentage of government data requests in the second half of 2017 (79%) compared to the same time period in 2016 (72%).

Apple’s transparency report shows the company has been complying with more of the government requests across multiple request types. Apple’s compliance with financial information requests was up year over year from 76% to 85%; account-based request compliance was up from 79% to 82%; and only compliance with emergency requests went down from 86% to 82%.

National security requests also rose sharply, according to the Apple transparency report. In the second half of 2016, Apple received between 5,750 and 5,999 national security requests and complied with the majority of them (between 4,750 and 4,999). In the same time period in 2017, Apple received more than 16,000 national security requests, but only provided data to the U.S. government in about half of those cases.

Richard Goldberg, principal and litigator at the law firm Goldberg & Clements in Washington, D.C., said he was struck by the large percentage of U.S. government requests made by either national security request or subpoena.

“Although Apple has challenged certain government requests aggressively in public, we don’t know how aggressive the company has been in private — which is especially relevant because these requests typically do not require a judge’s approval,” Goldberg said via email. “So the government collects this information, and it may never see the inside of a courtroom.”

Additional information

Goldberg added that the general level of detail in Apple’s transparency report is helpful, but suggested Apple “should break out administrative subpoenas from all other types.”

“Administrative subpoenas can have broad scope, because they often need only be related to something the agency is permitted to investigate, and they need not be connected to a grand jury proceeding or active litigation,” Goldberg said. “It’s a one-sided way for the government to demand information with little to no oversight, unless the recipient chooses to fight. And we don’t know how Apple makes that decision.”

According to Apple, the predominant reason for financial information requests around the world was credit card and iTunes gift card fraud and in multiple regions — including the U.S. — a “high number of devices specified in requests [was] predominantly due to device repair fraud investigations, fraudulent purchase investigations and stolen device investigations.”

It is unclear what data in the Apple transparency report correlates to the allegedly large number of devices the FBI and other law enforcement cannot access due to encryption, nor is it clear which data in the report correlates to iCloud backup data, which Apple has previously admitted to handing over to law enforcement.

SearchSecurity contacted Apple for clarification on these issues and Apple referred to its Legal Process Guidelines, which detailed the types of data in iCloud backups that Apple would be able to provide to law enforcement, including the subscriber’s name, address, email, telephone, mail logs, email content, iMessage data, SMS, photos and contacts.

However, Apple did note in the report that it would be adding “government requests to take down Apps from the App Store in instances related to alleged violations of legal and/or policy provisions,” starting with the transparency report for the second half of 2018.

The Microsoft Cloud can save customers 93 percent and more in energy and carbon efficiency

New report outlines how businesses moving from on-premises datacenters to the Microsoft Cloud can achieve sustainable innovation

REDMOND, Wash. — May 17, 2018 — A new report issued Thursday by Microsoft Corp. in partnership with WSP shows significant energy and carbon emissions reduction potential from the Microsoft Cloud when compared with on-premises datacenters. These gains, as much as 93 percent more energy efficient and as high as 98 percent more carbon efficient, are due to Microsoft’s extensive investments in IT efficiency from chip-to-datacenter infrastructure, as well as renewable energy.

“The world is producing more data than ever, making our infrastructure decisions about how to power this digital transformation incredibly important,” said Brad Smith, president and chief legal officer, Microsoft. “Today’s report confirms what we’ve long believed — that investing in sustainability is good for business, good for customers and good for the planet.”

Specifically, the report found that cloud investments made by Microsoft in IT operational efficiency, IT equipment efficiency, datacenter infrastructure efficiency and renewable electricity were responsible for the environmental benefits. These efficiencies translate into both energy and carbon savings for Microsoft and customers using Microsoft Cloud services.

Microsoft Cloud services achieve energy and emissions reductions in comparison with every on-premises deployment scenario assessed — Microsoft Azure Cloud Compute, Azure Storage, Exchange Online and SharePoint Online.

With more regions than any other cloud provider, Microsoft provides cloud services to customers around the world. As customers across all industries move to the cloud, sustainability and environmental responsibility are key factors in their choice of cloud provider.

“Schneider Electric chose the Microsoft Cloud to power our numerous cloud-based offerings, and it has helped us achieve our goal of becoming a global leader in sustainable energy management,” said Michael MacKenzie, vice president, EcoStruxure Technology Platform – IoT & Digital Offers, Schneider Electric. “The fact that Microsoft shares our sustainability values and focus on decreasing environmental impact makes the company a natural partner for us.”

“When organizations choose low-carbon cloud computing, they are taking an important step forward on sustainability,” said Lance Pierce, president of CDP North America. “Sustainable digital transformation, powered by a cleaner cloud, enables the creation of a sustainable and thriving economy that works for people and planet in the long term.”

Learn more about the Microsoft’s investments and approach to sustainability in the cloud at https://blogs.microsoft.com/on-the-issues/?p=58951. The report can be found in full at “The Carbon Benefits of Cloud Computing: A Study on the Microsoft Cloud.”

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, press only:

Microsoft Media Relations, WE Communications, (425) 638-7777,

rrt@we-worldwide.com 

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication, but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.

 

The post The Microsoft Cloud can save customers 93 percent and more in energy and carbon efficiency appeared first on Stories.

Accenture: Intelligent operations goal requires data backbone

A newly released report co-authored by Accenture and market researcher HfS reveals 80% of the global enterprises surveyed worry about digital disruption, but many of those companies lack the data backbone that could help them compete.

The report stated that large organizations are “concerned with disruption and competitive threats, especially from new digital-savvy entrants.” Indeed, digital disrupters such as Uber and Lyft in personal transportation, Airbnb in travel and hospitality, and various fintech startups have upset the established order in those industries. The Accenture-HfS report views “intelligent operations” as the remedy for the digital challenge and the key to bolstering customer experience. But the task of improving operations calls for organizations to pursue more than a few mild course corrections, according to Debbie Polishook, group chief executive at Accenture Operations, a business segment that includes business process and cloud services.

In the past, enterprises that encountered friction in their operations would tweak the errant process, add a few more people and take on a Lean Six Sigma project, she noted. Those steps, however, won’t suffice in the current business climate, Polishook said.

“Given what is happening  today with the multichannel, with the various ways customers and employees can interact with you, making tiny tweaks is not going to get it done and meet the expectations of your stakeholders,” she said.

Graphic detailing data quality problems within organizations
Organizations struggle to leverage their data

Hard work ahead

The report, which surveyed 460 technology and services decision-makers in organizations with more than $3 billion in revenue, suggested professional services firms such as Accenture will have their work cut out for them as they prepare clients for the digital era.

The survey noted most enterprises struggle to harness data with an eye toward improving operations and achieving competitive advantage. The report stated “nearly 80% of respondents estimate that 50% [to] 90% of their data is unstructured” and largely inaccessible. A 2017 Accenture report also pointed to a data backbone deficit among corporations: More than 90% of the respondents to that survey said they struggle with data access.

In addition, half of the Accenture-HfS report respondents who were surveyed acknowledged their back office isn’t keeping pace with the front office demands to support digital capabilities.

“Eighty percent of the organizations we talked to are concerned with digital disruption and are starting to note that their back office is not quite keeping up with their front office,” Polishook said. “The entire back office is the boat anchor holding them back.”

That lagging back office is at odds with enterprises’ desire to rapidly roll out products and services. An organization’s operations must be able to accommodate the demand for speed in the context of a digital, online and mobile world, Polishook said.

Enterprises need a “set of operations that can respond to these pressures,” she added. “Most companies are not there yet.”

One reason for the lag: Organizations tend to prioritize new product development and front office concerns when facing digital disruption. Back office systems such as procurement tend to languish.

“Naturally, as clients … are becoming disrupted in the market, they pay attention first to products and services,” Polishook said. “They are finding that is not enough.”

The report’s emphasis on revamped operations as critical to fending off digital disruption mirrors research from MIT Sloan’s Center for Information Systems Research. In a presentation in 2017, Jeanne Ross, principal research scientist at the center, identified a solid operational backbone as one of four keys to digital transformation. The other elements were strategic vision, a focus on customer engagement or digitized solutions and a plan for rearchitecting the business.

The path to intelligent operations

The Accenture-HfS report identified five essential components necessary for intelligent operations: innovative talent, a data backbone, applied intelligence, cloud computing and a “smart partnership ecosystem.”

As for innovative talent, the report cited “entrepreneurial drive, creativity and partnering ability” as enterprises’ top areas of talent focus.

There is a lot of heavy lifting to be done.
Debbie Polishookgroup chief executive, Accenture Operations

“One of the most important pieces getting to intelligent operations is the talent,” Polishook said. She said organizations in the past looked to ERP or business process management to boost operations, but contended there is no technology silver bullet.

The data-driven backbone is becoming an important focus for large organizations. The report stated more than 85% of enterprises “are developing a data strategy around data aggregation, data lakes, or data curation, as well as mechanisms to turn data into insights and then actions.” Big data consulting is already a growing market for channel partners.

In the area of applied intelligence about 90% of the enterprises surveyed identified automation, analytics and AI as technologies that will emerge as the cornerstone of business and process transformation. Channel partners also look forward to the AI field and the expanded use of such automation tools as robotic process automation as among the top anticipated trends of 2018.

Meanwhile, more than 90% of large enterprises expect to realize “plug-and-play digital services, coupled with enterprise-grade security, via the cloud, according to the Accenture-HfS report. And a like percentage of respondents viewed partnering with an ecosystem as important for exploiting market opportunities. The report said enterprises of the future will create “symbiotic relationships with startups, academia, technology providers and platform players.”

The path to achieving intelligent operations calls for considerable effort among all partners involved in the transformation.

“There is a lot of heavy lifting to be done,” Polishook said.

CIA attributes NotPetya attacks to Russian spy agency

An unreleased CIA report is alleged to officially name Russia’s top foreign spy agency as the source of the NotPetya ransomware and the initial attacks against Ukraine.

The CIA reportedly concluded in November 2017 that Russia’s GRU foreign intelligence agency was responsible for the NotPetya attacks in June 2017. According to The Washington Post, unnamed officials said the classified CIA report attributed the NotPetya attacks to Russia’s GRU and said the hackers that created the ransomware worked for the Russian military’s GTsST, or Main Center for Special Technology.

The NotPetya attacks began by targeting Ukrainian agencies but quickly spread through the use of the EternalBlue exploit developed by the NSA and used in the WannaCry ransomware attacks.

Attributing the attacks to Russia is not in itself surprising as security researchers in June said Russia was the likely threat actor given that the initial NotPetya attacks targeted Ukraine government agencies through multiple software backdoors in the M.E.Doc tax program. However, experts noted that the CIA likely wanted to be certain before making any statement.

Tim Erlin, vice president of product management and strategy at Tripwire, the information security company headquartered in Portland, Ore., said “attributing cyberattacks to specific attackers or groups can be a challenging task.”

“It’s not always possible to make a direct connect, and indirect inferences are required to come to a conclusion. Accurate attribution is broadly valuable. While organizations should focus on the solid application of foundational controls first, characterizing the threat environment in terms of changing attackers can help prioritize more advanced protections,” Erlin told SearchSecurity. “It’s hard to say why the CIA didn’t publish this information sooner, though it’s important to realize that a three-month delay in disclosing this kind of nation-state attribution isn’t a very long time.”

The NotPetya attacks and Russian aggression

Tom Kellermann, CEO of Strategic Cyber Ventures LLC in Augusta, Ga., said the CIA likely “withheld attribution to prevent their sources and methods from being discovered.”

“The public announcement is significant as it is meant to warn the American public of the significant cyber threat posed by Russia,” Kellermann told SearchSecurity. “Cold War cyberattacks against the U.S. have dramatically increased over the past six weeks, as evidenced by the resurgence of Fancy Bear coming out of hibernation. We are under siege.”

Chris Morales, head of security analytics at Vectra Networks, a cybersecurity company based in San Jose, said the security industry felt comfortable attributing the NotPetya attacks to Russia “due to similarities of the NotPetya attack to prior attacks from Russia targeting the Ukraine.”

Cyberspace is the next major battle ground between major nation-states.
Chris Moraleshead of security analytics at Vectra Networks

“Russia has engaged in what the Pentagon calls ‘hybrid warfare’ against […] Ukraine, with three previously known attacks against the Ukrainian voting system and power grid dating back to 2014. With the CIA confirmation, NotPetya now looks like another attack in a succession of state-sponsored attacks,” Morales told SearchSecurity. “The bigger concern here for the U.S. is that we believe Russia is practicing and honing their craft against […] Ukraine, where they face little opposition from global powers. Cyberspace is the next major battle ground between major nation-states. Russia is arming themselves with cyber weapons that could be used against us or any other state as Russia would deem necessary in a bigger attack campaign. The irony of this attack is that it leveraged exploits developed by the NSA in their pursuit of weaponizing cyber space.”

Kaspersky sheds more light on Equation Group malware detection

Kaspersky Lab published a lengthy report that shed further light on its discovery of Equation Group malware and its possession of classified U.S. government materials.

The antivirus company, which has been under intense scrutiny by government officials and lawmakers this year, disclosed that classified materials were transmitted to Kaspersky’s network between September 11, 2014 and November 17, 2014. In a previous explanation, the company said Kaspersky antivirus software detected malware on a computer located in the greater Baltimore area. Kaspersky later discovered a 7zip archive on the computer that had Equation Group malware and other materials with U.S. government classified markings.

Kaspersky’s new investigation details were issued in response to several media reports that claimed Russian state-sponsored hackers used Kaspersky’s antivirus software to identify and locate U.S. government data. The reports claimed that in 2015 an NSA contractor’s system was compromised by Russian hackers using Kaspersky antivirus scans, which led to a massive leak of confidential NSA files and Equation Group malware. The news reports also claimed Israeli intelligence penetrated Kaspersky’s network in 2014 and found classified NSA materials on its network.

The Equation Group was an APT group that was first identified by Kaspersky researchers in 2015 and later linked to the U.S. National Security Agency (NSA) in 2016 following disclosures by the hacking group known as the Shadow Brokers.

New details in Kaspersky’s investigation

Thursday’s report provided new details about the computer with Equation Group malware, which was believed to be the NSA contractor’s system. Kaspersky did not confirm or deny these reports, saying its software anonymizes users’ information and divulging details about the specific user in this case would violate its ethical and privacy standards.

The Kaspersky investigation revealed the suspected NSA contractor’s computer was “compromised by a malicious actor on October 4, 2014” as a result of a backdoor Trojan known as Smoke Loader or Smoke Bot. The compromise occurred during the nearly two-month span Kaspersky identified and scanning the computer from Sept. 11 to Nov. 17, 2014.

Kaspersky said it believes the user turned Kaspersky’s antivirus software off at some point during that time frame in order to install a pirated version of Microsoft Office, which allowed Smoke Loader to activate. The report also noted Smoke Loader was attributed to a Russian Hacker in 2011 and was known to be distributed on Russian hacker forums.

Kaspersky said once the classified markings were discovered in the 7zip archive materials, all data except the malware binaries was deleted under order of CEO Eugene Kaspersky. The company also said it “found no indication the information ever left our corporate networks.”

Kaspersky’s report appeared to suggestthe threat actors who reportedly found the classified NSA data and Equation Group malware likely did so by hacking the computer directly with Smoke Loader and not, as media reports claimed, by hacking into Kaspersky’s network and abusing the company’s antivirus technology.

The company also said it’s possible the computer had other malware on it that Kaspersky didn’t detect.

“Given that system owner’s potential clearance level, the user could have been a prime target of nation states,” the report stated. “Adding the user’s apparent need for cracked versions of Windows and Office, poor security practices, and improper handling of what appeared to be classified materials, it is possible that the user could have leaked information to many hands. What we are certain about is that any non-malware data that we received based on passive consent of the user was deleted from our storage.”

Thursday’s report followed comments from Jeanette Manfra, assistant secretary for cybersecurity and communications at the U.S. Department of Homeland Security, who told the House Science, Space and Technology Oversight Subcommittee earlier this week that there was no conclusive evidence that Kaspersky software had been exploited to breach government systems.

Policy changes

The report also contained new information about how Kaspersky responded to the 2014 Equation Group malware discovery and the company policy changes that followed.

“The reason we deleted those files and will delete similar ones in the future is two-fold; We don’t need anything other than malware binaries to improve protection of our customers and secondly, because of concerns regarding the handling of potential classified materials,” the report states. “Assuming that the markings were real, such information cannot and will not [be] consumed even to produce detection signatures based on descriptions.”

Kaspersky said that those concerns led to the adoption of a new policy for the company that requires all analysts to “delete any potential classified materials that have been accidentally collected during anti-malware research or received from a third party.”

The report didn’t say whether or not Kaspersky ever notified the NSA or other government agencies about the Equation Group malware it discovered or the classified data contained in the 7zip archive. In a previous statement on the situation, the company stated, “As a routine procedure, Kaspersky Lab has been informing the relevant U.S. government institutions about active APT infections in the USA.” It’s also unclear why, after finding the classified U.S. government files, the company never disclosed Equation Group was connected to the NSA.

Kaspersky has not responded to requests for comment on these questions.

The company responded to media reports that claimed threat actors used Kaspersky antivirus scans to hunt for classified markings.

“We have done a thorough search for keywords and classification markings in our signature databases,” Kaspersky said. “The result was negative: we never created any signatures on known classification markings.”

Kaspersky did, however, acknowledged that a malware analyst created a signature for the word “secret” based on the discovery of the TeamSpy malware in 2013, which used a wildcard string pattern based on several keywords, including “secret.” The company hypothesized that a third party may have either misinterpreted the malware signature or maliciously used it against Kaspersky to spread false allegations.

Three years in a row – Microsoft is a leader in the ODBMS Magic Quadrant

We’re happy to report that Gartner has positioned Microsoft in the Leaders Quadrant in the 2017 Magic Quadrant for Operational Database Management Systems again this year. This is the third year that Microsoft has been positioned farthest in completeness of vision and ability to execute in the operational database management systems market.

At Microsoft, we’re dedicated to helping both enterprises and individuals realize their full potential. Our industry position in Operational DBMS is due to the unequaled capabilities of SQL Server.

The release of SQL Server 2017 brings the power of SQL Server to Windows, Linux, and Docker containers for the first time ever. Developers are able to build intelligent applications using preferred languages and environments, while enjoying in-memory performance across workloads, mission-critical high availability, and in-database advanced analytics. You can develop once and deploy anywhere in a consistent experience across your datacenter and public cloud.

SQL Server proves itself, year over year, to be the least vulnerable DBMS in the industry. Built for security from the ground up, SQL Server offers customers a layered protection approach that incorporates encryption, authentication, and monitoring and auditing at the disk, database, and application levels. Innovative security technologies like Always Encrypted, for encryption at rest and in motion, help transform global operations for the better.

Perhaps most noteworthy for organizations living in the real world of cost-benefit analyses, SQL Server 2017 remains one of the most cost-competitive DBMS offerings in the enterprise space. In fact, you can get all the robust business capabilities in SQL Server 2017 built-in to a single product SKU, without expensive add-ons — for one great, low total cost of ownership.

But don’t just take our word for it. We encourage you to take the time to read the full Gartner report.

And then take a moment to see how you can get free licenses when you migrate to SQL Server 2017. We’re confident you’ll find the industry-leading database you know and love — now across operating systems and application platforms, on-premises and in the cloud.

Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner complete document is available now.