Customers with multiple Amazon accounts now have a way to manage backup policies for all of them.
Druva is adding a global policy management tool to its CloudRanger software, alongside other AWS backup features. Originally, CloudRanger only allowed backup policy setting within individual accounts. The update allows users to create backup policies first, then select or exclude the Amazon accounts to apply them to.
Druva’s vice president of product David Gildea said there has been an increase in the number of enterprises that hold multiple accounts. He said Druva designed the new CloudRanger feature around the idea that customers have thousands of accounts and multiple resources, and it gives the customer a “broad stroke” approach to backup policy management.
“[Amazon] S3 is one of the biggest and most important data sources in the world now,” Gildea said, highlighting the need to protect and manage the data within it.
S3 backup is one of Druva’s key new AWS backup features. Customers can back up S3 snapshots across regions, protecting them from a regional outage. In addition, users can move EBS snapshots to S3 storage, including Glacier and Glacier Deep Archive tiers for greater cost efficiency.
Druva CloudRanger is a management tool for AWS workloads and automated disaster recovery. The total list of Amazon workloads CloudRanger protects now includes EBS, S3, RedShift, ODS, EC2, DocumentDB and Neptune DB. Along with AWS backup, Druva also has products for on-premises data center, endpoint and SaaS application protection.
Druva is not alone in the AWS backup space. Clumio recently extended its backup as a service to support EBS, and Veeam recently launched a cloud-native EC2 protection product in AWS Marketplace.
Druva’s new AWS backup capabilities are available immediately to early access customers and are expected to become generally available in the first quarter of 2020.
Gildea said customers who have built apps on Amazon and use them at a large scale have a large amount of off-premises data that may not be under the protection of a business’s traditional backup. Druva’s AWS backup saves these customers the trouble of scripting and developing custom backup, which typically does not scale and needs to be continually maintained with every Amazon update.
There is a growing adoption of hybrid cloud infrastructure, said Steven Hill, a senior analyst at 451 Research. Backup vendors have products for protecting on-premises workloads, as well as offerings for the cloud. However, Hill said the challenge for vendors is eliminating the complexity that comes with managing separate environments, one of which is off premises.
Hill said as businesses push more critical workloads to the cloud, the cost of backup will be minor compared to the potential loss of data. He said some businesses have to learn this the hard way through a data loss incident before they buy in.
“Data protection is a bit like buying insurance — it’s optional,” Hill said.
Hill said over time, businesses will learn that cloud workloads need the same quality of backup and business continuity and disaster recovery (BC/DR) as on-premises environments. However, monitoring off-premises systems is an additional challenge. Therefore, he believes the future of BC/DR will lie in automation and flexibility through policy-based management regardless if an environment is on or off premises.
Similar to many software companies, CyberArk Software Ltd. has policies and and practices that appeal to people with skills in high demand. They include a social responsibility policy and catered lunches. The information security software firm also has something else that appeals to younger employees — an employee activism effort that brought about some real change.
Lex Register, an associate in corporate development and strategy at CyberArk, was hired in 2018. Soon after, he saw gaps in the firm’s environmental sustainability practices. The firm wasn’t, for instance, collecting food scraps for composting.
“If you’ve never composted before, the idea of leaving left out food in your office can be sort of a sticky subject,” Register said, who has a strong interest in environmental issues.
Register approached his managers at CyberArk’s U.S. headquarters in Newton, Mass., about improving its environmental sustainability. He had some specific ideas and wanted to put together an employee team to work on it. Management gave it approval and a budget.
Register helped organize a “green team,” which now makes up about 25% of its Newton office staff of 200. The firm’s global workforce is about 1,200.
CyberArk’s green team has four subgroups: transportation, energy, community and “green” habits in the office. It also has a management steering committee. Collectively, these efforts undertake a variety of actions such as volunteering on projects in the community, improving enviornmental practices in the office and working on bigger issues, such as installing electric vehicle charging stations for the office building.
Lex RegisterAssociate in corporate development and strategy, CyberArk Software
“When I think about the companies I want to work for, I really want to have pride in everything they do,” Register said.
Junior employees lead the effort
The green team subgroups are headed by junior employees, according to Register, who is 28.
“It’s a way for a lot of our junior employees who don’t necessarily have responsibility for managing people to sort of step up,” Register said. They “can run some of their own projects and show some leadership capabilities.”
Employee activism has become an increasingly public issue in the last 12 months. In May, for instance, thousands of Amazon employees signed a letter pressing the firm for action. In September, thousands walked out as part of the Global Climate Strike.
“This walkout is either a result of employees not feeling heard,” said Henry Albrecht, CEO at Limeade Inc., or employees feeling heard but fundamentally disagreeing with their leaders. Limeade makes employee experience systems. “The first problem has a simple fix: listen to employees, regularly, intentionally and with empathy,” he said.
Some companies, such as Ford Motor Co., are using HR tools to listen to their employees and get more frequent feedback. In an interview with SearchHRSoftware, a Ford HR official said recently this kind of feedback encouraged the firm to join California in seeking emission standards that are stricter than those sought by President Trump’s administration.
But employee activism that leads to public protest doesn’t tell the full employee activism story.
Interest in green teams rising
The Green Business Bureau provides education, assessment tools and processes that firms can use to measure their sustainability practices. In the past nine months, Bill Zujewski, CMO at the bureau, said it’s been hearing more about the formation of sustainability committees at firms. The employees leading the efforts are “almost always someone who’s a few years out of school,” he said.
HR managers, responding to “employee-driven” green initiatives, are often the ones Zujewski hears from.
Maggie Okponobi, funding coordination manager at School Specialty Inc, is one of the Green Business Bureau’s clients. Her employer is an educational services and products firm based in Greenville, Wisc. Her job is to help schools secure federal and state grants.
Okponobi is in an MBA program that has an emphasis on sustainability. As a final project, she proposed bringing a green certification to her company. The assessments evaluate a firm’s sustainability activities against best environmental practices.
Okponobi explained what she wanted to do to one of the executives. She got support and began her research, starting with an investigation of certification programs. She decided on Green Business Bureau assessments, as did CyberArk.
Company managers at School Speciality had been taking ad-hoc steps all along to improve sustainability. Efforts included installing LED lighting, and reducing paper useage by using both sides for printing and recycling, Okponobi said.
Okponobi collected data about the environmental practices for certification. The firm discovered it was eligible for gold level certification, one step below the highest level, platinum.
The results were brought to an executive group, which included members from HR as well as marketing. Executives saw value in the ranking, and Okponobi believes it will help with recruiting efforts, especially with younger candidates. The company plans to create a green team to coordinate the sustainability efforts.
HR benefits from sustainability
Sustainability may help with retention, especially with younger workers, Okponobi said. “It gives them something exciting, positive to do in their workplace, and a goal to work toward,” she said.
Some employees are coming to workplaces with training on sustainability issues. One group that provides that kind of training is Manomet Inc., a 50-year-old science-based non-profit in Plymouth, Mass.
“We can’t make the progress that we need on climate change and other issues without the for-profit sector,” said Lora Babb, program manager of sustainable economies at Manomet.
The nonprofit takes about 20 undergrad college students each year, usually enrolled in majors that often have a sustainability component, and gives them “real world skills” to meet with businesses and conduct assessments. The training enables future employees to “make changes from the inside,” and understand practical, applied sustainability, Babb said.
This is not strictly an environmental assessment. The students also ask businesses about economic and social issues, including a workforce assessment that considers employee benefits, engagement and talent development, Babb said.
A business with a strong environmental mission is “going to be far less effective at carrying out that mission if you are having constant workforce challenges,” Babb said.
And the results of such efforts can have an effect on culture. CyberArk’s employees have embraced composting, Register said. The company hired a firm that picks up food scraps about twice a week, processes them and makes compost — what master gardeners often refer to as black gold — available for employees to use in their home gardens.
The results make employee composting efforts “very tangible for them,” Register said.
Arista Networks has added to its CloudVision management console the ability to apply security policies across virtualized switching fabrics running on Amazon Web Services, Google Cloud and Microsoft Azure.
Arista also introduced this week an integration between Arista CloudVision and NSX, VMware’s software for provisioning virtualized networks. The combination lets engineers take security policies created in NSX and apply them to Arista switches running in the data center.
The latest features come about a year after Arista introduced a virtualized version of its network operating system, called vEOS, for AWS, Google and Azure. At the time, Arista added some vEOS controls to CloudVision, which competes with Cisco CloudCenter.
The new multi-cloud feature within Arista CloudVision lets engineers modify the access control lists (ACLs) in vEOS switches, said Jeff Raymond, vice president of EOS product management. The capability, which the vendor calls Zone Segmentation Security, eliminates having to worry about the unique security mechanisms in each of the three public clouds.
Companies often create virtual networks in the public clouds to deliver security, load balancing and other services to applications. Amazon and Google call the networks Virtual Private Clouds (VPCs) while Microsoft refers to them as virtual networks (VNet).
Arista has integrated its Zone Segmentation feature with Zscaler’s cloud-based web gateway. The integration lets companies use Zscaler to apply security policies for traffic heading from a campus network or remote office to the cloud provider. Arista CloudVision applies policies to traffic flowing between and within virtual networks.
Overall, Arista is using CloudVision to address a trend toward more collaboration between corporate networking and security teams, said Shamus McGillicuddy, an analyst at Enterprise Management Associates, based in Boulder, Colo. A recent EMA survey found that 91% of security and network infrastructure teams were working together using shared or integrated tools.
The latest Arista offerings also show the vendor recognizes its customers need security that stretches from the private data center to the public cloud, said Bob Laliberte, an analyst at Enterprise Strategy Group, based in Milford, Mass. “Building out a strong security ecosystem will be critical, and delivering a capable management platform for hybrid cloud environments will be important for its customers to effectively manage those hybrid environments.”
VMware NSX integration with Arista CloudVision
The NSX integration bridges the gap between VMware virtual networks and Arista physical switches in the data center. With CloudVision, engineers will be able to take security policies created for NSX environments and apply them to workloads running on the hardware.
NSX policies define the network resources accessible to groups of workloads and applications running on the virtual network. CloudVision applies those policies to an Arista fabric by converting them into a format that can become a part of the switch’s ACL.
As a result, engineers can save time by using just NSX for creating security policies, according to Raymond.
New hardware-based encryption in Arista routers
Finally, Arista plans to release four routers with built-in support for encryption standards. For the enterprise WAN, Arista embedded hardware-based IPSec in the 7020SRG for site-to-site virtual private networks. The router is a 10 GbE platform.
For the data center interconnect, Arista will provide MACsec encryption in the new 7280CR2M and the 7280SRAM. Both routers offer wire-speed encryption with 10 GbE and 100 GbE for up to 100 kilometers. For MACsec encryption up to 2,500 km, Arista introduced the 7280SRM, which has 200 GbE Coherent interfaces for metro and long-haul links.
Arista plans to release all the new technology by the end of September.
Arista sells its products primarily to tier-one and tier-two service providers, financial institutions and high-tech companies, including Microsoft, Amazon and Facebook.
Recently, however, the company has aimed some new hardware at enterprises with more mainstream data centers. In May, for example, the company introduced switches for the campus LAN.
Each year, more and more governments are developing policies to address security challenges presented by an increasingly digitized world. And to support those efforts, I’m excited today to announce the release of Microsoft’s new Cybersecurity Policy Framework, a resource for policymakers that provides an overview of the building blocks of effective cybersecurity policies and that is aligned with the best practices from around the globe. Nations coming online today, and building their cybersecurity infrastructures, should not—and need not—be burdened with the stumbling blocks that characterized previous generations of cybersecurity policies. Instead, such nations should be empowered to leapfrog outdated challenges and unnecessary hurdles.
For years, Microsoft has worked with policymakers in advanced and emerging economies, and across many social and political contexts, to support the development of policies to address a wide range of cybersecurity challenges. This new publication captures and distills the important lessons learned from those years of experience partnering with governments. And as increasing numbers of countries wrestle with how to best address cybersecurity challenges, the Cybersecurity Policy Framework is an indispensable resource for the policymakers joining this work.
According to the last analysis provided by the United Nations, half of the countries on earth today either have or are developing national cybersecurity strategies. I have little doubt that in the next decade every single outstanding country will add its name to that list. And this trend highlights the importance of this new resource. The policies established today will impact how technologies are used for years to come and how safe or dangerous the online world becomes for all of us. Truly, there is no going back, only forward.
The Cybersecurity Policy Framework is not one-stop shopping for cybersecurity policymakers, but it does serve as an important “umbrella document,” providing a high-level overview of concepts and priorities that must be top of mind when developing an effective and resilient cybersecurity policy environment.
Specifically, this new resource outlines:
National strategies for cybersecurity.
How to establish a national cyber agency.
How to develop and update cybercrime laws.
How to develop and update critical infrastructure protections.
International strategies for cybersecurity.
We at Microsoft have been at this work for a long time and have developed a wide variety of resources to help those who are working to position their industries and nations to capitalize on the benefits of new technologies—so many that they can often be difficult to find! And this highlights another strength of the Cybersecurity Policy Framework, while it is not one-stop shopping, each section does provide an overview of a critical policy topic as well as links to the associated and more in-depth resources my team has developed over the years to assist policymakers. In this way, this new resource serves not only as essential, high-level guidance, but also as a key to a broader catalogue of resources built on years of experience partnering with governments around the world.
Reading through this new resource, I am proud of the work we have done in pursuit of a safer online world. Important progress has been made and these foundational principles underscore much today’s cybersecurity discourse. However, we have—and will always have—more work to do as a result of the changes and innovations in technology always on the horizon, and their implications for cybersecurity. I’m glad to put this resource forward today to support a new generation of policymakers and also look forward to partnering with them to tackle the new challenges we will face together tomorrow.
In an effort to be more transparent with customers, Microsoft is clarifying patch management policies that experts said have been generally understood, but never properly codified.
Alongside the June 2018 Patch Tuesday release, Microsoft published the Security Servicing Commitment, which it hopes will help customers understand whether a reported vulnerability will be addressed during the monthly patch cycle or in the next version of a product.
In order to make this determination, Microsoft has specified two key criteria for immediate security patching: whether the vulnerability is severe enough and whether it “violate[s] a promise made by a security boundary or a security feature that Microsoft has committed to defending.”
“If the answer to both questions is yes, then the vulnerability will be addressed through a security update that applies to all affected and supported offerings,” Microsoft wrote in the Security Servicing Commitment. “If the answer to either question is no, then by default the vulnerability will be considered for the next version or release of an offering but will not be addressed through a security update, though in some cases an exception may be made.”
The security boundaries described in the Security Servicing Commitment are the points of “logical separation between the code and data of security domains with different levels of trust,” including network boundaries, kernel boundary, virtual machine boundary and more. Security features include Windows Defender, BitLocker and Windows Resource Access Controls.
However, Microsoft makes a distinction between these features and boundaries and defense-in-depth features, which it claims “may provide protection against a threat without making a promise.” These features include address space layout randomization, data execution prevention, user account control and more.
Codifying understood policy
Experts said there wasn’t really anything new in Microsoft’s Security Servicing Commitment, although the clarification was welcomed.
Dustin Childs, communications manager for Trend Micro’s Zero Day Initiative, said the policy description was less of a change and more of a clarification.
“Some of this information was publicly available, but it wasn’t found in a consolidated source with full details,” Childs wrote via email. “It’s hard to say why they chose to publish this now. Perhaps there has been an increase in submissions that don’t meet their servicing bar and have caused confusion with researchers.”
Chris Goettl, director of product management for security for Ivanti, based in South Jordan, Utah, said it was good “to see some clarity regarding severity of vulnerabilities to better understand how updates are classified” with the Security Servicing Commitment.
“Public and private disclosure of vulnerabilities can be a messy ordeal. I think this commitment provides the ethical hackers of the world with rules of engagement for disclosing bugs with Microsoft,” Goettl wrote via email. “Overall, I think it provides transparency to those who are committing their time so they know it will be worth the effort and are not disappointed or surprised by a response where Microsoft is not committing to provide a fix or a bounty.”
Chris Goettldirector of product management and security for Ivanti
Allan Liska, threat intelligence analyst at Recorded Future, based in Somerville, Mass., said the Security Servicing Commitment was “spot on and laid out in a smart, strategic way.”
“Given Microsoft’s breadth and depth of products and constant commitment to security, this is a good approach on their part. What stood out, especially, was that they made the distinction between a potential exploitable security vulnerability versus a defense in-depth feature,” Liska wrote via email. “While there will always be people who question security moves a company as large and impactful as Microsoft makes, overall, this is good step in the direction of transparency, and I think it should be applauded.”
Childs said the Security Servicing Commitment constituted “a pretty comprehensive list” of policies, but it could be better.
“Due to the complexities of modern code, it’s unlikely any list such as this could ever be 100% complete and cover every scenario,” Childs wrote. “While this level of transparency is good to see, it would be great if they also committed to fixing bugs — especially severe bugs — faster or committed to improving patch quality or communications.”
AWS has updated its security policies and defaults for Amazon S3 encryption to address a recurring problem for customers that are ill-prepared for the complexity of the service.
Amazon Simple Storage Service (S3) is one of the most popular services on AWS, but its ever-expanding ancillary security options on both client and server sides has led customers to misconfigure settings and expose their data to the public. The latest change by AWS to encrypt objects for S3 buckets as the default setting could help mollify some of those issues.
Several household-name companies, including Accenture, Verizon and WWE, were publicly shamed this year over leaky S3 buckets — exposed not because of malicious attacks, but through the efforts of security firms scanning for vulnerabilities. There’s no evidence data was stolen or copied in those cases, but bad actors likely would follow the same path to access corporate information stored on AWS.
One of the most attractive elements of S3 is its flexibility, with multiple configurations and connections to numerous AWS tools and services. But that variety introduces choices, and sometimes users unknowingly make the wrong ones.
A simple check box item for S3 encryption would be a simple fix even for enterprises with hundreds of accounts and thousands of buckets, said Zohar Alon, CEO of Dome9, a cloud security company in Mountain View, Calif. But with so many ways to configure S3, users might not realize they’ve exposed their data.
“The 22-year-old developer will not take the time to read the manual of what do the five options mean, so we need to pre-position it,” Alon said. “We need to direct them to the right answer. We need to take check boxes away rather than add more.”
Zohar AlonCEO, Dome9
Encryption is one of several policy choices for users, and those who want to encrypt everything must reject non-encrypted objects. The new S3 encryption default will instead automatically encrypt all objects, even new ones.
AWS was built to provide a set of tools for customers to choose how to develop their applications. In the case of encryption, Amazon has made a choice for them — and it’s the right one, because of the changing nature of workloads hosted on its platform, said Fernando Montenegro, an analyst at 451 Research.
“As these [workloads] became more critical they recognize their customers are having additional demands,” he said.” As they add more workloads related to specific compliance regimes they have to follow that and have the right level of encryption.”
S3 encryption is an important step because 90% of users defer to the default option, Alon said. This won’t solve every problem, however, especially as cloud workloads begin to sprawl across multiple platforms.
“There are many ways you can shoot yourself in the leg when storing data on [Microsoft] Azure just like on AWS, so it’s asking a lot to expect the security team to figure that out across an ever-growing footprint of cloud assets and subscriptions.”
Go beyond S3 encryption
For the continued edification of AWS customers, buckets that are publicly accessible will carry a prominent indicator in the S3 console, new permission checks identify why a bucket is public, and additional information in inventory reports identifies the status of each object.
S3 is a powerful service, but users often overlook the responsibilities that come along with that, Montenegro said. He’s particularly high on the permission checks and inventory reports because they can help address the knowledge gap.
“As more people begin to use this they have a clearer picture of what you’re doing might have unintended consequences,” he said.
This isn’t Amazon’s first response to this problem. In the past six months it added new Config rules and emailed customers to caution them to take note of their publicly accessible assets. Amazon Macie, a service introduced over the summer, incorporates machine learning to track the S3 usage and identify anomalies. Other recent AWS updates include more control over access management when replicating to a separate destination account, and the ability to replicate encrypted data that uses AWS Key Management Service across regions.
Trevor Jones is a senior news writer with SearchCloudComputing and SearchAWS. Contact him at [email protected].