Tag Archives: Customers

Scale-out Qumulo NAS qualifies latest Dell EMC PowerEdge servers

Qumulo today added a hardware option for its customers by qualifying its scale-out NAS software to run on Dell Technologies’ PowerEdge servers. That leaves open the possibility that Qumulo will gain customers on Dell EMC servers at the expense of Dell EMC Isilon’s clustered NAS platform.

Qumulo is nearly two years into an OEM deal with Dell EMC archrival Hewlett Packard Enterprise. HPE rebrands and sells Qumulo’s scale-out NAS software on its servers. There is no joint go-to-marketing agreement between Qumulo and Dell EMC, which is a NAS market leader. The partnership means customers can purchase PowerEdge hardware from their preferred Dell EMC resellers and install Qumulo NAS software on the box.

Dell qualified Qumulo NAS software to run on dual-socket 2U Dell EMC PowerEdge R740xd servers.

“There are a lot of customers who build private clouds on Dell hardware. We’re now in a position where they can choose our software to build their computing,” Qumulo chief marketing officer Peter Zaballos said.

Dell EMC 14th-generation PowerEdge are equipped with about 20% more NVMe flash capacity than R730 models. One of the use cases cited by Dell EMC is the ability to use a single PowerEdge 14G node to power its IsilonSD Edge virtual NAS software, which competes with Qumulo storage.

Will Qumulo on PowerEdge compete with Dell EMC Isilon NAS?

The Qumulo File Fabric (QF2) file system scales to support billions of files and hundreds of petabytes. QF2 is available on Qumulo C-Series hybrid arrays, all-flash P-Series or preinstalled on HPE Apollo servers. Customers also may run it as an Elastic Compute Cloud instance to burst and replicate in AWS.

Qumulo NAS gear is sold mostly to companies in media and entertainment and other sectors with large amounts of unstructured data.

Zaballos said QF2 on PowerEdge isn’t a direct attempt to displace Isilon. The goal is to give Dell EMC shops greater flexibility, he said.

“We’re looking to build the biggest footprint in the market. Between Dell and HPE, that’s about 40% of the server market for data centers,” Zaballos said.

Qumulo competes mainly with Isilon and NetApp’s NAS products and has won customers away from Isilon. Pressure on traditional NAS vendors is also coming from several file system-based cloud startups, including Elastifile, Quobyte, Stratoscale and WekaIO.

Qumulo founders Peter Godman, Aaron Passey and Neal Fachan helped develop the Isilon OneFS clustered file system, which paved the way for the startup’s initial public offering in 2006. EMC bought the Isilon technology for $2.25 billion in 2010 and then was acquired as part of the Dell-EMC merger in 2015.

Qumulo CEO Bill Richter was president of the EMC Isilon division for three years. He joined Qumulo in 2016.

Greg Schulz, an analyst with Server StorageIO, based in Stillwater, Minn., likened the Qumulo-PowerEdge configuration to Dell EMC’s “co-optetition” OEM agreement with hyper-converged vendor Nutanix.

Qumulo NAS has been focused on high-performance, big-bandwidth file serving, which may not play well in environments that have many smaller files and mixed workloads. That’s an area Isilon has adapted to over the years. The other obstacle is getting [beyond] large elephant-hunt deals into broader markets, and getting traction with Dell servers can help them fill gaps in their portfolio,” Schulz said.

Ron Pugh, vice president for Dell EMC OEM sales in North America, said it’s not unusual for potential competitors to rely on Dell hardware products.

“If you look deeply inside the Dell Technologies portfolio, some of our customers can be considered competitors. Our OEM program is here to be a building block for our customers, not to build competing products,” Pugh said.

Dell EMC also sells Elastifile cloud-based NAS on its servers and is an Elastifile strategic investor.

Qumulo: AI tests on P-Series flash

Qumulo this week also previewed upcoming AI enhancements to its P-Series to enable faster prefetching of application data in RAM. Those enhancements are due to roll out in September. Grant Gumina, a Qumulo senior product manager, said initial AI enhancements will improve performance of all-flash P-Series. Series proofs of concept are under way with media customers, Gumina said.

“A lot of studios are using SANs to power primarily file-based workloads in each playback bay. The performance features in QF2 effectively means they can install a NAS for the first time and move to a fully Ethernet-based environment,” Gumina said.

File storage vendors NetApp and Pure Storage recently added flash systems built for AI, incorporating Nvidia hardware.

Revenue ops main theme at Ramp by InsightSquared conference

Customers, potential customers and partners of InsightSquared Inc. gathered in Boston for two days for Ramp 2018, the dashboard and reporting software vendor’s second annual conference. The Pipeline podcast was there to take in the conference festivities.

Revenue ops was among the main topics discussed at Ramp, with keynotes and conversations dedicated to the idea of bringing together marketing, sales and service departments to improve ROI and revenue.

To help companies with that objective, InsightSquared also unveiled a new set of marketing analytics tools that may help companies uncover insights within the marketing process, including marketing attribution, demand management, and planning and analysis.

“There’s a natural tension between sales and marketing,” said Matisha Ladiwala, GM of marketing analytics for InsightSquared, on the conference stage. Ladiwala ran through a demo of some of the tools’ capabilities before two InsightSquared customers spoke about using the marketing analytics tools.

One of Ladiwala’s demos showed a dashboard that united data from the sales and marketing departments and determined how quickly sales followed up on leads and how many leads were making it into the funnel. This revenue ops approach is beneficial to companies that have traditionally used a more manual, time-intensive approach to reporting, according to InsightSquared.

Aggregating information from areas was very manual and time-consuming.
Guido BartolacciNew Breed

One InsightSquared user, Guido Bartolacci, manager of acquisition and strategy at New Breed, an inbound marketing and sales agency, told conference attendees: “Aggregating information from areas was very manual and time-consuming.”

By using InsightSquared’s new marketing analytics tools while in beta, the marketing and sales agency was able to pull together data from multiple sources quickly and with more insight, Bartolacci said.

Beyond discussing the revenue ops-focused conference, this Pipeline podcast also touches on some of the other speakers at Ramp, including Nate Silver, data scientist and founder of the FiveThirtyEight blog, and TrackMaven CEO Allen Gannett, who gave a lively, entertaining keynote on creativity.

Avaya earnings show cloud, recurring revenue growth

Avaya hit revenue targets, increased cloud sales and added customers in its second full quarter as a public company — welcome news for customers and partners anxious for proof that the company is regaining its financial footing following last year’s bankruptcy.

Avaya reported revenue of $755 million in the third quarter of 2018 — down from $757 million last quarter, but within the vendor’s previously announced targets. When excluding sales from the networking division, which Avaya sold last year, adjusted revenue was 1% higher than during the third quarter of 2017.

To keep pace with competitors like Microsoft and Cisco, Avaya is looking to reduce its dependence on large, one-time hardware purchases by selling more monthly cloud subscriptions. This transition can make it difficult to show positive quarter-over-quarter and year-over-year growth in the short term.

Recurring revenue accounted for 59% of Avaya’s adjusted earnings in the third quarter — up from 58% the previous quarter. Cloud revenue represented just 11% of the quarter’s total, but monthly recurring revenue from cloud sales increased by 43% in the midmarket and 107% in the enterprise market, compared with last quarter.

Avaya reported an $88 million net loss in the third quarter. Still, the company’s operations netted $83 million in cash, which is a more critical financial indicator, in this case, than net income, said Hamed Khorsand, analyst at BWS Financial Inc., based in Woodland Hills, Calif.

“This is a company that’s still in transition as far as their accounting goes, with the bankruptcy proceedings,” Khorsand said. “[The net cash flow] actually tells you that the company is adding cash to its balance sheet.”

Also during the third quarter, Avaya regained top ratings in Gartner’s yearly rankings of unified communications (UC) and contact center infrastructure vendors. Avaya’s one-year absence from the leadership quadrant in the Gartner report probably slowed growth, Khorsand said, because C-suite executives place value in those standings.

Avaya’s stock closed up 3.61%, at $20.68 per share, following the Avaya earnings report on Thursday.

Avaya earnings report highlights product growth

The Avaya earnings report showed the company added 1,700 customers worldwide during the third quarter. It also launched and refreshed several products, including an updated workforce optimization suite for contact centers and a new version of Avaya IP Office, its UC offering for small and midsize businesses.

The product releases demonstrate that Avaya continued to invest in research and development, even as it spent most of 2017 engaged in Chapter 11 bankruptcy proceedings, said Zeus Kerravala, principal analyst at ZK Research in Westminster, Mass.

“As long as we continue to see this steady stream of new products coming out, I think it should give customers confidence,” Kerravala said. “Channel partners tend to live on new products, as well.”

The bankruptcy allowed Avaya to cut its debt in half to a level it can afford based on current revenue. But years of underinvestment in product continue to haunt the vendor, as it tries to play catch-up with rivals Cisco and Microsoft, which analysts generally agreed have pulled ahead of all other vendors in the UC market.

Avaya acquired cloud contact center vendor Spoken Communications earlier this year, gaining a multi-tenant public cloud offering. Avaya plans to use the same technology to power a UC-as-a-service product in the future.

“We are investing significantly in people and technology, investing more on technology in the last two quarters than we did in all of fiscal 2017,” said Jim Chirico, CEO at Avaya.

Avaya is expecting to bring in adjusted revenue between $760 and $780 million in the fourth quarter, which would bring the fiscal year’s total to a little more than $3 billion.

LifeLock vulnerability exposed user email addresses to public

Symantec’s identity theft protection service, LifeLock, exposed millions of customers’ email addresses.

According to security journalist Brian Krebs, the LifeLock vulnerability was in the company’s website, and it enabled unauthorized third parties to collect email addresses associated with LifeLock user accounts or unsubscribe users from communications from the company. Account numbers, called subscriber keys, appear in the URL of the unsubscribe page on the LifeLock website that correspond to a customer record and appear to be sequential, according to Krebs, and that lends itself to writing a simple script to collect the email address of every subscriber.

The biggest threat with this LifeLock vulnerability is attackers could launch a targeted phishing scheme — and the company boasted more than 4.5 million users as of January 2017.

“The upshot of this weakness is that cyber criminals could harvest the data and use it in targeted phishing campaigns that spoof LifeLock’s brand,” Krebs wrote. “Of course, phishers could spam the entire world looking for LifeLock customers without the aid of this flaw, but nevertheless the design of the company’s site suggests that whoever put it together lacked a basic understanding of web site authentication and security.”

Krebs notified Symantec of the LifeLock vulnerability, and the security company took the affected webpage offline shortly thereafter. Krebs said he was alerted to the issue by Atlanta-based independent security researcher Nathan Reese, a former LifeLock subscriber who received an email offering him a discount if he renewed his membership. Reese then wrote a proof of concept and was able to collect 70 email addresses — enough to prove the LifeLock vulnerability worked.

Reese emphasized to Krebs how easy it would be for a malicious actor to use the two things he knows about the LifeLock customers — their email addresses and the fact that they use an identity theft protection service — to create a “sharp spear” for a spear phishing campaign, particularly because LifeLock customers are already concerned about cybersecurity.

Symantec, which acquired the identity theft protection company in 2016, issued a statement after Krebs published his report on the LifeLock vulnerability:

This issue was not a vulnerability in the LifeLock member portal. The issue has been fixed and was limited to potential exposure of email addresses on a marketing page, managed by a third party, intended to allow recipients to unsubscribe from marketing emails. Based on our investigation, aside from the 70 email address accesses reported by the researcher, we have no indication at this time of any further suspicious activity on the marketing opt-out page.

LifeLock has faced problems in the past with customer data. In 2015, the company paid out $100 million to the Federal Trade Commission to settle charges that it allegedly failed to secure customers’ personal data and ran deception advertising.

In other news:

  • The American Civil Liberties Union (ACLU) of Northern California said Amazon’s facial recognition program, Rekognition, falsely identified 28 members of Congress as people who were arrested for a crime in its recent test. The ACLU put together a database of 25,000 publicly available mugshots and ran the database against every current member of the House and Senate using the default Rekognition settings. The false matches represented a disproportionate amount of people of color — 40% of the false matches, while only 20% of Congress members are people of color — and spanned both Democrats and Republicans and men and women of all ages. One of the falsely identified individuals was Rep. John Lewis (D-Ga.), who is a member of the Congressional Black Caucus; Lewis previously wrote a letter to Amazon’s CEO, Jeff Bezos, expressing concern for the potential implications of the inaccuracy of Rekognition and how it could affect law enforcement and, particularly, people of color.
  • Researchers have discovered another Spectre vulnerability variant that enables attackers to access sensitive data. The new exploit, called SpectreRSB, was detailed by researchers at the University of California, Riverside, in a paper titled, “Spectre Returns! Speculation Attacks using the Return Stack Buffer.” “Rather than exploiting the branch predictor unit, SpectreRSB exploits the return stack buffer (RSB), a common predictor structure in modern CPUs used to predict return addresses,” the research team wrote. The RSB aspect of the exploit is what’s new, compared with Spectre and its other variants. It’s also why it is, so far, unfixed by any of the mitigations put in place by Intel, Google and others. The researchers tested SpectreRSB on Intel Haswell and Skylake processors and the SGX2 secure enclave in Core i7 Skylake chips.
  • Google Chrome implemented its new policy this week that any website not using HTTPS with a valid TLS certificate will be marked as “not secure.” In the latest version of the browser, Google Chrome version 68, users will see a warning message stating that the site in not secure. Google first announced the policy in February. “Chrome’s new interface will help users understand that all HTTP sites are not secure, and continue to move the web towards a secure HTTPS web by default,” Emily Schechter, Chrome Security product manager, wrote in the announcement. “HTTPS is easier and cheaper than ever before, and it unlocks both performance improvements and powerful new features that are too sensitive for HTTP.”

How gamers with disabilities helped design the new Xbox Adaptive Controller’s elegantly accessible packaging

Romney said the experience has made him think about packaging differently.

“We have customers in our store every single day who buy product. I look at our laptop boxes and how they have to be opened. How many steps, how much packaging and how much of a barrier do each of those pieces become to someone with a mobility limitation?”

Romney thinks the Xbox Adaptive Controller packaging has the potential to set a new standard.

“I think it’s going to change how we look at things in the industry, in terms of how we make boxes. And I think it has to,” he said. “I think as a case study of inclusive design, the Xbox Adaptive Controller is going to make a brilliant example of how you do it, and how you include your audience and design with a population, rather than for a population.”

For Marshall and Weiser, the packaging project was challenging, time-consuming — and ultimately rewarding.

“It was a really powerful experience,” Marshall said. “I don’t think you realize, until you’re required to think differently, what you take for granted. As a designer, when you see things through a completely different lens, it’s paradigm-shifting.”

Said Weiser: “We put in a lot of extra time on it, but it was a pleasure to be able to work on this type of project. It’s great that we’re focused on this as a company.”

Discussions are underway about how Microsoft might use the learnings from the Xbox Adaptive Controller packaging. Marshall hopes the deceptively simple-looking box can serve as a springboard for future efforts.

“It’s certainly changing how we’re looking at packaging. We’re excited about moving forward from this point with a new lens and looking at what we can do,” he said.

“We’re really excited to take this journey on.”

Middleware tools demand to peak in 2018, before iPaaS ascends

2018 is a big year for middleware technologies, as revenues peak and cloud alternatives woo customers away from legacy, on-premises middleware suites.

The maturation of cloud, IoT and digital development platforms overall will drive enterprise investments in middleware tools to an all-time high of $30 billion in 2018, according to Gartner. Although traditional on-premises middleware technologies will serve legacy applications for years, enterprise investments in iPaaS and MWaaS will eclipse the traditional market from now on.

Cloud middleware tools are enterprise-ready, which is good news for businesses that require both cloud and on-premises systems to support customers’ and partners’ digital touchpoints and application development.

“Businesses today need flexible, consumable and agile integration capabilities that enable more people to get involved in delivering solutions,” said Neil Ward-Dutton, research director at MWD Advisors in Horsham, U.K.

End of the middleware suite era

MWD Advisors' Neil Ward-DuttonNeil Ward-Dutton

Investment trends signal a shift in enterprise business application development and integration. Revenues for traditional on-premises, ESB-centric middleware suites from such vendors as IBM and Oracle achieved only single-digit growth in 2016 and 2017, according to Gartner.

Meanwhile, spending is increasing for hybrid middleware tools from vendors such as Neosoft, Red Hat, WSO2 and Talend. The iPaaS market exceeded $1 billion for the first time in 2017 and grew by over 60% in 2016 and 72% in 2017, said Saurabh Sharma, principal analyst at Ovum, a London-based IT research firm. Common characteristics of these middleware technologies include an open source base, API integration, loosely coupled architecture, and subscription model.

Saurabh Sharma, principal analyst, OvumSaurabh Sharma

Businesses with on-premises, legacy middleware suites run these applications and do a lot of integration in their own data centers. However, many enterprises are migrating to a hybrid cloud environment and require both on-premises and cloud integration, said Elizabeth Golluscio, a Gartner analyst. These newer vendors replace traditional ESBs with lightweight, open source service busses, largely encompassed in iPaaS offerings.

Costs, flexibility drive iPaaS adoption

Businesses’ shift away from traditional, on-premises middleware technologies is driven by lower costs, an inundation of new technologies, new digital business integration requirements, and faster time-to-integration particularly for SaaS applications. “Cloud integration platforms are now mature enough to deliver these flexible and speedy integration capabilities,” Sharma said.

The era of perpetual licensing of expensive, on-premises integration middleware on a per-server basis is fading fast.
Neil Ward-Duttonresearch director, MWD Advisors

The cloud middleware subscription fee model aligns enterprises’ costs with usage and return. “The era of perpetual licensing of expensive, on-premises integration middleware on a per-server basis is fading fast,” Ward-Dutton said.

In the past few years, businesses were inundated with new technologies like AI and IoT, as well as shifts in enterprise architecture and explosion customer touch points. “To consolidate the middleware tools required to support and glue all of these things together, enterprise IT has to evaluate cloud middleware,” Golluscio said.

Don’t muddle through middleware modernization

In evaluations of cloud middleware tools, don’t give in to the lure of one-size-fits-all products, Ward-Dutton said. First, get buy-in for hybrid projects from the IT team. In a business with a long history of enterprise middleware use, specialists may fiercely protect their roles. IT organizations with mature DevOps teams have an advantage because new integration models enable more open collaboration across different roles and require smooth teamwork.

To plan a middleware modernization project, determine the desired state of integration architecture, and account for the integration requirements of digital business processes, Sharma said. Create integration competency centers (ICCs) to facilitate the adoption of self-service integration tools and cloud middleware platforms. Then, plan to gradually migrate appropriate integration processes and workloads to cloud-based integration services to deliver greater agility with a lower cost of ownership.

Overall, the key theme in middleware modernization projects is API-led integration that uses both API management and service-oriented and microservices architectures to expose and consume REST APIs. Focus on adopting API platforms to implement API- and design-first principles and enable the rapid creation of APIs that can effectively meet the specific requirements of end users.

Adobe-Microsoft partnership adds key integrations

The Adobe-Microsoft partnership is becoming immediately visible to customers, as the tech giants start to more deeply integrate their products for cloud users.

The companies said they will soon start coselling applications and comarketing products.

Also, Adobe has unveiled new integrations with Document Cloud and Microsoft products, embedding the ability to create a PDF directly in applications like Office 365, Dynamics 365, SharePoint and OneDrive. The San Jose, Calif., company also released upgrades to its Adobe Scan and Adobe Sign products.

The integrations strengthen the partnership revealed almost two years ago at Microsoft Ignite, when Microsoft CEO Satya Nadella championed the promise of a significant alliance with Adobe.

By committing to deep integrations and signaling they will cosell and comarket  products, the companies are making more Adobe-Microsoft  integrations immediately visible to and usable for customers — particularly now that the Adobe logo is an integral feature on the Microsoft 365 UI ribbon.

Analysts said the tighter Adobe-Microsoft connection indicates the tech partners have succeeded in bridging their technologies and making product integrations tangible to users.

“Adobe and Microsoft is the tightest, most profound alliance I’ve seen with two companies,” said Paul Greenberg, founder and managing principal of The 56 Group and author of CRM at the Speed of Light. “I call it the ‘get-a-room’ partnership, because it’s that intimate — down to the bits and bytes of architecture.”

Document Cloud integration for Microsoft products has been available for on-premises versions of Office and SharePoint, but the latest integration opens the option to export PDFs into Acrobat directly from Microsoft’s cloud-based products.

Office 365 and Dynamics 365 users will also be able to use the Document Cloud integration on mobile devices. To take advantage of the integration, users must have licenses for both Document Cloud and any of the Microsoft cloud products included in the integration.

The depth of integration is going to drive sales in a significant way, regardless of if they are comarketing.
Tim Frazierresearch director at IDC

“Enterprises can roll this out for all of their users; it’s consistent across 365 tools, including Outlook,” said Lisa Croft, group marketing manager at Adobe Document Cloud. “We made the decision to do this partnership not only from a technology perspective, but also marketing and coselling together.”

Whether the Adobe-Microsoft partnership means the companies will actually go to market together does not directly reflect how close the companies have become, according to Terry Frazier, research director at IDC.

“The depth of integration is going to drive sales in a significant way, regardless of if they are comarketing,” Frazier said. “They’re expanding the ease of use and integration, and all of that is critically important.”

Neither IDC nor Adobe had statistics available on how many Adobe Document Cloud customers also license Microsoft 365 products. However, anecdotally, both said there is a significant overlap.

Adobe upgrades Scan, integrates Sign

Beyond tightening integrations between core Microsoft and Adobe products, Adobe also upgraded Adobe Scan and Adobe Sign with consumer- and enterprise-friendly features.

Adobe Scan consumers can now use a free application to scan a business card and have the information on the card turn into a contact on their mobile device. Adobe also added its AI platform, Sensei, into Scan to help the app deal with common problems, such as shadows and dimly lit pictures.

“Paper business cards are something with information that can be reused and repurposed in a better way,” Croft said, adding that since releasing Scan last year, Adobe has seen more than 10.5 million downloads.

Last year, Adobe integrated Sign into Office 365 and Dynamics 365 and furthered that integration this year to include multistep workflows within Dynamics 365 and improved automation features to be compliant with the General Data Protection Regulation.

“The whole e-signature space is somewhere we see a lot of activity, and it needs to continue to get easier and fast and smarter,” Frazier said.

All these integrations and product upgrades are being done with a specific goal in mind, Greenberg said. That is recognizing the importance for customers of having an intelligent, easy-to-use platform and ecosystem of products.

“Looking at it all, these companies recognize that they need to meet the end-to-end demand of customers,” Greenberg said. “Between acquisitions and partnerships, there’s a lot of completion happening.”

Security Servicing Commitment clarifies Microsoft patch policy

In an effort to be more transparent with customers, Microsoft is clarifying patch management policies that experts said have been generally understood, but never properly codified.

Alongside the June 2018 Patch Tuesday release, Microsoft published the Security Servicing Commitment, which it hopes will help customers understand whether a reported vulnerability will be addressed during the monthly patch cycle or in the next version of a product.

In order to make this determination, Microsoft has specified two key criteria for immediate security patching: whether the vulnerability is severe enough and whether it “violate[s] a promise made by a security boundary or a security feature that Microsoft has committed to defending.”

“If the answer to both questions is yes, then the vulnerability will be addressed through a security update that applies to all affected and supported offerings,” Microsoft wrote in the Security Servicing Commitment. “If the answer to either question is no, then by default the vulnerability will be considered for the next version or release of an offering but will not be addressed through a security update, though in some cases an exception may be made.”

The security boundaries described in the Security Servicing Commitment are the points of “logical separation between the code and data of security domains with different levels of trust,” including network boundaries, kernel boundary, virtual machine boundary and more. Security features include Windows Defender, BitLocker and Windows Resource Access Controls.

However, Microsoft makes a distinction between these features and boundaries and defense-in-depth features, which it claims “may provide protection against a threat without making a promise.” These features include address space layout randomization, data execution prevention, user account control and more.

Codifying understood policy

Experts said there wasn’t really anything new in Microsoft’s Security Servicing Commitment, although the clarification was welcomed.

Dustin Childs, communications manager for Trend Micro’s Zero Day Initiative, said the policy description was less of a change and more of a clarification.

Chris Goettl, director of product management for security for IvantiChris Goettl

“Some of this information was publicly available, but it wasn’t found in a consolidated source with full details,” Childs wrote via email. “It’s hard to say why they chose to publish this now. Perhaps there has been an increase in submissions that don’t meet their servicing bar and have caused confusion with researchers.”

Chris Goettl, director of product management for security for Ivanti, based in South Jordan, Utah, said it was good “to see some clarity regarding severity of vulnerabilities to better understand how updates are classified” with the Security Servicing Commitment.

“Public and private disclosure of vulnerabilities can be a messy ordeal. I think this commitment provides the ethical hackers of the world with rules of engagement for disclosing bugs with Microsoft,” Goettl wrote via email. “Overall, I think it provides transparency to those who are committing their time so they know it will be worth the effort and are not disappointed or surprised by a response where Microsoft is not committing to provide a fix or a bounty.”

Public and private disclosure of vulnerabilities can be a messy ordeal.
Chris Goettldirector of product management and security for Ivanti

Allan Liska, threat intelligence analyst at Recorded Future, based in Somerville, Mass., said the Security Servicing Commitment was “spot on and laid out in a smart, strategic way.”

“Given Microsoft’s breadth and depth of products and constant commitment to security, this is a good approach on their part. What stood out, especially, was that they made the distinction between a potential exploitable security vulnerability versus a defense in-depth feature,” Liska wrote via email. “While there will always be people who question security moves a company as large and impactful as Microsoft makes, overall, this is good step in the direction of transparency, and I think it should be applauded.”

Childs said the Security Servicing Commitment constituted “a pretty comprehensive list” of policies, but it could be better.

“Due to the complexities of modern code, it’s unlikely any list such as this could ever be 100% complete and cover every scenario,” Childs wrote. “While this level of transparency is good to see, it would be great if they also committed to fixing bugs — especially severe bugs — faster or committed to improving patch quality or communications.”

Microsoft Office gets a makeover | Stories

New user experience updates rolling out to customers globally over the next few months

REDMOND, Wash. — June 13, 2018 — Starting Wednesday, the most-used productivity product in the world is getting a makeover.

Whether you’re writing a letter in Word, managing a budget in Excel or sending an email in Outlook, Microsoft Office is the go-to place to get stuff done for people around the world.

Beginning today, millions of people who use Office at home and work will begin to see some welcome changes designed to deliver a balance of power and simplicity. These updates are exclusive to Office.com and Office 365 — the always up-to-date versions of our apps and services:

  • Simplified ribbon. A new, updated version of the ribbon is designed to help users focus on their work and collaborate naturally with others. People who prefer to dedicate more screen space to the commands will still be able to expand the ribbon to the classic three-line view.
  • New icons and color. Across the apps you’ll start to see new icons and colors built as scalable graphics — so they render with crisp, clean lines on screens of any size. These changes are designed to both modernize Office design and make it more inclusive and accessible.
  • Search. Search will become a much more important element of the user experience, providing access to commands, content and people. With “zero query search” simply placing your cursor in the search box will bring up recommendations powered by AI and the Microsoft Graph.

These design changes are focused on the following:

  • Customers. We’re using a customer-driven innovation process to co-create the design of the Office apps. That process consists of three phases: initial customer research and analysis; concepting and co-creation; and validation and refinement.  
  • Context. Customers love the power of Office, but they don’t need every feature at the same time. We want our new designs to understand the context that you are working in so that you can focus on your content. That means both surfacing the most relevant commands based on the work you are doing and making it easy to connect and collaborate with others.
  • Control. We recognize that established skills and routines are powerful — and that the way someone uses the apps often depends on specific parts of the user interface. So we want to give end users control, allowing them to toggle significant changes to the user experience on and off.

Over the last year, Microsoft completed extensive customer research and spent many months working side-by-side with customers to guide design changes. As a result, customers will benefit from a more simplified experience while maintaining the full power of Office and a design ethos that is more inclusive — empowering everyone to create, communicate and collaborate. The team also increased its focus not just on what people think of the product, but how they feel using it.

“Through gathering feedback from thousands of people, we’ve found that people react most positively to feeling in control, productive and secure,” said Trish Miner, principal design researcher, Microsoft.

And to ensure that we continue to listen, learn and respond quickly to customer feedback, the company is including an in-product survey to ascertain how features make people feel. “The good news is that as we better understand these correlations between the design of features and how people feel while using them, we can develop technologies that are more empathetic,” Miner said.

The design refresh will start rolling out to business and consumer customers beginning in June. More information on the new Office design can be found at https://www.microsoft.com/en-us/microsoft-365/blog/2018/06/13/power-and-simplicity-updates-to-the-office-365-user-experience/.

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, press only:

Microsoft Media Relations, WE Communications, (425) 638-7777, rrt@we-worldwide.com

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication, but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.

 

Why you should bet on Azure for your infrastructure needs, today and in the future

I love all the amazing things our partners and customers are doing on Azure! Adobe, for example, is using a combination of infrastructure and platform services to deliver the Adobe Experience Manager globally. HP is using AI to help improve their customer experiences. Jet is using microservices and containers on IaaS VMs to deliver a unique ecommerce experience. These three customers are just a few examples where major businesses have bet on Azure, the most productive, hybrid, trusted and intelligent cloud.

For the last few years, Infrastructure-as-a-service (IaaS) has been the primary service hosting customer applications. Azure VMs are the easiest to migrate from on-premises while still enabling you to modernize your IT infrastructure, improve efficiency, enhance security, manage apps better and reduce costs. And I am proud that Azure continues to be recognized as a leader in this key area.

As you are considering your movement and deployment in the cloud, I want to share a few key reasons you should bet on Azure for your infrastructure needs.

Infrastructure for every workload

We are committed to providing the right infrastructure for every workload. Across our 50 Azure regions, you can pick from a broad array of virtual machines with varying CPU, GPU, Memory and disk configurations for your application needs. For HPC workloads that need extremely fast interconnect, we have InfiniBand and for supercomputing workloads, we offer Cray hardware in Azure. For SAP HANA, we provide virtual machines up to 4 TB of RAM and provide purpose-built bare metal infrastructure up to 20 TB of RAM. Not only do we have some of the most high-powered infrastructure out there, but we also provide confidential computing capabilities for securing your data while in use. These latest computing capabilities even include quantum computing. Whether it’s Windows Server, Red Hat, Ubuntu, CoreOS, SQL, Postgres or Oracle, we support over 4,000 pre-built applications to run on this broad set of hardware. With the broad set of infrastructure we provide, enterprises such as Coats are moving their entire datacenter footprint to Azure.

Today, I am announcing some exciting new capabilities:

  • Industry leading M-series VM sizes offering memory up to 12 TB on a single VM, the largest memory for a single VM in the public cloud for in-memory workloads such as SAP HANA. With these larger VM configurations, we are not only advancing the limits of virtualization in the cloud but also the performance of SAP HANA on VMs. These new sizes will be based on Intel Xeon Scalable (Skylake) processors, with more details available in the coming months.
  • Newer M-series VM sizes with memory as low as 192 GB, extending M-series VM range from 192 GB to 4 TB in RAM, available now, enabling fast scale-up and scale-down with 10 different size choices. M-series VMs are certified for SAP HANA and available worldwide in 12 regions. Using Azure ARM template automation scripts for SAP HANA, you can deploy entire SAP HANA environments in just minutes compared to weeks on-premises.
  • New SAP HANA TDIv5 optimized configurations for SAP HANA availability on Azure Large Instances with memory sizes of 6 TB, 12 TB, 18 TB. In addition to this, we now offer the industry-leading public cloud instance scale for SAP HANA with our new 24TB TDIv5 configuration. This extends our purpose-built SAP HANA offering to 15 different instance choices. With these new configurations, you can benefit from a lower price for TDIv5 configuration with an unparalleled 99.99% SLA for SAP HANA infrastructure and the ability to step up to larger configurations.
  • New Standard SSDs provide a low-cost SSD-based Azure Disk solution, optimized for test and entry-level production workloads requiring consistent performance and high throughput. You will experience improved latency, reliability and scalability as compared to Standard HDDs. Standard SSDs can be easily upgraded to Premium SSDs for more demanding and latency-sensitive enterprise workloads. Standard SSDs come with the same industry leading durability and availability that our clients expect from Azure Disks. Learn more about Standard SSDs.

Truly consistent hybrid capabilities

When I talk with many of our customers about their cloud strategy, there is a clear need for choice and flexibility on where to run workloads and applications. Like most customers, you want to be able to bridge your on-premises and cloud investments. From VPN and ExpressRoute to File Sync and Azure Security Center, Azure offers a variety of services that help you enable, connect and manage your on-premises and cloud environments creating a truly hybrid infrastructure. In addition, with Azure Stack, you can extend Azure services and capabilities to on-premises and the edge, allowing you to build, deploy and operate hybrid cloud applications seamlessly.

Today, I’m happy to announce that we are expanding the geographical coverage of Azure Stack to meet the growing demand of the customers globally. Azure Stack will now be available in 92 countries throughout the world. Given your excitement over Azure Stack, we continue to expand opportunities for you to deploy this unique service. For a full list of the supported countries, please visit the Azure Stack overview page.

Liquid Telecom, a leading data, voice and IP provider in eastern, central and southern Africa, plans to use Azure Stack to deliver value to its customers and partners in some of the most remote parts of the world.

“We have a long history of delivering future-focused and innovative services to our customers. Microsoft Azure Stack strengthens this mission while also enabling us to deliver value to a new set of customers and partners in some of the most remote parts of Africa. By using Azure Stack, alongside Azure and ExpressRoute over our award-winning pan-African fiber network, we can now guide our customers through increasingly complex business challenges, such as data privacy, compliance and overall governance in the cloud. This helps not only us, but our entire channel of distribution and value-added partners in enhancing the customer experience on their digital journey.” David Behr, Chief Product Officer, Liquid Telecom

Built-in security and management

I frequently get asked about best practices for security and management in Azure. We have a unique set of services that make it incredibly easy for you to follow best practices whether running a single VM or 1,000s of VMs, including built-in backup, policy, advisor, security detection, monitoring, log analytics and patching. This will help you proactively protect your VMs and detect potential threats to your environment. These services are built upon decades of experience at Microsoft in delivering services across Xbox, Office, Windows and Azure with thousands of security professionals and more than 70 compliance certifications. We have also taken a leadership position on topics such as privacy and compliance to standards such as General Data Protection Regulation (GDPR), ISO 27001, HIPAA and more. Last week we announced the general availability of Azure Policy which is a free service to help you control and govern your Azure resources at scale.

Today, I’m excited to announce a few additional built-in security and management capabilities:

  • Disaster recovery for Azure IaaS virtual machines general availability: You likely need disaster recovery capabilities to ensure your applications are compliant with regulations that require a business continuity plan (such as ISO27001). You also may need your application to run continuously in the unlikely event of a natural disaster that could impact an entire region. With the general availability of this new service, you can configure disaster recovery within minutes, not days or weeks, with a built-in disaster recovery as a service that is unique to Azure. Learn more about how to get started by visiting our documentation.

“ASR has helped Finastra refine our DR posture through intuitive configuration of replication between Azure regions. It’s currently our standard platform for disaster recovery and handling thousands of systems with no issues regarding scale or adherence to our tight RPO/RTO requirements.” Bryan Heymann, Director of Systems & Architecture, D+H

  • Azure Backup for SQL in Azure Virtual Machines preview: Today we are extending the Azure backup capability beyond virtual machines and files to also include backup of a SQL instance running on a VM. This is a zero-infrastructure backup service that provides freedom from managing backups scripts, agents, backup servers or even backup storage. Moreover, customers can perform SQL log backups with 15-minute intervals on SQL Servers and SQL Always On Availability groups. Learn more on the key benefits of this capability and how to get started.
  • VM Run command: Customers can easily run scripts on an Azure VM directly from the Azure portal without having to connect to the machine. You can run either PowerShell scripts or Bash scripts and you can even troubleshoot a machine that has lost connection to the network. Learn more about Run Command for Windows and Linux.

More ways to save money, manage costs and optimize infrastructure

Given the agility offered by cloud infrastructure, I know you not only want freedom to deploy but also want tight control on your costs. You want to optimize your spending as you transition to the cloud. We can help drive higher ROI by reducing and optimizing infrastructure costs.

Azure offers innovative products and services to help reduce costs, like low priority VMs, burstable VMs, vCPU-constrained VMs for Oracle and SQL databases, and archive storage so customers can choose the right cost optimized infrastructure option for their app. Azure also uniquely offers free Cost Management so customers can manage and optimize their overall budget better.

With Azure Reserved VM Instances (RIs), you can save up to 72 percent. By combining RIs with Azure Hybrid Benefit, you can save up to 80 percent on Windows Server virtual machines, and up to 73* percent compared to AWS RIs for Windows VMs – making Azure the most cost-effective cloud to run Windows Server workloads. Customers like Smithfield Foods have been able to slash datacenter costs significantly, reduce new-application delivery time and optimize their infrastructure spend.

I hope you enjoyed this overview of some of the coolest new capabilities and services in Azure. We are constantly working to improve the platform and make a simpler and easier infrastructure service for you! Please let us know how we can make it even better.

Get started today with Azure IaaS. You can also register now to the Azure IaaS webcast I am hosting on June 18, 2018 on many of these topics.

Thanks,

Corey

*Disclaimer:

  1. Sample annual cost comparison of two D2V3 Windows Server VMs. Savings based two D2V3 VMs in US West 2 Region running 744 hours/month for 12 months; Base compute rate at SUSE Linux Enterprise rate for US West 2. Azure pricing as of April 24, 2018. AWS pricing updated as of April, 24, 2018. Price subject to change.
  2. The 80 percent of saving is based on the combined cost of Azure Hybrid Benefit for Windows Server and 3-year Azure Reserved Instance. It does not include Software Assurance cost.
  3. Actual savings may vary based on location, instance type, or usage.