Tag Archives: Customers

Meeting Insights: Contextual AI to help you achieve your meeting goals

It’s a big day for you. Back-to-back meetings are scheduled with critical customers and partners, and a parent-teacher conference is sandwiched in there as well. As you’re headed toward the last meeting, suddenly you cannot remember the key talking points. Who sent you the pre-read notes? Was it Taylor? No, possibly Drew. No luck. You are about two minutes from reaching the meeting room, and you want more than anything to pull out your phone and scream at it:


If only there existed an intelligent system that enabled you to find information this effortlessly. Now, there is: Meeting Insights provides AI capabilities that help you find information before, during, and after meetings as easily as if you had your own assistant to support you. Meeting Insights is now available for commercial Microsoft 365 and Office 365 customers in Outlook mobile (on both Android and iOS devices) and Outlook on the web. We would like to pull back the cover and talk about the science and technology that drives this scenario. Also, we’ll share why Meeting Insights is only the tip of the iceberg in how we at Microsoft are developing AI-powered capabilities to simplify and improve customer experience and productivity. We’re currently testing two new features that expand intelligent content recommendations to new scenarios in Outlook.

Providing usefulness in every context

Customers often say that finding content from meetings is a challenge. Therefore, we set out to build an intelligent personalized solution that provides customers with information from their mailboxes, OneDrive for Business accounts, and SharePoint sites to better help them accomplish the goals of their meetings.

The solution we developed powers the Meeting Insights feature that makes meetings more effective by helping customers:

  • Prepare for their meetings by offering them content they haven’t had a chance to read or may want to revisit;
  • Access relevant content during their meetings with ease;
  • Retrieve information about completed meetings by returning content presented during the meeting, sent meeting notes, and other relevant post-meeting material

Currently, Meeting Insights can be found on more than 40% of all Outlook mobile and Outlook on the web meetings.

Large-scale, personal, privacy-preserving AI

The most useful emails and files for a meeting may change over time (for example, those most useful before may be different than the ones most useful during or after). In order to create a relevant and useful service, we needed to find a way to reason across information shared by a customer as well as the files in their organization that they have permission to access and have opted to share. Microsoft 365 upholds a strict commitment to protecting customer data—promising to only use customer data for agreed upon services and not look at data during development or deployment of a new feature. This privacy promise, rather than being a hindrance, spurred us to think creatively and to innovate. As detailed below, we use a creative combination of weak and self-supervised machine learning (ML) algorithms in Meeting Insights to train large-scale language models without looking at any customer data.

The need to efficiently reason over millions of private corpora, themselves each potentially containing millions of items, underscores the complexity of the problem we needed to solve in Meeting Insights. To accomplish this reasoning, Meeting Insights enlists the help of Microsoft Graph, where shared data is captured in a graph representation. Microsoft Graph provides convenient APIs to reason over all of the shared email, files, and meetings for customers as well as the relationship between these items. This provides a high level of personalization to accurately meet customer needs.

Building intelligent features like Meeting Insights in the enterprise setting poses additional problems to the standard ML workflow. In enterprise settings, customers have high expectations of new products—especially the ones in their critical workflows and even more so when they are paying for the service. Because there is a need for an initial model to work out of the gate, standard ML workflows, which deploy a heuristic model with moderate performance and take time to learn from interaction data, lead to a lack of product acceptance. In Meeting Insights, we use ML algorithms that require less supervision to personalize customers’ experiences more quickly.

This challenge, which we refer to as the ‘’jump-start’’ problem, is therefore critical to product success in enterprise scenarios. This goes beyond standard “cold-start” challenges where data about a particular item or new user of a system is lacking, and instead the primary challenge is to get the entire process off the ground. Common approaches to improve model performance before deployment, such as getting annotations from crowd-sourced judges, have limited to no applicability due to the privacy-sensitive and personal nature of the recommendation and learning challenges. Finally, Microsoft 365 is used all over the world, and we wanted to make this technology available as broadly as possible and not simply to a few select languages.

Figure 1: Schematic depiction of how we train the model for recommending emails in Meeting Insights.

Solving the technical challenges

In order to make Meeting Insights possible, we needed to leverage three key components: weak supervision that is language agnostic, personalization enriched by the Microsoft Graph, and an agile, privacy-preserving ML pipeline.

Weak supervision: Large-scale supervised learning provides state-of-the-art results for many applications. However, this is impractical when building new enterprise search scenarios due to the privacy-sensitive and personal nature of the problem space. Instead of having annotators labeling data, we turned to weak supervision, an approach where heuristics can be defined to programmatically label data. To apply weak supervision to this task, we used Microsoft’s compliant experimentation platform. Emails and files attached to meetings were assigned a positive label, and all emails and files which the organizer could have attached at meeting creation time but did not were assigned a negative label. The benefit of using weak supervision for this problem went beyond preserving privacy as it allowed us to quickly and cheaply scale across languages and communication styles—all of which would be extremely challenging with a strongly supervised modeling approach involving annotators.

Personalization: Identifying the most relevant and useful information for a customer requires understanding the people and phrases that are important for that person. In order to identify the candidate set of relevant items and rank them, we leverage personalized representations of the most important key phrases and key people for a person. These personalized representations are learned in a self-supervised and privacy-preserving manner from nodes and edges in the Microsoft Graph. The context meeting is then combined with these personalized key-phrase and people representations to construct a candidate set using the same. Microsoft Search endpoint uses the same Microsoft Search technology powering search in applications such as Outlook, Teams, and SharePoint. In the final ranking stage, these personalized representations as well as more general embeddings are used to compute semantic relatedness between the context and candidate items, relationship strength via graph features, and collaboration strength based on relationship between key people.

Agile privacy preserving ML pipelines: As noted above, preserving the privacy of our customers’ data is sacrosanct for Microsoft. The weak and self-supervised algorithm techniques described above allow us algorithmically to train highly accurate and language agnostic large-scale models without having to look at the customer’s data. However, in order to put the algorithms into practice, test them, and innovate, we needed a platform that makes approaches like this possible. Innovations on the modeling front went hand-in-hand with development of ML platforms and processes that allowed our scientists to remain agile. Our in-house compliant experimentation platform provides key privacy safeguards. For example, our algorithms can operate on customer content to provide recommendations directly to customers, but our engineers cannot see that content except when it’s their own. Many tools were developed to assist in monitoring and debugging our ML pipelines, firing off alerts when data quality as well correlations between signals and labels diverged from expected values.

Self-hosting to improve for our customers

As we developed Meeting Insights, we first rolled it out to internal Microsoft customers and instrumented their interactions with the experience to identify areas for improvement. Early on, we saw from the data we had instrumented that 90% of the usage of Meeting Insights on a given day was for meetings that or the following day. Armed with this datapoint, we were able to implement a significant optimization by prefetching the insights for these meetings the moment the customer opens their calendar. This data-informed strategy resulted in a 50% reduction of customer-perceived latency.

Customer engagement with the deployed product showed other strong temporal effects worth calling out for this experience:

  • For meetings, freshness is important with about 5% of insights clicks happening within 15 minutes of the meeting being created.
  • For email insights, 30% of clicks go to emails sent/received in the 24 hours preceding the time of the user request.
  • For file insights, 35% of clicks go to files created or modified in the 24 hours preceding the time of the user request.

In less than four months after shipping our first Meeting Insights experience (for meeting invitations written in English), we were able to expand support to all enterprise customers across all languages. This was made possible by effectively leveraging the Microsoft Graph, being creative in the low-cost modeling approaches we employed, and being careful in the design of our AI solutions by using weak supervision and avoiding language specific dependencies. Over the next few months, we will be rolling Meeting Insights out to Cortana Briefing Mail recipients.

Meeting Insights is currently shown on more than 40% of opened meetings on supported Outlook clients, with customers reporting two out of three suggestions to be useful.

Providing broader contextual intelligence

Meeting Insights is not the only place where we are providing contextual intelligence that makes life easier for our customers. We are looking at how we can use Meeting Insights to accelerate our offerings in other scenarios using techniques like transfer learning, which has proven to be an effective and efficient way for us to gain reusable value from AI models learned for one scenario but reapplied to another.

For example, we are now transferring the learnings from our Meeting Insights models to power other intelligent content recommendations features such as “Suggested Attachments” and “Suggested Reply with File” on Outlook. These features take a customer and an email as input to return contextually relevant attachment suggestions that significantly reduce the time and effort required to share content via email.

“Suggested Attachments” and “Suggested Reply with File” are features currently in testing phases. We look forward to adding new offerings for Microsoft 365 users and beyond for intelligent content recommendation.

Imagine you’re heading to that last meeting again after an exceptionally busy day. You’ve suddenly forgotten the talking points, and you just can’t seem to recall who sent those pre-read notes. Was it Taylor? Drew? You feel like shouting at the sky, but then a thought flashes into your mind. You calmly pull up Outlook mobile on your phone as you approach the room, and with a simple tap on the meeting, your pre-read notes appear at the bottom of the screen thanks to Meeting Insights. Now, you’ve got this.

We look forward to continuing to improve life for our customers, and we hope the next time you walk into a meeting, you also walk in with more confidence knowing that Meeting Insights is there to assist you.

Go to Original Article
Author: Steve Clarke

SAP S/4HANA Cloud 2005 focuses on supply chain

SAP customers already appeared more receptive to cloud-based software at the start of 2020, but the COVID-19 pandemic may spur momentum for SAP S/4HANA Cloud 2005, the latest release of the SaaS version of the ERP platform.

SAP reported increases in cloud-based revenue for the first quarter of 2020, and, although this was not broken out into specific product groups, SAP is seeing a shift in demand for the cloud, said Jan Gilg, president of SAP S/4HANA.

Jan GilgJan Gilg

“Customers are coming in to ask how quickly they can be up and running, or maybe how quickly they can set up a subsidiary or specific business units,” Gilg said. “So we’re seeing a lot of uptake and a lot of customers looking into the cloud model now than before.”

The cloud momentum is expected to continue even after the pandemic has passed, he said, as companies hit hard by the disruption will evaluate their IT capabilities and the status of ERP modernization and digital transformation projects.

One of the advantages of cloud-based software is that new functions can be introduced in each new version, Gilg said. SAP S/4HANA 2005 includes updates that could be valuable for companies dealing with the rapidly changing business environment brought on by COVID-19.

Supply chain, finance and integration with SAP SuccessFactors, an HCM platform, are the most prominent updates, he said.

Enabling a more flexible supply chain

Supply chain Situation Handling functionality now allows companies to monitor inventory more accurately. In the last few years, supply chains have been stretched around the globe and have focused on just-in-time delivery, keeping only as much stock in inventory as needed. The strategy has been exposed as a weakness by the pandemic, as companies have grappled with an abrupt disruption to production schedules.

This is leading companies to reassess supply chains by moving to more local suppliers and keeping more inventory in stock, Gilg said.

“S/4HANA Cloud 2005 puts more emphasis on inventory management and stock levels and gives companies the support to help them with intelligence that proactively alerts companies when inventory levels go down, go too low or run out,” he said. “In the current situation, it’s really critical to make sure that there’s enough flow of goods to the respective consumers; it’s about being flexible.”

Flexibility is also the key to new financial functions, which allow companies to monitor and approve payments from SAP and non-SAP systems. This will help companies keep a closer eye on cash flow, which will be important as business interruption makes cash flow an issue, Gilg said.

The other significant S/4HANA Cloud update is a more seamless integration between S/4HANA and SAP SuccessFactors Employee Central, which standardizes the data model and cost centers for ERP and HCM systems.

“The ambition here is that this should really look and feel like one solution, and ideally customers should not even notice that there’s two solutions behind the scenes,” Gilg said. “The transition is seamless from a UI perspective, from process data integration, and also from some of the technical attributes like the provisioning.”

Giving the customers what they want

Although there’s no hard evidence of an increase in demand for SAP S/4HANA Cloud, it wouldn’t be a surprise given the overall increase in demand for cloud applications, said analyst Jon Reed, co-founder of Diginomica.com, an enterprise applications news and analysis site.

Jon ReedJon Reed

However, the most appropriate market for S/4HANA Cloud may not be able to invest, given the current environment.

“Keep in mind that S/4HANA Cloud’s best vertical adoption, if we are talking the full cloud solution, not hosted S/4HANA, is in professional services, which, for the most part, is not a vertical that is thriving at the moment,” Reed said. “Modern ERP cloud is going to have be very vertical in its appeal, a topic SAP has understood for some time but has not moved nearly fast enough on.”

S/4HANA Cloud 2005’s updates should be welcomed by customers, Reed said.

“These are the types of features customers have been asking for,” he said. “In particular, the SuccessFactors integration should help S/4HANA Cloud have some response to Workday’s complete finance and HR integrations, although SAP has a long way to go there.”

S/4HANA Cloud 2005 looks impressive, with the SuccessFactors Employee Central integration and more end-to-end industry focus, said Predrag “PJ” Jakovljevic, principal industry analyst at Technology Evaluation Centers, an enterprise computing analysis firm in Montreal.

Predrag JakovljevicPredrag Jakovljevic

The current COVID-19 environment may spur more cloud demand, Jakovljevic said.

“Both S/4HANA Cloud 2005 and cloud ERP, SCM [supply chain management] and CRM, in general, should benefit from COVID-19, since many customer success stories nowadays talk about using cloud and mobile digital collaborative tools,” he said. “On-premises will still not necessarily fully die, however, because some places still have regulatory requirements and poor internet connectivity, and on-premises solutions can now come with remote access.”

Go to Original Article
Author:

Employees empowered to make data-driven decisions spur growth

When employees on the front lines, the ones actually meeting with customers rather than those in back offices and board rooms, are given the tools, training and authority to make data-driven decisions, it makes a huge difference in the success of a given organization.

That’s the finding of a new report from ThoughtSpot, an analytics vendor founded in 2012 and based in Sunnyvale, Calif., and the Harvard Business Review titled, “The New Decision Makers: Equipping Frontline Workers for Success.”

A total of 464 business executives across 16 industry sectors in North America, Europe and Asia were surveyed for the report.

The survey found that only 20% of organizations are giving their  front-line employees both the authority and the tools — self-service analytics platforms and training — to make decisions based on analytics. Those organizations, meanwhile, were most likely among the respondents to have seen more than 10% annual growth in revenue in recent years.

As a result of their objective success — their growth — those enterprises were dubbed Leaders.

Another 43%, however, were deemed Laggards. They were organizations that have so far failed to give their  front-line employees the ability to make decisions driven by data, either by not providing them the BI tools and training or simply not giving them the authority, and were seeing that reflected in their bottom line.

Scott Holden, ThoughtSpot's chief marketing officerScott Holden

Scott Holden, chief marketing officer at ThoughtSpot, recently discussed the report in detail.

In a Q&A, he delves into the hypothesis that led ThoughtSpot and the Harvard Business Review to conduct the research, as well as many of the key findings.

Among them were that, even among Leaders, a vast majority of organizations (86%) believe they’re not yet doing to enough to give  front-line employees everything they need in order to make data-driven decisions. Meanwhile, among those that have at least begun giving  front-line employees the necessary tools and training, 72% have seen an increase in productivity.

What was the motivation for the survey — what did you see in the market that led you and Harvard Business Review to team up and undertake the project?

Scott Holden: We sponsored this research because we had a hunch that companies that empowered their frontline employees with faster access to data would outperform their [competitors]. That was the primary premise, and we wanted to explore the idea and see what other dynamics surrounded that — what was holding them back, what were the Leaders pursuing and doing better than the Laggards — and that was the impetus for this.

What were the key findings, what did ThoughtSpot and HBR discover?

Holden: We all know that technology plays a huge role in productivity gains and empowering people, but there’s also a big cultural transformation required — a process change — and how do the things you do as leaders impact how people adopt technology, and so that was another big component of this. We wanted to explore both dimensions, with the goal of giving leaders that are trying to transform their companies a guide to how to do something.

When you look at the key findings, there are a few things that stand out. Not surprisingly, but it was good to confirm this, companies want to empower the front lines. Ninety percent of all respondents said, ‘Yes, we want to do this; our success is dependent on being able to give fast access to data to all people.’ That’s not surprising, but it’s good to see the number be so high. But then it gets a little bit more surprising because almost the same percentage, 86% of them, said that they need to do more. They’re basically saying they need to provide better technology and tools to empower those employees. They’re saying, ‘We’re not doing enough,’ and more specifically, only 7% of the people surveyed though they were actually doing enough. That was our hunch, but the data proved out really strongly to say that there’s certainly a movement afoot here and people want to be doing this and they need to be doing a better job of it.

How do organizations stand to benefit from empowering employees to make data-driven decisions — what can they accomplish that they couldn’t before?

Holden: There was the benefits that companies saw — if you do this we think this will happen — but there was a nice nuance if you dig into those performance improvements, which is the difference reported based on what the Leaders were doing versus what the Laggards were doing.

When you empower an employee with more and better access to data, they actually become a happier and more engaged employee, and … that translates into better service and better customer satisfaction.
Scott HoldenChief marketing officer, ThoughtSpot

The dimensions by which people saw improvements were around productivity, employee satisfaction, employee engagement, customer satisfaction, and then an improvement in quality of products and services. Some specific stats were that if people said they were to give their employees better access to data and facts, 72% said they would increase their productivity, 69% said that they would increase customer employee engagement and satisfaction, and 67% said they would increase the quality of their products and services.

Basically, everybody thinks that across the business, if we do this we’re going to see big, big improvements in what would be the core levers for any business. If you look at higher employee engagement and higher customer satisfaction, when you empower an employee with more and better access to data, they actually become a happier and more engaged employee, and if they’re the one that’s on the front line talking to your customer, that translates into better service and better customer satisfaction. There’s a nice tie-in to how this actually plays to delivering better services and experiences to your customers in a really relevant way.

What are the obstacles?

Holden: This is where it gets into Leaders versus Laggards. I was really kind of blown away. One of the things that I saw was that — and this is a little counterintuitive — the laggards were 10 times more likely to say they don’t want to empower the front lines. There’s a good chunk of the Laggards out there — 42% of the Laggards – [that] said they actually don’t think they should empower the front lines. This really gets into what I think underlies this big thing that’s standing in the way of the analytics industry right now, which is that historically analytics and data-driven decision-making was done at the top, sort of ivory tower analytics. If you were a C-level executive you probably had a data analyst or someone who worked for you who gave you access to information, and you were able to use your management dashboard to help you make decisions. For practical reasons, it’s hard to give every employee on the front lines access to data analysts, and there may be some trust issues.

What we’re seeing in this report is that there’s a real lack of trust, and that’s why you’re seeing a lot of Laggards say they don’t think they need it. You see that the Laggards have a backward view of what’s driving success, and that empowering the front lines really is an important thing and they’re missing it.

If an organization wants to empower  front-line employees to make data-driven decisions but hasn’t begun the process, how does it get from Point A to Point B?

Holden: New technology can help. If you make it faster and easier, people are more likely to use it. But there are a couple of other key elements here. In the report, there are five key ways that folks are empowering the front lines: leadership; putting data in the hands of folks; governance, which is building the right process and security around the data that you do expose; training; and facilitation, which is a nuance that ties into training, having managers who are bought in because they’re the ones that facilitate the training and make it happen.

Technology companies across the board are so eager to talk about technology, but you can see that other than data, the other things are about leadership, governance, management, training, and it is a full cultural experience to transform your business to be more data-driven. Fifty percent of the Leaders said that culture was a key factor where only 20% of the Laggards did — and Leaders and Laggards were based on success metrics, objective measures that show whether they’re outperforming their industry or not. Building a culture around data-driven decisions is a key factor here that can’t be underestimated.

What is the danger for organizations that don’t give their  front-line employees the tools to make data-driven decisions?

Holden: There’s a huge danger, and this is why the Laggards versus Leaders thing was so stark. If you aren’t buying into being a data-driven company and putting the leadership, the culture, the training, the thinking in place, you are going to fall behind. This report statistically says that you are going to miss out on a big opportunity if you’re not thinking strategically about making this shift. I think it’s a pretty big wake-up call. Data has been a key asset for companies for a while now, but pushing data further out into the front lines, and the success that can have on your business, is a newer concept — it’s not just empowering leaders to make decisions but empowering the marketing manager, the retail associate, the local hospital administrator, the person on the factory floor. Those folks need fast access to data too, and that is an eye-opening discovery.

Editor’s note: This Q&A has been edited for clarity and conciseness.

Go to Original Article
Author:

VMware’s Kubernetes-based products ease container migration

VMware hopes a raft of new Kubernetes-based enhancements can position the company as the right choice for customers interested in container migration while they retain investments in vSphere.

The strategy centers on Tanzu, a product portfolio VMware introduced at the VMworld conference in August. A chief component is the Kubernetes Grid, a distribution of the container orchestration engine that sets up clusters in a consistent way across various public clouds and on-premises infrastructure.

Another product, Tanzu Mission Control, provides management tooling for Kubernetes clusters. VMware has also pushed its acquisition of Bitnami under the Tanzu header. Bitnami, which offers a catalog of pre-packaged software such as the MySQL database for quick deployment across multiple environments, is now called Tanzu Application Catalog.

Finally, VMware has rebranded Pivotal Application Service to Tanzu Application Service and changed its Wavefront monitoring software’s name to Tanzu Observability by Wavefront.

This flurry of product development and marketing around Kubernetes has a critical purpose for VMware.

“Kubernetes has practically stolen virtualization from VMware, so now it needs to upgrade the engine room, while keeping the promenade deck the same and hoping the passengers stay on board and do not jump ship,” said Holger Mueller, an analyst at Constellation Research.

Kubernetes ecosystem
VMware hopes to be a big player in the Kubernetes ecosystem with its Tanzu portfolio.

A big part of this plan involves the new vSphere 7, which has been reworked to run both container and virtual machine workloads by embedding Tanzu Kubernetes Grid and other components. This vSphere option is initially available only through VMware Cloud Foundation 4, which is supported on AWS, Azure, Google, Oracle, Rackspace and IBM’s public cloud services, as well as through other VMware partners.

VMware also plans to release a separate, Kubernetes-less edition of vSphere 7 for customers who don’t want that functionality. Tanzu Kubernetes Grid, Application Catalog and Mission Control are available now, while Cloud Foundation 4 and vSphere 7 are slated for release before May 1.

Users gravitate towards containers

VMware’s announcements further confirm the industrywide trend of users moving away from their core virtualization platforms and more seriously exploring container migration. With VMware the longtime industry leader in virtualization, the announcements carry added weight.

“There is a transition happening in compute technology of what is being used to deliver the apps that is moving away from virtualization to containers — not that virtualization isn’t useful for other things,” said Gary Chen, IDC’s research director of software-defined compute. “VMware is trying to make that transition, and they appear to be pretty serious about it.”

VMware’s efforts around Kubernetes stem back a few years. It previously offered Pivotal Container Service as an add-on to its core platform, and acquired a batch of Kubernetes talent and related IP through its purchase of Heptio in 2018. Two of the three original authors of Kubernetes now work at VMware.

“At the end of the day, Kubernetes is still an orchestration tool for automating containers, but what if you are not in a developer group?” said Brian Kirsch, an IT architect and instructor at Milwaukee Area Technical College. “What they are introducing here is for people writing their own software and moving toward containers, but will there be enough support on the back end for those not ready for Kubernetes or containers, or who may never need them? We support 45,000 students here, but we still buy our software and don’t write it.”

Many companies in large vertical markets, such as manufacturing and healthcare, are often slow to move to another DevOps environment once they have settled on a product. Traditionally, many applications in those markets aren’t updated often by the vendors and it can be a monumental task to pursue container migration, even for long-time vSphere users.

“Up until just a few years ago, some of the larger EHR apps were still in VB [Microsoft’s Visual Basic] for the front end,” Kirsch said. “It just takes time.”

While VMware executives tout that Cloud Foundation and vSphere products can work on competitors’ cloud platforms, Kirsch said he thinks the company is overplaying the importance of that capability.

“Writing an app once and have it run wherever you want is good for some, but I don’t know that many people who want to hop around that much,” Kirsch said. “My question is: How many times have you left your cloud provider unless it goes belly up? A lot of work is involved with this and no matter how transparent it is, it’s almost never like just flipping a switch,” he said.

Controlling the control plane

Some analysts see the VMware announcements around container migration as counterpunching the competitive efforts of IBM-Red Hat and others to gain a firm grasp of the management software piece of both the cloud and on-premises applications.

“If Red Hat succeeded in commoditizing the enterprise OS space, making RHEL and Windows Server the two de facto standards, then the next layer to be commoditized is the control plane, which I still believe to be the PaaS layer,” said Geoff Woollacott, senior strategy consultant and principal analyst at Technology Business Review. “Right now, the main rivals for that are VMware with this announcement, Azure and OpenShift.”

The U.S. Air Force is in the midst of evaluating multiple Kubernetes distributions and management tools, including Red Hat OpenShift, Rancher and a beta version of Tanzu Kubernetes Grid. The various IT teams within the military branch can use whichever Kubernetes platform they choose. For the Air Force’s purposes, the latest Red Hat OpenShift versions will beat VMware to the punch in disconnected and Kubernetes edge environments, along with real-time operating system support that the Air Force will use in F16 fighter jets. The Air Force will also wait until all of VMware’s Tanzu product line becomes generally available before it commits to using it, and carefully watch how VMware brings together its new business units and their products.

“VMware is checking all the boxes, but details matter,” said Nicolas Chaillan, the Air Force’s chief software officer, and co-lead for the Enterprise DevSecOps Initiative in the office of the Department of Defense CIO. “With mergers, there are always people leaving, conflicts, and you never know what’s going to happen.”

However, VMware retains its lead in server virtualization, and the Kubernetes IP and expertise the company has assembled with its Heptio acquisition and Pivotal merger can’t be overlooked, Chaillan added.

The vSphere piece, and the ability to tie that back to Kubernetes, is very interesting, and that alone could win the market.
Nicolas ChaillanChief software officer, Air Force

“The vSphere piece, and the ability to tie that back to Kubernetes, is very interesting, and that alone could win the market,” he said. “A lot of companies in finance and healthcare still need a virtualization stack on premises, and otherwise would have to use Google Anthos, Azure Stack or Amazon Outposts — or they could go through vSphere, and have a single company that brings [them] the whole thing.”

Redesigning the crown jewels

VSphere 7.0, formerly called Project Pacific, has been significantly redesigned, according to Krishna Prasad, vice president and general manager of VMware’s Cloud Platform Business. A large part of that redesigning was to tightly integrate Kubernetes into vSphere. One advantage of this for corporate users is when they stand up a cluster based on the company’s ESX Server virtualization layer, those become Kubernetes clusters along with the company’s vCenter control plane, Prasad said.

“When we started rearchitecting, it wasn’t driven by the need to accommodate Kubernetes workloads — that was just one of the driving factors,” Prasad said. “We realized it [Kubernetes] was a foundational piece we could bring into vSphere at the platform level that would enhance the platform itself. It would make the platform more modern like Kubernetes itself.”

Another important consideration for the redesign was a direct response to what the company’s core customers were asking for: to be able to deliver their infrastructure to their developers through a cloud consumption model.

“They want and we want to deliver infrastructure completely as code,” Prasad said.

To this end, VMware also unveiled an improved version of NSX-T that now offers full-stack networking and security services that connect and protect both VMs and containers.

“With the enhancements to NSX-T, as you deploy Kubernetes workloads it automates everything right through to the Kubernetes UI,” Prasad said. “This is about writing infrastructure as code and automating the whole deployment instead of bringing in your own components. We think it is a critical part of delivering Kubernetes with full automation.”

Senior News Writer Beth Pariseau contributed to this report.

Go to Original Article
Author:

Nvidia scoops up object storage startup SwiftStack

Nvidia plans to acquire object storage vendor SwiftStack to help its customers accelerate their artificial intelligence, high-performance computing and data analytics workloads.

The GPU vendor, based in Santa Clara, Calif., will not sell SwiftStack software but will use SwiftStack’s 1space as part of its internal artificial intelligence (AI) stack. It will also enable customers to use the SwiftStack software as part of their AI stacks, according to Nvidia’s head of enterprise computing, Manuvir Das.

SwiftStack and Nvidia disclosed the acquisition today. They did not reveal the purchase price, but they said it expects the deal to close with weeks.

Nvidia previously worked with SwiftStack

Nvidia worked with San Francisco-based SwiftStack for more than 18 months on tackling the data challenges associated with running AI applications at a massive scale. Nvidia found 1space particularly helpful. SwiftStack introduced 1space in 2018 to accelerate data access across public and private clouds through a single object namespace.

“Simply put, it’s a way of placing the right data in the right place at the right time, so that when the GPU is busy, the data can be sent to it quickly,” Das said.

Das said Nvidia customers would be able to use enterprise storage from any vendor. The SwiftStack 1space technology will form the “storage orchestration layer” that sits between the compute and the storage to properly place the data so the AI stack runs optimally, Das said.

“We are not a storage vendor. We do not intend to be a storage vendor. We’re not in the business of selling storage in any form,” Das said. “We work very closely with our storage partners. This acquisition is designed to further the integration between different storage technologies and the work we do for AI.”

We are not a storage vendor. We do not intend to be a storage vendor.
Manuvir DasHead of enterprise computing, Nvidia

Nvidia partners with storage vendors such as Pure Storage, NetApp, Dell EMC and IBM. The storage vendors integrate Nvidia GPUs into their arrays or sell the GPUs along with their storage in reference architectures.

Nvidia attracted to open source tech

Das said Nvidia found SwiftStack attractive because its software is based on open source technology. SwiftStack’s eponymous object- and file-based storage and data management software is rooted in open source OpenStack Swift. Das said Nvidia plans to continue to work with the SwiftStack team to advance and optimize the technology and make it available through open source avenues.

“The SwiftStack team is part of Nvidia now,” he said. “They’re super talented. So, the innovation will continue to happen, and all that innovation will be upstreamed into the open source SwiftStack. It will be available to anybody.”

Joe ArnoldJoe Arnold

SwiftStack laid off an undisclosed number of sales and marketing employees in late 2019, but kept the engineering and support team intact, according to president Joe Arnold. He attributed the layoffs to a shift in sales focus from classic backup and archiving to AI, machine learning and data analytics use cases.

The SwiftStack 7.0 software update that emerged late last year took aim at analytics HPC, AI and ML use cases, such as autonomous vehicle applications that feed data to GPU-based servers. SwiftStack said at the time that it had worked with customers to design clusters that could scale to handle multiple petabytes of data and support throughput in excess of 100 GB per second.

Das said Nvidia has been using SwiftStack’s object storage technology as well as 1space. He said Nvidia’s internal work on data science and AI applications had quickly showed the company that accelerating the computer shifts the bottleneck elsewhere, to the storage. That played a factor in Nvidia’s acquisition of SwiftStack, he noted.

“We recognized a long time ago that the way to help the customers is not just to provide them a GPU or a library, but to help them create the entire stack, all the way from the GPU up to the applications themselves. If you look at Nvidia now, we spend most of our energy on the software for different kinds of AI applications,” Das said.

He said Nvidia would fully support SwiftStack’s customer base. SwiftStack claims it has around 125 customers. It products lineup included SwiftStack’s object storage software, ProxyFS file system for integrated file and object API access, and 1space. SwiftStack’s software is designed to run on commodity hardware on premises, and its 1space technology can run in the public cloud.

SwiftStack spent more than eight years expanding its software’s capabilities since the company’s 2011 founding. Das said Nvidia has no reason to sell the SwiftStack’s proprietary software because it does not compete head-to-head with other object storage providers.

“Our philosophy here at Nvidia is we are not trying to compete with infrastructure vendors by selling some kind of a stack that competes with other peoples’ stacks,” Das said. “Our goal is simply to make people successful with AI. We think, if that happens, everybody wins, including Nvidia, because we believe GPUs are the best platform for AI.”

Go to Original Article
Author:

SAP S/4HANA migration should be business-driven

As SAP customers contemplate an SAP S/4HANA migration, they have to work through big questions like what infrastructure it will run on and how to handle business processes. One of the keys to a successful S/4HANA migration will be which part of the organization sets the project parameters, IT or business.

SAP expert Ekrem Hatip, senior solution architect at Syntax Systems, advises that because an S/4HANA migration is a fundamentally different project than a normal system upgrade, such as from SAP R/3 to SAP ECC, organizations must approach it differently. In this Q&A, Hatip discusses some of the issues that organizations need to consider as they embark on an S/4HANA journey.

Syntax Systems is based in Montreal and provides managed services for SAP systems, including hosting SAP systems in Syntax Systems data centers and running SAP systems on public cloud provider infrastructures.

How are Syntax customers approaching a possible S/4HANA migration? Is it on their minds?

Ekrem Hatip, Syntax senior solution architectEkrem Hatip

Ekrem Hatip: Over the last few years we have brought up the S/4HANA topic even if the customer doesn’t show immediate interest in doing that. We discuss with them what S/4HANA is, what are the advantages, and what are the innovations that S/4HANA introduces. We look at the customers’ existing landscape and discuss the possible migration paths from their system to an S/4HANA system. We talk about the business requirements, because an S/4HANA conversion is not a technical upgrade — it’s not a technical conversion from one database to another. It touches every aspect of their business processes, and we want to make sure that customers are aware that it is a sizable project.

Are customers eager to move or are they holding back now?

Hatip: Most customers are happy with what they have right now — with their SAP implementation. It satisfies their current needs and they don’t see an immediate reason to go to S/4HANA other than the fact that SAP has put the 2025 date in front of them [when SAP will end support for SAP ECC]. We can help our customers to understand what is ahead of them.

So educating them on what to expect is the first step of an S/4HANA migration?

Hatip: Absolutely. Most people don’t know much about SAP HANA let alone S/4HANA. Their expectation is, just like when they upgraded from R/3 to ECC, they will go ahead and just upgrade their system over one weekend. Then come Monday morning, they will continue running as they used to run on a shiny new system. We have to make sure that they understand this is not the case. Most of their business processes will be touched and most of the business processes might need to be modified or dropped. I also tell customers — especially if they’re going with a greenfield implementation — to keep their customizations at minimum. Everything seems to be going into cloud and S/4HANA Cloud is out there. So, I tell them if they can limit their customizations, they’ll be able to go to S/4HANA Cloud for the true SaaS experience.

Are customers considering any other systems as an alternative to an S/4HANA migration?

Hatip: For many customers SAP is the core of their business operations, and I haven’t yet seen any customers who are considering going to other solutions than SAP for their core business. So, it’s more likely they’re considering if they want to remain on ECC for as long as they can before moving to S/4HANA. With that said, I have seen that some customers are now considering alternatives to some of the peripheral systems offered by SAP. For example, one customer who was using BOB-J [SAP BusinessObjects BI] for its reporting is now considering using Microsoft Power BI on Azure. Do I know whether this is driven by the fact that they need to go to S/4HANA or not? I don’t know, but my observation is that some customers are considering alternatives for the systems surrounding their core SAP environment.

What are the most critical issues to consider as you make the S/4HANA decision?

Hatip: Unlike the previous conversions or upgrades, an S/4HANA conversion is not an IT-driven decision. It should definitely be a business-driven decision. It should be coming from the top and presented to the IT department, as opposed to the IT department going back and saying, this operating system is going out of support or that database is going out of support, so we have to upgrade. It should definitely be coming from the business side. Therefore, not only should the CIO be convinced to go to S/4HANA, at the same time CEOs and COOs should also be in the decision-making process. An S/4HANA conversion is a lengthy and fairly complex project, but at the same time it allows customers to clean up their existing legacy systems. Customers can use the opportunity to clean up data and review hardware or server requirements, or they can definitely leverage the public cloud offerings when they decide to go to S/4HANA. Finally, CIOs and the IT department should try to keep their customizations at a minimum in order to future-proof their environment. 

Go to Original Article
Author:

Cloud consultants set for massive workload shift to cloud

Cloud consultants take heed: Customers are pushing the bulk of their workloads to cloud infrastructure and a significant number are adopting related technologies such as containers.

AllCloud, a cloud and managed service provider based in Tel Aviv, Israel, said 85% of the 150 respondents to its cloud infrastructure survey expect to operate the majority of their workloads in the cloud by the end of 2020. Twenty-four percent of the IT decision-makers polled said they plan to be cloud-only organizations. The respondents work for companies with at least 300 employees and represent a range of industries.

AllCloud’s survey, published Jan. 15, also points to growing acceptance of containers, a trend other cloud consultants view as accelerating. More than 56% of respondents reported at least half of their cloud workloads use containers or microservices.

AllCloud CEO Eran Gil said cloud adoption, as reflected in the survey sample, is further along than he anticipated. He also said the amount of containers adoption surprised him.  

“It is interesting to see how many organizations are leveraging them,” he said of containers. “It’s far more than I expected to see.”

Eran Gil, CEO at AllCloudEran Gil

For cloud consultants, the transition from small-scale, individual workload migrations to more decisive shifts to the cloud may open opportunities for IT modernization.

“We are talking to [customers] about modernizing their infrastructure — not just simply taking what they have on premises and hosting it on AWS or other vendors,” Gil said.

Amid broader cloud adoption, AllCloud plans to expand in North America. The company in 2018 launched operations in North America, acquiring Figur8, a Salesforce partner with offices in San Francisco, Toronto, New York City and Vancouver, B.C. AllCloud is a Salesforce Platinum partner and an AWS Premier Consulting Partner.

“We are focusing on growing North America in particular,” Gil said, noting the company has received a new round of funding to support its expansion. “You will hear us announce acquisitions this year in either one of our ecosystems.”

The funding will also help AllCloud grow organically. Gil said the company plans to hire an AWS practice leader, who will report to Doug Shepard, AllCloud’s general manager for North America. Shepard previously was president of the Google business unit at Cloud Sherpas, a cloud consultancy Gil co-founded in 2008. Accenture acquired Cloud Sherpas in 2015.

Gil said the fundamental drivers of cloud adoption have changed dramatically since the launch of Cloud Sherpas. Back then, he said, cost was the main consideration, and security and reliability concerns were obstacles to acceptance. Security, however, emerged in AllCloud’s survey as the top consideration in cloud selection, followed by reliability. Cost ranked fourth in the list of adoption drivers.

“All the factors 10, 12 years ago that were the determents are now the drivers,” Gil said. 

New channel hires

  • DevOps lifecycle tool provider GitLab has appointed Michelle Hodges as vice president of global channels. GitLab, which plans to go public this year, said Hodges’ hiring is part of an initiative to ramp up the company’s channel strategy. Hodges joins GitLab from Gigamon, where she served as vice president of worldwide channels.
  • Avaya named William Madison as its vice president of North America cloud sales. Madison’s prior roles included vice president of global channel development and channel chief at Masergy Communications.
  • Managed services automation company BitTitan hired Kirk Swanson as its corporate development associate. Swanson will help BitTitan pursue acquisitions in the enterprise cloud market, targeting companies with SaaS products and relationships with IT service providers and MSPs, the company said. Prior to BitTitan, Swanson served as an associate at investment firm D.A. Davidson & Co.
  • Exclusive Networks, a cloud and cybersecurity distributor, named Christine Banker as vice president of North American sales. Banker will lead vendor recruitment, inside and field sales, and Exclusive’s PC and server business, among other departments and teams, the company said.
  • Anexinet Corp., a digital business solutions provider based in Philadelphia, has appointed Suzanne Lentz as chief marketing officer. She was previously chief marketing officer of Capgemini Invent NA.
  • Workspace-as-a-service vendor CloudJumper named Amie Ray as its enterprise channel sales manager. Ray comes to CloudJumper from PrinterLogic, where she was national channel account manager.

Other news

  • WESCO International Inc. has agreed to acquire distributor Anixter International Inc. for $4.5 billion. WESCO outbid Clayton, Dubilier & Rice LLC. The deal is expected to close in the second or third quarter of 2020. According to Pittsburgh-based WESCO, the combined entity would have revenue of about $17 billion. The pending deal follows Apollo Global Management’s agreement to acquire Tech Data Corp., a distributor based in Tampa, Fla.
  • Lemongrass Consulting, a professional services and managed service provider based in Atlanta, has completed a $10 million Series C round of financing, a move the company said will help it build out its senior leadership team, boost product development, and expands sales and marketing. Rodney Rogers, co-founder and general partner of Blue Lagoon Capital, joins Lemongrass as chairman. Blue Lagoon led the new funding round. Mike Rosenbloom is taking on the group CEO role at Lemongrass. He was formerly managing director of Accenture’s Intelligent Cloud & Infrastructure business. Walter Beek, who has been group CEO at Lemongrass, will stay on with company as co-founder and chief innovation officer. Lemongrass focuses on SAP applications running on AWS infrastructure.
  • Strategy and revenue are getting a heightened focus among CIOs, according to a Logicalis survey. The London-based IT solutions provider’s poll of 888 global CIOs found 61% of the respondents “spent more time on strategic planning in the last 12 months, while 43% are now being measured on their contribution to revenue growth.” The emphasis on strategy and revenue comes at the expense of innovation. About a third of the CIOs surveyed said the time available to spend on innovation has decreased over the last 12 months.
  • IT infrastructure management vendor Kaseya said it ended 2019 with a valuation exceeding $2 billion. Kaseya added more than 5,000 new customers and had more than $300 million in annual bookings, according to the company. Kaseya noted that the company had an organic growth rate of about 30%.
  • Cybersecurity vendor WatchGuard Technologies updated its FlexPay program with automated, monthly billing for its network security hardware and services. Partners can acquire subscriptions from WatchGuard’s distributor partners in various purchasing models, including one- and three-year contracts and pay-as-you-go terms, WatchGuard said. In the U.S., WatchGuard Subscriptions are available exclusively through the Synnex Stellr online marketplace.
  • Copper, which provides CRM for G Suite, rolled out its 2020 Partner Ambassador Program. The referral program has four partner tiers with incremental incentives, marketing resources, and training and certifications.
  • GTT Communications Inc., a cloud networking provider based in McLean, Va., has added Fortinet Secure SD-WAN to its SD-WAN service offering.
  • EditShare, a storage vendor that specializes in media creation and management, signed Key Code Media to its channel program. Key Code Media is an A/V, broadcast and post-production reseller and systems integrator.
  • Accenture opened an intelligent operation center in St. Catharines, Ont., as a hub for its intelligent sales and customer operations business. Accenture said the location is the company’s third intelligent operations center in Canada and its second in the Niagara region.

Market Share is a news roundup published every Friday.

Go to Original Article
Author:

New Oracle Enterprise Manager release advances hybrid cloud

In a bid to meet customers’ needs for hybrid cloud deployments, Oracle has injected its Oracle Enterprise Manager system with new capabilities to ease cloud migration and hybrid cloud database management.

The software giant unveiled the new Oracle Enterprise Manager release 13.4 on Wednesday, with general availability expected by the end of the first quarter.

The release includes new analytics features for users to make the most of a single database and optimize performance. Lifecycle automation for databases gets a boost in the new release. The update also provides users with new tools to enable enterprises to migrate from an on-premises database to one in the cloud.

“Managing across hybrid on-prem and public cloud resources can be challenging in terms of planning and executing database migrations,” said Mary Johnston Turner, research vice president for cloud management at IDC. “The new Migration Workbench addresses this need by providing customers with guided support for updating and modernizing across platforms, as appropriate for the customer’s specific requirements.”

Beyond helping with migration, Turner noted that Oracle Enterprise Manager 13.4 supports customer choice by enabling consistent management across Oracle Cloud and traditional on-premises resources, which is a recognition that most enterprises are adopting multi-cloud architectures.

The other key addition in Oracle Enterprise Manager 13.4 is advanced machine learning analytics, Turner noted.

“Prior to this release the analytics capabilities were mostly limited to Oracle Management Cloud SaaS [software as a service] solutions, so adding this capability to Enterprise Manager is significant,” she said.

Oracle Enterprise Manager 13.4 features

Nearly all large Oracle customers use Enterprise Manager already, said Mughees Minhas, vice president of product management at Oracle. He said Oracle doesn’t want to force a new management tool on customers that choose to adopt the cloud, which is why the vendor is increasingly integrating cloud management features with Oracle Enterprise Manager.

Managing across hybrid on-prem and public cloud resources can be challenging in terms of planning and executing database migrations.
Mary Johnston TurnerResearch vice president for cloud management, IDC

As users decide to move data from on-premises deployments to the cloud, it’s rarely just an exercise in moving an application from one environment to another without stopping to redesign the workflow, Minhas said.

The migration tool in the new enterprise manager update includes a SQL performance analyzer feature to ensure that database operations are optimized as they move to the cloud. The tool also includes a compatibility checker to verify that on-premises database applications are compatible with the autonomous versions of Oracle database that runs in the cloud.

Migrating to new databases with Enterprise Manager 13.4

Helping organizations migrate to new database versions is one of the key capabilities of the latest version of Oracle Enterprise Manager.

“Normally, you would create a separate test system on-prem where you would install it and then once you’re done with the testing, then you’d upgrade the actual system,” Minhas said. “So we are promoting these use cases to Enterprise Manager through the use of real application testing tools, where we let you create a new database in the cloud to test.”

Intelligent analytics

The new Oracle Enterprise Manager release also benefits from Exadata Warehouse technology, which now enables analytics for Oracle database workloads.

“The goal of a great admin or cloud DBA [database administrator] is that they want to avoid problems before they happen, and not afterwards,” Minhas said. “So we are building analytical capabilities and some algorithms, so they can do some forecasting, so they know limits and are able to take action.”

Minhas said hybrid management will continue to be Oracle’s focus for Oracle Enterprise Manager.

“Over time, you’ll see us doing more use cases where we also let you do the same thing you’re doing on premises in the cloud, using the same APIs users are already familiar with,” Minhas said.

Go to Original Article
Author:

Public cloud vendors launch faulty services as race heats up

The public cloud services arena has turned a corner, introducing new challenges for customers, according to the latest edition of “Technology Radar,” a biannual report by global software consultancy ThoughtWorks. Competition has heated up, so top public cloud vendors are creating new cloud services at a fast clip. But in their rush to market, those vendors can roll out flawed services, which opens the door for resellers to help clients evaluate cloud options.

Public cloud has become a widely deployed technology, overcoming much of the resistance it had seen in the past. “Fears about items like security and sovereignty have been calmed,” noted Scott Shaw, director of technology for Asia Pacific region at ThoughtWorks. “Regulators have become more comfortable with the technology, so cloud interest has been turning into adoption.”

The cloud market shifts

With the sales of public cloud services rising, competition has intensified. Initially, Amazon Web Services dominated the market, but recently Microsoft Azure and Google Cloud Platform have been gaining traction among enterprise customers.

Corporations adopting public cloud have not had as much success as they had hoped for.
Scott ShawDirector of technology for Asia Pacific region, ThoughtWorks

One ripple effect is that the major public cloud providers have been trying to rapidly roll out differentiating new services. However, in their haste to keep pace, they can deliver services with rough edges and incomplete feature sets, according to ThoughtWorks.

Customers can get caught in this quicksand. “Corporations adopting public cloud have not had as much success as they had hoped for,” Shaw said.

Businesses try to deploy public cloud services based on the promised functionality but frequently hit roadblocks during implementations. “The emphasis on speed and product proliferation, through either acquisition or hastily created services, often results not merely in bugs but also in poor documentation, difficult automation and incomplete integration with vendors’ own parts,” the report noted.

Top public cloud vendors chart
The global public cloud market share in 2019.

Testing is required

ThoughtWorks recommended that organizations not assume all public cloud vendors’ services are of equal quality. They need to test out key capabilities and be open to alternatives, such as open source options and multi-cloud strategies.

Resellers can act as advisors to help customers make the right decisions as they consider new public cloud services, pointing out the strengths and flaws in individual cloud options, Shaw said.

To serve as advisors, however, resellers need in-depth, hands-on experience with the cloud services. “Channel partners cannot simply rely on a feature checklist,” Shaw explained. “To be successful, they need to have worked with the service and understand how it operates in practice and not just in theory.”

Go to Original Article
Author:

Cradlepoint NetCloud update avoids unnecessary data usage

Cradlepoint has introduced technology that helps customers control costs by flagging unusual increases in data use across the wireless links managed by the vendor’s software-defined WAN.

The vendor unveiled this week the latest analytics in its cloud-based Cradlepoint NetCloud management platform. Cradlepoint is aiming the technology at retailers, government agencies and enterprises that have widely distributed operations. Those organizations typically have a WAN dependent on 4G and other wireless links.

The latest algorithms determine patterns of data usage based on historical data gathered over time across a company’s wireless links, the vendor said. Cradlepoint NetCloud will notify network managers when data usage deviates from past patterns.

The feature provides early notification of surges in usage that might be unrelated to normal business operations, such as video streaming by employees or misconfigured networking gear.

Cradlepoint pitches itself as particularly useful to retailers. The company claims that 75% of the top retailers globally uses its technology. Customers include David’s Bridal, which sells wedding dresses through 330 stores in North America and the United Kingdom. Another sizable retail customer is the jewelry manufacturer Pandora, which distributes its products through stores in more than 100 countries.

Companies outside of retail also use Cradlepoint technology. DSC Dredge LLC uses Cradlepoint for managing 4G LTE, 4G and 3G connectivity across its fleet of dredging machines. The company supplies the equipment in more than 40 countries for use in constructing dams and improving waterway drainage and navigability.

DSC has equipped each of its dredges with a Cradelpoint router and oversees the technology through the NetCloud management software.

Cradlepoint sells subscription-based packages that converge multiple network services on a single edge router. The bundle, for example, could include a router with Ethernet ports, and support for Wi-Fi with a guest portal and LTE integration.

Cradlepoint sells subscriptions on a one-, three- or five-year basis.

Go to Original Article
Author: