Tag Archives: Cloud

Zendesk Relater primes customers for remote call center work

Zendesk, the cloud platform vendor that made its name with its Support Suite customer service platform for SMBs, is moving into CRM. But during the coronavirus crisis, the company quickly moved its own operations to at-home virtual work as it supports its 150,000 users, many of which are launching remote call centers amid spikes in customer service interactions.

“Even companies that are already flexible and using Zendesk are experiencing dramatic increases in their volumes, because a lot of people are trying to work remote right now,” said Colleen Berube, Zendesk CIO. “We have a piece of our business where we are having to help companies scale up their abilities to handle this shift in working.”

Even though the vendor did support some remote work before the coronavirus work-from-home orders hit, immediately rolling out work-from-home for Zendesk’s entire organization wasn’t straightforward, because of laptop market shortages. Like many companies, it required a culture shift to move an entire operation to telecommuting that included new policies allowing workers to expense some purchases for home-office workstations.

“We don’t have any intention of recreating the entire workplace at home, but we wanted to give them enough so they could be productive,” Berube said.

Zendesk CEO Mikkel Svane
Zendesk CEO Mikkel Svane delivers the Zendesk Relater user conference keynote from his home Tuesday.

Among Zendesk’s prominent midmarket customers so far are travel and hospitality support desks “dealing with unprecedented volumes of cancellations and refunds,” as well as companies assisting remote workforces shipping hardware to their employees, said Zendesk founder and CEO Mikkel Svane at the Zendesk Relater virtual user conference Tuesday.

“Using channels like chat have helped these customers keep up with this volume,” Svane said.

Zendesk has seen interest and network use in general grow among customers who need to bring remote call centers online during shelter-in-place orders from local and state governments. Easing the transition for users and their customers, Berube said, are self-service chatbots that Zendesk has developed over the last few years. She added that she’s seen Zendesk’s own AnswerBot keep tickets manageable on its internal help desk, which services remote employees as well as partners.

During Relater, Zendesk President of Products Adrian McDermott said that Zendesk AI-powered bots have saved users 600,000 agent hours by enabling customer self-service, adding that Zendesk customers using AI for customer support increased more than 90% over the last year. He said the company is betting big on self-service becoming the grand majority of customer service.

[Self-service is] not just going to a knowledge base and reading the knowledge base … but it’s about the user being at the center of the conversation and controlling the conversation.
Adrian McDermottPresident of products, Zendesk

“Self-service is going to be everywhere,” McDermott said. “It’s not just going to a knowledge base and reading the knowledge base … but it’s about the user being at the center of the conversation and controlling the conversation.”

While some larger cloud customer experience software vendors such as Oracle, Salesforce and Google canceled even the virtual conferences that were planned in lieu of live user events, Zendesk assembled a set of pre-recorded presentations from executives at home and other speakers scheduled for its canceled Miami Relate conference and put on a virtual user conference renamed “Zendesk Relater.”

Earlier this month, Zendesk released upgrades to its Sunshine CRM and Support Suite platforms. At Relater, the company announced a partnership with Tata Consultancy Services to implement Zendesk CRM at large enterprises.

Zendesk has the reputation of being a customer service product tuned for B2C companies, specializing in quick interactions. Its CRM system also has potential to serve that market, said Kate Leggett, Forrester Research analyst. Whether that will translate to enterprises and gain traction in the B2B market remains to be seen.

“It’s very different from the complex products that Microsoft and Salesforce have for that long-running sales interaction, with many people on the seller side and many people on the buyer side,” Leggett said.

Go to Original Article
Author:

Workday salary bonus may pay workforce dividends

Workday Inc. is giving the majority of its employees a two-week pay bonus in response to the coronavirus pandemic. The HCM cloud vendor announced its plan this week.

The company’s action isn’t too different from the government’s idea of sending every American a check. It’s providing aid to help employees cover the costs associated with this crisis.

The Workday salary bonus, a one-time payment, will cost the company about $80 million. It’s to help employees with “any unforeseen costs and needs at their discretion,” the HR vendor said. It employs about 12,000.

It comes off as altruistic, but the Workday salary bonus is also a strategic business move, said Atta Tarki, founder and CEO of executive-search firm ECA in Santa Monica, Calif. He is also the author of the just released Evidence-Based Recruiting (McGraw Hill, Feb. 2020).

The Workday salary bonus “builds higher loyalty, which translates to longer retention, more productivity — all those metrics,” Tarki said.

“A lot of companies and leadership teams are going to show their true colors in this environment,” Tarki said. That is not to say that every company going through layoffs is a bad company, he said. It depends on their financial situation, he said.

Firms that can help employees are showing that statements such as “we put our employees first” have weight, Tarki said. “This is an opportunity to step up and show what they mean by that,” he said.

Workday retains employees

The bonus helps the company retain employees, it can save a substantial amount of money in the cost of replacing lost workers, Tarki said.

A lot of companies and leadership teams are going to show their true colors in this environment.
Atta TarkiCEO, ECA

If employees believe that they can count on their employers, “that is going to be reciprocated by the employees when things pick up again,” Tarki said.

Employees are going to think hard about making a move to a potentially higher-paying job, Tarki said. They’ll ask themselves: “Do I really want to leave this company that was there for me?”

Facebook is reportedly planning to give $1,000 to each of its 45,000 employees, according to The Information.

But the leading  impact of COVID-19 is unemployment.

As states order the closing of restaurants, theaters, gyms, casinos and set limits on public gatherings, unemployment claims are rising rapidly.

For instance, Pennsylvania reported more than 50,000 unemployment claims on Monday alone. The Colorado Department of Labor “has seen a surge in unemployment claims from 400 on March 7 to more than 6,800 on March 17,” the agency reported.

The amount of unemployment financial help varies greatly by state. For instance, Florida provides only 12 weeks of unemployment insurance and caps the benefit at $275 a week, according to U.S. Rep. Frederica Wilson (D-Fla.), who is seeking an increase. In contrast, the maximum weekly benefit in Texas is $465 a week, and in Massachusetts, it is $823 per week.

Go to Original Article
Author:

Mendix expands support with private, dedicated cloud options

Low-code platform provider Mendix has its cloud bases covered after introducing new private and dedicated cloud offerings this week.

Mendix for Private Cloud and Mendix Dedicated Cloud join the Mendix Public Cloud service and expand the company’s offerings across public, private and hybrid cloud deployment options.

These tools will help enterprises use low-code development in any cloud environment of their choice, as well as on premises, said Jon Scolamiero, manager of architecture and governance at Mendix.

Mendix is bucking the trend where low-code pure-plays offer their solutions as PaaS deployments, responding to enterprises’ adoption of Kubernetes-based hybrid cloud strategies.

Charlotte DunlapCharlotte Dunlap

“Increasingly, leading public cloud providers are offering their own versions of low-code and automation technologies to complement their Kubernetes offerings — for instance, Microsoft with Power Automate and Google’s acquisition of AppSheet in January, which poses a competitive threat to Mendix,” said Charlotte Dunlap, an analyst at GlobalData in Santa Cruz, Calif.

Mendix for Private Cloud runs on Kubernetes in any privately configured location or data center. Its target is any enterprise with specialized security, compliance or data integration needs.

Meanwhile, the company said the Mendix Dedicated Cloud is aimed at enterprises that have more than 100 Mendix applications as well as customers operating in highly regulated environments. This cloud is built, managed and configured by Mendix exclusively for the customer. Mendix Public Cloud runs managed and hosted by Mendix or hosted on AWS, SAP or IBM public clouds.

Multi-cloud in demand

Some 86% of respondents to a recent Forrester Research survey said they may deploy workloads on hybrid cloud or multi-cloud environments. One-third of that group said they will use private cloud as part of their development strategy. In addition, 23% of respondents planned to deploy on-premises workloads together with public cloud deployment.

Increasingly, leading public cloud providers are offering their own versions of low-code and automation technologies to complement their Kubernetes offerings.
Charlotte DunlapAnalyst, GlobalData

Forrester’s survey focused on the general IT landscape and not specifically low-code vendors such as Mendix. But Mendix’s cloud options could appeal to existing customers such as Kermit, a firm based in Hunt Valley, Md., that offers a spend management platform for hospitals to track the amount spent on implantable medical devices. The analytics-based platform helps medical institutions track physician preference items, which account for about 60% of a hospital’s spending for supplies.

Kermit developed its cloud-based platform with Mendix. In fact, the CEO and two of his co-founders found Mendix so approachable that they downloaded a Mendix modeler and were able to construct a running application, said Richard Palarea, CEO and co-founder of Kermit.

“I am very comfortable with the paradigm of having control over the code and wanting to actually hard-code everything, versus having an environment where you can stitch together a business process and go into the app store and grab widgets that you need to bring your idea about,” he said. “That, to me, just has always seemed like a better way of doing things.”

The Kermit analytics platform, which was built by one core developer in nine months, enables hospitals to track and manage contracts, billings, and vendor compliance.

The company has primarily offered its platform to hospitals in Maryland, where Kermit manages 40% of the total spending on medical implants, Palarea said. Kermit began taking its platform nationwide at the end of last year.

“Healthcare analytics are a huge game right now,” Palarea said. “Every CIO in hospitals these days is looking at these kinds of tools to manage the business better because lower reimbursements from both Medicare primarily and also third-party insurers mean the payment that they get for their surgeries are going down.”

While it is too early for Kermit to consider using these new editions of the Mendix platform, company officials said that once use of the Kermit offering goes nationwide, they might take another look.

Go to Original Article
Author:

NetApp digs into Talon Storage for Cloud OnTap

The NetApp cloud ecosystem gained some muscle for shared file with the acquisition of software-defined vendor Talon Storage.

NetApp said the Talon Storage Fast software suite is already integrated in NetApp Cloud OnTap as part of an ongoing partnership. Talon provides global file caching and syncs data locally from public clouds to a company’s remote branch offices. The NetApp integration does not include local OnTap clusters.

The NetApp cloud file storage is available with two products. NetApp Cloud Volumes OnTap allows a storage administrator to run NetApp storage in the public cloud in the same manner as on premises. Customers also may run NetApp file storage as a service in Amazon Web Services, Google Cloud Platform and Microsoft Azure.

Those NetApp cloud offerings have direct connectivity and integration with Talon, said Anthony Lye, a NetApp senior vice president and GM of its cloud business unit.

Talon Storage is designed to mitigate the performance penalties that occur when shared cloud is used for local processing. The Talon technology runs as a virtual machine. Ctera Networks, Nasuni and Panzura are Talon’s chief competitors. NetApp did not disclose the purchase price, or how many Talon employees will join NetApp.

Cloud storage works well with cloud-native applications, but performance breaks down when more and more data is consumed locally. Major cloud providers are addressing the issue with products by placing storage closer to on-premises users, including AWS Outposts and Google Anthos.

“Talon is a nice acquisition for NetApp. Small player, but good technology,” said Steve McDowell, a senior analyst for storage and data center technologies at Moor Insights & Strategy, based in Austin, Texas.

Support for Talon Storage customers

Lye said Talon gives NetApp new tools to help customers shift more data workloads to the cloud. He added that it lowers the cost for organizations that need to sync data between multiple locations.

 “The only way you could get low response times historically was to sync the files between all the different remote offices and branch offices. Talon uses memory on the remote machine and has an intelligence to place frequently accessed files into the memory. The files are cached remotely, even though the source of truth still exists on the back-end file server,” Lye said.

NetApp has been exposing functionality to third-party vendors via RESTful APIs, which Lye said helped speed up the Talon integration. He said NetApp has gained more than 1,000 Talon Storage customers. He said NetApp will support Talon customers regardless of their back-end storage.

“We are promoting aggressively to new prospects a bundle that would include Talon Storage and NetApp Cloud Volumes, at a significantly lower TCO,” Lye said.

Talon Storage is the fifth NetApp cloud software acquisition since 2017. The company added compliance software vendor Cognigo last year and Kubernetes specialist Stackpoint.io in 2018, one year after landing memory startup Plexistor and cloud orchestration player GreencloudIQ.

Go to Original Article
Author:

Developers vie for Oracle Cloud Infrastructure certs

Oracle hopes a new developer certification program can drive more interest in its cloud platform, which is well behind the likes of AWS, Microsoft and Google in terms of market share.

To this end, Oracle has introduced a new Developer Associate certification for Oracle Cloud Infrastructure, to help developers familiarize themselves with the platform and ultimately build modern enterprise applications there. Oracle now offers five distinct certifications for architects, operators and developers on OCI.

“We absolutely expect this certification will help to grow the number of developers in the Oracle cloud ecosystem,” said Bob Quillin, vice president of Oracle cloud developer relations. “This is a real, professional set of skills and we have a broad set of tools on the toolbox to get and keep developers up to speed.”

Oracle has struggled to attract enterprise customers to OCI, the second-generation version of its cloud platform. The company is hoping the certifications will help in that regard by teaching developers how to use the Oracle Cloud Infrastructure service APIs, command-line interface and SDKs to write applications.

Bob Quillin, VP of Oracle cloud developer relationsBob Quillin

“Certifications matter when it comes to building software,” said Holger Mueller, principal analyst at Constellation Research in Cupertino, Calif. “It matters for enterprises to know what they can expect from developers with a certain certification level. And developers want to document their skills level with a certification.”

Passing the certification means that developers have learned OCI architecture, as well as use cases and best practices, Quillin said.

Traditional leading app platforms providers such as Oracle have a massive installed base of customers and developers invested in their solutions who will continue to adopt new cloud-based solutions.

Developers are less motivated to climb onboard a new brand, which is very costly, so vendors are tailoring new programs and certificates to attract non-Oracle expertise developers and broaden its developer following.
Charlotte DunlapPrincipal analyst, GlobalData

However, “Beyond that pool, developers are less motivated to climb onboard a new brand, which is very costly, so vendors are tailoring new programs and certificates to attract non-Oracle expertise developers and broaden its developer following,” said Charlotte Dunlap, principal analyst at GlobalData in Santa Cruz, Calif.

Quillin said the move to add the new certification was based on developer demand.

“We have thousands of Oracle Cloud Infrastructure certified individuals and are seeing a strong customer demand for Oracle Cloud technical skills,” he said.

This reflects strong momentum among developers who want to build new applications, as well as build extensions to existing applications and data.

“The creation of a certification is a usually a sign of maturity for a platform or product,” Mueller said. “This is the case for Oracle’s cloud developer certification, which just launched. As with all certifications, popularity will determine relevance and we will see in a few quarters what the uptake will be.”

Oracle University is offering the new Developer Associate certification at $150. Oracle University is also reducing the price of all Oracle Cloud Infrastructure Associate level certifications — including architect and operations — to $150 and offering the Foundations certification at $95.

In addition, Oracle University has made all its Oracle Cloud Infrastructure learning content available at no charge. Oracle previously charged $2,995 per user subscription.  

The certification exam is 105 minutes long and contains 60 questions. It is currently available only in English.

Moreover, the developer certification is aimed at developers who have 12 or more months of experience in developing and maintaining applications. They should also have an understanding of cloud-native fundamentals and knowledge of at least one programming language.

Go to Original Article
Author:

Windows Server 2008 end of life means it’s time to move

Windows Server 2008 end of life is here, so will you go to the cloud, use containers or carry on with on-premises Windows Server?

Windows Server 2008 and Server 2008 R2 left support recently, giving administrators one last batch of security updates on January Patch Tuesday. Organizations that have not migrated to a supported platform will not get further security fixes for Server 2008/2008 R2 machines unless they have enrolled in the Extended Security Updates program or moved those workloads into the Microsoft cloud for three additional years of security updates. Organizations that choose to continue without support will roll the dice with machines that now present a liability.

In many instances, a switch to a newer version of Windows Server is not an option. For example, many hospitals run equipment that relies on applications that do not function on a 64-bit operating system, which rules out every currently supported Windows Server OS. In these cases, IT must keep those workloads running but keep them as secure as possible using various methods, such as isolating the machine with a segregated virtual LAN or even pulling the plug by air-gapping those systems.

What works best for your organization is based on many factors, such as cost and the IT department’s level of expertise and comfort level with newer technologies.

For some, a cloudy forecast

The decision to stay with the same version of Server 2008/2008 R2 comes with a price. To enroll in the Extended Security Updates program requires Software Assurance and the cost for the updates annually is about 75% of what a Windows Server license costs.

This expense will motivate some organizations to explore ways to reduce those costs and one alternative is to push those Server 2008/2008 R2 workloads into the Azure cloud. This migration will require some adjustment as the capital expense of an on-premises system migrates to the operational expense used with the cloud consumption model.

Mentioning the cloud word doesn’t fill IT with as much apprehension as it once did, but the move might require some technological gymnastics to get some workloads working when one component, such as the database, needs to stay on premises while the application runs in the cloud.

Some other considerations include increasing the available bandwidth to accommodate the need for lower latency when working with cloud workloads and learning how to patch and do other administrative tasks when the system is in Azure.

Application virtualization is another option

While traditional virtualization is the norm for most Windows shops, there’s a relatively new form of virtualization that is another migration option. Putting a Windows Server 2008/2008 R2 workload into a Docker container might not seem as far-fetched as it did when this technology was in its infancy.

Containers versus VMs
Because each virtual machine uses a guest operating system, VMs use more disk space than a container that shares an underlying operating system.

Microsoft added support for Windows containers on Windows Server 2016 and 2019, as well as the Semi-Annual Channel releases. The migration process puts the legacy application into a container, which then runs on top of a supported Windows Server OS.

Administrators will need to get up to speed with the differences between the two forms of virtualization, and the advantages and disadvantages of migrating a server workload to a containerized application. For example, all the containerized applications run on top of a shared kernel, which might not work in environments with a requirement for kernel isolation for sensitive workloads.

Storage Migration Service eases file server upgrades

Microsoft released Windows Server 2019 with a host of new features, including the Storage Migration Service, which attempts to reduce the friction associated with moving file servers to a newer Windows Server operating system.

One standby for many organizations is the on-premises file server that holds documents, images and other data that employees rely on to do their jobs. The Windows Server 2008 end of life put many in IT in the difficult position of upgrading file servers on this legacy server OS. It’s not as simple as copying all the files over to the new server because there are numerous dependencies associated with stored data that must get carried over and, at the same time, avoid disrupting the business during the migration process.

The Storage Migration Service runs from within Microsoft’s relatively new administrative tool called the Windows Admin Center. The feature is not limited to just shifting to a supported on-premises Windows Server version but will coordinate the move to an Azure File Sync server or a VM that runs in Azure.

Go to Original Article
Author:

Using Azure AD conditional access for tighter security

As is standard with technologies in the cloud, the features in Azure Active Directory are on the move.

The Azure version of Active Directory differs from its on-premises version in many ways, including its exposure to the internet. There are ways to protect your environment and be safe, but that’s not the case by default. Here are two changes you should make to protect your Azure AD environment.

Block legacy authentication

Modern authentication is Microsoft’s term for a set of rules and requirements on how systems can communicate and authenticate with Azure AD. This requirement is put in place for several security benefits, but it’s also not enforced by default on an Azure AD tenant.

Legacy authentication is used for many types of attacks against Azure AD-based accounts. If you block legacy authentication, then you will block those attacks, but there’s a chance you’ll prevent users trying to perform legitimate tasks.

This is where Azure AD conditional access can help. Instead of a simple off switch for legacy authentication, you can create one or more policies — a set of rules — that dictate what is and isn’t allowed under certain scenarios.

You can start by creating an Azure AD conditional access policy that requires modern authentication or it blocks the sign-in attempt. Microsoft recently added a “report only” option to conditional access policies, which is highly recommended to use and leave on a few days after deployment. This will show you the users still using legacy authentication that you need to remediate before you enforce the policy for real. This helps to ensure you don’t stop users from doing their jobs.

However, this change will severely limit mobile phone email applications. The only ones officially supported with modern authentication are Outlook for iOS and Android, and Apple iOS Mail.

Implement multifactor authentication

This sounds like an obvious one, but there are many ways to do multifactor authentication (MFA). Your Microsoft licensing is one of the factors that dictates your choices. The good news is that options are available to all licensing tiers — including the free one — but the most flexible options come from Azure AD Premium P1 and P2.

With those paid plans, conditional access rules can be a lot nicer than just forcing MFA all the time. For example, you might not require MFA if the user accesses a Microsoft service from an IP address at your office or if the device is Azure AD-joined. You might prefer that both of those scenarios are requirements to avoid MFA while other situations, such as a user seeking access on a PC not owned by the company, will prompt for extra authentication.

MFA doesn’t have to just be SMS-based authentication. Microsoft’s Authenticator App might take a few more steps for someone to set up the first time they register, but it’s much easier to just accept a pop-up on your mobile device as a second factor of authorization, rather than waiting for an SMS, reading the six-digit number, then typing it into your PC.

Without MFA, you’re running a high risk of having an internet-exposed authentication system that attackers can easily try leaked credentials or use spray attacks until they hit a successful login with a username and password.

The other common attack is credential phishing. This can be particularly successful when the threat actor uses a compromised account to send out phishing emails to the person’s contacts or use fake forms to get the contact’s credentials, too. This would be mostly harmless if the victim’s account required MFA.

Accounts in Azure AD will lock out after 10 failed attempts without MFA, but only for a minute, then gradually increase the time after further failure attempts. This is a good way to slow down the attackers, and it’s also smart enough to only block the attacker and keep your user working away. But the attacker can just move onto the next account and come back to the previous account at a later time, eventually hitting a correct password.

Azure AD conditional access changes are coming

The above recommendations can be enabled by four conditional access baseline policies, which should be visible in all Azure AD tenants (still in preview), but it appears these are being removed in the future.

baseline protection policies
Microsoft plans to replace the baseline protection policies with security defaults

The policies will be replaced by a single option called Security Defaults, found under the Manage > Properties section of Azure AD. The baseline policies helped you be a bit more granular about what security you wanted and the enablement of each feature. To keep that flexibility, you’ll need Azure AD Premium once these baseline policies go.

Turning on Security Defaults in your Azure AD tenant will:

  • force administrators to use MFA;
  • force privileged actions, such as using Azure PowerShell, to use MFA;
  • force all users to register for MFA within 14 days; and
  • block legacy authentication for all users.

I suspect the uptake wasn’t enough, which is why Microsoft is moving to a single toggle option to enable these recommendations. I also hazard to guess that Microsoft will make this option on by default for new tenants in the future, but there’s no need for you to wait. If you don’t have these options on, you should be working on enabling them as soon as you can.

Go to Original Article
Author:

Google Cloud security gets boost with Secret Manager

Google has added a new managed service called Secret Manager to its cloud platform amid a climate increasingly marked by high-profile data breaches and exposures.

Secret Manager, now in beta, builds on existing Google Cloud security services by providing a central place to store and manage sensitive data such as API keys or passwords.

The system employs the principle of least privilege, meaning only a project’s owners can look at secrets without explicitly granted permissions, Google said in a blog post. Secret Manager works in conjunction with the Cloud Audit Logging service to create access audit trails. These data sets can then be moved into anomaly detection systems to check for breaches and other abnormalities.

All data is encrypted in transit and at rest with AES-256-level encryption keys. Google plans to add support for customer-managed keys later on, according to the blog.

A secrets manager … is really no different than a database, but just with more audit logs and access checking.
Scott PiperAWS security consultant, Summit Route

Google Cloud customers have been able to manage sensitive data prior to now with Berglas, an open source project that runs from the command line, whereas Secret Manager adds a layer of abstraction through a set of APIs.

Berglas can be used on its own going forward, as well as directly through Secret Manager beginning with the recently released 0.5.0 version, Google said. Google also offers a migration tool for moving sensitive data out of Berglas and into Secret Manager.

Secret Manager builds on the existing Google Cloud security lineup, which also includes Key Management Service, Cloud Security Command Center and VPC Service Controls.

With Secret Manager, Google has introduced its own take on products such as HashiCorp Vault and AWS Secrets Manager, said Scott Piper, an AWS security consultant at Summit Route in Salt Lake City.

Scott Piper, an AWS security consultant at Summit Route Scott Piper

A key management service is used to keep an encryption key and perform encryption operations, Piper said. “So, you send them data, and they encrypt them. A secrets manager, on the other hand, is really no different than a database, but just with more audit logs and access checking. You request a piece of data from it — such as your database password — and it returns it back to you. The purpose of these solutions is to avoid keeping secrets in code.”

Doug Cahill, an analyst at Enterprise Strategy GroupDoug Cahill

Indeed, Google’s Key Management Service targets two different audiences within enterprise IT, said Doug Cahill, an analyst at Enterprise Strategy Group in Milford, Mass.

“The former is focused on managing the lifecycle of data encryption keys, while the latter is focused on securing the secrets employed to securely operate API-driven infrastructure-as-code environments,” Cahill said.

As such, data security and privacy professionals and compliance officers are the likely consumers of a key management offering, whereas secret management services are targeted toward DevOps, Cahill added.

Meanwhile, it is surprising that the Google Cloud security portfolio didn’t already have something like Secret Manager, but AWS only released its own version in mid-2018, Piper said. Microsoft released Azure Key Vault in 2015 and has positioned it as appropriate for managing both encryption keys and other types of sensitive data.

Pricing for Secret Manager during the beta period is calculated two ways: Google charges $0.03 per 10,000 operations, and $0.06 per active secret version per regional replica, per month.

Go to Original Article
Author:

Microsoft and Genesys expand partnership to help enterprises seize the power of the cloud for better customer experiences – Stories

Genesys Engage on Microsoft Azure is a new trusted and secure cloud offering built to ease the transition to the cloud for large enterprises

Microsoft CEO Satya Nadella and Tony Bates, CEO of Genesys
Microsoft CEO Satya Nadella (left), and Tony Bates, CEO of Genesys (right)

REDMOND, Wash., and SAN FRANCISCO — Jan. 23, 2020 — Microsoft Corp. and Genesys have expanded their partnership to provide enterprises with a new cloud service for contact centers that enables them to deliver superior interactions for customers. With the omnichannel customer experience solution Genesys Engage™ running on Microsoft Azure, enterprises have the security and scalability they need to manage the complexities involved with connecting every touchpoint throughout the customer journey.

Genesys Engage on Microsoft Azure will be available in late 2020. To accelerate adoption, the companies are providing Genesys Engage on Microsoft Azure through a joint co-selling and go-to-market strategy. Customers will benefit from a streamlined buying process that puts them on a clear path to the cloud.

The power of Genesys Engage on Microsoft Azure

With its multitenant architecture, Genesys Engage on Microsoft Azure gives customers the ability to innovate faster and improve their business agility. In addition, by running the Genesys customer experience solution on this dependable cloud environment, enterprises will be able to maximize their investment in Microsoft Azure through simplified management and maintenance requirements, centralized IT expertise, reduced costs, and more. These solutions make it easier for enterprises to leverage cloud and artificial intelligence (AI) technologies so they can gain deeper insights and provide tailor-made experiences for their customers.

Nemo Verbist, senior vice president of Intelligent Business and Intelligent Workplace at NTT Ltd., one of the top five global technology and services providers for the world’s largest enterprises and a partner of both Microsoft and Genesys, sees great value in the partnership. Verbist said, “Many of our customers have standardized on Microsoft solutions, and Genesys Engage on Microsoft Azure gives them an additional opportunity to take advantage of their investment. Together, these solutions provide enterprises a secure and powerful foundation to communicate with their customers in creative and meaningful ways.”

“Large contact centers receive an exceptionally high volume of inquiries across a growing list of channels and platforms. One of the biggest challenges is connecting the details of every interaction across all channels to ensure each customer has a seamless experience,” said Kate Johnson, president, Microsoft U.S. “By leveraging Microsoft’s Azure cloud and AI technologies, Genesys is helping enterprises create a seamless customer journey with Microsoft’s trusted, secure and scalable platform.”

“We are thrilled to give large enterprises the opportunity to run their mission-critical customer experience platform in the cloud environment they already know and trust — Microsoft Azure,” said Peter Graf, chief strategy officer of Genesys. “Together, we’re making it simpler for even the most complex organizations to transition to the cloud, enabling them to unlock efficiencies and accelerate innovation so they can build deeper connections with customers.”

The companies are also exploring and developing new integrations for Genesys and Microsoft Teams, Microsoft Dynamics 365 and Azure Cognitive Services to streamline collaboration and communications for employees and customers. More information will be released about these upcoming integrations later this year.

Register for the upcoming webinar, Genesys Engage + Microsoft Azure: Transform Your Customer Experience in the Cloud, to learn more on March 4.

To learn more about how Genesys and Microsoft are partnering, please visit the Microsoft Transform blog.

About Genesys

Every year, Genesys® delivers more than 70 billion remarkable customer experiences for organizations in over 100 countries. Through the power of the cloud and AI, our technology connects every customer moment across marketing, sales and service on any channel, while also improving employee experiences. Genesys pioneered Experience as a ServiceSM so organizations of any size can provide true personalization at scale, interact with empathy, and foster customer trust and loyalty. This is enabled by Genesys CloudTM, an all-in-one solution and the world’s leading public cloud contact center platform, designed for rapid innovation, scalability and flexibility. Visit www.genesys.com.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

©2020 Genesys Telecommunications Laboratories, Inc. All rights reserved. Genesys and the Genesys logo are trademarks and/or registered trademarks of Genesys. All other company names and logos may be registered trademarks or trademarks of their respective companies.

For more information, press only:

Microsoft Media Relations, WE Communications for Microsoft, +1 (425) 638-7777, [email protected]

Shaunna Morgan, Genesys Media Relations, +1 (317) 493-4241, [email protected]

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication, but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at https://news.microsoft.com/microsoft-public-relations-contacts.

Go to Original Article
Author: Microsoft News Center

Cisco Webex Edge for Devices links on-prem endpoints to cloud

Businesses using on-premises video gear from Cisco can now get access to cloud services, while keeping their video infrastructure in place.

A new service, called Cisco Webex Edge for Devices, lets businesses connect on-premises video devices to cloud services like Webex Control Hub and the Webex Assistant. Customers get access to some cloud features but continue to host video traffic on their networks.

Many businesses aren’t ready to move their communications to the cloud. Vendors have responded by developing ways to mix on-premises and cloud technologies. Cisco Webex Edge for Devices is the latest offering of that kind.

“It gives users that cloudlike experience without the businesses having to fully migrate everything to the cloud,” said Zeus Kerravala, principal analyst at ZK Research.

Cisco wants to get as many businesses as possible to go all-in on the cloud. Webex Edge for Devices, introduced this month, tees up customers to make that switch. Companies will have the option of migrating their media services to the cloud after connecting devices to the service.

Webex Edge for Devices is available for no additional charge to businesses with an enterprise-wide Collaboration Flex Plan, a monthly per-user subscription. Alternatively, companies can purchase cloud licenses for the devices they want to register with the service for roughly $30 per device, per month. The service won’t work with gear that’s so old Cisco no longer supports it.

Video hardware linked to the cloud through the service will show up in the Webex Control Hub, a console for managing cloud devices. For on-premises devices, the control hub will provide diagnostic reports, usage data, and insight into whether the systems are online or offline.

Many businesses are already using a mix of on-premises and cloud video endpoints. Webex Edge for Devices will let those customers manage those devices from a single console. In the future, Cisco plans to add support for on-premises phones.

Businesses will also be able to sync on-premises video devices with cloud-based calendars from Microsoft and Google. That configuration will let the devices display a one-click join button for meetings scheduled on those calendars.

Another cloud feature unlocked by Webex Edge for Devices is the Webex Assistant. The service is an AI voice system that lets users join meetings, place calls and query devices with their voice.

In the future, Cisco plans to bring more cloud features to on-premises devices. Future services include People Insights, a tool that provides background information on meeting participants with information gleaned from the public internet.

Cisco first released a suite of services branded as Webex Edge in September 2018. The suite included Webex Edge Audio, Webex Edge Connect and Webex Video Mesh. The applications provide ways to use on-premises and cloud technologies in combination to improve the quality of audio and video calls.

Cisco’s release of Webex Edge for Devices underscores its strategy of supporting on-premises customers without forcing them to the cloud, said Irwin Lazar, analyst at Nemertes Research.

Go to Original Article
Author: