Tag Archives: Azure

Microsoft is expanding the Azure Stack Edge with NVIDIA GPU preview

We’re expanding the Microsoft Azure Stack Edge with NVIDIA T4 Tensor Core GPU preview during the GPU Technology Conference (GTC Digital). Azure Stack Edge is a cloud-managed appliance that brings Azure’s compute, storage, and machine learning capabilities to the edge for fast local analysis and insights. With the included NVIDIA GPU, you can bring hardware acceleration to a diverse set of machine learning (ML) workloads.

What’s new with Azure Stack Edge

At Mobile World Congress in November 2019, we announced a preview of the NVIDIA GPU version of Azure Stack Edge and we’ve seen incredible interest in the months that followed. Customers in industries including retail, manufacturing, and public safety are using Azure Stack Edge to bring Azure capabilities into the physical world and unlock scenarios such as the real-time processing of video powered by Azure Machine Learning.

These past few months, we’ve taken our customers’ feedback to make key improvements and are excited to make our preview available to even more customers today.

If you’re not already familiar with Azure Stack Edge, here are a few of the benefits:Azure Stack Edge Hi-Res

  • Azure Machine Learning: Build and train your model in the cloud, then deploy it to the edge for FPGA or GPU-accelerated inferencing.
  • Edge Compute: Run IoT, AI, and business applications in containers at your location. Use these to interact with your local systems, or to pre-process your data before it transfers to Azure.
  • Cloud Storage Gateway: Automatically transfer data between the local appliance and your Azure Storage account.  Azure Stack Edge caches the hottest data locally and speaks file and object protocols to your on-prem applications.
  • Azure-managed appliance: Easily order and manage Azure Stack Edge from the Azure Portal.  No initial capex fees; pay as you go, just like any other Azure service.

Enabling our partners to bring you world-class business applications

Equally important to bringing you a great device is enabling our partners to bring you innovative applications to meet your business needs.  We’d love to share some of the continued investment we’re making with partners to bring their exciting developments to you.malong_logo_e_l_h

As self-checkouts grow in prevalence, Malong Technologies is innovating in AI applications for loss prevention.

“For our customers in the retail industry, artificial intelligence innovation is happening at the edge,” said Matt Scott, co-founder and chief executive officer, Malong Technologies. “Along with our state-of-the-art solutions, our customers need hardware that is powerful, reliable, and custom-tailored for the cloud. Microsoft’s Azure Stack Edge fits the bill perfectly. We’re proud to be a Microsoft Gold Certified Partner, working with Microsoft to help our retail customers succeed.”

Increasing your manufacturing organization’s quality inspection accuracy is key to Mariner’s Spyglass Visual Inspection application.

Mariner“Mariner has standardized on Microsoft’s Azure Stack Edge for our Spyglass Visual Inspection and Spyglass Connected Factory products. These solutions are mission critical to our manufacturing customers. Azure Stack Edge provides the performance, stability and availability they require.” – Phil Morris, CEO, Mariner

Building computer vision solutions to improve performance and safety in manufacturing and other industries is a key area of innovation for XXII.

“XXII is thrilled to be a Microsoft partner and we are working together to provide our clients with real time video analysis software on edge with the Azure Stack Edge box. With this solution, LOGO XXII _ NOIR _ SMALLAzure allow us to harvest the full potential of NVIDIA GPUs directly on edge and be able to provide our clients in retail, industry and smart city with smart video analysis that are easily deployable, scalable and easily manageable with Azure stack Edge.” – Souheil Hanoune, Chief Scientific Officer, XXII

More to come with Azure Stack Edge

NVIDIA4There are even more exciting developments with Azure Stack Edge coming. We’re putting the final touches on much-awaited new compute and AI capabilities including virtual machines, Kubernetes clusters, and multi-node support. Along with these new features announced at Ignite 2019, Data Box Edge was renamed Azure Stack Edge to align with the Azure Stack portfolio.

Our Rugged series for sites with harsh or remote environments is also coming this year, including the battery-powered form-factor that can be carried in a backpack. The versatility of these Azure Stack Edge form-factors and cloud-managed capabilities brings cloud intelligence and compute to retail stores, factory floors, hospitals, field operations, disaster zones, and rescue operations.

Get started with the Azure Stack Edge with NVIDIA GPU preview

Thank you for continuing to partner with us as we bring new capabilities to Azure Stack Edge. We’re looking forward to hearing from you.

  • To get started with the preview, please email us and we’ll follow up to learn more about your scenarios.
  • Learn more about Azure Stack Edge.

Learn more about Azure’s Hybrid Strategy

Read about more updates from Azure during NVIDIA’s GTC.

Go to Original Article
Author: Microsoft News Center

What Exactly are Proximity Placement Groups in Azure?

In this blog post, you’ll learn all about Azure Proximity Groups, why they are necessary and how to use them.

What are Azure Proximity Placement Groups?

Microsoft defines Azure Proximity Placement groups as an Azure Virtual Machine logical grouping capability that you can use to decrease the inter-VM network latency associated with your applications (Microsoft announcement blog post). But what does that actually mean?

When you look at VM placement in Azure and the reduction of latency between VMs, you can place VMs in the same region and the same Availability Zone. With that, they are in the same group of physical datacenters. To be honest, with the growing Azure footprint, these datacenters can still be a few kilometers away from each other.

That may impact the latency of your application and especially application with a need for latency within the nanosecond ranges will be highly impacted. Such applications can be for example banking applications for low latency trading or financial stock operations.

Proximity Placement Groups bring together as near as possible to achieve the lowest latency possible. Following scenarios are eligible for Proximity Placement Groups:

  • Low latency between stand-alone VMs.
  • Low Latency between VMs in a single availability set or a virtual machine scale set.
  • Low latency between stand-alone VMs, VMs in multiple Availability Sets, or multiple scale sets. You can have multiple compute resources in a single placement group to bring together a multi-tiered application.
  • Low latency between multiple application tiers using different hardware types. For example, running the backend using M-series in an availability set and the front end on a D-series instance, in a scale set, in a single proximity placement group.

All VMs must be in a single VNet like shown in the drawing below:

Virtual Network Scale Set and Availability Set

I wouldn`t suggest single VMs for production workloads on Azure. Always use a cluster within an Availability Set or a VM Scale Set.

How does that look like in an Azure Datacenter environment?

The following drawing shows the placement of a VM without Proximity Groups:

Placement of a VM without Proximity Groups

With Proximity Groups for a single VM, it could look like the following:

Proximity Groups for a single VM

When you use availability sets for your VMs, the distribution can look like the following.

Distribution availability sets for VMs

With that said, let’s learn how to set up a Proximity Placement Group.

How to set up a Proximity Placement Group

To set up a Proximity Placement Group is pretty easy.

Look for Proximity Placement Groups in the Portal:

Proximity Placement Groups in the Portal

Add a new group:

Create a new Proximity Placement Group

Select Subscription, Resource Group, Region and the name and create the group:

Proximity Placement Group Settings

When you now create a VM you can select the Proximity Placement Group in the advanced tap:

Proximity Placement Group advanced settings

There is also the option to use PowerShell to deploy Proximity Groups.

Conclusion

The information in this blog post explains Proximity Groups and the ways to use them but if you’re stuck or if there’s something you need further explanation about, let me know in the comments below and I’ll get back to you!

Go to Original Article
Author: Florian Klaffenbach

What Exactly is Azure Dedicated Host?

In this blog post, we’ll become more familiar with a new Azure service called Azure Dedicated Hosts. Microsoft announced the service as preview some time ago and will go general-available with it in the near future.

Microsoft Azure Dedicated Host allows customers to run their virtual machines on a dedicated host not shared with other customers. While in a regular virtual machine scenario different customers or tenants share the same hosts, with Dedicated Host, a customer does no longer share the hardware. The picture below illustrates the setup.

Azure Dedicated Hosts

With a Dedicated Host, Microsoft wants to address customer concerns regarding compliance, security, and regulations, which could come up when running on a shared physical server. In the past, there was only one option to get a dedicated host in Azure. The option was to use very large instances like a D64s v3 VM size. These instances were so large that they consumed one host, and the placement of other VMs was not possible.

To be honest here, with the improvements in machine placement, larger hosts, and with that a much better density, there was no longer a 100% guaranty that the host is still dedicated. Another thing regarding instances is they are extremely expensive, as you can see in the screenshot from the Azure Price Calculator.

Azure price calculator

How to Setup a Dedicated Host in Azure

The setup of a dedicated host is pretty easy. First, you need to create a host group with your preferences for availability, like Availability Zones and Number of Fault Domains. You also need to decide for a Host Region, Group Name, etc.

How To Setup A Dedicated Host In Azure

After you created the host group, you can create a host within the group. Within the current preview, only VM Type Ds3 and Es3 Family are available to choose from. Microsoft will add more options soon.

Create dedicated host

More Details About Pricing

As you can see in the screenshot, Microsoft added the option to use Azure Hybrid Use Benefits for Dedicated Host. That means you can use your on-prem Windows Server and SQL Server licenses with Software Assurance to reduce your costs in Azure.

Azure Hybrid Use Benefits pricing

Azure Dedicated Host also gives you more insides into the host like:

  • The underlying hardware infrastructure (host type)
  • Processor brand, capabilities, and more
  • Number of cores
  • Type and size of the Azure Virtual Machines you want to deploy

An Azure Customer can control all host-level platform maintenance initiated by Azure, like OS updates. An Azure Dedicated Host gives you the option to schedule maintenance windows within 35 days where these updates are applied to your host system. During this self-maintenance window, customers can apply maintenance to hosts at their own convenience.

So looking a bit deeper in that service, Azure becomes more like a traditional hosting provider who gives a customer a very dynamic platform.

The following screenshot shows the current pricing for a Dedicated Host.

Azure Dedicated Host pricing details

Following virtual machine types can be run on a dedicated host.

Virtual Machines on a Dedicated Host

Currently, you have a soft limit from 3000 vCPUs for a dedicated host per region. That limit can be enhanced by submitting a support ticket.

When Would I Use A Dedicated Host?

In most cases, you would choose a dedicated host because of compliance reasons. You may not want to share a host with other customers. Another reason could be that you want a guaranteed CPU architecture and type. If you place your VMs on the same host, then it is guaranteed that it will have the same architecture.

Further Reading

Microsoft already published a lot of documentation and blogs about the topic so you can deepen your knowledge about Dedicated Host.

Resource #1: Announcement Blog and FAQ 

Resource #2: Product Page 

Resource #3: Introduction Video – Azure Friday “An introduction to Azure Dedicated Hosts | Azure Friday”

Go to Original Article
Author: Florian Klaffenbach

Using Azure AD conditional access for tighter security

As is standard with technologies in the cloud, the features in Azure Active Directory are on the move.

The Azure version of Active Directory differs from its on-premises version in many ways, including its exposure to the internet. There are ways to protect your environment and be safe, but that’s not the case by default. Here are two changes you should make to protect your Azure AD environment.

Block legacy authentication

Modern authentication is Microsoft’s term for a set of rules and requirements on how systems can communicate and authenticate with Azure AD. This requirement is put in place for several security benefits, but it’s also not enforced by default on an Azure AD tenant.

Legacy authentication is used for many types of attacks against Azure AD-based accounts. If you block legacy authentication, then you will block those attacks, but there’s a chance you’ll prevent users trying to perform legitimate tasks.

This is where Azure AD conditional access can help. Instead of a simple off switch for legacy authentication, you can create one or more policies — a set of rules — that dictate what is and isn’t allowed under certain scenarios.

You can start by creating an Azure AD conditional access policy that requires modern authentication or it blocks the sign-in attempt. Microsoft recently added a “report only” option to conditional access policies, which is highly recommended to use and leave on a few days after deployment. This will show you the users still using legacy authentication that you need to remediate before you enforce the policy for real. This helps to ensure you don’t stop users from doing their jobs.

However, this change will severely limit mobile phone email applications. The only ones officially supported with modern authentication are Outlook for iOS and Android, and Apple iOS Mail.

Implement multifactor authentication

This sounds like an obvious one, but there are many ways to do multifactor authentication (MFA). Your Microsoft licensing is one of the factors that dictates your choices. The good news is that options are available to all licensing tiers — including the free one — but the most flexible options come from Azure AD Premium P1 and P2.

With those paid plans, conditional access rules can be a lot nicer than just forcing MFA all the time. For example, you might not require MFA if the user accesses a Microsoft service from an IP address at your office or if the device is Azure AD-joined. You might prefer that both of those scenarios are requirements to avoid MFA while other situations, such as a user seeking access on a PC not owned by the company, will prompt for extra authentication.

MFA doesn’t have to just be SMS-based authentication. Microsoft’s Authenticator App might take a few more steps for someone to set up the first time they register, but it’s much easier to just accept a pop-up on your mobile device as a second factor of authorization, rather than waiting for an SMS, reading the six-digit number, then typing it into your PC.

Without MFA, you’re running a high risk of having an internet-exposed authentication system that attackers can easily try leaked credentials or use spray attacks until they hit a successful login with a username and password.

The other common attack is credential phishing. This can be particularly successful when the threat actor uses a compromised account to send out phishing emails to the person’s contacts or use fake forms to get the contact’s credentials, too. This would be mostly harmless if the victim’s account required MFA.

Accounts in Azure AD will lock out after 10 failed attempts without MFA, but only for a minute, then gradually increase the time after further failure attempts. This is a good way to slow down the attackers, and it’s also smart enough to only block the attacker and keep your user working away. But the attacker can just move onto the next account and come back to the previous account at a later time, eventually hitting a correct password.

Azure AD conditional access changes are coming

The above recommendations can be enabled by four conditional access baseline policies, which should be visible in all Azure AD tenants (still in preview), but it appears these are being removed in the future.

baseline protection policies
Microsoft plans to replace the baseline protection policies with security defaults

The policies will be replaced by a single option called Security Defaults, found under the Manage > Properties section of Azure AD. The baseline policies helped you be a bit more granular about what security you wanted and the enablement of each feature. To keep that flexibility, you’ll need Azure AD Premium once these baseline policies go.

Turning on Security Defaults in your Azure AD tenant will:

  • force administrators to use MFA;
  • force privileged actions, such as using Azure PowerShell, to use MFA;
  • force all users to register for MFA within 14 days; and
  • block legacy authentication for all users.

I suspect the uptake wasn’t enough, which is why Microsoft is moving to a single toggle option to enable these recommendations. I also hazard to guess that Microsoft will make this option on by default for new tenants in the future, but there’s no need for you to wait. If you don’t have these options on, you should be working on enabling them as soon as you can.

Go to Original Article
Author:

Microsoft and Genesys expand partnership to help enterprises seize the power of the cloud for better customer experiences – Stories

Genesys Engage on Microsoft Azure is a new trusted and secure cloud offering built to ease the transition to the cloud for large enterprises

Microsoft CEO Satya Nadella and Tony Bates, CEO of Genesys
Microsoft CEO Satya Nadella (left), and Tony Bates, CEO of Genesys (right)

REDMOND, Wash., and SAN FRANCISCO — Jan. 23, 2020 — Microsoft Corp. and Genesys have expanded their partnership to provide enterprises with a new cloud service for contact centers that enables them to deliver superior interactions for customers. With the omnichannel customer experience solution Genesys Engage™ running on Microsoft Azure, enterprises have the security and scalability they need to manage the complexities involved with connecting every touchpoint throughout the customer journey.

Genesys Engage on Microsoft Azure will be available in late 2020. To accelerate adoption, the companies are providing Genesys Engage on Microsoft Azure through a joint co-selling and go-to-market strategy. Customers will benefit from a streamlined buying process that puts them on a clear path to the cloud.

The power of Genesys Engage on Microsoft Azure

With its multitenant architecture, Genesys Engage on Microsoft Azure gives customers the ability to innovate faster and improve their business agility. In addition, by running the Genesys customer experience solution on this dependable cloud environment, enterprises will be able to maximize their investment in Microsoft Azure through simplified management and maintenance requirements, centralized IT expertise, reduced costs, and more. These solutions make it easier for enterprises to leverage cloud and artificial intelligence (AI) technologies so they can gain deeper insights and provide tailor-made experiences for their customers.

Nemo Verbist, senior vice president of Intelligent Business and Intelligent Workplace at NTT Ltd., one of the top five global technology and services providers for the world’s largest enterprises and a partner of both Microsoft and Genesys, sees great value in the partnership. Verbist said, “Many of our customers have standardized on Microsoft solutions, and Genesys Engage on Microsoft Azure gives them an additional opportunity to take advantage of their investment. Together, these solutions provide enterprises a secure and powerful foundation to communicate with their customers in creative and meaningful ways.”

“Large contact centers receive an exceptionally high volume of inquiries across a growing list of channels and platforms. One of the biggest challenges is connecting the details of every interaction across all channels to ensure each customer has a seamless experience,” said Kate Johnson, president, Microsoft U.S. “By leveraging Microsoft’s Azure cloud and AI technologies, Genesys is helping enterprises create a seamless customer journey with Microsoft’s trusted, secure and scalable platform.”

“We are thrilled to give large enterprises the opportunity to run their mission-critical customer experience platform in the cloud environment they already know and trust — Microsoft Azure,” said Peter Graf, chief strategy officer of Genesys. “Together, we’re making it simpler for even the most complex organizations to transition to the cloud, enabling them to unlock efficiencies and accelerate innovation so they can build deeper connections with customers.”

The companies are also exploring and developing new integrations for Genesys and Microsoft Teams, Microsoft Dynamics 365 and Azure Cognitive Services to streamline collaboration and communications for employees and customers. More information will be released about these upcoming integrations later this year.

Register for the upcoming webinar, Genesys Engage + Microsoft Azure: Transform Your Customer Experience in the Cloud, to learn more on March 4.

To learn more about how Genesys and Microsoft are partnering, please visit the Microsoft Transform blog.

About Genesys

Every year, Genesys® delivers more than 70 billion remarkable customer experiences for organizations in over 100 countries. Through the power of the cloud and AI, our technology connects every customer moment across marketing, sales and service on any channel, while also improving employee experiences. Genesys pioneered Experience as a ServiceSM so organizations of any size can provide true personalization at scale, interact with empathy, and foster customer trust and loyalty. This is enabled by Genesys CloudTM, an all-in-one solution and the world’s leading public cloud contact center platform, designed for rapid innovation, scalability and flexibility. Visit www.genesys.com.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

©2020 Genesys Telecommunications Laboratories, Inc. All rights reserved. Genesys and the Genesys logo are trademarks and/or registered trademarks of Genesys. All other company names and logos may be registered trademarks or trademarks of their respective companies.

For more information, press only:

Microsoft Media Relations, WE Communications for Microsoft, +1 (425) 638-7777, [email protected]

Shaunna Morgan, Genesys Media Relations, +1 (317) 493-4241, [email protected]

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication, but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at https://news.microsoft.com/microsoft-public-relations-contacts.

Go to Original Article
Author: Microsoft News Center

Using the Windows Admin Center Azure services feature

To drive adoption of its cloud platform, Microsoft is lowering the technical barrier to Azure through the Windows Admin Center management tool.

Microsoft increasingly blurs the lines between on-premises Windows Server operating systems and its cloud platform.

One way the company has done this is by exposing Azure services alongside Windows Server services in the Windows Admin Center. Organizations that might have been reluctant to go through a lengthy deployment process that required PowerShell expertise can use the Windows Admin Center Azure functionality to set up a hybrid arrangement with just a few clicks in some instances.

Azure Backup

One of the Azure services that Windows Server 2019 can use natively is Azure Backup. This cloud service backs up on-premises resources to Azure. This service offers 9,999 recovery points for each instance and is capable of triple redundant storage within a single Azure region by creating three replicas.

Azure Backup can also provide geo-redundant storage, which insulates protected resources against regional disasters.

You access Azure Backup through the Windows Admin Center, as shown in Figure 1. After you register Windows Server with Azure, setting up Azure Backup takes four steps.

Azure Backup setup
Figure 1: The Windows Admin Center walks you through the steps to set up Azure Backup.

Microsoft designed Azure Backup to replace on-premises backup products. Organizations may find that Azure Backup is less expensive than their existing backup system, but the opposite may also be true. The costs vary widely depending on the volume of data, the type of replication and the data retention policy.

Azure Active Directory

Microsoft positions the Windows Admin Center as a one of the primary management tools for Windows Server. Because sensitive resources are exposed within the Windows Admin Center console, Microsoft offers a way to add an extra layer of security through Azure Active Directory.

When you enable the requirement for Azure Active Directory security, you will be required to authenticate into both the local machine and into Azure Active Directory.

To use Azure Active Directory, you must register the Windows Server with Azure, then you can require Azure Active Directory authentication to be used by opening the Windows Admin Center and then clicking on the Settings icon, followed by the Access tab. Figure 2 shows a simple toggle switch to turn Azure Active Directory authentication on or off.

Azure Active Directory authentication
Figure 2: The toggle switch in the Windows Admin Center sets up Azure Active Directory authentication.

Azure Site Recovery

Azure Site Recovery replicates machines running on-premises to the Microsoft Azure cloud. If a disaster occurs, you can fail over mission-critical workloads to use the replica VMs in the cloud. Once on-premises functionality returns, you can fail back workloads to your data center. Using the Azure cloud as a recovery site is far more cost-effective than building your own recovery data center, or even using a co-location facility.

Like other Azure services, Azure Site Recovery is exposed through the Windows Admin Center. To use it, the server must be registered with Azure. Although Hyper-V is the preferred hosting platform for use with Azure Site Recovery, the service also supports the replication of VMware VMs. The service also replicates between Azure VMs.

To enable a VM for use with the Azure Site Recovery services, open the Windows Admin Center and click on the Virtual Machines tab. This portion of the console is divided into two separate tabs. A Summary tab details the host’s hardware resource consumption, while the Inventory tab lists the individual VMs on the host.

Click on the Inventory tab and then select the checkbox for the VM you want to replicate to the Azure cloud. You can select multiple VMs and there is also a checkbox above the Name column to select all the VMs on the list. After selecting one or more VMs, click on More, and then choose the Set Up VM Protection option from the drop-down list, shown in Figure 3.

VM protection
Figure 3: To set up replication to Azure with the Azure Site Recovery service, select one or more VMs and then choose the Set Up VM Protection option.

The console will open a window to set up the host with Azure Site Recovery. Select the Azure subscription to use, and to create or select a resource group and a recovery vault. You will also need to select a location, as shown in Figure 4.

Azure Site Recovery setup
Figure 4: After you select the VMs to protect in Azure Site Recovery, finalize the process by selecting a location in the Azure cloud.

Storage Migration Service

The Storage Migration Service migrates the contents of existing servers to new physical servers, VMs or to the Azure cloud. This can help organizations reduce costs through workload consolidation.

You access the Storage Migration Service by selecting the Storage Migration Service tab in the Windows Admin Center, which opens a dialog box outlining the storage migration process as shown in Figure 5. The migration involves getting an inventory of your servers, transferring the data from those servers to the new location, and then cutting over to the new server.

Storage Migration Services overview
Figure 5: Microsoft developed Storage Migration Services to ease migrations to new servers, VMs or Azure VMs through a three-step process.

As time goes on, it seems almost inevitable that Microsoft will update the Windows Admin Center to expose even more Azure services. Eventually, this console will likely provide access to all of the native Windows Server services and all services running in Azure.

Go to Original Article
Author:

How to Use Azure Arc for Hybrid Cloud Management and Security

Azure Arc is a new hybrid cloud management option announced by Microsoft in November of 2019. This article serves as a single point of reference for all things Azure Arc.

According to Microsoft’s CEO Satya Nadella, “Azure Arc really marks the beginning of this new era of hybrid computing where there is a control plane built for multi-cloud, multi-edge” (Microsoft Ignite 2019 Keynote at 14:40). That is a strong statement from one of the industry leaders in cloud computing, especially since hybrid cloud computing has already been around for a decade. Essentially Azure Arc allows organizations to use Azure’s management technologies (“control plane”) to centrally administer public cloud resources along with on-premises servers, virtual machines, and containers. Since Microsoft Azure already manages distributed resources at scale, Microsoft is empowering its users to utilize these same features for all of their hardware, including edge servers. All of Azure’s AI, automation, compliance and security best practices are now available to manage all of their distributed cloud resources, and their underlying infrastructure, which is known as “connected machines.” Additionally, several of Azure’s AI and data services can now be deployed on-premises and centrally managed through Azure Arc, enhancing local and offline management and offering greater data sovereignty. Again, this article will provide an overview of the Azure Arc technology and its key capabilities (currently in Public Preview) and will be updated over time.

Video Preview of Azure Arc

Contents

Getting Started with Azure Arc

Azure Services with Azure Arc

Azure Artificial Intelligence (AI) with Azure Arc

Azure Automation with Azure Arc

Azure Cost Management & Billing with Azure Arc

Azure Data Services with Azure Arc

Cloud Availability with Azure Arc

Azure Availability & Resiliency with Azure Arc

Azure Backup & Restore with Azure Arc

Azure Site Recovery & Geo-Replication with Azure Arc

Cloud Management with Azure Arc

Management Tools with Azure Arc

Managing Legacy Hardware with Azure Arc

Offline Management with Azure Arc

Always Up-To-Date Tools with Azure Arc

Cloud Security & Compliance with Azure Arc

Azure Key Vault with Azure Arc

Azure Monitor with Azure Arc

Azure Policy with Azure Arc

Azure Security Center with Azure Arc

Azure Advanced Threat Protection with Azure Arc

Azure Update Management with Azure Arc

Role-Based Access Control (RBAC) with Azure Arc

DevOps and Application Management with Azure Arc

Azure Kubernetes Service (AKS) & Kubernetes App Management with Azure Arc

Other DevOps Tools with Azure Arc

DevOps On-Premises with Azure Arc

Elastic Scalability & Rapid Deployment with Azure Arc

Hybrid Cloud Integration with Azure Arc

Azure Stack Hub with Azure Arc

Azure Stack Edge with Azure Arc

Azure Stack Hyperconverged Infrastructure (HCI) with Azure Arc

Managed Service Providers (MSPs) with Azure Arc

Azure Lighthouse Integration with Azure Arc

Third-Party Integration with Azure Arc

Amazon Web Services (AWS) Integration with Azure Arc

Google Cloud Platform (GCP) Integration with Azure Arc

IBM Kubernetes Service Integration with Azure Arc

Linux VM Integration with Azure Arc

VMware Cloud Solution Integration with Azure Arc

Getting Started with Azure Arc

The Azure Arc public preview was announced in November 2019 at the Microsoft Ignite conference to much fanfare. In its initial release, many fundamental Azure services are supported along with Azure Data Services. Over time, it is expected that a majority of Azure Services will be supported by Azure Arc.

To get started with Azure Arc, check out the following guides and documentation provided by Microsoft.

Additional information will be added once it is made available.

Azure Services with Azure Arc

One of the fundamental benefits of Azure Arc is the ability to bring Azure services to a customer’s own datacenter. In its initial release, Azure Arc includes services for AI, automation, availability, billing, data, DevOps, Kubernetes management, security, and compliance. Over time, additional Azure services will be available through Azure Arc.

Azure Artificial Intelligence (AI) with Azure Arc

Azure Arc leverages Microsoft Azure’s artificial intelligence (AI) services, to power some of its advanced decision-making abilities learned from managing millions of devices at scale. Since Azure AI is continually monitoring billions of endpoints, it is able to perform tasks that can only be achieved at scale, such as identifying an emerging malware attack. Azure AI improves security, compliance, scalability and more for all cloud resources managed by Azure Arc. The services which run Azure AI are hosted in Microsoft Azure, and in disconnected environments, much of the AI processing can run on local servers using Azure Stack Edge.

For more information about Azure AI visit https://azure.microsoft.com/en-us/overview/ai-platform.

Azure Automation with Azure Arc

Azure Automation is a service provided by Azure that automates repetitive tasks which can be time-consuming or error-prone. This saves the organization significant time and money while helping them maintain operational consistency. Custom automation scripts can get triggered by a schedule or an event to automate servicing, track changes, collect inventory and much more. Since Azure Automation uses PowerShell, Python, and graphical runbooks, it can manage diverse software and hardware that supports PowerShell or has APIs. With Azure Arc, any on-premises connected machines and the applications they host can be integrated and automated with any Azure Automation workflow. These workflows can also be run locally on disconnected machines.

For more information about Azure Automation visit https://azure.microsoft.com/en-in/services/automation.

Azure Cost Management & Billing with Azure Arc

Microsoft Azure and other cloud providers use a consumption-based billing model so that tenants only pay for the resources which they consume. Azure Cost Management and Billing provides granular information to understand how cloud storage, network, memory, CPUs and any Azure services are being used. Organizations can set thresholds and get alerts when any consumer or business unit approaches or exceeds their limits. With Azure Arc, organizations can use cloud billing to optimize and manage costs for their on-premises resources also. In addition to Microsoft Azure and Microsoft hybrid cloud workloads, all Amazon AWS spending can be integrated into the same dashboard.

For more information about Azure Cost Management and Billing visit https://azure.microsoft.com/en-us/services/cost-management.

Azure Data Services with Azure Arc

Azure Data Services is the first major service provided by Azure Arc for on-premises servers. This was the top request of many organizations which want the management capabilities of Microsoft Azure, yet need to keep their data on-premises for data sovereignty. This makes Azure Data Services accessible to companies that must keep their customer’s data onsite, such as those working within regulated industries or those which do not have an Azure datacenter within their country.

In the initial release, both Azure SQL Database and Azure Database for PostgreSQL Hyperscale will be available for on-premises deployments. Now organizations can run and offer database as a service (DBaaS) as a platform as a service (PaaS) offering to their tenants. This makes it easier for users to deploy and manage cloud databases on their own infrastructure, without the overhead of setting up and maintaining the infrastructure on a physical server or virtual machine. The Azure Data Services on Azure Arc still require an underlying Kubernetes cluster, but many management frameworks are supported by Microsoft Azure and Azure Arc.

All of the other Azure Arc benefits are included with the data services, such as automation, backup, monitoring, scaling, security, patching and cost management. Additionally, Azure Data Services can run on both connected and disconnected machines. The latest features and updates to the data services are automatically pushed down from Microsoft Azure to Azure Arc members so that the infrastructure is always current and consistent.

For more information about Azure Data Services with Azure Arc visit https://azure.microsoft.com/en-us/services/azure-arc/hybrid-data-services.

Cloud Availability with Azure Arc

One of the main advantages offered by Microsoft Azure is access to its unlimited hardware spread across multiple datacenters which provide business continuity. This gives Azure customers numerous ways to increase service availability, retain more backups, and gain disaster recovery capabilities. With the introduction of Azure Arc, these features provide even greater integration between on-premises servers and Microsoft Azure.

Azure Availability & Resiliency with Azure Arc

With Azure Arc, organizations can leverage Azure’s availability and resiliency features for their on-premises servers. Virtual Machine Scale Sets allow automatic application scaling by rapidly deploying dozens (or thousands) of VMs to quickly increase the processing capabilities of a cloud application. Integrated load-balancing will distribute network traffic, and redundancy is built into the infrastructure to eliminate single points of failure. VM Availability Sets give administrators the ability to select a group of related VMs and force them to distribute themselves across different physical servers. This is recommended for redundant servers or guest clusters where it is important to have each virtualized instanced spread out so that the loss of a single host will not take down an entire service. Azure Availability Zones extend this concept across multiple datacenters by letting organizations deploy datacenter-wide protection schemes that distribute applications and their data across multiple sites. Azure’s automated updating solutions are availability-aware so they will keep services online during a patching cycle, serially updating and rebooting a subset of hosts. Azure Arc helps hybrid cloud services take advantage of all of the Azure resiliency features.

For more information about Azure availability and resiliency visit https://azure.microsoft.com/en-us/features/resiliency.

Azure Backup & Restore with Azure Arc

Many organizations limit their backup plans because of their storage constraints since it can be costly to store large amounts of data which may not need to be accessed again. Azure Backup helps organizations by allowing their data to be backed up and stored on Microsoft Azure. This usually reduces costs as users are only paying for the storage capacity they are using. Additionally storing backups offsite helps minimize data loss as offsite backups provide resiliency to site-wide outages and can protect customers from ransomware. Azure Backup also offers compression, encryption and retention policies to help organizations in regulated industries. Azure Arc manages the backups and recovery of on-premises servers with Microsoft Azure, with the backups being stored in the customer’s own datacenter or in Microsoft Azure.

For more information about Azure Backup visit https://azure.microsoft.com/en-us/services/backup.

Azure Site Recovery & Geo-Replication with Azure Arc

One of the more popular hybrid cloud features enabled with Microsoft Azure is the ability to replicate data from an on-premises location to Microsoft Azure using Azure Site Recovery (ASR). This allows users to have a disaster recovery site without needing to have a second datacenter. ASR is easy to deploy, configure, operate and even tests disaster recovery plans. Using Azure Arc it is possible to set up geo-replication to move data and services from a managed datacenter running Windows Server Hyper-V, VMware vCenter or Amazon Web Services (AWS) public cloud. Destination datacenters can include other datacenters managed by Azure Arc, Microsoft Azure and Amazon AWS.

For more information about Azure Site Recovery visit https://azure.microsoft.com/en-us/services/site-recovery.

Cloud Management with Azure Arc

Azure Arc introduces some on-premises management benefits which were previously available only in Microsoft Azure. These help organizations administer legacy hardware and disconnected machines with Azure-consistent features using multiple management tools.

Management Tools with Azure Arc

One of the fundamental design concepts of Microsoft Azure is to have centralized management layers (“planes”) that support diverse hardware, data, and administrative tools. The fabric plane controls the hardware through a standard set of interfaces and APIs. The data plane allows unified management of structured and unstructured data. And the control plane offers centralized management through various interfaces, including the GUI-based Azure Portal, Azure PowerShell, and other APIs. Each of these layers interfaces with each other through a standard set of controls, so that the operational steps will be identical whether a user deploys a VM via the Azure Portal or via Azure PowerShell. Azure Arc can manage cloud resources with the following Azure Developer Tools:

At the time of this writing, the documentation for Azure Arc is not yet available, but some examples can be found in the quick start guides which are linked in the Getting Started with Azure Arc section.

Managing Legacy Hardware with Azure Arc

Azure Arc is hardware-agnostic, allowing Azure to manage a customer’s diverse or legacy hardware just like an Azure datacenter server. The hardware must meet certain requirements so that a virtualized Kubernetes cluster can be deployed on it, as Azure services run on this virtualized infrastructure. In the Public Preview, servers must be running Windows Server 2012 R2 (or newer) or Ubuntu 16.04 or 18.04. Over time, additional servers will be supported, with rumors of 32-bit (x86), Oracle and Linux hosts being supported as infrastructure servers.

Offline Management with Azure Arc

Azure Arc will even be able to manage servers that are not regularly connected to the Internet, as is common with the military, emergency services, and sea vessels. Azure Arc has a concept of “connected” and “disconnected” machines. Connected servers have an Azure Resource ID and are part of an Azure Resource group. If a server does not sync with Microsoft Azure every 5 minutes, it is considered disconnected yet it can continue to run its local resources. Microsoft Arc allows these organizations to use the latest Azure services when they are connected, yet still use many of these features if the servers do not maintain an active connection, including Azure Data Services. Even some services which run Azure AI and are hosted in Microsoft Azure can work disconnected environments while running on Azure Stack Edge.

Always Up-To-Date Tools with Azure Arc

One of the advantages of using Microsoft Azure is that all the services are kept current by Microsoft. The latest features, best practices, and AI learning are automatically available to all users in real-time as soon as they are released. When an admin logs into the Azure Portal through a web browser, they are immediately exposed to the latest technology to manage their distributed infrastructure. By ensuring that all users have the same management interface and APIs, Microsoft can guarantee consistency of behavior for all users across all hardware, including on-premises infrastructure when using Azure Arc. However, if the hardware is in a disconnected environment (such as on a sea vessel), there could be some configuration drift as older versions of Azure data services and Azure management tools may still be used until they are reconnected and synced.

Cloud Security & Compliance with Azure Arc

Public cloud services like Microsoft Azure are able to offer industry-leading security and compliance due to their scale and expertise. Microsoft employs more than 3,500 of the world’s leading security engineers who have been collaborating for decades to build the industry’s safest infrastructure. Through its billions of endpoints, Microsoft Azure leverages Azure AI to identify anomalies and detect threats before they become widespread. Azure Arc extends all of the security features offered in Microsoft Azure to on-premises infrastructure, including key vaults, monitoring, policies, security, threat protection, and update management.

Azure Key Vault with Azure Arc

When working in a distributed computing environment, managing credentials, passwords, and user access can become complicated. Azure Key Vault is a service that helps enhance data protection and compliance by securely protecting all keys and monitoring access. Azure Key Vault is supported by Azure Arc, allowing credentials for on-premises services and hybrid clouds to be centrally managed through Azure.

For more information about Azure Key Vault visit https://azure.microsoft.com/en-us/services/key-vault.

Azure Monitor with Azure Arc

Azure Monitor is a service that collects and analyzes telemetry data from Azure infrastructure, networks, and applications. The logs from managed services are sent to Azure Monitor where they are aggregated and analyzed. If a problem is identified, such as an offline server, it can trigger alerts or use Azure Automation to launch recovery workflows. Azure Arc can now monitor on-premises servers, networks, virtualization infrastructure, and applications, just like they were running in Azure. It even leverages Azure AI and Azure Automation to make recommendations and fixes to hybrid cloud infrastructure.

For more information about Azure Monitor visit https://azure.microsoft.com/en-us/services/monitor.

Azure Policy with Azure Arc

Most enterprises have certain compliance requirements for the IT infrastructure, especially those organizations within regulated industries. Azure Policy uses Microsoft Azure to audit an environment and aggregate all the compliance data into a single location. Administrators can get alerted about misconfigurations or configuration drifts and even trigger automated remediation using Azure Automation. Azure Policy can be used with Azure Arc to apply policies on all connect machines, providing the benefits of cloud compliance to on-premises infrastructure.

For more information about Azure Policy visit https://azure.microsoft.com/en-us/services/azure-policy.

Azure Security Center with Azure Arc

The Azure Security Center centralizes all security policies and protects the entire managed environment. When Security Center is enabled, the Azure monitoring agents will report data back from the servers, networks, virtual machines, databases, and applications. The Azure Security Center analytics engines will ingest the data and use AI to provide guidance. It will recommend a broad set of improvements to enhance security, such as closing unnecessary ports or encrypting disks. Perhaps most importantly it will scan all the managed servers and identify updates that are missing, and it can use Azure Automation and Azure Update Management to patch those vulnerabilities. Azure Arc extends these security features to connected machines and services to protect all registered resources.

For more information about Azure Security Center visit https://azure.microsoft.com/en-us/services/security-center

Azure Advanced Threat Protection with Azure Arc

Azure Advanced Threat Protection (ATP) helps the industry’s leading cloud security solution by looking for anomalies and potential attacks with Azure AI. Azure ATP will look for suspicious computer or user activities and report any alerts in real-time. Azure Arc lets organizations extend this cloud protect to their hybrid and on-premises infrastructure offering leading threat protection across all of their cloud resources.

For more information about Azure Advanced Threat Protection visit https://azure.microsoft.com/en-us/features/azure-advanced-threat-protection.

Azure Update Management with Azure Arc

Microsoft Azure automates the process of applying patches, updates and security hotfixes to the cloud resources it manages. With Update Management, a series of updates can be scheduled and deployed on non-compliant servers using Azure Automation. Update management is aware of clusters and availability sets, ensuring that a distributed workload remains online while its infrastructure is patched by live migrating running VMs or containers between hosts. Azure will centrally manage updates, assessment reports, deployment results, and can create alerts for failures or other conditions. Organizations can use Azure Arc to automatically analyze and patch their on-premises and connected servers, virtual machines, and applications.

For more information about Azure Update Management visit https://docs.microsoft.com/en-us/azure/automation/automation-update-management.

Role-Based Access Control (RBAC) with Azure Arc

Controlling access to different resources is a critical function for any organization to enforce security and compliance. Microsoft Azure Active Directory (Azure AD) allows its customers to define granular access control for every user or user role based on different types of permissions (read, modify, delete, copy, sharing, etc.). There are also over 70 user roles provided by Azure, such as a Global Administrator, Virtual Machine Contributor or Billing Administrator. Azure Arc lets businesses extend role-based access control (RBAC) managed by Azure to on-premises environments. This means that any groups, policies, settings, security principals and managed identities that were deployed by Azure AD can now access all managed cloud resources. Azure AD also provides auditing so it is easy to track any changes made by users or security principals across the hybrid cloud.

For more information about Role-Based Access Control visit https://docs.microsoft.com/en-us/azure/role-based-access-control/overview.

DevOps and Application Management with Azure Arc

Over the past few years, containers have become more commonplace as they provide certain advantages over VMs, allowing the virtualized applications and services to be abstracted from their underlying virtualized infrastructure. This means that containerized applications can be uniformly deployed anywhere with any tools so that users do not have to worry about the hardware configuration. This technology has become popular amongst application developers, enabling them to manage their entire application development lifecycle without having a dependency on the IT department to set up the physical or virtual infrastructure. This development methodology is often called DevOps. One of the key design requirements with Azure Arc was to make it hardware agnostic, so with Azure Arc, developers can manage their containerized applications the same way whether they are running in Azure, on-premises or in a hybrid configuration.

Azure Kubernetes Service (AKS) & Kubernetes App Management with Azure Arc

Kubernetes is a management tool that allows developers to deploy, manage and update their containers. Azure Kubernetes Service (AKS) is Microsoft’s Kubernetes service and this can be integrated with Azure Arc. This means that AKS can be used to manage on-premises servers running containers. In additional to Azure Kubernetes Service, Azure Arc can be integrated with other Kubernetes management platforms, including Amazon EKS, Google Kubernetes Engine, and IBM Kubernetes Service.

For more information about Azure Container Services visit https://azure.microsoft.com/en-us/product-categories/containers and for Azure Kubernetes Services (AKS) visit https://azure.microsoft.com/en-us/services/kubernetes-service.

Other DevOps Tools with Azure Arc

For container management on Azure Arc developers can use any of the common Kubernetes management platforms, including Azure Kubernetes Service, Amazon EKS, Google Kubernetes Engine, and IBM Kubernetes Service. All standard deployment and management operations are supported on Azure Arc hardware enabling cross-cloud management.

More information about the non-Azure management tools is provided on the section on Third-Party Management Tools.

DevOps On-Premises with Azure Arc

Many developers prefer to work on their own hardware and some are required to develop applications in a private environment to keep their data secure. Azure Arc allows developers to build and deploy their applications anywhere utilizing Azure’s cloud-based AI, security and other cloud features while retaining their data, IP or other valuable assets within their own private cloud. Additionally, Azure Active Directory can use role-based access control (RBAC) and Azure Policies to manage developer access to sensitive company resources.

Elastic Scalability & Rapid Deployment with Azure Arc

Containerized applications are designed to start quickly when running on a highly-available Kubernetes cluster. The app will bypass the underlying operating system, allowing it to be rapidly deployed and scaled. These applications can quickly grow to an unlimited capacity when deployed on Microsoft Azure. When using Azure Arc, the applications can be managed across public and private clouds. Applications will usually contain several containers types that can be deployed in different locations based on their requirements. A common deployment configuration for a two-tiered application is to deploy the web frontend on Microsoft Azure for scalability and the database in a secure private cloud backend.

Hybrid Cloud Integration with Azure Arc

Microsoft’s hybrid cloud initiatives over the past few years have included certifying on-premises software and hardware configurations known as Azure Stack. Azure Stack allows organizations to run Azure-like services on their own hardware in their datacenter. It allows organizations that may be restricted from using public cloud services to utilize the best parts of Azure within their own datacenter. Azure Stack is most commonly deployed by organizations that have requirements to keep their customer’s datacenter inhouse (or within their territory) for data sovereignty, making it popular for customers who could not adopt the Microsoft Azure public cloud. Azure Arc easily integrates with Azure Stack Hub, Azure Stack Edge, and all the Azure Stack HCI configurations, allowing these services to be managed from Azure.

For more information about Azure Stack visit https://azure.microsoft.com/en-us/overview/azure-stack.

Azure Stack Hub with Azure Arc

Azure Stack Hub (formerly Microsoft Azure Stack) offers organizations a way to run Azure services from their own datacenter, from a service provider’s site, or from within an isolated environment. This cloud platform allows users to deploy Windows VMs, Linux VMs and Kubernetes containers on hardware which they operate. This offering is popular with developers who want to run services locally, organizations which need to retain their customer’s data onsite, and groups which are regularly disconnected from the Internet, as is common with sea vessels or emergency response personnel. Azure Arc allows Azure Stack Hub nodes to run supported Azure services (like Azure Data Services) while being centrally managed and optimized via Azure. These applications can be distributed across public, private or hybrid clouds.

For more information about Azure Stack Hub visit https://docs.microsoft.com/en-us/azure-stack/user/?view=azs-1908.

Azure Stack Edge with Azure Arc

Azure Stack Edge (previously Azure Data Box Edge) is a virtual appliance which can run on any hardware in a datacenter, branch office, remote site or disconnected environment. It is designed to run edge computing workloads on Hyper-V VMs, VMware VMs, containers and Azure services. These edge servers will be optimized run IoT, AI and business workloads so that processing can happen onsite, rather than being sent across a network to a cloud datacenter for processing. When the Azure Stack Edge appliance is (re)connected to the network it transfers any data at high-speed, and data use can be optimized to run during off-hours. It supports machine learning capabilities through GPGA or GPU. Azure Arc can centrally manage Azure Stack Edge, its virtual appliances and physical hardware.

For more information about Azure Stack Edge visit https://azure.microsoft.com/en-us/services/databox/edge.

Azure Stack Hyperconverged Infrastructure (HCI) with Azure Arc

Azure Stack Hyperconverged Infrastructure (HCI) is a program which provides preconfigured hyperconverged hardware from validated OEM partners which are optimized to run Azure Stack. For businesses which want to run Azure-like services on-premises they can purchase or rent hardware which has been standardized to Microsoft’s requirements. VMs, containers, Azure services, AI, IOT and more can run consistency on the Microsoft Azure public cloud or Azure Stack HCI hardware in a datacenter. Cloud services can be distributed across multiple datacenters or clouds and centrally managed using Azure Arc.

For more information about Azure Stack HCI visit https://azure.microsoft.com/en-us/overview/azure-stack/hci.

Managed Service Providers (MSPs) with Azure Arc

Azure Lighthouse Integration with Azure Arc

Azure Lighthouse is a technology designed for managed service providers (MSPs), ISVs or distributed organizations which need to centrally manage their tenants’ resources. Azure Lighthouse allows service providers and tenants to create a two-way trust to allow unified management of cloud resources. Tenants will grant specific permissions for approved user roles on particular cloud resources, so that they can offload the management to their service provider. Now service providers can add their tenants’ private cloud environments under Azure Arc management, so that they can take advantage of the new capabilities which Azure Arc provides.

For more information about Azure Lighthouse visit https://azure.microsoft.com/en-us/services/azure-lighthouse or on the Altaro MSP Dojo.

Third-Party Integration with Azure Arc

Within the Azure management layer (control plane) exists Azure Resource Manager (ARM). ARM provides a way to easily create, manage, monitor and delete any Azure resource. Every native and third-party Azure resource uses ARM to ensure that it can be centrally managed through Azure management tools. Azure Arc now allows non-Azure resources to be managed by Azure. This can include third-party clouds (Amazon Web Services, Google Cloud Platform), Windows and Linux VMs, VMs on non-Microsoft hypervisors (VMware vSphere, Google Compute Engine, Amazon EC2), Kubernetes containers and clusters (including IMB Kubernetes Service, Google Kubernetes Engine and Amazon EKS). At the time of this writing limited information is available about third-party integration, but it will be added over time.

Amazon Web Services (AWS) Integration with Azure Arc

Amazon Web Services (AWS) is Amazon’s public cloud platform. Some services from AWS can be managed by Azure Arc. This includes operating virtual machines running on the Amazon Elastic Compute Cloud (EC2) and containers running on Amazon Elastic Kubernetes Service (EKS). Azure Arc also lets an AWS site be used as a geo-replicated disaster recovery location. AWS billing can also be integrated with Azure Cost Management & Billing so that expenses from both cloud providers can be viewed in a single location.

Additional information will be added once it is made available.

Google Cloud Platform (GCP) Integration with Azure Arc

Google Cloud Platform (GCP) is Google’s public cloud platform. Some services from GCP can be managed by Azure Arc. This includes operating virtual machines running on Google Compute Engine (GCE) and containers running on Google Kubernetes Engine (GKE).

Additional information will be added once it is made available.

IBM Kubernetes Service Integration with Azure Arc

IBM Cloud is IBM’s public cloud platform. Some services from IBM Cloud can be managed by Azure Arc. This includes operating containers running on IBM Kubernetes Service (Kube).

Additional information will be added once it is made available.

Linux VM Integration with Azure Arc

In 2014 Microsoft’s CEO Satya Nadella declared, “Microsoft loves Linux”. Since then the company has embraced Linux integration, making Linux a first-class citizen in its ecosystem. Microsoft even contributes code to the Linux kernel so that it operates efficiently when running as a VM or container on Microsoft’s operating systems. Virtually all management features for Windows VMs are available to supported Linux distributions, and this extends to Azure Arc. Azure Arc admins can use Azure to centrally create, manage and optimize Linux VMs running on-premises, just like any standard Windows VM.

VMware Cloud Solution Integration with Azure Arc

VMware offers a popular virtualization platform and management studio (vSphere) which runs on VMware’s hypervisor. Microsoft has acknowledged that many customers are running legacy on-premises hardware are using VMware, so they provide numerous integration points to Azure and Azure Arc. Organizations can even virtualize and deploy their entire VMware infrastructure on Azure, rather than in their own datacenter. Microsoft makes it easy to deploy, manage, monitor and migrate VMware system and with Azure Arc businesses can now centrally operate their on-premises VMware infrastructure too. While the full management functionality of VMware vSphere is not available through Azure Arc, most standard operations are supported.

For more information about VMware Management with Azure visit https://azure.microsoft.com/en-us/overview/azure-vmware.


Go to Original Article
Author: Symon Perriman

Can AI help save penguins? – Microsoft News Center India

Working on Microsoft Azure platform, Mohanty and his colleagues used a Convolutional Neural Network model to come up with a solution that can identify and count penguins with a high degree of accuracy. The model can potentially help researchers speed up their studies around the status of penguin populations.

The team is now working on the classification, identification and counting of other species using similar deep learning techniques.

Building AI to save the planet

A long-time Microsoft partner headquartered in Hyderabad in India, Gramener is not new to leveraging AI for social good using Microsoft Azure. It was one of the earliest partners for Microsoft’s AI for Earth program announced in 2017.

“I believe that AI can help make the world a better place by accelerating biodiversity conservation and help solve the biggest environmental challenges we face today. When we came to know about Microsoft’s AI for Earth program over two years ago, we reached out to Microsoft as we wanted to find ways to partner and help with our expertise,” says Kesari.

While the program was still in its infancy, the teams from Gramener and Microsoft worked jointly to come up with quick projects to showcase what’s possible with AI and inspire those out there in the field. They started with a proof of concept for identifying flora and fauna species in a photograph.

“We worked more like an experimentation arm working with the team led by Lucas Joppa (Microsoft’s Chief Environmental Officer, and founder of AI for Earth). We built a model, using data available from iNaturalist, that could classify thousands of different species with 80 percent accuracy,” Kesari reveals.

Another proof of concept revolved around camera traps that are used for biodiversity studies in forests. The camera traps take multiple images whenever they detect motion, which leads to a large number of photos that had to be scanned manually.

Soumya Ranjan Mohanty, Lead Data Scientist, Gramener
Soumya Ranjan Mohanty, Lead Data Scientist, Gramener

“Most camera trap photos are blank as they don’t have any animal in the frame. Even in the frames that do, often the animal is too close to be identified or the photo is blurry,” says Mohanty, who also leads the AI for Earth partnership from Gramener.

The team came up with a two-step solution that first weeds out unusable images and then uses a deep learning model to classify images that have an animal in them. This solution too was converted by the Microsoft team into what is now the Camera Trap API that AI for Earth grantees or anyone can freely use.

“AI is critical to conservation because we simply don’t have time to wait for humans to annotate millions of images before we can answer wildlife population questions. For the same reason, we need to rapidly prototype AI applications for conservation, and it’s been fantastic to have Gramener on board as our ‘advanced development team’,” says Dan Morris, principal scientist and program director for Microsoft’s AI for Earth program.

Anticipating the needs of grantees, Gramener and Microsoft have also worked on creating other APIs, like the Land Cover Mapping API that leverages machine learning to provide high-resolution land cover information. These APIs are now part of the public technical resources available for AI for Earth grantees or anyone to use, to accelerate their projects without having to build the base model themselves.

Go to Original Article
Author: Microsoft News Center

A year of bringing AI to the edge

This post is co-authored by Anny Dow, Product Marketing Manager, Azure Cognitive Services.

In an age where low-latency and data security can be the lifeblood of an organization, containers make it possible for enterprises to meet these needs when harnessing artificial intelligence (AI).

Since introducing Azure Cognitive Services in containers this time last year, businesses across industries have unlocked new productivity gains and insights. The combination of both the most comprehensive set of domain-specific AI services in the market and containers enables enterprises to apply AI to more scenarios with Azure than with any other major cloud provider. Organizations ranging from healthcare to financial services have transformed their processes and customer experiences as a result.

These are some of the highlights from the past year:

Employing anomaly detection for predictive maintenance

Airbus Defense and Space, one of the world’s largest aerospace and defense companies, has tested Azure Cognitive Services in containers for developing a proof of concept in predictive maintenance. The company runs Anomaly Detector for immediately spotting unusual behavior in voltage levels to mitigate unexpected downtime. By employing advanced anomaly detection in containers without further burdening the data scientist team, Airbus can scale this critical capability across the business globally.

“Innovation has always been a driving force at Airbus. Using Anomaly Detector, an Azure Cognitive Service, we can solve some aircraft predictive maintenance use cases more easily.”  —Peter Weckesser, Digital Transformation Officer, Airbus

Automating data extraction for highly-regulated businesses

As enterprises grow, they begin to acquire thousands of hours of repetitive but critically important work every week. High-value domain specialists spend too much of their time on this. Today, innovative organizations use robotic process automation (RPA) to help manage, scale, and accelerate processes, and in doing so free people to create more value.

Automation Anywhere, a leader in robotic process automation, partners with these companies eager to streamline operations by applying AI. IQ Bot, their unique RPA software, automates data extraction from documents of various types. By deploying Cognitive Services in containers, Automation Anywhere can now handle documents on-premises and at the edge for highly regulated industries:

“Azure Cognitive Services in containers gives us the headroom to scale, both on-premises and in the cloud, especially for verticals such as insurance, finance, and health care where there are millions of documents to process.” —Prince Kohli, Chief Technology Officer for Products and Engineering, Automation Anywhere

For more about Automation Anywhere’s partnership with Microsoft to democratize AI for organizations, check out this blog post.

Delighting customers and employees with an intelligent virtual agent

Lowell, one of the largest credit management services in Europe, wants credit to work better for everybody. So, it works hard to make every consumer interaction as painless as possible with the AI. Partnering with Crayon, a global leader in cloud services and solutions, Lowell set out to solve the outdated processes that kept the company’s highly trained credit counselors too busy with routine inquiries and created friction in the customer experience. Lowell turned to Cognitive Services to create an AI-enabled virtual agent that now handles 40 percent of all inquiries—making it easier for service agents to deliver greater value to consumers and better outcomes for Lowell clients.

With GDPR requirements, chatbots weren’t an option for many businesses before containers became available. Now companies like Lowell can ensure the data handling meets stringent compliance standards while running Cognitive Services in containers. As Carl Udvang, Product Manager at Lowell explains:

“By taking advantage of container support in Cognitive Services, we built a bot that safeguards consumer information, analyzes it, and compares it to case studies about defaulted payments to find the solutions that work for each individual.”

One-to-one customer care at scale in data-sensitive environments has become easier to achieve.

Empowering disaster relief organizations on the ground

A few years ago, there was a major Ebola outbreak in Liberia. A team from USAID was sent to help mitigate the crisis. Their first task on the ground was to find and categorize the information such as the state of healthcare facilities, wifi networks, and population density centers.  They tracked this information manually and had to extract insights based on a complex corpus of data to determine the best course of action.

With the rugged versions of Azure Stack Edge, teams responding to such crises can carry a device running Cognitive Services in their backpack. They can upload unstructured data like maps, images, pictures of documents and then extract content, translate, draw relationships among entities, and apply a search layer. With these cloud AI capabilities available offline, at their fingertips, response teams can find the information they need in a matter of moments. In Satya’s Ignite 2019 keynote, Dean Paron, Partner Director of Azure Storage and Edge, walks us through how Cognitive Services in Azure Stack Edge can be applied in such disaster relief scenarios (starting at 27:07): 

Transforming customer support with call center analytics

Call centers are a critical customer touchpoint for many businesses, and being able to derive insights from customer calls is key to improving customer support. With Cognitive Services, businesses can transcribe calls with Speech to Text, analyze sentiment in real-time with Text Analytics, and develop a virtual agent to respond to questions with Text to Speech. However, in highly regulated industries, businesses are typically prohibited from running AI services in the cloud due to policies against uploading, processing, and storing any data in public cloud environments. This is especially true for financial institutions.

A leading bank in Europe addressed regulatory requirements and brought the latest transcription technology to their own on-premises environment by deploying Cognitive Services in containers. Through transcribing calls, customer service agents could not only get real-time feedback on customer sentiment and call effectiveness, but also batch process data to identify broad themes and unlock deeper insights on millions of hours of audio. Using containers also gave them flexibility to integrate with their own custom workflows and scale throughput at low latency.

What’s next?

These stories touch on just a handful of the organizations leading innovation by bringing AI to where data lives. As running AI anywhere becomes more mainstream, the opportunities for empowering people and organizations will only be limited by the imagination.

Visit the container support page to get started with containers today.

For a deeper dive into these stories, visit the following

Go to Original Article
Author: Microsoft News Center

AWS, Azure and Google peppered with outages in same week

AWS, Microsoft Azure and Google Cloud all experienced service degradations or outages this week, an outcome that suggests customers should accept that cloud outages are a matter of when, not if.

In AWS’s Frankfurt region, EC2, Relational Database Service, CloudFormation and Auto Scaling were all affected Nov. 11, with the issues now resolved, according to AWS’s status page.

Azure DevOps services for Boards, Repos, Pipelines and Test Plans were affected for a few hours in the early hours of Nov. 11, according to its status page. Engineers determined that the problem had to do with identity calls and rebooted access tokens to fix the system, the page states.

Google Cloud said some of its APIs in several U.S. regions were affected, and others experienced problems globally on Nov. 11, according to its status dashboard. Affected APIs included those for Compute Engine, Cloud Storage, BigQuery, Dataflow, Dataproc and Pub/Sub. Those issues were resolved later in the day.

Google Kubernetes Engine also went through some hiccups over the past week, in which nodes in some recently upgraded container clusters resulted in high levels of kernel panics. Known more colloquially as the “blue screen of death” and other terms, kernel panics are conditions wherein a system’s OS can’t recover from an error quickly or easily.

The company rolled out a series of fixes, but as of Nov. 13, the status page for GKE remained in orange status, which indicates a small number of projects are still affected.

AWS, Microsoft and Google have yet to provide the customary post-mortem reports on why the cloud outages occurred, although more information could emerge soon.

Move to cloud means ceding some control

The cloud outages at AWS, Azure and Google this week were far from the worst experienced by customers in recent years. In September 2018, severe weather in Texas caused a power surge that shut down dozens of Azure services for days.

Stephen ElliotStephen Elliot

Cloud providers have aggressively pursued region and zone expansions to help with disaster recovery and high-availability scenarios. But customers must still architect their systems to take advantage of the expanded footprint.

Still, customers have much less control when it comes to public cloud usage, according to Stephen Elliot, an analyst at IDC. That reality requires some operational sophistication.

It’s a myth that outages won’t happen.
Stephen ElliotAnalyst, IDC

“Networks are so interconnected and distributed, lots of partners are involved in making a service perform and available,” he said. “[Enterprises] need a risk mitigation strategy that covers people, process, technologies, SLAs, etc. It’s a myth that outages won’t happen. It could be from weather, a black swan event, security or a technology glitch.”

Jay LymanJay Lyman

This fact underscores why more companies are experimenting with and deploying workloads across hybrid and multi-cloud infrastructures, said Jay Lyman, an analyst at 451 Research. “They either control the infrastructure and downtime with on-premises deployments or spread their bets across multiple public clouds,” he said.

Ultimately, enterprise IT shops can weigh the challenges and costs of running their own infrastructure against public cloud providers and find it difficult to match, said Holger Mueller, an analyst at Constellation Research.

“That said, performance and uptime are validated every day, and should a major and longer public cloud outage happen, it could give pause among less technical board members,” he added.

Go to Original Article
Author: