Tag Archives: Microsoft

Creating a more accessible world with Azure AI

At Microsoft, we are inspired by how artificial intelligence is transforming organizations of all sizes, empowering them to reimagine what’s possible. AI has immense potential to unlock solutions to some of society’s most pressing challenges.

One challenge is that according to the World Health Association, globally, only 1 in 10 people with a disability have access to assistive technologies and products. We believe that AI solutions can have a profound impact on this community. To meet this need, we aim to democratize AI to make it easier for every developer to build accessibility into their apps and services, across language, speech, and vision.

In view of the upcoming Bett Show in London, we’re shining a light on how Immersive Reader enhances reading comprehension for people regardless of their age or ability, and we’re excited to share how Azure AI is broadly enabling developers to build accessible applications that empower everyone.

Empowering readers of all abilities

Immersive Reader is an Azure Cognitive Service that helps users of any age and reading ability with features like reading aloud, translating languages, and focusing attention through highlighting and other design elements. Millions of educators and students already use Immersive Reader to overcome reading and language barriers.

The Young Women’s Leadership School of Astoria, New York, brings together an incredible diversity of students with different backgrounds and learning styles. The teachers at The Young Women’s Leadership School support many types of learners, including students who struggle with text comprehension due to learning differences, or language learners who may not understand the primary language of the classroom. The school wanted to empower all students, regardless of their background or learning styles, to grow their confidence and love for reading and writing.

A teacher and student looking at a computer together

Watch the story here

Teachers at The Young Women’s Leadership School turned to Immersive Reader and an Azure AI partner, Buncee, as they looked for ways to create a more inclusive and engaging classroom. Buncee enables students and teachers to create and share interactive multimedia projects. With the integration of Immersive Reader, students who are dyslexic can benefit from features that help focus attention in their Buncee presentations, while those who are just learning the English language can have content translated to them in their native language.

Like Buncee, companies including Canvas, Wakelet, ThingLink, and Nearpod are also making content more accessible with Immersive Reader integration. To see the entire list of partners, visit our Immersive Reader Partners page. Discover how you can start embedding Immersive Reader into your apps today. To learn more about how Immersive Reader and other accessibility tools are fostering inclusive classrooms, visit our EDU blog.

Breaking communication barriers

Azure AI is also making conversations, lectures, and meetings more accessible to people who are deaf or hard of hearing. By enabling conversations to be transcribed and translated in real-time, individuals can follow and fully engage with presentations.

The Balavidyalaya School in Chennai, Tamil Nadu, India teaches speech and language skills to young children who are deaf or hard of hearing. The school recently held an international conference with hundreds of alumni, students, faculty, and parents. With live captioning and translation powered by Azure AI, attendees were able to follow conversations in their native languages, while the presentations were given in English.

Learn how you can easily integrate multi-language support into your own apps with Speech Translation, and see the technology in action with Translator, with support for more than 60 languages, today.

Engaging learners in new ways

We recently announced the Custom Neural Voice capability of Text to Speech, which enables customers to build a unique voice, starting from just a few minutes of training audio.

The Beijing Hongdandan Visually Impaired Service Center leads the way in applying this technology to empower users in incredible ways. Hongdandan produces educational audiobooks featuring the voice of Lina, China’s first blind broadcaster, using Custom Neural Voice. While creating audiobooks can be a time-consuming process, Custom Neural Voice allows Lina to produce high-quality audiobooks at scale, enabling Hongdandan to support over 105 schools for the blind in China like never before.

“We were amazed by how quickly Azure AI could reproduce Lina’s voice in such a natural-sounding way with her speech data, enabling us to create educational audiobooks much more quickly. We were also highly impressed by Microsoft’s commitment to protecting Lina’s voice and identity.”—Xin Zeng, Executive Director at Hongdandan

Learn how you can give your apps a new voice with Text to Speech.

Making the world visible for everyone

According to the International Agency for the Prevention of Blindness, more than 250 million people are blind or have low vision across the globe. Last month, in celebration of the United Nations International Day of Persons with Disabilities, Seeing AI, a free iOS app that describes nearby people, text, and objects, expanded support to five new languages. The additional language support for Spanish, Japanese, German, French, and Dutch makes it possible for millions of blind or low vision individuals to read documents, engage with people around them, hear descriptions of their surroundings in their native language, and much more. All of this is made possible with Azure AI.

Try Seeing AI today or extend vision capabilities to your own apps using Computer Vision and Custom Vision.

Get involved

We are humbled and inspired by what individuals and organizations are accomplishing today with Azure AI technologies. We can’t wait to see how you will continue to build on these technologies to unlock new possibilities and design more accessible experiences. Get started today with a free trial.

Check out our AI for Accessibility program to learn more about how companies are harnessing the power of AI to amplify capabilities for the millions of people around the world with a disability.

Go to Original Article
Author: Microsoft News Center

Microsoft announces it will be carbon negative by 2030 – Stories

REDMOND, Wash. — Jan. 16, 2020 — Microsoft Corp. on Thursday announced an ambitious goal and a new plan to reduce and ultimately remove its carbon footprint. By 2030 Microsoft will be carbon negative, and by 2050 Microsoft will remove from the environment all the carbon the company has emitted either directly or by electrical consumption since it was founded in 1975.

At an event at its Redmond campus, Microsoft Chief Executive Officer Satya Nadella, President Brad Smith, Chief Financial Officer Amy Hood, and Chief Environmental Officer Lucas Joppa announced the company’s new goals and a detailed plan to become carbon negative.

“While the world will need to reach net zero, those of us who can afford to move faster and go further should do so. That’s why today we are announcing an ambitious goal and a new plan to reduce and ultimately remove Microsoft’s carbon footprint,” said Microsoft President Brad Smith. “By 2030 Microsoft will be carbon negative, and by 2050 Microsoft will remove from the environment all the carbon the company has emitted either directly or by electrical consumption since it was founded in 1975.”

The Official Microsoft Blog has more information about the company’s bold goal and detailed plan to remove its carbon footprint: https://blogs.microsoft.com/?p=52558785.

The company announced an aggressive program to cut carbon emissions by more than half by 2030, both for our direct emissions and for our entire supply and value chain. This includes driving down our own direct emissions and emissions related to the energy we use to near zero by the middle of this decade. It also announced a new initiative to use Microsoft technology to help our suppliers and customers around the world reduce their own carbon footprints and a new $1 billion climate innovation fund to accelerate the global development of carbon reduction, capture and removal technologies. Beginning next year, the company will also make carbon reduction an explicit aspect of our procurement processes for our supply chain. A new annual Environmental Sustainability Report will detail Microsoft’s carbon impact and reduction journey. And lastly, the company will use its voice and advocacy to support public policy that will accelerate carbon reduction and removal opportunities.

More information can be found at the Microsoft microsite: https://news.microsoft.com/climate.

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, press only:

Microsoft Media Relations, WE Communications, (425) 638-7777, [email protected]

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication, but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at https://news.microsoft.com/microsoft-public-relations-contacts.

Go to Original Article
Author: Microsoft News Center

Google Cloud retail strategy provides search, hosting, AI for stores

NEW YORK — Google made a pitch to chain-store brands this week, taking on Microsoft Azure and AWS with a bundle of fresh Google Cloud retail hosting and services, backing it up with blue-chip customers.

In sessions at NRF 2020 Vision: Retail’s Big Show, Google Cloud CEO Thomas Kurian and Carrie Tharp, retail and consumer vice president, wooed retailers with promises of AI, uptime and search expertise — including voice and video, in addition to traditional text content — as well as store employee collaboration tools.

Home improvement chain Lowe’s said it will embark on a multiyear plan to rebuild its customer experience, both in-store and online, with Google Cloud at its center. Lowe’s plans to spend $500 million per year through 2021 on the project.

Kohl’s, Ulta Beauty’s business drivers

“Customers expect retailers to be as good with their tech as they are with their physical stores,” said Paul Gaffney, CTO of Kohl’s. The 1,060-store chain launched a major overhaul of its digital customer experience and IT infrastructure in 2018 with Google Cloud retail services, and plans to migrate 70% of its apps into Google Anthos.

Retailers need cloud services that create value for their brands among its customers, Gaffney said, but uptime and scalability is also a major consideration during peak selling times.

“The big rush of business used to be Black Friday, last year was the Cyber Five [Thanksgiving until the following Monday], and now seems like the months of November and December,” Gaffney said in a session with Kurian. “Folks who have been doing this a long time know that we all used to provision a lot of gear that lay idle other than during that period.”

Ulta Beauty, which operates 1,124 stores, chose the Google Cloud Platform for its Ulta Rewards loyalty program hosting and customer data handling, said Michelle Pacynski, vice president of digital innovation at Ulta. The program has 33.9 million members and drives 95% of Ulta’s sales, she added.

Ulta chose Google in part for its data, analytics and personalization platform, Pacynski said. But data ownership also weighed heavily in the decision.

“We looked at the usual subjects, who you would think we would look at,” Pacynski said. “Ultimately for us, we wanted to own our data, we wanted to have power over our data. We evaluated everybody and looked at how we could remain more autonomous with our data.”

Google Cloud retail taking on Azure, AWS

Google’s charge into the retail space started last year with the introduction of retail-specific services to manage customer loyalty, store operations, inventory and product lifecycle management. At NRF 2020, Google added search, AI and hosting services to that stack. It’s part of Google’s bigger push into verticals, Tharp said.

Really, where we see the future of cloud capabilities is in industry-specific solutions.
Carrie TharpRetail and consumer vice president, Google Cloud

“[Google] Cloud started as an infrastructure-as-a-service play,” Tharp said. “Really, where we see the future of cloud capabilities is in industry-specific solutions — having a deep understanding of the industry and building products specific to that. We’re constructing our entire organization around these industry-specific solutions.”

Tharp and some industry experts at NRF said that some retailers harbor resentment toward Amazon as a competitor and are looking for cloud partners other than AWS for future projects. But that is changing, as stores realize that offering Amazon-like speed of delivery and customer service in general is a more important business priority than beating Amazon.

Still, there’s enough anti-Amazon sentiment among retailers that Google has an opportunity to expand its foothold, said Sheryl Kingstone, 451 Research analyst.

“We’re seeing Google Cloud Platform pop up as one of the strategic vendors retailers are looking for in their digital transformations,” Kingstone said. “Azure is up there, and AWS is the 800-pound gorilla. But in the retail space, there’s that opportunity of stealing away someone who is very concerned about being on AWS.”

Go to Original Article
Author:

Launch of Microsoft Teams, consumer Skype integration delayed

Microsoft has again delayed the integration between Microsoft Teams and the consumer version of Skype. The company now plans to release the feature sometime between April and June, after previously saying it would go live this month.

Businesses have already waited more than three years for interoperability between Teams and consumer Skype. Some have complained that the feature’s absence has prevented their organizations from fully migrating from Skype for Business to Teams.

The integration will let users of Teams and consumer Skype message and call one another. Skype for Business has offered interoperability with consumer Skype for years. Many organizations rely on the feature to communicate with external contacts.

The lack of integration with consumer Skype underscores that Teams is still missing features available in Skype for Business, Raúl Castañón-Martinez, analyst at 451 Research, said. The feature gap persists despite Microsoft declaring in 2018 that the two apps had achieved feature parity.

Castañón-Martinez found it odd that Microsoft would push Skype for Business customers to upgrade to Teams yet fail to make it equal to the older application. “It seems like this consumer Skype integration should be a top priority for Microsoft,” he said.

A Microsoft representative declined to comment on what prompted the delay.

Microsoft is trailing competitors when it comes to enabling external collaboration within Teams, Castañón-Martinez said.

Although it supports one-to-one chats across company boundaries, Teams does not give organizations the ability to create common groups or channels for collaborating with external users. Slack supports such a setup through a feature called shared channels. 

More than 3,000 people have endorsed a request for consumer Skype integration on Microsoft’s user feedback website for Teams. Microsoft once said the feature could launch as soon as the second quarter of 2018 but shelved the item in May 2018 because of a “priority shift.”

In December, Microsoft updated its roadmap to indicate the feature would launch in January. But the company changed course this week, pushing the feature off until sometime in the second quarter of 2020.

Microsoft has struggled to meet deadlines for launching Teams features in the past. Last year, the company delayed the launch of private channels in Teams by roughly two months. The postponement came after users had been demanding the feature for more than two years.

The company has scheduled more than two dozen additional items on its roadmap for Teams. Meanwhile, the company has yet to announce release dates for several other features in high demand by users. Those include accessing Office 365 group calendars in Teams and the ability to archive channels.

Go to Original Article
Author:

January Patch Tuesday fixes cryptography bug found by NSA

Microsoft closed a flaw in a key cryptographic feature it discovered with help from the National Security Agency as part of the January Patch Tuesday security updates.

Microsoft issued fixes for Windows, Internet Explorer, Office, several .NET technologies, OneDrive for Android and Microsoft Dynamics for January Patch Tuesday to close 49 unique vulnerabilities, with eight rated as critical. Microsoft said there were no exploited or publicly disclosed vulnerabilities. This month’s updates were the last free security fixes for Windows 7 and Windows Server 2008/2008 R2 as those operating systems left extended support.

Windows cryptographic library flaw fixed

The bug that drew the most attention from various security researchers on January Patch Tuesday is a spoofing vulnerability (CVE-2020-0601), rated important, that affects Windows 10 and Windows Server 2016 and 2019 systems. The NSA uncovered a flaw in the crypt32.dll file that handles certificate and cryptographic messaging functions in the Windows CryptoAPI. The bug would allow an attacker to produce a malicious program that appears to have an authenticated signature from a trusted source.

A successful exploit using a spoofed certificate could be used to launch several types of attacks, such as deliver a malicious file that appears trustworthy, perform man-in-the-middle campaigns and decode sensitive data. An unpatched system could be particularly susceptible because the malicious file could appear legitimate and even skirt Microsoft’s AppLocker protection.

“The guidance from us would be, regardless of Microsoft’s ‘important’ classification, to treat this as a priority one and get the patch pushed out,” said Chris Goettl, director of product management and security at Ivanti, a security and IT management vendor based in South Jordan, Utah.

Goettl noted that companies might not be directly attacked with exploits that use the CryptoAPI bug, but could be at risk from attacks on the back-end system of a vendor or another outside entity, such as when attackers embedded the NotPetya ransomware in tax software to slip past defenses.

Chris Goettl, director of product management and security, IvantiChris Goettl

“It’s not a very common occurrence because good code-signing certificates can establish a level of trust, while this [vulnerability] invalidates that trust and allows an attacker to try and spoof that. It introduces a lot of potential for risk, so we recommend people close [CVE-2020-0601] down as quickly as possible,” he said.

Bugs in Windows remote connection technology patched

January Patch Tuesday also closed three vulnerabilities related to Remote Desktop Services rated critical.

CVE-2020-0609 and CVE-2020-0610 are both remote code execution vulnerabilities in the Remote Desktop Gateway that affect server operating systems on Windows Server 2012 and newer. Microsoft said both CVEs can be exploited pre-authentication without any interaction from the user. Attackers who use the exploit can run arbitrary code on the target system, then perform other tasks, including install programs, delete data or add a new account with full user rights.

CVE-2020-0611 is a remote code execution vulnerability in the Remote Desktop Client that affects Windows 7 and newer on desktops, and Windows Server 2008 R2 and newer on server systems, when the attacker tricks a user to connect to a malicious server. The threat actor could then perform a range of actions, such as install programs, view or change data, or make a new account with full user rights.   

Legacy operating systems reach end-of-life

January Patch Tuesday marks the last time Microsoft will provide security updates and other fixes for the Windows 7, Windows Server 2008 and 2008 R2 operating systems unless customers pay to enter the Extended Security Updates (ESU) program. Companies must also have Software Assurance coverage or subscription licenses to purchase ESU keys for the server operating systems. Users will need to add the ESU key to systems they want to keep protected. ESU for those systems will end in three years.

Companies that plan to keep these legacy operating systems and have signed up for the ESU program should install the servicing stack updates Microsoft released for all three operating systems on January Patch Tuesday, Goettl said. Administrators also need to deploy and activate the ESU key using Microsoft’s instructions.

ESU is an expensive option. For on-premises server workloads, organizations will need either Software Assurance or a subscription license at a cost of about 75% of the license cost each year.  

ESU does not add new or updated features, just security fixes.

For organizations that plan to keep these operating systems running without the safety net of ESU, there are a few ways to minimize the risk around those workloads, including adding more security layers and removing the workload from a direct connection to the internet, Goettl said.

“If there’s an application or something that needs to run on Windows 7, then virtualize that environment. Get the users on the Windows 10 platform and have them connect into the Windows 7 environment to access that critical app. You will it spend more money doing it that way, but you will reduce your risk significantly,” he said.

Go to Original Article
Author:

Microsoft shares new technique to address online grooming of children for sexual purposes – Microsoft on the Issues

Online child exploitation is a horrific crime that requires a whole-of-society approach. Microsoft has a long-standing commitment to child online protection. First and foremost, as a technology company, we have a responsibility to create software, devices and services that have safety features built in from the outset. We leverage technology across our services to detect, disrupt and report illegal content, including child sexual exploitation. And we innovate and invest in tools, technology and partnerships to support the global fight needed to address online child sexual exploitation.

In furtherance of those commitments, today Microsoft is sharing a grooming detection technique, code name “Project Artemis,” by which online predators attempting to lure children for sexual purposes can be detected, addressed and reported. Developed in collaboration with The Meet Group, Roblox, Kik and Thorn, this technique builds off Microsoft patented technology and will be made freely available via Thorn to qualified online service companies that offer a chat function. Thorn is a technology nonprofit that builds technology to defend children from sexual abuse.

The development of this new technique began in November 2018 at a Microsoft “360 Cross-Industry Hackathon,” which was co-sponsored by the WePROTECT Global Alliance in conjunction with the Child Dignity Alliance. These “360” hackathons are multifaceted, focusing not just on technology and engineering but also on legal and policy aspects as well as operations and policy implementation. Today’s announcement marks the technical and engineering progress over the last 14 months by a cross-industry v-team from Microsoft, The Meet Group, Roblox, Kik, Thorn and others to help identify potential instances of child online grooming for sexual purposes and to operationalize an effective response. The teams were led by Dr. Hany Farid, a leading academic who, in 2009, partnered with Microsoft and Dartmouth College on the development of PhotoDNA, a free tool that has assisted in the detection, disruption and reporting of millions of child sexual exploitation images and is used by more than 150 companies and organizations around the world.

Building off the Microsoft patent, the technique is applied to historical text-based chat conversations. It evaluates and “rates” conversation characteristics and assigns an overall probability rating. This rating can then be used as a determiner, set by individual companies implementing the technique, as to when a flagged conversation should be sent to human moderators for review. Human moderators would then be capable of identifying imminent threats for referral to law enforcement, as well as incidents of suspected child sexual exploitation to the National Center for Missing and Exploited Children (NCMEC). NCMEC, along with ECPAT International, INHOPE and the Internet Watch Foundation (IWF), provided valuable feedback throughout the collaborative process.

Beginning on January 10, 2020, licensing and adoption of the technique will be handled by Thorn. Companies and services wanting to test and adopt the technique can contact Thorn directly at [email protected] Microsoft has been leveraging the technique in programs on our Xbox platform for several years and is exploring its use in chat services, including Skype.

“Project Artemis” is a significant step forward, but it is by no means a panacea. Child sexual exploitation and abuse online and the detection of online child grooming are weighty problems. But we are not deterred by the complexity and intricacy of such issues. On the contrary, we are making the tool available at this point in time to invite further contributions and engagement from other technology companies and organizations with the goal of continuous improvement and refinement.

At Microsoft, we embrace a multi-stakeholder model to combat online child exploitation that includes survivors and their advocates, government, tech companies and civil society working together. Combating online child exploitation should and must be a universal call to action.

Learn how to detect, remove and report child sexual abuse materials at PhotoDNA or contact [email protected]. Follow @MSFTissues on Twitter.

Tags: , , , , , , , , ,

Go to Original Article
Author: Microsoft News Center

AWS Outposts vs. Azure Stack vs. HCI

Giants Amazon and Microsoft offer cloud products and services that compete in areas usually reserved for the strengths that traditional hyper-converged infrastructure platforms bring to the enterprise IT table. These include hybrid cloud offerings AWS Outposts, which Amazon made generally available late last year, and Azure Stack from Microsoft.

An integrated hardware and software offering, Azure Stack is designed to deliver Microsoft Azure public cloud services to enable enterprises to construct hybrid clouds in a local data center. It delivers IaaS and PaaS for organizations developing web apps. By sharing its code, APIs and management portal with Microsoft Azure, Azure Stack provides a common platform to address hybrid cloud issues, such as maintaining consistency between cloud and on-premises environments. Stack is for those who want the benefits of a cloud-like platform but must keep certain data private due to regulations or some other constraint.

AWS Outposts is Amazon’s on-premises version of its IaaS offering. Amazon targets AWS Outposts at those who want to run workloads on Amazon Web Services, but instead of in the cloud, do so inside their own data centers to better meet regulatory requirements and, for example, to reduce latency.

Let’s delve deeper into AWS Outposts vs. Azure Stack to better see how they compete with each other and your typical hyper-converged infrastructure (HCI) deployment.

hybrid cloud storage use cases

What is AWS Outposts?

AWS Outposts is Amazon’s acknowledgment that most enterprise class organizations prefer hybrid cloud to a public cloud-only model. Amazon generally has acted solely as a hyperscale public cloud provider, leaving its customers’ data center hardware needs for other vendors to handle. With AWS Outposts, however, Amazon is — for the first time — making its own appliances available for on-premises use.

AWS Outposts customers can run AWS on premises. They can also extend their AWS virtual private clouds into their on-premises environments, so a single virtual private cloud can contain both cloud and data center resources. That way, workloads with low-latency or geographical requirements can remain on premises while other workloads run in the Amazon cloud. Because Outposts is essentially an on-premises extension of the Amazon cloud, it also aims to ease the migration of workloads between the data center and the cloud.

What is Microsoft Azure Stack?

Although initially marketed as simply a way to host Azure services on premises, Azure Stack has evolved into a portfolio of products. The three products that make up the Azure Stack portfolio include Azure Stack Edge, Azure Stack Hub and Azure Stack HCI.

Azure Stack Edge is a cloud-managed appliance that enables you to run managed virtual machine (VM) and container workloads on premises. While this can also be done with Windows Server, the benefit to using Azure Stack Edge is workloads can be managed with a common tool set, whether they’re running on premises or in the cloud.

Azure Stack Hub is used for running cloud applications on premises. It’s mostly for situations in which data sovereignty is required or where connectivity isn’t available.

As its name implies, Azure Stack HCI is a version of Azure Stack that runs on HCI hardware.

AWS Outposts vs. Azure Stack vs. HCI

To appreciate how AWS Outposts competes with traditional HCI, consider common HCI use cases. HCI is often used as a virtualization platform. While AWS Outposts will presumably be able to host Elastic Compute Cloud virtual machine instances, the bigger news is that Amazon is preparing to release a VMware-specific version of Outposts in 2020. The VMware Cloud on AWS Outposts will allow a managed VMware software-defined data center to run on the Outposts infrastructure.

Organizations are also increasingly using HCI as a disaster recovery platform. While Amazon isn’t marketing Outposts as a DR tool, the fact that Outposts acts as a gateway between on-premises services and services running in the Amazon cloud means the platform will likely be well positioned as a DR enabler.

Many organizations have adopted hyper-converged systems as a platform for running VMs and containers. Azure Stack Edge may end up displacing some of those HCIs if an organization is already hosting VMs and containers in the Azure cloud. As for Azure Stack Hub, it seems unlikely that it will directly compete with HCI, except possibly in some specific branch office scenarios.

The member of the Azure Stack portfolio that’s most likely to compete with traditional hyper-convergence is Azure Stack HCI. It’s designed to run scalable VMs and provide those VMs with connectivity to Azure cloud services. These systems are being marketed for use in branch offices and with high-performance workloads.

Unlike first-generation HCI systems, Azure Stack HCI will provide scalability for both compute and storage. This could make it a viable replacement for traditional HCI platforms.

In summary, when it comes to AWS Outposts vs. Azure Stack or standard hyper-convergence, all three platforms have their merits, without any one being clearly superior to the others. If an organization is trying to choose between the three, then my advice would be to choose the platform that does the best job of meshing with the existing infrastructure and the organization’s operational requirements. If the organization already has a significant AWS or Azure footprint, then Outposts or Azure Stack would probably be a better fit, respectively. Otherwise, traditional HCI is probably going to entail less of a learning curve and may also end up being less expensive.

Go to Original Article
Author:

Using the Windows Admin Center Azure services feature

To drive adoption of its cloud platform, Microsoft is lowering the technical barrier to Azure through the Windows Admin Center management tool.

Microsoft increasingly blurs the lines between on-premises Windows Server operating systems and its cloud platform.

One way the company has done this is by exposing Azure services alongside Windows Server services in the Windows Admin Center. Organizations that might have been reluctant to go through a lengthy deployment process that required PowerShell expertise can use the Windows Admin Center Azure functionality to set up a hybrid arrangement with just a few clicks in some instances.

Azure Backup

One of the Azure services that Windows Server 2019 can use natively is Azure Backup. This cloud service backs up on-premises resources to Azure. This service offers 9,999 recovery points for each instance and is capable of triple redundant storage within a single Azure region by creating three replicas.

Azure Backup can also provide geo-redundant storage, which insulates protected resources against regional disasters.

You access Azure Backup through the Windows Admin Center, as shown in Figure 1. After you register Windows Server with Azure, setting up Azure Backup takes four steps.

Azure Backup setup
Figure 1: The Windows Admin Center walks you through the steps to set up Azure Backup.

Microsoft designed Azure Backup to replace on-premises backup products. Organizations may find that Azure Backup is less expensive than their existing backup system, but the opposite may also be true. The costs vary widely depending on the volume of data, the type of replication and the data retention policy.

Azure Active Directory

Microsoft positions the Windows Admin Center as a one of the primary management tools for Windows Server. Because sensitive resources are exposed within the Windows Admin Center console, Microsoft offers a way to add an extra layer of security through Azure Active Directory.

When you enable the requirement for Azure Active Directory security, you will be required to authenticate into both the local machine and into Azure Active Directory.

To use Azure Active Directory, you must register the Windows Server with Azure, then you can require Azure Active Directory authentication to be used by opening the Windows Admin Center and then clicking on the Settings icon, followed by the Access tab. Figure 2 shows a simple toggle switch to turn Azure Active Directory authentication on or off.

Azure Active Directory authentication
Figure 2: The toggle switch in the Windows Admin Center sets up Azure Active Directory authentication.

Azure Site Recovery

Azure Site Recovery replicates machines running on-premises to the Microsoft Azure cloud. If a disaster occurs, you can fail over mission-critical workloads to use the replica VMs in the cloud. Once on-premises functionality returns, you can fail back workloads to your data center. Using the Azure cloud as a recovery site is far more cost-effective than building your own recovery data center, or even using a co-location facility.

Like other Azure services, Azure Site Recovery is exposed through the Windows Admin Center. To use it, the server must be registered with Azure. Although Hyper-V is the preferred hosting platform for use with Azure Site Recovery, the service also supports the replication of VMware VMs. The service also replicates between Azure VMs.

To enable a VM for use with the Azure Site Recovery services, open the Windows Admin Center and click on the Virtual Machines tab. This portion of the console is divided into two separate tabs. A Summary tab details the host’s hardware resource consumption, while the Inventory tab lists the individual VMs on the host.

Click on the Inventory tab and then select the checkbox for the VM you want to replicate to the Azure cloud. You can select multiple VMs and there is also a checkbox above the Name column to select all the VMs on the list. After selecting one or more VMs, click on More, and then choose the Set Up VM Protection option from the drop-down list, shown in Figure 3.

VM protection
Figure 3: To set up replication to Azure with the Azure Site Recovery service, select one or more VMs and then choose the Set Up VM Protection option.

The console will open a window to set up the host with Azure Site Recovery. Select the Azure subscription to use, and to create or select a resource group and a recovery vault. You will also need to select a location, as shown in Figure 4.

Azure Site Recovery setup
Figure 4: After you select the VMs to protect in Azure Site Recovery, finalize the process by selecting a location in the Azure cloud.

Storage Migration Service

The Storage Migration Service migrates the contents of existing servers to new physical servers, VMs or to the Azure cloud. This can help organizations reduce costs through workload consolidation.

You access the Storage Migration Service by selecting the Storage Migration Service tab in the Windows Admin Center, which opens a dialog box outlining the storage migration process as shown in Figure 5. The migration involves getting an inventory of your servers, transferring the data from those servers to the new location, and then cutting over to the new server.

Storage Migration Services overview
Figure 5: Microsoft developed Storage Migration Services to ease migrations to new servers, VMs or Azure VMs through a three-step process.

As time goes on, it seems almost inevitable that Microsoft will update the Windows Admin Center to expose even more Azure services. Eventually, this console will likely provide access to all of the native Windows Server services and all services running in Azure.

Go to Original Article
Author:

How to Use Azure Arc for Hybrid Cloud Management and Security

Azure Arc is a new hybrid cloud management option announced by Microsoft in November of 2019. This article serves as a single point of reference for all things Azure Arc.

According to Microsoft’s CEO Satya Nadella, “Azure Arc really marks the beginning of this new era of hybrid computing where there is a control plane built for multi-cloud, multi-edge” (Microsoft Ignite 2019 Keynote at 14:40). That is a strong statement from one of the industry leaders in cloud computing, especially since hybrid cloud computing has already been around for a decade. Essentially Azure Arc allows organizations to use Azure’s management technologies (“control plane”) to centrally administer public cloud resources along with on-premises servers, virtual machines, and containers. Since Microsoft Azure already manages distributed resources at scale, Microsoft is empowering its users to utilize these same features for all of their hardware, including edge servers. All of Azure’s AI, automation, compliance and security best practices are now available to manage all of their distributed cloud resources, and their underlying infrastructure, which is known as “connected machines.” Additionally, several of Azure’s AI and data services can now be deployed on-premises and centrally managed through Azure Arc, enhancing local and offline management and offering greater data sovereignty. Again, this article will provide an overview of the Azure Arc technology and its key capabilities (currently in Public Preview) and will be updated over time.

Video Preview of Azure Arc

Contents

Getting Started with Azure Arc

Azure Services with Azure Arc

Azure Artificial Intelligence (AI) with Azure Arc

Azure Automation with Azure Arc

Azure Cost Management & Billing with Azure Arc

Azure Data Services with Azure Arc

Cloud Availability with Azure Arc

Azure Availability & Resiliency with Azure Arc

Azure Backup & Restore with Azure Arc

Azure Site Recovery & Geo-Replication with Azure Arc

Cloud Management with Azure Arc

Management Tools with Azure Arc

Managing Legacy Hardware with Azure Arc

Offline Management with Azure Arc

Always Up-To-Date Tools with Azure Arc

Cloud Security & Compliance with Azure Arc

Azure Key Vault with Azure Arc

Azure Monitor with Azure Arc

Azure Policy with Azure Arc

Azure Security Center with Azure Arc

Azure Advanced Threat Protection with Azure Arc

Azure Update Management with Azure Arc

Role-Based Access Control (RBAC) with Azure Arc

DevOps and Application Management with Azure Arc

Azure Kubernetes Service (AKS) & Kubernetes App Management with Azure Arc

Other DevOps Tools with Azure Arc

DevOps On-Premises with Azure Arc

Elastic Scalability & Rapid Deployment with Azure Arc

Hybrid Cloud Integration with Azure Arc

Azure Stack Hub with Azure Arc

Azure Stack Edge with Azure Arc

Azure Stack Hyperconverged Infrastructure (HCI) with Azure Arc

Managed Service Providers (MSPs) with Azure Arc

Azure Lighthouse Integration with Azure Arc

Third-Party Integration with Azure Arc

Amazon Web Services (AWS) Integration with Azure Arc

Google Cloud Platform (GCP) Integration with Azure Arc

IBM Kubernetes Service Integration with Azure Arc

Linux VM Integration with Azure Arc

VMware Cloud Solution Integration with Azure Arc

Getting Started with Azure Arc

The Azure Arc public preview was announced in November 2019 at the Microsoft Ignite conference to much fanfare. In its initial release, many fundamental Azure services are supported along with Azure Data Services. Over time, it is expected that a majority of Azure Services will be supported by Azure Arc.

To get started with Azure Arc, check out the following guides and documentation provided by Microsoft.

Additional information will be added once it is made available.

Azure Services with Azure Arc

One of the fundamental benefits of Azure Arc is the ability to bring Azure services to a customer’s own datacenter. In its initial release, Azure Arc includes services for AI, automation, availability, billing, data, DevOps, Kubernetes management, security, and compliance. Over time, additional Azure services will be available through Azure Arc.

Azure Artificial Intelligence (AI) with Azure Arc

Azure Arc leverages Microsoft Azure’s artificial intelligence (AI) services, to power some of its advanced decision-making abilities learned from managing millions of devices at scale. Since Azure AI is continually monitoring billions of endpoints, it is able to perform tasks that can only be achieved at scale, such as identifying an emerging malware attack. Azure AI improves security, compliance, scalability and more for all cloud resources managed by Azure Arc. The services which run Azure AI are hosted in Microsoft Azure, and in disconnected environments, much of the AI processing can run on local servers using Azure Stack Edge.

For more information about Azure AI visit https://azure.microsoft.com/en-us/overview/ai-platform.

Azure Automation with Azure Arc

Azure Automation is a service provided by Azure that automates repetitive tasks which can be time-consuming or error-prone. This saves the organization significant time and money while helping them maintain operational consistency. Custom automation scripts can get triggered by a schedule or an event to automate servicing, track changes, collect inventory and much more. Since Azure Automation uses PowerShell, Python, and graphical runbooks, it can manage diverse software and hardware that supports PowerShell or has APIs. With Azure Arc, any on-premises connected machines and the applications they host can be integrated and automated with any Azure Automation workflow. These workflows can also be run locally on disconnected machines.

For more information about Azure Automation visit https://azure.microsoft.com/en-in/services/automation.

Azure Cost Management & Billing with Azure Arc

Microsoft Azure and other cloud providers use a consumption-based billing model so that tenants only pay for the resources which they consume. Azure Cost Management and Billing provides granular information to understand how cloud storage, network, memory, CPUs and any Azure services are being used. Organizations can set thresholds and get alerts when any consumer or business unit approaches or exceeds their limits. With Azure Arc, organizations can use cloud billing to optimize and manage costs for their on-premises resources also. In addition to Microsoft Azure and Microsoft hybrid cloud workloads, all Amazon AWS spending can be integrated into the same dashboard.

For more information about Azure Cost Management and Billing visit https://azure.microsoft.com/en-us/services/cost-management.

Azure Data Services with Azure Arc

Azure Data Services is the first major service provided by Azure Arc for on-premises servers. This was the top request of many organizations which want the management capabilities of Microsoft Azure, yet need to keep their data on-premises for data sovereignty. This makes Azure Data Services accessible to companies that must keep their customer’s data onsite, such as those working within regulated industries or those which do not have an Azure datacenter within their country.

In the initial release, both Azure SQL Database and Azure Database for PostgreSQL Hyperscale will be available for on-premises deployments. Now organizations can run and offer database as a service (DBaaS) as a platform as a service (PaaS) offering to their tenants. This makes it easier for users to deploy and manage cloud databases on their own infrastructure, without the overhead of setting up and maintaining the infrastructure on a physical server or virtual machine. The Azure Data Services on Azure Arc still require an underlying Kubernetes cluster, but many management frameworks are supported by Microsoft Azure and Azure Arc.

All of the other Azure Arc benefits are included with the data services, such as automation, backup, monitoring, scaling, security, patching and cost management. Additionally, Azure Data Services can run on both connected and disconnected machines. The latest features and updates to the data services are automatically pushed down from Microsoft Azure to Azure Arc members so that the infrastructure is always current and consistent.

For more information about Azure Data Services with Azure Arc visit https://azure.microsoft.com/en-us/services/azure-arc/hybrid-data-services.

Cloud Availability with Azure Arc

One of the main advantages offered by Microsoft Azure is access to its unlimited hardware spread across multiple datacenters which provide business continuity. This gives Azure customers numerous ways to increase service availability, retain more backups, and gain disaster recovery capabilities. With the introduction of Azure Arc, these features provide even greater integration between on-premises servers and Microsoft Azure.

Azure Availability & Resiliency with Azure Arc

With Azure Arc, organizations can leverage Azure’s availability and resiliency features for their on-premises servers. Virtual Machine Scale Sets allow automatic application scaling by rapidly deploying dozens (or thousands) of VMs to quickly increase the processing capabilities of a cloud application. Integrated load-balancing will distribute network traffic, and redundancy is built into the infrastructure to eliminate single points of failure. VM Availability Sets give administrators the ability to select a group of related VMs and force them to distribute themselves across different physical servers. This is recommended for redundant servers or guest clusters where it is important to have each virtualized instanced spread out so that the loss of a single host will not take down an entire service. Azure Availability Zones extend this concept across multiple datacenters by letting organizations deploy datacenter-wide protection schemes that distribute applications and their data across multiple sites. Azure’s automated updating solutions are availability-aware so they will keep services online during a patching cycle, serially updating and rebooting a subset of hosts. Azure Arc helps hybrid cloud services take advantage of all of the Azure resiliency features.

For more information about Azure availability and resiliency visit https://azure.microsoft.com/en-us/features/resiliency.

Azure Backup & Restore with Azure Arc

Many organizations limit their backup plans because of their storage constraints since it can be costly to store large amounts of data which may not need to be accessed again. Azure Backup helps organizations by allowing their data to be backed up and stored on Microsoft Azure. This usually reduces costs as users are only paying for the storage capacity they are using. Additionally storing backups offsite helps minimize data loss as offsite backups provide resiliency to site-wide outages and can protect customers from ransomware. Azure Backup also offers compression, encryption and retention policies to help organizations in regulated industries. Azure Arc manages the backups and recovery of on-premises servers with Microsoft Azure, with the backups being stored in the customer’s own datacenter or in Microsoft Azure.

For more information about Azure Backup visit https://azure.microsoft.com/en-us/services/backup.

Azure Site Recovery & Geo-Replication with Azure Arc

One of the more popular hybrid cloud features enabled with Microsoft Azure is the ability to replicate data from an on-premises location to Microsoft Azure using Azure Site Recovery (ASR). This allows users to have a disaster recovery site without needing to have a second datacenter. ASR is easy to deploy, configure, operate and even tests disaster recovery plans. Using Azure Arc it is possible to set up geo-replication to move data and services from a managed datacenter running Windows Server Hyper-V, VMware vCenter or Amazon Web Services (AWS) public cloud. Destination datacenters can include other datacenters managed by Azure Arc, Microsoft Azure and Amazon AWS.

For more information about Azure Site Recovery visit https://azure.microsoft.com/en-us/services/site-recovery.

Cloud Management with Azure Arc

Azure Arc introduces some on-premises management benefits which were previously available only in Microsoft Azure. These help organizations administer legacy hardware and disconnected machines with Azure-consistent features using multiple management tools.

Management Tools with Azure Arc

One of the fundamental design concepts of Microsoft Azure is to have centralized management layers (“planes”) that support diverse hardware, data, and administrative tools. The fabric plane controls the hardware through a standard set of interfaces and APIs. The data plane allows unified management of structured and unstructured data. And the control plane offers centralized management through various interfaces, including the GUI-based Azure Portal, Azure PowerShell, and other APIs. Each of these layers interfaces with each other through a standard set of controls, so that the operational steps will be identical whether a user deploys a VM via the Azure Portal or via Azure PowerShell. Azure Arc can manage cloud resources with the following Azure Developer Tools:

At the time of this writing, the documentation for Azure Arc is not yet available, but some examples can be found in the quick start guides which are linked in the Getting Started with Azure Arc section.

Managing Legacy Hardware with Azure Arc

Azure Arc is hardware-agnostic, allowing Azure to manage a customer’s diverse or legacy hardware just like an Azure datacenter server. The hardware must meet certain requirements so that a virtualized Kubernetes cluster can be deployed on it, as Azure services run on this virtualized infrastructure. In the Public Preview, servers must be running Windows Server 2012 R2 (or newer) or Ubuntu 16.04 or 18.04. Over time, additional servers will be supported, with rumors of 32-bit (x86), Oracle and Linux hosts being supported as infrastructure servers.

Offline Management with Azure Arc

Azure Arc will even be able to manage servers that are not regularly connected to the Internet, as is common with the military, emergency services, and sea vessels. Azure Arc has a concept of “connected” and “disconnected” machines. Connected servers have an Azure Resource ID and are part of an Azure Resource group. If a server does not sync with Microsoft Azure every 5 minutes, it is considered disconnected yet it can continue to run its local resources. Microsoft Arc allows these organizations to use the latest Azure services when they are connected, yet still use many of these features if the servers do not maintain an active connection, including Azure Data Services. Even some services which run Azure AI and are hosted in Microsoft Azure can work disconnected environments while running on Azure Stack Edge.

Always Up-To-Date Tools with Azure Arc

One of the advantages of using Microsoft Azure is that all the services are kept current by Microsoft. The latest features, best practices, and AI learning are automatically available to all users in real-time as soon as they are released. When an admin logs into the Azure Portal through a web browser, they are immediately exposed to the latest technology to manage their distributed infrastructure. By ensuring that all users have the same management interface and APIs, Microsoft can guarantee consistency of behavior for all users across all hardware, including on-premises infrastructure when using Azure Arc. However, if the hardware is in a disconnected environment (such as on a sea vessel), there could be some configuration drift as older versions of Azure data services and Azure management tools may still be used until they are reconnected and synced.

Cloud Security & Compliance with Azure Arc

Public cloud services like Microsoft Azure are able to offer industry-leading security and compliance due to their scale and expertise. Microsoft employs more than 3,500 of the world’s leading security engineers who have been collaborating for decades to build the industry’s safest infrastructure. Through its billions of endpoints, Microsoft Azure leverages Azure AI to identify anomalies and detect threats before they become widespread. Azure Arc extends all of the security features offered in Microsoft Azure to on-premises infrastructure, including key vaults, monitoring, policies, security, threat protection, and update management.

Azure Key Vault with Azure Arc

When working in a distributed computing environment, managing credentials, passwords, and user access can become complicated. Azure Key Vault is a service that helps enhance data protection and compliance by securely protecting all keys and monitoring access. Azure Key Vault is supported by Azure Arc, allowing credentials for on-premises services and hybrid clouds to be centrally managed through Azure.

For more information about Azure Key Vault visit https://azure.microsoft.com/en-us/services/key-vault.

Azure Monitor with Azure Arc

Azure Monitor is a service that collects and analyzes telemetry data from Azure infrastructure, networks, and applications. The logs from managed services are sent to Azure Monitor where they are aggregated and analyzed. If a problem is identified, such as an offline server, it can trigger alerts or use Azure Automation to launch recovery workflows. Azure Arc can now monitor on-premises servers, networks, virtualization infrastructure, and applications, just like they were running in Azure. It even leverages Azure AI and Azure Automation to make recommendations and fixes to hybrid cloud infrastructure.

For more information about Azure Monitor visit https://azure.microsoft.com/en-us/services/monitor.

Azure Policy with Azure Arc

Most enterprises have certain compliance requirements for the IT infrastructure, especially those organizations within regulated industries. Azure Policy uses Microsoft Azure to audit an environment and aggregate all the compliance data into a single location. Administrators can get alerted about misconfigurations or configuration drifts and even trigger automated remediation using Azure Automation. Azure Policy can be used with Azure Arc to apply policies on all connect machines, providing the benefits of cloud compliance to on-premises infrastructure.

For more information about Azure Policy visit https://azure.microsoft.com/en-us/services/azure-policy.

Azure Security Center with Azure Arc

The Azure Security Center centralizes all security policies and protects the entire managed environment. When Security Center is enabled, the Azure monitoring agents will report data back from the servers, networks, virtual machines, databases, and applications. The Azure Security Center analytics engines will ingest the data and use AI to provide guidance. It will recommend a broad set of improvements to enhance security, such as closing unnecessary ports or encrypting disks. Perhaps most importantly it will scan all the managed servers and identify updates that are missing, and it can use Azure Automation and Azure Update Management to patch those vulnerabilities. Azure Arc extends these security features to connected machines and services to protect all registered resources.

For more information about Azure Security Center visit https://azure.microsoft.com/en-us/services/security-center

Azure Advanced Threat Protection with Azure Arc

Azure Advanced Threat Protection (ATP) helps the industry’s leading cloud security solution by looking for anomalies and potential attacks with Azure AI. Azure ATP will look for suspicious computer or user activities and report any alerts in real-time. Azure Arc lets organizations extend this cloud protect to their hybrid and on-premises infrastructure offering leading threat protection across all of their cloud resources.

For more information about Azure Advanced Threat Protection visit https://azure.microsoft.com/en-us/features/azure-advanced-threat-protection.

Azure Update Management with Azure Arc

Microsoft Azure automates the process of applying patches, updates and security hotfixes to the cloud resources it manages. With Update Management, a series of updates can be scheduled and deployed on non-compliant servers using Azure Automation. Update management is aware of clusters and availability sets, ensuring that a distributed workload remains online while its infrastructure is patched by live migrating running VMs or containers between hosts. Azure will centrally manage updates, assessment reports, deployment results, and can create alerts for failures or other conditions. Organizations can use Azure Arc to automatically analyze and patch their on-premises and connected servers, virtual machines, and applications.

For more information about Azure Update Management visit https://docs.microsoft.com/en-us/azure/automation/automation-update-management.

Role-Based Access Control (RBAC) with Azure Arc

Controlling access to different resources is a critical function for any organization to enforce security and compliance. Microsoft Azure Active Directory (Azure AD) allows its customers to define granular access control for every user or user role based on different types of permissions (read, modify, delete, copy, sharing, etc.). There are also over 70 user roles provided by Azure, such as a Global Administrator, Virtual Machine Contributor or Billing Administrator. Azure Arc lets businesses extend role-based access control (RBAC) managed by Azure to on-premises environments. This means that any groups, policies, settings, security principals and managed identities that were deployed by Azure AD can now access all managed cloud resources. Azure AD also provides auditing so it is easy to track any changes made by users or security principals across the hybrid cloud.

For more information about Role-Based Access Control visit https://docs.microsoft.com/en-us/azure/role-based-access-control/overview.

DevOps and Application Management with Azure Arc

Over the past few years, containers have become more commonplace as they provide certain advantages over VMs, allowing the virtualized applications and services to be abstracted from their underlying virtualized infrastructure. This means that containerized applications can be uniformly deployed anywhere with any tools so that users do not have to worry about the hardware configuration. This technology has become popular amongst application developers, enabling them to manage their entire application development lifecycle without having a dependency on the IT department to set up the physical or virtual infrastructure. This development methodology is often called DevOps. One of the key design requirements with Azure Arc was to make it hardware agnostic, so with Azure Arc, developers can manage their containerized applications the same way whether they are running in Azure, on-premises or in a hybrid configuration.

Azure Kubernetes Service (AKS) & Kubernetes App Management with Azure Arc

Kubernetes is a management tool that allows developers to deploy, manage and update their containers. Azure Kubernetes Service (AKS) is Microsoft’s Kubernetes service and this can be integrated with Azure Arc. This means that AKS can be used to manage on-premises servers running containers. In additional to Azure Kubernetes Service, Azure Arc can be integrated with other Kubernetes management platforms, including Amazon EKS, Google Kubernetes Engine, and IBM Kubernetes Service.

For more information about Azure Container Services visit https://azure.microsoft.com/en-us/product-categories/containers and for Azure Kubernetes Services (AKS) visit https://azure.microsoft.com/en-us/services/kubernetes-service.

Other DevOps Tools with Azure Arc

For container management on Azure Arc developers can use any of the common Kubernetes management platforms, including Azure Kubernetes Service, Amazon EKS, Google Kubernetes Engine, and IBM Kubernetes Service. All standard deployment and management operations are supported on Azure Arc hardware enabling cross-cloud management.

More information about the non-Azure management tools is provided on the section on Third-Party Management Tools.

DevOps On-Premises with Azure Arc

Many developers prefer to work on their own hardware and some are required to develop applications in a private environment to keep their data secure. Azure Arc allows developers to build and deploy their applications anywhere utilizing Azure’s cloud-based AI, security and other cloud features while retaining their data, IP or other valuable assets within their own private cloud. Additionally, Azure Active Directory can use role-based access control (RBAC) and Azure Policies to manage developer access to sensitive company resources.

Elastic Scalability & Rapid Deployment with Azure Arc

Containerized applications are designed to start quickly when running on a highly-available Kubernetes cluster. The app will bypass the underlying operating system, allowing it to be rapidly deployed and scaled. These applications can quickly grow to an unlimited capacity when deployed on Microsoft Azure. When using Azure Arc, the applications can be managed across public and private clouds. Applications will usually contain several containers types that can be deployed in different locations based on their requirements. A common deployment configuration for a two-tiered application is to deploy the web frontend on Microsoft Azure for scalability and the database in a secure private cloud backend.

Hybrid Cloud Integration with Azure Arc

Microsoft’s hybrid cloud initiatives over the past few years have included certifying on-premises software and hardware configurations known as Azure Stack. Azure Stack allows organizations to run Azure-like services on their own hardware in their datacenter. It allows organizations that may be restricted from using public cloud services to utilize the best parts of Azure within their own datacenter. Azure Stack is most commonly deployed by organizations that have requirements to keep their customer’s datacenter inhouse (or within their territory) for data sovereignty, making it popular for customers who could not adopt the Microsoft Azure public cloud. Azure Arc easily integrates with Azure Stack Hub, Azure Stack Edge, and all the Azure Stack HCI configurations, allowing these services to be managed from Azure.

For more information about Azure Stack visit https://azure.microsoft.com/en-us/overview/azure-stack.

Azure Stack Hub with Azure Arc

Azure Stack Hub (formerly Microsoft Azure Stack) offers organizations a way to run Azure services from their own datacenter, from a service provider’s site, or from within an isolated environment. This cloud platform allows users to deploy Windows VMs, Linux VMs and Kubernetes containers on hardware which they operate. This offering is popular with developers who want to run services locally, organizations which need to retain their customer’s data onsite, and groups which are regularly disconnected from the Internet, as is common with sea vessels or emergency response personnel. Azure Arc allows Azure Stack Hub nodes to run supported Azure services (like Azure Data Services) while being centrally managed and optimized via Azure. These applications can be distributed across public, private or hybrid clouds.

For more information about Azure Stack Hub visit https://docs.microsoft.com/en-us/azure-stack/user/?view=azs-1908.

Azure Stack Edge with Azure Arc

Azure Stack Edge (previously Azure Data Box Edge) is a virtual appliance which can run on any hardware in a datacenter, branch office, remote site or disconnected environment. It is designed to run edge computing workloads on Hyper-V VMs, VMware VMs, containers and Azure services. These edge servers will be optimized run IoT, AI and business workloads so that processing can happen onsite, rather than being sent across a network to a cloud datacenter for processing. When the Azure Stack Edge appliance is (re)connected to the network it transfers any data at high-speed, and data use can be optimized to run during off-hours. It supports machine learning capabilities through GPGA or GPU. Azure Arc can centrally manage Azure Stack Edge, its virtual appliances and physical hardware.

For more information about Azure Stack Edge visit https://azure.microsoft.com/en-us/services/databox/edge.

Azure Stack Hyperconverged Infrastructure (HCI) with Azure Arc

Azure Stack Hyperconverged Infrastructure (HCI) is a program which provides preconfigured hyperconverged hardware from validated OEM partners which are optimized to run Azure Stack. For businesses which want to run Azure-like services on-premises they can purchase or rent hardware which has been standardized to Microsoft’s requirements. VMs, containers, Azure services, AI, IOT and more can run consistency on the Microsoft Azure public cloud or Azure Stack HCI hardware in a datacenter. Cloud services can be distributed across multiple datacenters or clouds and centrally managed using Azure Arc.

For more information about Azure Stack HCI visit https://azure.microsoft.com/en-us/overview/azure-stack/hci.

Managed Service Providers (MSPs) with Azure Arc

Azure Lighthouse Integration with Azure Arc

Azure Lighthouse is a technology designed for managed service providers (MSPs), ISVs or distributed organizations which need to centrally manage their tenants’ resources. Azure Lighthouse allows service providers and tenants to create a two-way trust to allow unified management of cloud resources. Tenants will grant specific permissions for approved user roles on particular cloud resources, so that they can offload the management to their service provider. Now service providers can add their tenants’ private cloud environments under Azure Arc management, so that they can take advantage of the new capabilities which Azure Arc provides.

For more information about Azure Lighthouse visit https://azure.microsoft.com/en-us/services/azure-lighthouse or on the Altaro MSP Dojo.

Third-Party Integration with Azure Arc

Within the Azure management layer (control plane) exists Azure Resource Manager (ARM). ARM provides a way to easily create, manage, monitor and delete any Azure resource. Every native and third-party Azure resource uses ARM to ensure that it can be centrally managed through Azure management tools. Azure Arc now allows non-Azure resources to be managed by Azure. This can include third-party clouds (Amazon Web Services, Google Cloud Platform), Windows and Linux VMs, VMs on non-Microsoft hypervisors (VMware vSphere, Google Compute Engine, Amazon EC2), Kubernetes containers and clusters (including IMB Kubernetes Service, Google Kubernetes Engine and Amazon EKS). At the time of this writing limited information is available about third-party integration, but it will be added over time.

Amazon Web Services (AWS) Integration with Azure Arc

Amazon Web Services (AWS) is Amazon’s public cloud platform. Some services from AWS can be managed by Azure Arc. This includes operating virtual machines running on the Amazon Elastic Compute Cloud (EC2) and containers running on Amazon Elastic Kubernetes Service (EKS). Azure Arc also lets an AWS site be used as a geo-replicated disaster recovery location. AWS billing can also be integrated with Azure Cost Management & Billing so that expenses from both cloud providers can be viewed in a single location.

Additional information will be added once it is made available.

Google Cloud Platform (GCP) Integration with Azure Arc

Google Cloud Platform (GCP) is Google’s public cloud platform. Some services from GCP can be managed by Azure Arc. This includes operating virtual machines running on Google Compute Engine (GCE) and containers running on Google Kubernetes Engine (GKE).

Additional information will be added once it is made available.

IBM Kubernetes Service Integration with Azure Arc

IBM Cloud is IBM’s public cloud platform. Some services from IBM Cloud can be managed by Azure Arc. This includes operating containers running on IBM Kubernetes Service (Kube).

Additional information will be added once it is made available.

Linux VM Integration with Azure Arc

In 2014 Microsoft’s CEO Satya Nadella declared, “Microsoft loves Linux”. Since then the company has embraced Linux integration, making Linux a first-class citizen in its ecosystem. Microsoft even contributes code to the Linux kernel so that it operates efficiently when running as a VM or container on Microsoft’s operating systems. Virtually all management features for Windows VMs are available to supported Linux distributions, and this extends to Azure Arc. Azure Arc admins can use Azure to centrally create, manage and optimize Linux VMs running on-premises, just like any standard Windows VM.

VMware Cloud Solution Integration with Azure Arc

VMware offers a popular virtualization platform and management studio (vSphere) which runs on VMware’s hypervisor. Microsoft has acknowledged that many customers are running legacy on-premises hardware are using VMware, so they provide numerous integration points to Azure and Azure Arc. Organizations can even virtualize and deploy their entire VMware infrastructure on Azure, rather than in their own datacenter. Microsoft makes it easy to deploy, manage, monitor and migrate VMware system and with Azure Arc businesses can now centrally operate their on-premises VMware infrastructure too. While the full management functionality of VMware vSphere is not available through Azure Arc, most standard operations are supported.

For more information about VMware Management with Azure visit https://azure.microsoft.com/en-us/overview/azure-vmware.


Go to Original Article
Author: Symon Perriman

Customer data platform tools top priority list for big vendors

Adobe, Microsoft and Oracle released their own customer data platforms in 2019 to compete with roughly 20 CDP vendors that had been serving their users for several years. Salesforce and SAP plan to follow with their own CDPs in 2020.

Those large customer experience (CX) platforms face challenges in the marketplace. Customer data platforms, which unify data across marketing automation, customer service, sales and e-commerce applications, solve a problem the big vendors created: A lack of data flow between CX applications, according to David Raab, founder of the CDP Institute.

“Nothing else has solved it; nothing comes close to solving it,” Raab said.

That data is siloed, typically, because the vendors built their CX technologies via acquisition and continue to have difficulty integrating the marketing, sales and service clouds comprising their platforms. But users demand it, as they see value in real-time access to customer data from all channels at once. They want machine learning and analytics running with those systems, pulling data from across the platform in order to create one-to-one customer offers, in real time, to drive sales and marketing campaigns.

Tech buyers must choose big vs. small CDPs

The question is, as many B2B companies are just starting to digitally transform their commerce, will they purchase new tools from the big vendors like Oracle and Microsoft, or go with more established and technically advanced CDPs from the smaller companies, such as Lytics, Lotame, Arm Treasure Data and RedPoint Global?

“Buyers are within their rights to be skeptical [of the big-box vendors],” said Gartner analyst Benjamin Bloom. “That vendor who might not have delivered the thing that you were looking for — or [caused] unintended challenges or consequences — now they are exactly the ones who are telling you how to clean up the mess [with their new CDP].”

Smaller CDP vendors tend to be nimble and more responsive to customer needs for features and integrations with analytics tools and outside applications, Bloom said. He sees them keeping their users for some time to come, as the larger platform vendors play catch-up, which Raab agrees with.

Graphic showing value of customer data platform technologies
Customer data platforms unify data from applications tracking customer web behavior, sales, e-commerce and other sources to create personalized marketing profiles and drive revenue.

Yet another option has become available for technology buyers tasked with building customer experiences: the digital experience platform. These typically arise from cloud content management vendors that are moving into customer experience. Acquia acquired CDP vendor AgilOne earlier this month to assemble a marketing automation and e-commerce platform with more sophisticated web content management than all the large CX platform vendors, with the exception of Adobe.

One of Acquia’s main competitors that also offers a CDP, Episerver, is moving more deeply into digital experience. It expanded its B2B e-commerce offering by acquiring InSite Software this month, and hired former SAP CX platform lead Alex Atzberger as CEO to oversee its digital experience technologies.

So many companies are building B2B e-commerce operations from scratch, said Gartner analyst Jason Daigler, that it doesn’t surprise him to see content management vendors challenge companies like Oracle and SAP for customers. He sees the appeal of combining strong content management with e-commerce.

Having the right data in the right time is really important to delivering a good customer experience. Does that require a single database? I don’t think it does, but it does require good understanding of data, where it is, and how to put it together.
Nicole FranceAnalyst, Constellation Research

“Most commerce platforms were not built with the best content management systems out there; they’re not known for their digital experience capabilities,” Daigler said.

Salesforce takes a different approach

Other experts wonder whether or not CDPs are the answer to collecting real-time data from disparate sources such as social media, sales and marketing channels. Because this data is always imperfect and the customer golden record is a mythical concept, said self-described CDP skeptic Constellation Research analyst Nicole France, the CDP may be a “fool’s errand.”

Salesforce is coming out with “a CDP that’s not a CDP,” as the company described it in analyst previews, France said. Salesforce may be solving problems that require customer data platforms to fix with upcoming features in Those could amount to integrations and APIs connecting data and unifying customer profiles with Mulesoft tools, instead of a whole new database itself.

“I do think that having the right data in the right time is really important to delivering a good customer experience. Does that require a single database? I don’t think it does, but it does require good understanding of data, where it is, and how to put it together.”

Go to Original Article
Author: