Tag Archives: Azure

Can AI help save penguins? – Microsoft News Center India

Working on Microsoft Azure platform, Mohanty and his colleagues used a Convolutional Neural Network model to come up with a solution that can identify and count penguins with a high degree of accuracy. The model can potentially help researchers speed up their studies around the status of penguin populations.

The team is now working on the classification, identification and counting of other species using similar deep learning techniques.

Building AI to save the planet

A long-time Microsoft partner headquartered in Hyderabad in India, Gramener is not new to leveraging AI for social good using Microsoft Azure. It was one of the earliest partners for Microsoft’s AI for Earth program announced in 2017.

“I believe that AI can help make the world a better place by accelerating biodiversity conservation and help solve the biggest environmental challenges we face today. When we came to know about Microsoft’s AI for Earth program over two years ago, we reached out to Microsoft as we wanted to find ways to partner and help with our expertise,” says Kesari.

While the program was still in its infancy, the teams from Gramener and Microsoft worked jointly to come up with quick projects to showcase what’s possible with AI and inspire those out there in the field. They started with a proof of concept for identifying flora and fauna species in a photograph.

“We worked more like an experimentation arm working with the team led by Lucas Joppa (Microsoft’s Chief Environmental Officer, and founder of AI for Earth). We built a model, using data available from iNaturalist, that could classify thousands of different species with 80 percent accuracy,” Kesari reveals.

Another proof of concept revolved around camera traps that are used for biodiversity studies in forests. The camera traps take multiple images whenever they detect motion, which leads to a large number of photos that had to be scanned manually.

Soumya Ranjan Mohanty, Lead Data Scientist, Gramener
Soumya Ranjan Mohanty, Lead Data Scientist, Gramener

“Most camera trap photos are blank as they don’t have any animal in the frame. Even in the frames that do, often the animal is too close to be identified or the photo is blurry,” says Mohanty, who also leads the AI for Earth partnership from Gramener.

The team came up with a two-step solution that first weeds out unusable images and then uses a deep learning model to classify images that have an animal in them. This solution too was converted by the Microsoft team into what is now the Camera Trap API that AI for Earth grantees or anyone can freely use.

“AI is critical to conservation because we simply don’t have time to wait for humans to annotate millions of images before we can answer wildlife population questions. For the same reason, we need to rapidly prototype AI applications for conservation, and it’s been fantastic to have Gramener on board as our ‘advanced development team’,” says Dan Morris, principal scientist and program director for Microsoft’s AI for Earth program.

Anticipating the needs of grantees, Gramener and Microsoft have also worked on creating other APIs, like the Land Cover Mapping API that leverages machine learning to provide high-resolution land cover information. These APIs are now part of the public technical resources available for AI for Earth grantees or anyone to use, to accelerate their projects without having to build the base model themselves.

Go to Original Article
Author: Microsoft News Center

A year of bringing AI to the edge

This post is co-authored by Anny Dow, Product Marketing Manager, Azure Cognitive Services.

In an age where low-latency and data security can be the lifeblood of an organization, containers make it possible for enterprises to meet these needs when harnessing artificial intelligence (AI).

Since introducing Azure Cognitive Services in containers this time last year, businesses across industries have unlocked new productivity gains and insights. The combination of both the most comprehensive set of domain-specific AI services in the market and containers enables enterprises to apply AI to more scenarios with Azure than with any other major cloud provider. Organizations ranging from healthcare to financial services have transformed their processes and customer experiences as a result.

These are some of the highlights from the past year:

Employing anomaly detection for predictive maintenance

Airbus Defense and Space, one of the world’s largest aerospace and defense companies, has tested Azure Cognitive Services in containers for developing a proof of concept in predictive maintenance. The company runs Anomaly Detector for immediately spotting unusual behavior in voltage levels to mitigate unexpected downtime. By employing advanced anomaly detection in containers without further burdening the data scientist team, Airbus can scale this critical capability across the business globally.

“Innovation has always been a driving force at Airbus. Using Anomaly Detector, an Azure Cognitive Service, we can solve some aircraft predictive maintenance use cases more easily.”  —Peter Weckesser, Digital Transformation Officer, Airbus

Automating data extraction for highly-regulated businesses

As enterprises grow, they begin to acquire thousands of hours of repetitive but critically important work every week. High-value domain specialists spend too much of their time on this. Today, innovative organizations use robotic process automation (RPA) to help manage, scale, and accelerate processes, and in doing so free people to create more value.

Automation Anywhere, a leader in robotic process automation, partners with these companies eager to streamline operations by applying AI. IQ Bot, their unique RPA software, automates data extraction from documents of various types. By deploying Cognitive Services in containers, Automation Anywhere can now handle documents on-premises and at the edge for highly regulated industries:

“Azure Cognitive Services in containers gives us the headroom to scale, both on-premises and in the cloud, especially for verticals such as insurance, finance, and health care where there are millions of documents to process.” —Prince Kohli, Chief Technology Officer for Products and Engineering, Automation Anywhere

For more about Automation Anywhere’s partnership with Microsoft to democratize AI for organizations, check out this blog post.

Delighting customers and employees with an intelligent virtual agent

Lowell, one of the largest credit management services in Europe, wants credit to work better for everybody. So, it works hard to make every consumer interaction as painless as possible with the AI. Partnering with Crayon, a global leader in cloud services and solutions, Lowell set out to solve the outdated processes that kept the company’s highly trained credit counselors too busy with routine inquiries and created friction in the customer experience. Lowell turned to Cognitive Services to create an AI-enabled virtual agent that now handles 40 percent of all inquiries—making it easier for service agents to deliver greater value to consumers and better outcomes for Lowell clients.

With GDPR requirements, chatbots weren’t an option for many businesses before containers became available. Now companies like Lowell can ensure the data handling meets stringent compliance standards while running Cognitive Services in containers. As Carl Udvang, Product Manager at Lowell explains:

“By taking advantage of container support in Cognitive Services, we built a bot that safeguards consumer information, analyzes it, and compares it to case studies about defaulted payments to find the solutions that work for each individual.”

One-to-one customer care at scale in data-sensitive environments has become easier to achieve.

Empowering disaster relief organizations on the ground

A few years ago, there was a major Ebola outbreak in Liberia. A team from USAID was sent to help mitigate the crisis. Their first task on the ground was to find and categorize the information such as the state of healthcare facilities, wifi networks, and population density centers.  They tracked this information manually and had to extract insights based on a complex corpus of data to determine the best course of action.

With the rugged versions of Azure Stack Edge, teams responding to such crises can carry a device running Cognitive Services in their backpack. They can upload unstructured data like maps, images, pictures of documents and then extract content, translate, draw relationships among entities, and apply a search layer. With these cloud AI capabilities available offline, at their fingertips, response teams can find the information they need in a matter of moments. In Satya’s Ignite 2019 keynote, Dean Paron, Partner Director of Azure Storage and Edge, walks us through how Cognitive Services in Azure Stack Edge can be applied in such disaster relief scenarios (starting at 27:07): 

Transforming customer support with call center analytics

Call centers are a critical customer touchpoint for many businesses, and being able to derive insights from customer calls is key to improving customer support. With Cognitive Services, businesses can transcribe calls with Speech to Text, analyze sentiment in real-time with Text Analytics, and develop a virtual agent to respond to questions with Text to Speech. However, in highly regulated industries, businesses are typically prohibited from running AI services in the cloud due to policies against uploading, processing, and storing any data in public cloud environments. This is especially true for financial institutions.

A leading bank in Europe addressed regulatory requirements and brought the latest transcription technology to their own on-premises environment by deploying Cognitive Services in containers. Through transcribing calls, customer service agents could not only get real-time feedback on customer sentiment and call effectiveness, but also batch process data to identify broad themes and unlock deeper insights on millions of hours of audio. Using containers also gave them flexibility to integrate with their own custom workflows and scale throughput at low latency.

What’s next?

These stories touch on just a handful of the organizations leading innovation by bringing AI to where data lives. As running AI anywhere becomes more mainstream, the opportunities for empowering people and organizations will only be limited by the imagination.

Visit the container support page to get started with containers today.

For a deeper dive into these stories, visit the following

Go to Original Article
Author: Microsoft News Center

AWS, Azure and Google peppered with outages in same week

AWS, Microsoft Azure and Google Cloud all experienced service degradations or outages this week, an outcome that suggests customers should accept that cloud outages are a matter of when, not if.

In AWS’s Frankfurt region, EC2, Relational Database Service, CloudFormation and Auto Scaling were all affected Nov. 11, with the issues now resolved, according to AWS’s status page.

Azure DevOps services for Boards, Repos, Pipelines and Test Plans were affected for a few hours in the early hours of Nov. 11, according to its status page. Engineers determined that the problem had to do with identity calls and rebooted access tokens to fix the system, the page states.

Google Cloud said some of its APIs in several U.S. regions were affected, and others experienced problems globally on Nov. 11, according to its status dashboard. Affected APIs included those for Compute Engine, Cloud Storage, BigQuery, Dataflow, Dataproc and Pub/Sub. Those issues were resolved later in the day.

Google Kubernetes Engine also went through some hiccups over the past week, in which nodes in some recently upgraded container clusters resulted in high levels of kernel panics. Known more colloquially as the “blue screen of death” and other terms, kernel panics are conditions wherein a system’s OS can’t recover from an error quickly or easily.

The company rolled out a series of fixes, but as of Nov. 13, the status page for GKE remained in orange status, which indicates a small number of projects are still affected.

AWS, Microsoft and Google have yet to provide the customary post-mortem reports on why the cloud outages occurred, although more information could emerge soon.

Move to cloud means ceding some control

The cloud outages at AWS, Azure and Google this week were far from the worst experienced by customers in recent years. In September 2018, severe weather in Texas caused a power surge that shut down dozens of Azure services for days.

Stephen ElliotStephen Elliot

Cloud providers have aggressively pursued region and zone expansions to help with disaster recovery and high-availability scenarios. But customers must still architect their systems to take advantage of the expanded footprint.

Still, customers have much less control when it comes to public cloud usage, according to Stephen Elliot, an analyst at IDC. That reality requires some operational sophistication.

It’s a myth that outages won’t happen.
Stephen ElliotAnalyst, IDC

“Networks are so interconnected and distributed, lots of partners are involved in making a service perform and available,” he said. “[Enterprises] need a risk mitigation strategy that covers people, process, technologies, SLAs, etc. It’s a myth that outages won’t happen. It could be from weather, a black swan event, security or a technology glitch.”

Jay LymanJay Lyman

This fact underscores why more companies are experimenting with and deploying workloads across hybrid and multi-cloud infrastructures, said Jay Lyman, an analyst at 451 Research. “They either control the infrastructure and downtime with on-premises deployments or spread their bets across multiple public clouds,” he said.

Ultimately, enterprise IT shops can weigh the challenges and costs of running their own infrastructure against public cloud providers and find it difficult to match, said Holger Mueller, an analyst at Constellation Research.

“That said, performance and uptime are validated every day, and should a major and longer public cloud outage happen, it could give pause among less technical board members,” he added.

Go to Original Article
Author:

Microsoft and Salesforce expand strategic partnership to accelerate customer success

Salesforce names Microsoft Azure as its public cloud provider for Salesforce Marketing Cloud to help customers scale and grow; new integration between Salesforce Sales and Service Clouds with Microsoft Teams will boost productivity

REDMOND, Wash., and SAN FRANCISCO — Nov. 14, 2019 — Microsoft Corp. (Nasdaq: MSFT) and Salesforce (NYSE: CRM) on Thursday announced plans to expand their strategic partnership to help customers meet the evolving needs of their businesses and boost team productivity. Salesforce has named Microsoft Azure as its public cloud provider for Salesforce Marketing Cloud. Salesforce will also build a new integration that connects Salesforce’s Sales Cloud and Service Cloud with Microsoft Teams.

Salesforce and Microsoft logos
“At Salesforce, we’re relentlessly focused on driving trust and success for our customers,” said Marc Benioff and Keith Block, co-CEOs, Salesforce. “We’re excited to expand our partnership with Microsoft and bring together the leading CRM with Azure and Teams to deliver incredible customer experiences.”

“In a world where every company is becoming a digital company, we want to enable every customer and partner to build experiences on our leading platforms,” said Satya Nadella, CEO, Microsoft. “By bringing together the power of Azure and Microsoft Teams with Salesforce, our aim is to help businesses harness the power of the Microsoft Cloud to better serve customers.”

Comments on the news

“Marriott has more than 7,200 properties spanning 134 countries and territories, so driving efficiency and collaboration is critical,” said Brian King, global officer, Digital, Distribution, Revenue Strategy, and Global Sales, Marriott International. “The combination of Salesforce and Microsoft enables our teams to work better together to enhance the guest experience at every touchpoint.”

“With 400 brands and teams in 190 countries, we are always looking for ways to scale more efficiently and strengthen collaboration,” said Jane Moran, chief technology advisor, Unilever. “The powerful combination of Salesforce and Microsoft enables us to be more productive and connect with each other and our customers like never before.”

Salesforce names Microsoft Azure as its public cloud provider for marketing cloud

With Salesforce Marketing Cloud, marketers are empowered to know their customers, personalize marketing with Einstein, engage with them across any channel, and analyze the impact to improve campaign performance. Bringing its Marketing Cloud workload to Azure, Salesforce joins the over 95% of Fortune 500 companies benefitting from an Azure infrastructure offering the most global regions of any cloud provider.

Through this partnership, Salesforce will move its Marketing Cloud to Azure — unlocking new growth opportunities for customers. By moving to Azure, Salesforce will be able to optimize Marketing Cloud’s performance as customer demand scales. This will reduce customer onboarding times and enable customers to expand globally more quickly with Azure’s global footprint and help address local data security, privacy and compliance requirements.

​Salesforce and Microsoft Teams integration will boost productivity

​As teamwork becomes a driving force in the workplace, people want to bring workflows and frequently used apps into their collaboration workspace environments. Sales and customer service are highly collaborative, team-centric functions, and many companies actively use both Salesforce CRM and Microsoft Teams. As part of this agreement, Salesforce will build a new integration that give sales and service users the ability to search, view, and share Salesforce records directly within Teams. The new Teams integration for Salesforce Sales and Service Clouds will be made available in late 2020.

Building on a commitment to customer success

These new integrations will build on existing solutions that enable mutual customers to be more productive, including the hundreds of thousands of monthly active users using Salesforce’s Microsoft Outlook integration to create, communicate and collaborate.

​About Salesforce​

Salesforce is the global leader in Customer Relationship Management (CRM), bringing companies closer to their customers in the digital age. Founded in 1999, Salesforce enables companies of every size and industry to take advantage of powerful technologies—cloud, mobile, social, internet of things, artificial intelligence, voice and blockchain—to create a 360° view of their customers. For more information about Salesforce (NYSE: CRM), visit: www.salesforce.com.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, press only:

Microsoft Media Relations, WE Communications for Microsoft, (425) 638-7777, [email protected]

Stephanie Barnes, Salesforce PR, (415) 722-0883, [email protected]

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.

Go to Original Article
Author: Microsoft News Center

Allianz partners with Microsoft to digitally transform the insurance industry

Allianz and Microsoft to reimagine the insurance industry experience with Azure to streamline insurance processes; Microsoft will partner with Syncier, the B2B2X insurtech founded by Allianz, to offer customized insurance platform solutions and related services

Jean-Philippe Courtois, EVP and president, Microsoft Global Sales, Marketing & Operations and Christof Mascher, COO and member of the Board of Management of Allianz SE
Jean-Philippe Courtois, EVP and president, Microsoft Global Sales, Marketing & Operations (left) and Christof Mascher, COO and member of the Board of Management of Allianz SE (right). Source: allianz.com

MUNICH, Germany, and REDMOND, Wash. — Nov. 14, 2019 — On Thursday, Allianz SE and Microsoft Corp. announced a strategic partnership focused on digitally transforming the insurance industry, making the insurance process easier while creating a better experience for insurance companies and their customers. Through the strategic partnership, Allianz will move core pieces of its global insurance platform, Allianz Business System (ABS), to Microsoft’s Azure cloud and will open-source parts of the solution’s core to improve and expand capabilities.

Syncier will offer a configurable version of the solution called ABS Enterprise Edition to insurance providers as a service, allowing them to benefit from one of the most advanced and comprehensive insurance platforms in the industry, reducing costs and centralizing their insurance portfolio management. This will increase efficiencies across all lines of insurance business, resulting in better experiences through tailored customer service and simplified product offerings.

“Teaming up with Microsoft and leveraging Azure’s secure and trusted cloud platform will support us in digitalizing the insurance industry,” said Christof Mascher, COO and member of the Board of Management of Allianz SE. “Through this partnership, Allianz and Syncier strive to offer the most advanced Insurance as a Service solutions on Microsoft Azure. The ABS Enterprise Edition is an exciting opportunity, both for larger insurers needing to replace their legacy IT, and smaller players — such as insurtechs — looking for a scalable insurance platform.”

“Allianz is setting the standard for insurance solutions globally,” said Jean-Philippe Courtois, EVP and president, Microsoft Global Sales, Marketing & Operations. “Together, Microsoft and Allianz are offering a solution that combines Allianz’s deep knowledge of the insurance sector with Microsoft’s trusted Azure cloud platform. By delivering an open-source, cloud-based insurance platform and software application marketplace, we will support innovation and transformation across this sector.”

Syncier’s ABS Enterprise Edition can handle insurance processes across all lines of business: property and casualty, life, health, and assistance. It can be customized for any insurance company, country and regulatory requirements. Insurers, brokers and agents adopting the platform can service clients and manage entire portfolios end to end in one system, gaining a unique 360-degree view of each client and the business.

To accelerate industry innovation, Syncier will also offer an Azure cloud-based marketplace for ready-made software applications and services tailored to the insurance sector. Such solutions could include, for example, customer service chatbots or AI-based fraud detection. The marketplace enables insurance providers to easily and quickly implement the available solutions in a plug-and-play manner.

Allianz uses ABS globally as a platform for all lines of business and along with Microsoft is committed to supporting the ABS Enterprise Edition long term as an industry solution. Today, ABS handles around 60 million insurance policies in 19 countries and is being rolled out to all Allianz entities.

About Allianz

The Allianz Group is one of the world’s leading insurers and asset managers with more than 92 million retail and corporate customers. Allianz customers benefit from a broad range of personal and corporate insurance services, ranging from property, life and health insurance to assistance services to credit insurance and global business insurance. Allianz is one of the world’s largest investors, managing around 729 billion euros on behalf of its insurance customers. Furthermore, our asset managers PIMCO and Allianz Global Investors manage more than 1.5 trillion euros of third-party assets. Thanks to our systematic integration of ecological and social criteria in our business processes and investment decisions, we hold the leading position for insurers in the Dow Jones Sustainability Index. In 2018, over 142,000 employees in more than 70 countries achieved total revenues of 132 billion euros and an operating profit of 11.5 billion euros for the group. For more information on Syncier, visit www.syncier.com.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information, press only:

Microsoft Media Relations, WE Communications for Microsoft, (425) 638-7777, [email protected]

Gregor Wills, Allianz, +49 89 3800 61313, [email protected]

Note to editors: For more information, news and perspectives from Microsoft, please visit the Microsoft News Center at http://news.microsoft.com. Web links, telephone numbers and titles were correct at time of publication but may have changed. For additional assistance, journalists and analysts may contact Microsoft’s Rapid Response Team or other appropriate contacts listed at http://news.microsoft.com/microsoft-public-relations-contacts.

Go to Original Article
Author: Microsoft News Center

The Big Announcements from Microsoft Ignite 2019

Yesterday wrapped up Day 1 of the 2019 Microsoft Ignite event. As always there was no shortage of announcements for IT professionals of all walks of life. In this blog post we’re going to talk about some of the bigger announcements that have impacts in the infrastructure space, as well as office services as well. So let’s dive in!

Azure Arc

There is no doubt that Azure Arc was the headline of the show for many attendees. In short, Azure Arc is designed to be a control plane for Multi-Cloud / Multi-Edge Deployments. Meaning, single-pane-of-glass management for all computing resources regardless of where they live. Many organizations have compute resources in many locations, including on-premises, Azure, other cloud environments…etc..etc, and the big challenge has always been maintaining effective management of these disparate systems. Azure Arc is designed to address this issue.

With Azure Arc, your on-premises resources actually appear in the Azure Console and can be managed in much the same way as you would manage a VM running in Azure. The on-prem resources could then be integrated with Azure services such as Log Analytics and Azure Policies. Additionally, as a demo during the Azure technical keynote, an instance of the Azure SQL DB service was actually pushed down to one of the on-premises VMs, just as if it was another Azure resource!

If you’d like more details on this feature, we’ll be putting together some content in the near future. In the meantime check out the video below that’s been put out by the Azure Advocacy Team.

[embedded content]

Project Cortex

Another noteworthy announcement from the keynotes today was Project Cortex. The stated goal of Project Cortex is to take all of the data that resides in an organization’s Office 365 tenant and create usable knowledge out of it by leveraging AI. For example, you set a meeting with a co-worker. This feature would see that you’ve set a meeting with your co-worker and would do things like show you the last couple of documents you shared with that person. Or, another example would be topic centers and knowledge centers. These assets are automatically created and updated by AI and present applicable information to people in various office applications. The whole goal is to more readily present the information you need when you need it. More preliminary information on Project Cortex can be found here.

Azure Quantum

A year or two back at a previous Microsoft Ignite, Microsoft showed off some enhancements in the realm of quantum computing. If you follow the news in quantum computing, you likely know that it’s a radically different computing model and will fundamentally alter the tech scene. Azure Quantum is a collection of quantum services from Microsoft and a number of other vendors that is designed to enable open access to quantum resources. The other thing regarding this service that’s nice to see is that it’s being done with openness in mind. Open source is a key design decision and will help benefit everyone involved in bringing quantum computing to the masses in the coming years.

If you’re interested in finding out a bit more about these new features, Microsoft has a website now setup for Azure Quantum here.

Additional Options for the Intelligent Edge and Azure Stack

As you likely know from previous years, Azure Stack was designed to bring a purpose-built appliance with Azure services to your datacenter. It’s seen widespread adoption since its release, especially in the enterprise and service provider space. However, Microsoft sees edge use cases (think branch offices and remote locations) as a key area where Azure stack can be leveraged extensively. With this in mind, Microsoft has created several different variants of Azure Stack for a vast swath of compute use cases. These use-cases ranged from branch office retail stores with a small Azure Stack Edge Device in their building, to search and rescue teams needing to process drone data, from “Ruggedized” Azure Stack instances in their backpacks!

We’ll be talking more about Azure Stack in the coming weeks and months, but if you’re interested in more information regarding how all these options fit together, Microsoft has updated its Azure Stack product site with many of these new offerings.

Interview with Jeffery Snover

[embedded content]

Further Information

The announcements mentioned here are simply the tip of the iceberg. There were several other smaller announcements that came out of Microsoft today, including things like new VM sizes in Azure IaaS, a new Performance Monitor application, and GA of Windows Admin Center 1910. We’ll be bringing you updated information on these topics as the week progresses, and also be on the lookout for blogs on each of these areas on the Altaro blogs in the near future!

Stay tuned for more information!

Go to Original Article
Author: Andy Syrewicze

How Microsoft re-envisioned the data warehouse with Azure Synapse Analytics

About four years ago, the Microsoft Azure team began to notice a big problem troubling many of its customers. A mass migration to the cloud was in full swing, as enterprises signed up by the thousands to reap the benefits of flexible, largescale computing and data storage. But the next iteration of that tech revolution, in which companies would use their growing stores of data to get more tangible business benefits, had stalled.

Technology providers, including Microsoft, have built a variety of systems to collect, retrieve and analyze enormous troves of information that would uncover market trends and insights, paving the way toward a new era of improved customer service, innovation and efficiency.

But those systems were built independently by different engineering teams and sold as individual products and services. They weren’t designed to connect with one another, and customers would have to learn how to operate them separately, wasting time, money and precious IT talent.

“Instead of trying to add more features to each of our services, we decided to take a step back and figure out how to bring their core capabilities together to make it easy for customers to collect and analyze all of their increasingly diverse data, to break down data silos and work together more collaboratively,” said Raghu Ramakrishnan, Microsoft’s chief technology officer for data.

At its Ignite conference this week in Orlando, Florida, Microsoft announced the end result of a yearslong effort to address the problem: Azure Synapse Analytics, a new service that merges the capabilities of Azure SQL Data Warehouse with new enhancements such as on-demand query as a service.

Microsoft said this new offering will help customers put their data to work much more quickly, productively and securely by pulling together insights from all data sources, data warehouses and big data analytics systems. And, the company said, with deeper integration between Power BI and Azure Machine Learning, Azure Synapse Analytics can reduce the time required to process and share that data, speeding up the insights that businesses can glean.

What’s more, it will allow many more businesses to take advantage of game-changing technologies like data analytics and artificial intelligence, which are helping scientists to better predict the weather, search engines to better understand people’s intent and workers to more easily handle mundane tasks.

This newest effort to break down data silos also builds on other Microsoft projects, such as the Open Data Initiative and Azure Data Share, which allows you to share data from multiple sources and even other organizations.

Microsoft said Azure Synapse Analytics is also designed to support the increasingly popular DevOps strategy, in which development and operations staff collaborate more closely to create and implement services that work better throughout their lifecycles.

YouTube Video

A learning process

Azure Synapse Analytics is the result of a lot of work, and a little trial and error.

At first, Ramakrishnan said, the team developed highlevel guidelines showing customers how to glue the systems together themselves. But they quickly realized that was too much to ask.

“That required a lot of expertise in the nitty gritty of our platforms,” Ramakrishnan said. Customers made it overwhelmingly clear that we needed to do better.”

So, the company went back to the drawing board and spent an additional two years revamping the heart of its data business, Azure SQL Data Warehouse, which lets customers build, test, deploy and manage applications and services in the cloud.

A breakthrough came when the company realized that customers need to analyze all their data in a single service, without having to copy terabytes of information across various systems to use different analytic capabilities – as has traditionally been the case with data warehouses and data lakes.

With the new offering, customers can use their data analytics engine of choice, such as Apache Spark or SQL, on all their data. That’s true whether it’s structured data, such as rows of numbers on spreadsheets, or unstructured data, such as a collection of social media posts.

This project was risky. It involved deep technical surgery: completely rewriting the guts of the SQL query processing engine to optimize it for the cloud and make it capable of instantly handling big bursts of work as well as very large and diverse datasets.

It also required unprecedented integration among several teams within Microsoft, some of whom would have to make hard choices. Established plans had to be scrapped. Resources earmarked for new features would be redirected to help make the entire system work better.

“In the beginning, the conversations were often heated. But as we got into the flow of it, they became easier. We began to come together,” Ramakrishnan said.

Microsoft also had to make sure that the product would work for any company, regardless of employees’ technical expertise.

“Most companies can’t afford to hire teams of 20 people to drive data projects and wire together multiple systems. There aren’t even enough skilled people out there to do all that work,” said Daniel Yu, director of product marketing for Azure Data and Artificial Intelligence.

Making it easy for customers

Customers can bring together various sources of data into a single feed with Azure Synapse Analytics Studio, a console – or single pane of glass that will allow a business professional with minimal technical expertise to locate and collect data from multiple sources like sales, supply chain, finance and product development. They can then choose how and where to store that data, and they can use it to create reports through Microsoft’s popular Power BI analytics service.

In a matter of hours, Azure Synapse will deliver useful business insights that used to take days or even weeks and months, said Rohan Kumar, corporate vice president for Azure Data.

“Let’s say an executive wants a detailed report on sales performance in the eastern U.S. over the last six months,” Kumar said. Today, a data engineer has to do a lot of work to find where that data is stored and write a lot of brittle code to tie various services together. They might even have to bring in a systems integrator partner. With Azure Synapse, there’s no code required. It’s a much more intuitive experience.”

The complexity of the technical problems Azure Synapse addressed would be hard to overstate. Microsoft had to meld multiple independent components into one coherent form factor, while giving a wide range of people – from data scientists to line of business owners – their preferred tools for accessing and using data.


With Azure Synapse, there’s no code required. It’s a much more intuitive experience.”

~ Rohan Kumar, corporate vice president for Azure Data


That includes products like SQL Server, the open source programming interface Apache Spark, Azure Data Factory and Azure Data Studio, as well as notebook interfaces preferred by many data professionals to clean and model data.

“Getting all those capabilities to come together fluidly, making it run faster, simpler, eliminating overlapping processes – there was some scary good stuff getting done,” Ramakrishnan said.

The result is a data analytics system that will be as easy to use as a modern mobile phone. Just as the smartphone replaced several devices by making all of their core capabilities intuitively accessible in a single device, the Azure Synapse “smartphone for data” now allows a data engineer to build an entire end-to-end data pipeline in one place. It also enables data scientists and analysts to look at the underlying data in ways that are natural to them.

And just as the phone has driven waves of collaboration and business innovation, Azure Synapse will free up individuals and companies to introduce new products and services as quickly as they can dream them up, Microsoft said.

“If we can help different people view data through a lens that is natural to them, while it’s also visible to others in ways natural to them, then we will transform the way companies work,” Ramakrishnan said. That’s how we should measure our success.

Top photo: Rohan Kumar, corporate vice president for Azure Data, says Azure Synapse will deliver useful business insights that used to take days or even weeks and months. Photo by Scott Eklund/Red Box Pictures.

Related:

Go to Original Article
Author: Microsoft News Center

IoT security will set innovation free: Azure Sphere general availability scheduled for February 2020

Today, at the IoT Solutions World Congress, we announced that Azure Sphere will be generally available in February of 2020. General availability will mark our readiness to fulfill our security promise at scale, and to put the power of Microsoft’s expertise to work for our customers every day—by delivering over a decade of ongoing security improvements and OS updates delivered directly to each device.

Since we first introduced Azure Sphere in 2018, the IoT landscape has quickly expanded. Today, there are more connected things than people in the world: 14.2 billion in 2019, according to Gartner, and this number is expected to hit 20 billion by 2020. Although this number appears large, we expect IoT adoption to accelerate to provide connectivity to hundreds of billions of devices. This massive growth will only increase the stakes for devices that are not secured.

Recent research by Bain & Co. lists security as the leading barrier to IoT adoption. In fact, enterprise customers would buy at least 70 percent more IoT devices if a product addresses their concerns about cybersecurity. According to Bain & Co., enterprise executives, with an innate understanding of the risk that connectivity opens their brands and customers to, are willing to pay a 22 percent premium for secured devices.

Azure Sphere’s mission is to empower every organization on the planet to connect and create secured and trustworthy IoT devices. We believe that for innovation to deliver durable value, it must be built on a foundation of security. Our customers need and expect reliable, consistent security that will set innovation free. To deliver on this, we’ve made several strategic investments and partnerships that make it possible to meet our customers wherever they are on their IoT journey.

Delivering silicon choice to enable heterogeneity at the edge

By partnering with silicon leaders, we can combine our expertise in security with their unique capabilities to best serve a diverse set of customer needs.

MediaTek’s MT3620, the first Azure Sphere certified chip produced, is designed to meet the needs of the more traditional MCU space, including Wi-Fi-enabled scenarios. Today, our customers across industries are adopting the MT3620 to design and produce everything from consumer appliances to retail and manufacturing equipment—these chips are also being used to power a series of guardian modules to securely connect and protect mission-critical equipment.

In June, we announced our collaboration with NXP to deliver a new Azure Sphere certified chip. This new chip will be an extension of their popular i.MX 8 high-performance applications processor series and be optimized for performance and power. This will bring greater compute capabilities to our line-up to support advanced workloads, including artificial intelligence (AI), graphics, and richer UI experiences.

Earlier this month, we announced our collaboration with Qualcomm to deliver the first cellular-enabled Azure Sphere chip. With ultra-low-power capabilities this new chip will light up a broad new set of scenarios and give our customers the freedom to securely connect anytime, anywhere.

Streamlining prototyping and production with a diverse hardware ecosystem

Manufacturers are looking for ways to reduce cost, complexity, and time to market when designing new devices and equipment. Azure Sphere development kits from our partners at Seeed Studios and Avnet are designed to streamline the prototyping and planning when building Azure Sphere devices. When you’re ready to shift gears into production mode, there are a variety of modules by partners including AI-Link, USI, and Avnet to help you reduce costs and accelerate production so you can get to market faster.

Adding secured connectivity to existing mission-critical equipment

Many enterprises are looking to unlock new value from existing equipment through connectivity. Guardian modules are designed to help our customers quickly bring their existing investments online without taking on risk and jeopardizing mission-critical equipment. Guardian modules plug into existing physical interfaces on equipment, can be easily deployed with common technical skillsets, and require no device redesign. The deployment is fast, does not require equipment to be replaced before its end of life, and quickly pays for itself. The first guardian modules are available today from Avnet and AI-Link, with more expected soon.

Empowering developers with the right tools

Developers need tools that are as modern as the experiences they aspire to deliver. In September of 2018, we released our SDK preview for Visual Studio. Since then, we’ve continued to iterate rapidly, making it quicker and simpler to develop, deploy, and debug Azure Sphere apps. We also built out a set of samples and solutions on GitHub, providing easy building blocks for developers to get started. And, as we shared recently, we’ll soon have an SDK for Linux and support for Visual Studio Code. By empowering their developers, we help manufacturers bring innovation to market faster.

Creating a secure environment for running an RTOS or bare-metal code

As manufacturers transform MCU-powered devices by adding connectivity, they want to leverage existing code running on an RTOS or bare-metal. Earlier this year, we provided a secured environment for this code by enabling the M4 core processors embedded in the MediaTek MT3620 chip. Code running on these real-time cores is programmed and debugged using Visual Studio. Using these tools, such code can easily be enhanced to send and receive data via the protection of a partner app running on the Azure Sphere OS, and it can be updated seamlessly in the field to add features or to address issues. Now, manufacturers can confidently secure and service their connected devices, while leveraging existing code for real-time processing operations.

Delivering customer success

Deep partnerships with early customers have helped us understand how IoT can be implemented to propel business, and the critical role security plays in protecting their bottom line, brand, and end users. Today, we’re working with hundreds of customers who are planning Azure Sphere deployments, here are a few highlights from across retail, healthcare, and energy:

  • Starbucks—In-store equipment is the backbone of not just commerce, but their entire customer experience. To reduce disruptions and maintain a quality experience, Starbucks is partnering with Microsoft to deploy Azure Sphere across its existing mission-critical equipment in stores globally using guardian modules.
  • Gojo—Gojo Industries, the inventor of PURELL Hand Sanitizer, has been driving innovation to improve hygiene compliance in health organizations. Deploying motion detectors and connected PURELL dispensers in healthcare facilities made it possible to quantify hand cleaning behavior in a way that made it possible to implement better practices. Now, PURELL SMARTLINK Technology is undergoing an upgrade with Azure Sphere to deploy secure and connected dispensers in hospitals.
  • Leoni—Leoni develops cable systems that are central components within critical application fields that manage energy and data for the automotive sector and other industries. To make cable systems safer, more reliable, and smarter, Leoni uses Azure Sphere with integrated sensors to actively monitor cable conditions, creating intelligent and connected cable systems.

Looking forward

We want to empower every organization on the planet to connect and create secure and trustworthy IoT devices. While Azure Sphere leverages deep and extensive Microsoft heritage that spans hardware, software, cloud, and security, IoT is our opportunity to prove we can deliver in a new space. Our work, our collaborations, and our partnerships are evidence of the commitment we’ve made to our customers—to give them the tools and confidence to transform the world with new experiences. As we close in on the milestone achievement of Azure Sphere general availability, we are already focused on how to give our customers greater opportunities to securely shape the future.

Go to Original Article
Author: Microsoft News Center

What are the Azure Stack HCI deployment, management options?

There are several management approaches and deployment options for organizations interested in using the Azure Stack HCI product.

Azure Stack HCI is a hyper-converged infrastructure product, similar to other offerings in which each node holds processors, memory, storage and networking components. Third-party vendors sell the nodes that can scale should the organization need more resources. A purchase of Azure Stack HCI includes the hardware, Windows Server 2019 operating system, management tools, and service and support from the hardware vendor. At time of publication, Microsoft’s Azure Stack HCI catalog lists more than 150 offerings from 19 vendors.

Azure Stack HCI, not to be confused with Azure Stack, gives IT pros full administrator rights to manage the system.

Tailor the Azure Stack HCI options for different needs

The basic components of an Azure Stack HCI node might be the same, but an organization can customize them for different needs, such as better performance or lowest price. For example, a company that wants to deploy a node in a remote office/branch office might select Lenovo’s ThinkAgile MX Certified Node, or its SR650 model. The SR650 scales to two nodes that can be configured with a variety of processors offering up to 28 cores, up to 1.5 TB of memory, hard drive combinations providing up to 12 TB (or SSDs offering more than 3.8 TB), and networking with 10/25 GbE. Each node comes in a 2U physical form factor.

If the organization needs the node for more demanding workloads, one option is the Fujitsu Primeflex. Azure Stack HCI node models such as the all-SSD Fujitsu Primergy RX2540 M5 scale to 16 nodes. Each node can range from 16 to 56 processor cores, up to 3 TB of SSD storage and 25 GbE networking.

Management tools for Azure Stack HCI systems

Microsoft positions the Windows Admin Center (WAC) as the ideal GUI management tool for Azure Stack HCI, but other familiar utilities will work on the platform.

Microsoft positions the Windows Admin Center (WAC) as the ideal GUI management tool for Azure Stack HCI, but other familiar utilities will work on the platform.

The Windows Admin Center is a relatively new browser-based tool for consolidated management for local and remote servers. The Windows Admin Center provides a wide array of management capabilities, such as managing Hyper-V VMs and virtual switches, along with failover and hyper-converged cluster management. While it is tailored for Windows Server 2019 — the server OS used for Azure Stack HCI — it fully supports Windows Server 2012/2012 R2 and Windows Server 2016, and offers some functionality for Windows Server 2008 R2.

Azure Stack HCI users can also use more established management tools such as System Center. The System Center suite components handle infrastructure provisioning, monitoring, automation, backup and IT service management. System Center Virtual Machine Manager provisions and manages the resources to create and deploy VMs, and handle private clouds. System Center Operations Manager monitors services, devices and operations throughout the infrastructure.

Other tools are also available including PowerShell, both the Windows and the PowerShell Core open source versions, as well as third-party products, such as 5nine Manager for Windows Server 2019 Hyper-V management, monitoring and capacity planning.

It’s important to check over each management tool to evaluate its compatibility with the Azure Stack HCI platform, as well as other components of the enterprise infrastructure.

Go to Original Article
Author:

Do you know the difference in the Microsoft HCI programs?

While similar in name, Microsoft’s Azure Stack and Azure Stack HCI products are substantially different product offerings designed for different use cases.

Azure Stack brings Azure cloud capabilities into the data center for organizations that want to build and run cloud applications on localized resources. Azure Stack HCI operates on the same Hyper-V-based, software-driven compute, storage and networking technologies but serves a fundamentally different purpose. This new Microsoft HCI offering is a hyper-converged infrastructure product that combines vendor-specific hardware with Windows Server 2019 Datacenter edition and management tools to provide a highly integrated and optimized computing platform for local VM workloads.

Azure Stack gives users a way to employ Azure VMs for Windows and Linux, Azure IoT and Event Hubs, Azure Marketplace, Docker containers, Azure Key Vault, Azure Resource Manager, Azure Web Apps and Functions, and Azure administrative tools locally. This functionality gives an organization the benefits of Azure cloud operation, while also satisfying regulatory requirements that require workloads to run in the data center.

Azure Stack HCI offers optional connections to an array of Azure cloud services, including Azure Site Recovery, Azure Monitor, Azure Backup, Azure Update Management, Azure File Sync and Azure Network Adapter. However, these workloads remain in the Azure cloud. Also, there is no way to convert this Microsoft HCI product into an Azure Stack deployment.

Azure Stack HCI evolved from Microsoft’s WSSD HCI offering.

Windows Server Software-Defined products still exist

Azure Stack HCI evolved from Microsoft’s Windows Server Software-Defined (WSSD) HCI offering. The WSSD program still exists, but the main difference on the software side is hardware in the WSSD program runs on the Windows Server 2016 OS.

WSSD HCI is similar to Azure Stack HCI with a foundation of vendor-specific hardware, the inclusion of Windows Server technologies — Hyper-V, Storage Spaces Direct and software-defined networking — and Windows Admin Center for systems management. Azure Stack HCI expands on WSSD through improvements to Windows Server 2019 and tighter integration with Azure services.

Go to Original Article
Author: