Tag Archives: Azure

Changing the world through data science

By Kenji Takeda, Director, Azure for Research

Alan Turing asked the question “can machines think?” in 1950 and it still intrigues us today. At The Alan Turing Institute, the United Kingdom’s national institute for data science in London, more than 150 researchers are pursuing this question by bringing their thinking to fundamental and real-world problems to push the boundaries of data science.

One year ago, The Turing first opened its doors to 37 PhD students, 117 Turing Fellows and visiting researchers, 6 research software engineers and more than 5,000 researchers for its workshops and events. I have been privileged to be one of these visiting fellows, helping the researchers take a cloud-first approach through our contribution of $5 million of Microsoft Azure cloud computing credits to The Turing. To be part of this world-leading center of data science research is exhilarating. Cloud computing is unlocking an impressive level of ambition at The Turing, allowing researchers to think bigger and unleash their creativity.

“We have had an exceptional first year of research at The Turing. Working with Microsoft, our growing community of researchers have been tooled up with skills and access to Azure for cloud computing and as a result they’ve been able to undertake complex data science tasks at speed and with maximum efficiency, as illustrated by some of the stories of Turing research showcased today. We look forward to growing our engagement with the Azure platform to help us to undertake even bigger and more ambitious research over the coming academic year.”
~ Andrew Blake, Research Director, The Alan Turing Institute

Human society is one of the most complex systems on the planet and measuring aspects of it has been extremely difficult until now. Merve Alanyali and Chanuki Seresinhe are graduate students from the University of Warwick who are spending a year at The Turing applying novel computational social science techniques to understand human happiness and frustration. They are using AI and deep neural networks to analyze millions of online photos with Microsoft Azure and their findings are providing deeper insights into the human condition.

[embedded content]

Kenneth Heafield, Turing Fellow from the University of Edinburgh, has been using thousands of Azure GPUs (graphical processing units) to explore and optimize neural machine translation systems for multiple languages in the Conference on Machine Translation. Azure GPUs enabled the group to participate in more languages, producing substantially better results than last year and winning first place in some language pairs. The team is working closely with Intel on using new architectures, including FPGAs (field-programmable gate arrays) like Microsoft’s Project Catapult, to make even bigger gains in machine translation.

Microsoft is delighted to see The Alan Turing Institute setting up a deep research program around ethics, a crucial topic in data science, AI and machine learning. Our own human-centered design principles are that AI technology should be transparent, secure, inclusive and respectful, and also maintain the highest degree of privacy protection. We are pleased that Luciano Floridi is leading the Data Ethics research group at The Turing as his perspectives on areas such as healthcare are helping us to think about how we can ensure that technology is used in the most constructive ways.

The first-year at The Turing has been impressive. We look forward to another exciting year as we work together on projects in data-centric engineering, blockchain, healthcare and secure cloud computing. Along with Microsoft’s data science collaborations at University of California, Berkeley, and through the National Science Foundation Big Data Innovation Hubs, we are perhaps getting closer to answering Alan Turing’s profound question from 67 years ago.

Learn more:

Meet the Azure Analysis Services team at PASS Summit 2017


Members from the Azure Analysis Services team will be presenting two sessions at this years PASS Summit 2017 in Seattle, WA. Members will also be available in the SQL Clinic to directly answer your Analysis Services questions in a one-on-one setting. Sessions include the following:

Creating Enterprise Grade BI Models with Azure Analysis Services or SQL Server Analysis Services

Speakers: Bret Grinslade and Christian Wade

Level: 400

Microsoft Azure Analysis Services and SQL Server Analysis Services enable you to build comprehensive, enterprise-scale analytic solutions that deliver actionable insights through familiar data visualization tools such as Microsoft Power BI and Microsoft Excel. Analysis Services enables consistent data across reports and users of Power BI. This session will reveal new features for large, enterprise models in the areas of performance, scalability, model management, and monitoring. Learn how to use these new features to deliver tabular models of unprecedented scale with easy data loading and simplified user consumption.

Deliver Enterprise BI on Big Data

Speakers: Bret Grinslade and Josh Caplan

Level: 300

Learn how to deliver analytics at the speed of thought with Azure Analysis Services on top of a petabyte-scale SQL Data Warehouse, Azure Data Lake, or HDInsight implementation. This session will cover best practices for managing, processing, and query accelerating at scale, implementing change management for data governance, and designing for performance and security. These advanced techniques will be demonstrated thorough an actual implementation including architecture, code, data flows, along with tips and tricks.

Learn more about PASS Summit 2017. We hope to see you there.

Blackbaud and Microsoft to strengthen strategic partnership to digitally transform the nonprofit sector – News Center

Social good software leader Blackbaud bets big on Microsoft Azure as the two companies plan to go deeper on integrations, innovation and sector leadership to scale global good

BALTIMORE — Oct. 18, 2017 As part of bbcon 2017, Blackbaud (Nasdaq: BLKB), the world’s leading cloud software company powering social good, and Microsoft Corp. (Nasdaq: MSFT), plan to expand their partnership in support of their mutual goals to digitally transform the nonprofit sector.

The nonprofit sector represents the third largest workforce behind retail and manufacturing in the United States with approximately 3 million organizations globally. Blackbaud, the largest vertical cloud software provider in the space, announced its intention to fully power its social good-optimized cloud, Blackbaud SKY™, with Microsoft Azure. The two companies highlighted a three-point commitment to collaboration for the good of the global nonprofit community. This announcement comes just days after Microsoft launched its new Tech for Social Impact Group, which is dedicated to accelerating technology adoption and digital transformation with the nonprofit industry to deliver greater impact on the world’s most critical social issues.

“This newly expanded partnership between Microsoft and Blackbaud will allow both companies to better meet the unique technology challenges nonprofits face,” said Justin Spelhaug, general manager of Microsoft Tech for Social Impact. “By combining Microsoft’s cloud platforms and expertise with Blackbaud’s leading industry solutions, we will create new opportunities for digital transformation to empower nonprofits to make an even bigger impact on the world.”

“The nonprofit community plays a vital role in the health of the entire social economy, and we’ve been working for more than three decades to help these inspiring organizations achieve big, bold mission outcomes,” said Mike Gianoni, president and CEO of Blackbaud. “For nearly that long we’ve also been a Microsoft partner, and we’re incredibly enthusiastic about forging new ground together as we tackle some of the most pressing issues nonprofits face. Both companies couldn’t be more committed to this space, so the nonprofit community should expect great things from this expanded partnership.”

The newly expanded partnership between Microsoft and Blackbaud will focus on three key areas:

Deeper integration between Microsoft and Blackbaud solutions, with Blackbaud’s cloud platform for social good, Blackbaud SKY, powered by Microsoft Azure

Blackbaud has been developing on the Microsoft stack for over three decades. As a leading Global ISV Partner, Blackbaud is already one of Microsoft’s top Azure-based providers. Today, Blackbaud announced its intention to fully power Blackbaud SKY™, its high-performance cloud exclusively designed for the social good community, in Microsoft’s Azure environment.

“Blackbaud’s expanded Azure commitment will be one of the most significant partner bets on Microsoft’s hyperscale cloud, and the most significant to transform the social good space,” Spelhaug said. “We often highlight the engineering work behind Blackbaud SKY™, because it demonstrates the power of Microsoft Azure and the kind of forward-looking innovation and leadership that the nonprofit sector greatly needs.”

Details of the investment are not publicly available but the companies plan to share more about the partnership in coming months. Blackbaud also announced its plans to become a CSP (Cloud Solution Provider) partner for the Microsoft platform, simplifying the purchase, provisioning and management of Blackbaud and Microsoft cloud offerings. For nonprofits that want the security, power and flexibility of the cloud plus the services and support of a trusted solution provider that deeply understands their unique needs, Blackbaud will be able to deliver both Microsoft and Blackbaud solutions through a unified purchase experience.

A commitment to pursuing best-in-class nonprofit cloud solutions that bring together the best of both companies’ innovation for a performance-enhanced experience for nonprofits — from funding, to mission operations, to program delivery

Blackbaud and Microsoft plan to pursue innovative ways to fully harness the power, security and reliability of Microsoft’s Azure-Powered solutions (e.g., Office 365, Dynamics) and Blackbaud’s industry-leading, outcome-focused solutions that cater specifically to the unique workflow and operating model needs of nonprofits — all with the goal of improving nonprofit performance across the entire mission lifecycle.

This includes exploring how both companies’ respective cloud artificial intelligence (AI) and analytics innovations can be leveraged in new ways to drive even greater sector impact.

“There is massive opportunity to empower the nonprofit community through creative tech innovation,” said Kevin McDearis, chief products officer at Blackbaud. “Every 1 percent improvement in fundraising effectiveness makes $2.8 billion available for frontline program work. This is just one example of the type of impact Blackbaud focuses on with our workflows and embedded intelligence, and we couldn’t be more thrilled to team up with Microsoft to push into new areas of innovation that move the sector forward, faster.”

Joint sector leadership initiatives that make innovation, research and best practices more accessible to nonprofits around the world

Nonprofits are addressing some of the world’s most complicated issues. As shared value companies, Microsoft and Blackbaud share a commitment to helping nonprofits meet those needs. Microsoft is globally known for its unmatched philanthropic reach and impact. And Blackbaud, which exclusively builds software for social good, invests more in R&D and best-practice-driven research for global good than any technology provider. Both companies were among just 56 companies named to the Fortune 2017 Change the World list.

Together, Microsoft and Blackbaud intend to partner on initiatives that make innovation more accessible for nonprofits large and small, while also exploring ways the companies’ data assets, community outreach and sector leadership can be synergistically and responsibly applied to improve the effectiveness and impact of the entire nonprofit community.

Microsoft and Blackbaud will share further details in the coming months. Learn more about Microsoft’s Technology for Social Impact Group here. Visit www.Blackbaud.com for more on Blackbaud.

About Blackbaud

Blackbaud (NASDAQ: BLKB) is the world’s leading cloud software company powering social good. Serving the entire social good community—nonprofits, foundations, corporations, education institutions, healthcare institutions and individual change agents—Blackbaud connects and empowers organizations to increase their impact through software, services, expertise, and data intelligence. The Blackbaud portfolio is tailored to the unique needs of vertical markets, with solutions for fundraising and CRM, marketing, advocacy, peer-to-peer fundraising, corporate social responsibility, school management, ticketing, grantmaking, financial management, payment processing, and analytics. Serving the industry for more than three decades, Blackbaud is headquartered in Charleston, South Carolina and has operations in the United States, Australia, Canada and the United Kingdom. For more information, visit www.blackbaud.com.

About Microsoft

Microsoft (Nasdaq “MSFT” @microsoft) is the leading platform and productivity company for the mobile-first, cloud-first world, and its mission is to empower every person and every organization on the planet to achieve more.

Microsoft Media Relations, WE Communications for Microsoft, (425) 638-7777, rrt@we-worldwide.com

Nicole McGougan, Public Relations Manager for Blackbaud, (843) 654-3307, media@blackbaud.com

Scality Connect ports S3 apps to Azure Blob storage

Object storage vendor Scality is moving to connect Amazon S3 apps to Microsoft Azure Blob storage in multicloud setups.

Scality Connect software, which launched last week, can help customers overcome the hurdle of porting an application based on the Simple Storage Service (S3) API to Azure Blob storage.

Scality plans to announce in December advanced Amazon S3 API support, along with versioning and a bucket website, said Wally MacDermid, vice president of business development for cloud at Scality, based in San Francisco.

John Webster, a senior partner at Evaluator Group in Boulder, Colo., said the multicloud play will be of particular interest to the DevOps groups within organizations. Many developers spend a great deal of time doing API modifications to applications.

“Anytime you can relieve the user of that burden is good. [Lack of interoperability] is a big issue. This is the last thing customers want,” Webster said of the need to modify APIs. “They just hate it. They have to modify APIs to work with other APIs.”

MacDermid said there is no hardware requirement for Scality Connect.  It is included as a stateless container inside an Azure subscription. Connect stores data in the Microsoft Azure Blob storage native format, and the container runs in a virtual machine within the customer’s subscription.

“We don’t hold any data. We just pass it to the Azure cloud,” MacDermid said. “An application that works on S3 can run in Azure without requiring any modification in the code.

“Once the data is up in Azure, you can use the Azure management services on top of it.”

Scality Connect makes it easier for developers to deploy applications within Microsoft Azure and use its advanced services. The software is available through the Azure Marketplace.

The Microsoft Azure and Google clouds do not support the Amazon S3 API, which has become the de facto cloud standard in the industry. That means the Azure Blob storage does not talk to the Amazon S3 API, which limits a customer’s ability to use multiple clouds.

“One side talks S3, and the other side talks the Azure API, and neither talks to each other,” MacDermid said. “This is a problem not only for customers, but for Azure, as well. [Microsoft] would admit that. The Scality Connect runs in the Azure Cloud. It gets your data up to the Azure Cloud and allows you to use the Azure services. We are the translation layer.”

Scality Connect is not the vendor’s first multicloud initiative. Scality in July unveiled its Zenko open source software controller for multicloud management to store data and applications under a single user interface no matter where they reside, including Scality Ring. It helps customers match specific workloads to the best cloud service. Zenko is based on the Scality S3 Server.

A pair of big companies explain why they jumped on board the Microsoft cloud

microsoft julia white
Microsoft Corporate VP of
Azure Julia White


In the cloud service market, Microsoft finds itself
firmly in second place. 

But in trying to catch up with market leader Amazon, the
tech giant argues it has a distinct advantage — its
long history in the business software game and its
long-established relationships with companies of all sizes.
Microsoft knows how to meet companies’ needs, it argues.

That’s not just an idle boast, if my conversations with Geico and
Dun & Bradstreet are any indication. Both companies recently
turned to Microsoft’s Azure cloud service. And in both cases, the
companies saw distinct benefits to using Azure over rival

“You can tell Microsoft is hungry,” said Pat McDevitt, the chief
technology officer of Dun & Bradstreet, which recently
started experimenting with Azure. “They are doing exactly what
they need to do.”

Azure is in the spotlight this week thanks to Ignite, Microsoft’s
annual developer conference. The company typically uses Ignite to
promote its massive cloud computing platform. At this year’s
Microsoft announced several tools to make it easier for large
companies in particular to use Azure. 

‘Essentially evacuating the data center’

fikri larguet geico
Fikri Larguet, Director of


One big company that’s already embraced Azure is Geico. The
insurance giant began shifting over to Microsoft’s cloud
service about three years ago, said Fikri Larguet, the
company’s director of information technology. Geico’s rationale:
Owning and operating data centers and servers is both expensive
and outside its core area of expertise.

The company, which has more than 38,000 employees, is
“essentially evacuating the data center,” Larguet
said. Geico, a wholly-owned subsidiary of Berkshire
Hathaway, has been moving a little bit at a time. But over 50% of
the company’s core business services are already in the cloud and
its goal is to be “full cloud” by 2020, he said.

Larguet said his team has a mandate to “get out of the data
center business as quickly as possible.” 

Geico bet on Azure because Microsoft had already built into its
cloud service the ability to interact with lots of different
applications. That made it a smooth process for Geico
to bring over its existing software, Larguet said. Similar
support for newer tools and technologies also made it easier
for Geico to add on things like software containers, a trendy new
Silicon Valley technology for building large-scale apps. 

 The biggest challenge Geico’s faced hasn’t been technical,
Larguet said. As the team tries to adjust to a post-data center
era, Larguet is trying to teach the team to “fail fast” and
be unafraid of trying new things. For him, this is a chance for a
fresh start in the software organization. 

“We don’t want to carry over our bad habits,” he said.

‘We don’t need to be bleeding edge’

dun bradstreet cto pat mcdevitt
Dun & Bradstreet CTO
Pat McDevitt

Dun &

Dun & Bradstreet, a firm that’s provided data and analytics
to businesses since 1841, is taking its cloud migration a little
more slowly. 

The firm “doesn’t need to be a pioneer, doesn’t need to be
bleeding edge,” McDevitt, its CTO, said. Dun & Bradstreet
has been around the better part of two centuries; it can
afford to experiment with the cloud rather than rushing to push
everything over to it right away. And the firm has been
experimenting; it moved over some key applications to Amazon
Web Services a few years ago.

About three months ago, though, the firm started
experimenting with Azure. What appealed to Dun & Bradstreet
were Microsoft’s tools for managing data. Those tools make it
easy for companies to build cloud-based applications that
read and write from their existing databases. With them, the
firm could more quickly build mobile apps and other new-wave

McDevitt asked one of the firm’s development teams in Austin
to test Azure by using it to build a new mobile app for Dun &
Bradstreet’s clients. Although these developers’ past experience
was primarily with Amazon’s cloud service, they found it so easy
to work with Azure that they finished ahead of schedule. 

And Azure offered the firm an another benefit.
Because Microsoft has embraced technology like Docker
software containers and the Linux operating system, Azure
integrated better with Dun & Bradstreet’s existing
systems than McDevitt had first thought. Originally, the
firm was going to use Azure just for new apps. Now the firm
is considering using it for older apps also, he said. 

Microsoft worked really hard to win his business, McDevitt

Get the latest Microsoft stock price here.

Announcing general availability of Azure Managed Applications Service Catalog

Today we are pleased to announce the general availability of Azure Managed Applications Service Catalog.

Service Catalog allows corporate central IT teams to create and manage a catalog of solutions and applications to be used by employees in that organization. It enables organizations to centrally manage the approved solutions and ensure compliance. It enables the end customers, or in this case, the employees of an organization, to easily discover the list of approved solutions. They can consume these solutions without having to worry about learning how the solution works in order to service, upgrade or manage it. All this is taken care of by the central IT team which published and owns the solution.

In this post, we will walkthrough the new capabilities that have been added to the Managed Applications and how it improves the overall experience.


We have made improvements to the overall experience and made authoring much easier and straight forward. Some of the major improvements are described below.

Package construction simplified

In the preview version, the publisher needed to author three files and package them in a zip. One of them was a template file which contained only the Microsoft.Solutions/appliances resource. The publisher also had to specify all of the actual parameters needed for the deployment of the actual resources in this template file again. This was in addition to these parameters already being specified in the other template file. Although this was needed, it caused redundant and often confusing work for the publishers. Going forward, this file will be auto-generated by the service.

So, in the package (.zip), only two files are now required – i) mainTemplate.json (template file which contains the resources that needs to be provisioned) ii) createUIDefinition.json

If your solution uses nested templates, scripts or extensions, those don’t need to change.

Portal support enabled

At preview, we just had CLI support for creating a managed application definition for the Service Catalog. Now, we have added Portal and PowerShell support. With this, the central IT team of an organization can use portal to quickly author a managed application definition and share it with folks in the organization. They don’t need to use CLI and learn the different commands offered there.

These could be discovered in the portal by clicking on “More Services” and then searching for Managed. Don’t use the ones which say “Preview”.


To create a managed application definition, select “Service Catalog managed application definitions” and click on “Add” button. This will open the below blade.


Support for providing template files inline instead of packaging as .zip

Create a .zip file, uploading it to a blob, making it publicly accessible, getting the URL and then creating the managed application definition still required a lot of steps. So, we have enabled another option where you can specify these files inline using new parameters that have been added to CLI and Powershell. Support for inline template files will be added to portal shortly.

Service Changes

Please note that the following major changes have been made to the service.

New api-version

The general availability release is introducing a new api-version which will enable you to leverage all the above mentioned improvements. The new api-version is 2017-09-01. Azure Portal will use this new api-version. The latest version of Azure CLI and Azure PowerShell leverages this new api-version. It will be required that you switch to this latest version to develop and manage Managed Applications. Note that creating and managing Managed Applications will not be supported using the existing version of CLI after 9/25/2017. Existing resources which have been created using the old api-version (old CLI) will still continue to work.

Resource type names have changed

The resource type names have changed in the new api-version. And so, Microsoft.Solutions/appliances is now Microsoft.Solutions/applications, and Microsoft.Solutions/applianceDefinitions is Microsoft.Solutions/applicationDefinitions.

Upgrade to the latest CLI and PowerShell

As mentioned above, to continue using and creating Managed Applications, you will have to use the latest version of CLI and PowerShell, or you can use the Azure portal. Existing versions of these clients built on the older api-version will no longer be supported. Your existing resources will be migrated to use the new resource types and will continue to work using the new version of the clients.

Supported locations

Currently, the supported locations are West Central US and West US2.

Please try out the new version of the service and let us know your feedback through our user voice channel or in the comments below.

Additional resources

Catch up on Azure Networking and Azure Active Directory news from Microsoft Ignite – The Fire Hose

Infographic showing VNet Service Endpoints in Azure
VNet Service Endpoints restricts Azure services to be accessed only from a VNet.

Virtual Network Service Endpoints in Azure and expanded conditional access capabilities for Azure Active Directory were among the announcements made at the Microsoft Ignite conference this week in Orlando, Florida.

“As you continue to bring your mission-critical workloads to Azure, we will continue to simplify the overall network management, security, scalability, availability and performance of your applications,” writes Yousef Khalidi, corporate vice president of Azure Networking.

Azure services such as Storage and SQL have internet-facing IP addresses, and many customers would prefer that their Azure services not be exposed directly to the internet, Khalidi says. “Virtual Network Service Endpoints extend your virtual network private address space and the identity of your VNet to Azure services. You can restrict Azure resources to only be accessed from your VNet and not via the internet.”

Meanwhile, Alex Simons, director of program management for the Microsoft Identity Division, writes about a new wave of scenarios that expand conditional access capabilities, including integration across Enterprise Mobility + Security (EMS) Azure Information Protection and Microsoft Cloud App Security services, also announced at Ignite.

Learn more by visiting the Microsoft Azure Blog and the Enterprise Mobility and Security Blog.

Tags: Azure, azure active directory, Ignite 2017, Security

Use Azure Storage Explorer to manage Azure storage accounts

You might have used third-party tools to manage Azure storage accounts — including managing storage blobs, queues…


* remove unnecessary class from ul
$(“#inlineregform”).find( “ul” ).removeClass(“default-list”);

* Replace “errorMessageInput” class with “sign-up-error-msg” class
function renameErrorMsgClass() {
$(“.errorMessageInput”).each(function() {
if ($(this).hasClass(“hidden”)) {
$(this).removeClass(“errorMessageInput hidden”).addClass(“sign-up-error-msg hidden”);
} else {

* when validation function is called, replace “errorMessageInput” with “sign-up-error-msg”
* before return
function validateThis(v, form) {
var validateReturn = urValidation.validate(v, form);
return validateReturn;

* DoC pop-up window js – included in moScripts.js which is not included in responsive page
$(“#inlineRegistration”).on(“click”,”a.consentWindow”, function(e) {
window.open(this.href, “Consent”, “width=500,height=600,scrollbars=1”);

and table storages — and VM files in the past, but there’s another option. Microsoft developed an Azure storage management tool that can manage multiple Azure storage accounts, which helps increase productivity. Meet certain requirements before installing the tool, and you can realize other benefits of using Azure Storage Explorer, such as performing complete Azure storage operational tasks from your desktop in a few simple steps.

Azure Storage Explorer was released in June 2016. Although Azure Storage Explorer is in preview, many organizations use it to efficiently manage Azure storage accounts. There are several previous versions of Azure Storage Explorer, but the latest version that is reliable and is in production use is 0.8.16.

Benefits of using Azure Storage Explorer

One of the main benefits of using Azure Storage Explorer is that you can perform Azure storage operations-related tasks — copy, delete, download, manage snapshots. You can also perform other storage-related tasks, such as copying blob containers, managing access policies configured for blob containers and setting public access levels, from a single GUI installed on your desktop machine.

Another benefit of using this tool is that if you have Azure storage accounts created in both Azure Classic and Azure Resource Manager modes, the tool allows you to manage Azure storage accounts for both modes.

You can also use Azure Storage Explorer to manage storage accounts from multiple Azure subscriptions. This helps you track storage sizes and accounts from a single UI rather than logging into the Azure portal to check the status of Azure storage for a different Azure subscription.

Azure Storage Emulator, which must be downloaded separately,  allows you to test code and storage without an Azure storage account. Apart from managing storage accounts created on Azure, Azure Storage Explorer can connect to other storage accounts hosted on sovereign clouds and Azure Stack.

Requirements and installing Azure Storage Explorer

Azure Storage Explorer requires minimum resources on the desktop and can be installed on Windows Client, Windows Server, Mac and Linux platforms. All you need to do is download the tool and then install it. The installation process is quite simple. Just proceed with the onscreen steps to install the tool. When you launch the tool for the first time, it will ask you to connect to an Azure subscription, but you can cancel and add an Azure subscription at a later stage if you want to explore the options available with the tool. For example, you might want to modify the proxy settings before a connection to Azure subscriptions can be established.

Configuring proxy settings

It’s important to note that, because Azure Storage Explorer requires a working internet connection and because many of the production environments have a proxy server deployed before someone can access the internet, you’ll be required to modify the proxy settings in Azure Storage Explorer by navigating to the Edit menu and then clicking Configure Proxy as shown in Figure A below:

Azure Storage Explorer proxy server settings
Figure A. Launching the proxy server settings page

When you click on Configure Proxy, the tool will show you the Proxy Settings page as shown in Figure B below. From there, you can enter the proxy settings and then click on OK to save the settings.

Proxy setting configuration
Figure B. Configuring proxy settings in Azure Storage Explorer

When you configure proxy settings in Azure Storage Explorer, the tool doesn’t check whether the settings are correct. It just saves the settings. If you run into any connection issues, please make sure that the proxy settings are correct and that you have a reliable internet connection.

How to use Azure Storage Explorer

If you’ve worked with third-party Azure storage management tools, you’re already familiar with storage operational tasks, such as uploading VHDX files and working with blob containers, tables and queues. Azure Storage Explorer provides the same functionality, but the interface might be different than the third-party storage management tools you’ve worked with thus far. The first step is to connect to an Azure account by clicking on the Manage Accounts icon and then clicking Add an Account. Once it is connected, Azure Storage Explorer will retrieve all the subscriptions associated with the Azure account. If you need to work with storage accounts in an Azure subscription, first select the subscription, and then click Apply. When you click Apply, Azure Storage Explorer will retrieve all of the storage accounts hosted on the Azure subscription. Once storage accounts have been retrieved, you can work with blob containers, file shares, queues and tables from the left navigation pane as shown in Figure C below:

Storage accounts in Azure Storage Explorer
Figure C. Working with storage accounts in Azure Storage Explorer

If you have several Azure storage accounts, you can search for a particular storage account by typing in the search box located on top of the left pane as it is shown in Figure C above. Azure Storage Explorer provides easy management of blob containers. You can perform most blob container-related tasks, including creating a blob, setting up public access for a blob and managing access policies for blobs. As you know, by default, a blob container has public access disabled. If you want to enable public access for a blob container, click on a blob container in the left navigation pane, right-click on the blob container and then click on Set Public Access Level… to display the Set Container Public Access Level page shown in Figure D below.

Blob container public access level
Figure D. Setting public access level for a blob container

Next Steps

Learn more about different Azure storage types

Navigate expanded Microsoft Azure features

Enhance cloud security with Azure Security Center

Microsoft Azure announces new capabilities and partnerships at IBC 2017

Over the past months, we have aggressively added innovative new capabilities in Azure to enable content owners and partners to prepare, store, protect, distribute and monetize media in the cloud. We are thrilled to see media & cable organizations choosing Microsoft Azure for their digital transformation needs. Whether helping organizations move their focus away from infrastructure to content, distributing content with digital era velocity, or providing reliable and relevant content, Microsoft Azure is the trusted and global-scale cloud for the media industry’s needs.

Azure Media Services

In response to enthusiastic customer feedback, we have added the following new capabilities to Azure Media Services.

  • Democratizing AI for Media Industry: Video Indexer a powerful new capability which brings AI to life for media industry was introduced a few months back at BUILD 2017, as a unique integrated bundling of Microsoft’s cloud-based artificial intelligence and cognitive capabilities applied specifically for video content. Its brings capabilities like face detection, sentiment extraction, object dectection, audio transcripts and more. Enhancements to this service include support for Egyptian Arabic language for speech analysis, playback at multiple speeds, and many more.
    Azure Media Redactor is a powerful cloud video processing service leveraging word-class artificial intelligence, which is capable of automatically detecting and blurring faces in your videos, for use in cases such as public safety and news media. This is a new addition to the broad set of already existing capabilities of Azure Media Analytics.
  • HEVC Encoding support: The improved encoding efficiency of HEVC helps video service operators deliver higher resolution and higher quality video to their viewers, while saving on storage and CDN costs. Apple’s iOS11 release will support this codec, which will ensure a large addressable market for operators who adopt this codec. HEVC encoding is now generally available in Azure Media Services via the Premium Encoder. Further, HEVC encoding is priced the same as H.264 – we believe this to be critical to the success of our customers.

Azure CDN

The following new capabilities to optimize performance for media streaming and dynamic content were added since NAB for Azure CDN.

Media streaming optimized delivery With Azure CDN: We have added the ability to easily optimize the delivery of streaming media. With a single click, media streaming optimizations are automatically applied. This results in fast and efficient delivery of media assets to users and optimized requests to origins.

Dynamic content acceleration: Azure CDN now supports Dynamic Site Acceleration (DSA) to optimize the delivery of dynamically served content. When DSA is enabled the performance of web pages with dynamic content is significantly improved as a result of route/network optimizations, resource prefetching and smart image compression.

One Click Integration with Storage and Web Apps: Pursuant to our goals of simplifying & making CDN capability transparent to our users we have enabled “One Click” enablement for assets in Azure storage as well a simple on click enablement of CDN for Websites & Web apps built on Azure.

Azure Storage

Azure Storage has made substantial improvements in 2017 to address the needs of Media & Entertainment customers. This includes:

  • Azure Archive Blob Storage – Azure Archive Blob storage is designed to provide Media organizations with a low cost means of delivering durable, highly available, secure cloud storage for rarely accessed data with flexible latency requirements. This is in addition to the existing Hot & Cool tiers which provide instantaneous high throughput object storage.
  • Blob-Level Tiering – To simplify data lifecycle management, we now allow customers to tier their data at the blob level.  Customers can easily change the access tier of a blob among the Hot, Cool, or Archive tiers as usage patterns change, without having to move data between accounts. Blobs in all three access tiers can co-exist within the same account.
  • Larger Object Sizes – Azure Blob Storage now supports up to 5TB objects to address the needs of 4K and 8K media.
  • Higher performance – Azure Blob Storage now supports significantly higher throughput for object reads, up to gigabytes per second for a single object to support post production, transcoding and other workloads. Write throughput has also improved to support large media ingestion including Media Archival.
  • Encryption at rest (including customer managed keys)– Azure Blob Storage has support for Encryption at rest to meet the security needs of our media customers. Azure is the only public cloud compliant with the MPAA certification. This summer, we launched a preview for customer managed keys, which allows customers to further enhance security by encrypting the data with their own keys.

Growing Partner Ecosystem

At IBC this year, we are excited to announce that our partner ecosystem is growing at a rapid pace, enabling our customers with solutions that span the length and breadth of digital workflows.

Earlier this year, we announced that Avid has selected Microsoft Azure as their preferred partner to power their business in the cloud. Visitors to the Microsoft booth at IBC can see a demonstration showcasing an integrated Microsoft Azure and Avid offering.

In late July, it was announced that Ooyala’s Flex platform, which runs on Microsoft Azure, will now be utilized by Zone TV for its end-to-end workflow needs. Also, other notable customers who are driving value from Ooyala’s solution are National Rugby League (NRL), SiriusXM and Zoomin.TV (Netherlands). We are pleased to announce that visitors to Hall 15 at IBC can also see a joint demo between Ooyala’s Flex and Azure Media Services.

We are also pleased to announce our new partners – Amagi and axle Video. Amagi, which provides cloud-managed broadcast services and targeted advertising solutions, announced that their CLOUDPORT product, a cloud playout platform, can now be deployed on Microsoft Azure. axle Video, which provides Media Management solution, announced a deep integration with Microsoft Video Indexer to enhance their axle ai product.

We have also expanded the partnership with LiveArena as it launches LiveArena Broadcast Room, an end-to-end service allowing production and broadcast of Live and on-demand TV to any device, built entirely on Microsoft Azure.

Additionally, Levels Beyond, makers of Reach Engine, has announced the release of Reach Engine Media Services, which is built on Microsoft Azure and integrates with Microsoft Azure Storage. XenData, provider of high capacity data storage solutions, announced the launch of its managed hybrid cloud service by integrating with Microsoft Azure Storage.

Stay tuned for more Azure blog posts that will dive deeper into these announcements.

We’re continually innovating and forming new partnerships when it comes to enabling media workflows on the cloud, and invite you to take advantage by building on Microsoft Azure.

Come see us at IBC 2017

Learn more about Azure Media Services and Azure Storage, and visit us at our IBC Booth in Hall 15 to meet the Azure Media Services and Azure Storage team and see these services in action. We would also like to invite you to the Microsoft Hall to speak with our partners – Amagi, GameOn Technology, GrayMeta, IPV Limited, LiveArena, Make.TV, Oceaneering, Oooyala, Ownzones, StreamingBuzz, Veritone Media, and x.news information technology.

If you are not attending the conference but would like to learn more about our services for the media industry, follow the Azure Blog to say up to date on new announcements.

New Power BI Connector for Azure Enterprise users

We are very pleased to announce the release of Azure Consumption and Insights Connector in Power BI Desktop. Enterprise customers can use this to pull Azure Charge and Usage data for both Azure and Marketplace resources. It can be used by enterprise users for exploring, analyzing, and building their own custom dashboards.

Learn more by reading the detailed documentation on getting started with the APIs. Currently, we also have a Power BI Content Pack that can be used by enterprise customers for performing detailed analysis on their Azure usage and spend details.

What’s new?

  • We have the 4 datasets setup with the default behavior:
    • Current Billing period data for Usage and Price Sheet
    • Data since May 2014 for Balance Summary and Marketplace
  • We have added new parameters to pull data for any historical period as a moving window of data. For example, if you are interested in doing a month over month comparison you can pull data for that month from the prior and current year by using the new parameters.

Get started

Please check the steps listed in the Azure Consumption and Insights Connector documentation.

We have also added a guidebook section in the above document to help customers move their existing dashboards built on Azure Enterprise Connector (Beta) to the new connector.

What’s next?

We are currently working on providing this data using ARM authentication. As always, we welcome any feedback or suggestion you may have. These can be sent to us using the Azure Feedback Forum and the Azure MSDN forum. We will continue to enhance our collateral with additional functionality to provide richer insights into your usage and spend data for all workloads running on Azure.