Tag Archives: authored

The Microsoft AI Idea Challenge – Breakthrough Ideas Wanted!

This post is authored by Tara Shankar Jana, Senior Technical Product Marketing Manager at Microsoft.

All of us have creative ideas – ideas that can improve our lives and the lives of thousands, perhaps even millions of others. But how often do we act on turning those ideas into a reality? Most of the time, we do not believe in our ideas strongly enough to pursue them. Other times we feel like we lack a platform to build out our idea or showcase it. Most good ideas don’t go beyond those initial creative thoughts in our head.

If you’re a professional working in the field of artificial intelligence (AI), or an aspiring AI developer or just someone who is passionate about AI and machine learning, Microsoft is excited to offer you an opportunity to transform your most creative ideas into reality. Join the Microsoft AI Idea Challenge Contest today for a chance to win exciting prizes and get your project featured in Microsoft’s AI.lab showcase. Check out the rules, terms and conditions of the contest and then dive right in!

The Challenge

The Microsoft AI Idea Challenge is seeking breakthrough AI solutions from developers, data scientists, professionals and students, and preferably developed on the Microsoft AI platform and services. The challenge gives you a platform to freely share AI models and applications, so they are reusable and easily accessible. The ideas you submit are judged on the parameters shown in the figure below – essentially half the weight is for the originality of your idea, 20% for the feasibility of your solution, and 30% for the complexity (i.e. level of sophistication) of your implementation.

The Microsoft AI Challenge is accepting submissions between now and October 12th, 2018.

To qualify for the competition, individuals or teams are required to submit a working AI model, test dataset, a demo app and a demo video that can be a maximum of three minutes long. We encourage you to register early and upload your projects soon, so that you can begin to plan and build out your solution and turn in the rest of your materials on time. We are looking for solutions across the whole spectrum of use cases – to be inspired, take a look at some of the examples at AI.lab.


The winners of the first three places in the contest will respectively receive a Surface Book 2, a DJI Drone, and an Xbox One X.

We hope that’s motivation to get you started today – good luck!


Salesforce’s Marc Benioff calls for a national privacy law

Since June, when Salesforce CEO Marc Benioff authored a widely read opinion piece for Politico calling for a national privacy law for personal  data — a  stance that runs counter to that of many tech vendors — California passed data privacy legislation echoing the European Union’s groundbreaking GDPR.

Benioff, who had earlier backed a more radical data privacy ballot measure that tech giants like Facebook opposed, threw his clout behind the California bill, even though that measure is not quite as stringent as the GDPR (General Data Protection Regulation).

Amid boom times for data and the lack of U.S. regulation making it easier for companies to consume and monetize customer data, Benioff put the giant San Francisco-based CRM software vendor at the vanguard of the U.S. data privacy movement.

Meanwhile, the GDPR extended beyond the Atlantic and affected many U.S.-based companies that have international customers.

Between the regulatory changes around the world and consumer distrust of technology companies after recent episodes of breaches and data misuse, a national personal data privacy law such as the one Benioff envisions could be on the horizon.

But what that would look like in the U.S. is unclear, given Silicon Valley’s virtually unregulated reign and Congress’ apparent lack of appetite up to now for tech regulation.

And a big question for observers of the mercurial Benioff is whether it’s relatively easier for a company like Salesforce, which acts as a data processor and doesn’t monetize data directly, to call for stricter data privacy regulation

‘Bit of a paradox’

Salesforce declined to make Benioff available for this story. But Lindsey Finch, senior vice president of global privacy at Salesforce, said in an interview with SearchSalesforce that the kind of corporate transparency Benioff calls for in the Politico piece can be hard to achieve.

Headshot of Salesforce CEO Marc BenioffMarc Benioff

In his article, Benioff beseeches companies to be clearer about how they use customers’ data and to make terms of service concise and comprehensible.

But, as a click on the Salesforce website shows, the company’s terms of service for using cookies on a browser comprises more than 5,000 words of legal terminology. It’s more concise than some competitors’ pages, but it’s doubtful the average consumer would take the time to read 16 pages of legalese.

“This is something the industry struggles with,” Finch, a lawyer and former privacy counsel for General Electric, said. “On one hand, privacy laws like GDPR require very detailed information be provided. On the other hand, consumers want something concise and understandable. There’s a bit of a paradox of what laws legally require and what consumers can reasonably digest.”

Finch also responded to the potential criticism that Salesforce’s position in favor of strict personal data privacy is somewhat convenient because even though the company acts as a data processor for thousands of organizations, its customers’ data doesn’t directly affect its bottom line.

Lindsey Finch, senior vice president, global privacy, at SalesforceLindsey Finch

“We are a data processor in delivering our services to our customers and processing data on their behalf, but we are also a data controller in terms of running our business,” Finch said. “While our core business is delivering services as a data processor, we also have a lot of responsibilities as a controller as well.”

Data controllers are the companies directly affected by GDPR regulation and have to maintain their personal data protections up to the EU standards.

With some 10,000 employees, Salesforce is the biggest employer in San Francisco.                                    

Finch also elaborated on another key point in Benioff’s Politico piece — that a national personal data privacy law in the U.S., while using GDPR as a template, should be “tailored to our own traditions, values and rule of law.”

“We’re seeing this explosion of data and on the one hand you have individuals, either in a business or personal capacity, and what they want is a more personalized experience,” Finch said. “But at the same time they’re demanding that companies they do business with are more trustworthy than ever before.”

“We see an opportunity with something like GDPR,” she added. “What we’re advocating for in the U.S. is to look at how data is handled from the perspective of the end consumer and the types of things they’d reasonably expect or want to know about how their data is handled.”

The unregulated data landscape

Meanwhile, although the internet seems like a ubiquitous staple of everyday life, it’s still relatively new and matching regulation — which often gets drawn out through the political system — with the pace of innovation can be a difficult quandary.

“We’re seeing a backlash to this ‘Wild West’ approach to data and seeing the first signs of a more orderly digital economy,” said Steve Wilson, a data privacy expert and vice president and principal analyst at Constellation Research Inc. “You can’t have everyone for themselves when it comes to data.”

I’m seeing respect for GDPR and movement on this. Businesses understand that GDPR is an opportunity to get your [stuff] together.
Steve Wilsonvice president and principal analyst, Constellation Research

Wilson argued that widespread data breaches and concerns about Facebook’s use of data are laying the groundwork for an eventual national personal data privacy law, but others in the data privacy field still see any significant regulation as years away — if it materializes in the U.S. at all.

Another privacy expert, Ted Claypoole, a partner at the North Carolina firm Womble Bond Dickinson and a veteran data and privacy lawyer, was less sanguine on a national privacy law’s chances here.

“I think [a national privacy law] is unlikely and if it comes it will be a while from now,” Claypoole said. “If citizens don’t push for this and go collectively to Congress and demand this be changed, it won’t be. In 10 or 15 years it’ll be too hard to do because businesses are built on using data.”

One of the main reasons for Claypoole’s pessimism is his view that U.S. citizens don’t care much about personal data privacy — or if they do, their actions contradict their sentiment.

“People say one thing then jump on Facebook or give away data for a 20% off Chipotle coupon,” he said, referring to the popular Mexican food chain.

Wilson agreed that the U.S. has lagged behind on setting national privacy standards, but said he’s bullish on the response to GDPR and the movement toward better data regulation in the U.S. — starting with California.

“America has taken 20 years to get nowhere on this, but I’m not cynical about this,” Wilson said. “I’m seeing respect for GDPR and movement on this. Businesses understand that GDPR is an opportunity to get your [stuff] together.”

Wilson also said Salesforce is in a better position than companies like Facebook or Google to take the moral high ground on data privacy because Salesforce doesn’t directly make money from personal data, but that the CRM vendor still owns an important stake in the issue.

“A cynic would say that Salesforce can separate itself from the hurly burly of data and that’s fine, but they do have an important role in the custodianship of data,” Wilson said.

Data is valuable

Whether a national privacy law is a realistic goal or an idealistic dream, the national conversation on data privacy is shifting.

The California law AB 375 is a significant first step and could be a substantial one, as it would make U.S. companies abide by the California regulation if they do any business in California — similar to how companies not based in the EU still need to adhere to GDPR if they have a customer based in the EU.

“It’s not really about privacy. It’s about data being an important commodity in the digital economy and we can’t have a ‘Wild West’ anymore,” Wilson said. “We need some restraint and control and enterprise modesty. We see this attitude toward data because it’s so valuable.”

Enhanced debugging and faster simulation with the latest Quantum Development Kit update

This post was authored with contributions by Cathy Palmer, Program Manager, Quantum Software & Services.

Today, Microsoft released an update to the Microsoft Quantum Development Kit including an enhanced debugging experience and faster simulations, as well as several contributions from the Q# community. We’re excited about the momentum generated by the many new Q# developers joining us in building a new generation of quantum computing.

Just over six months ago, we released a preview of Q#, our new programming language for quantum development featuring rich integration with Visual Studio. The February 26 release added integration with Visual Studio Code to support Q# development on macOS and Linux as well as Python interoperability for Windows. Since then, tens of thousands of developers have begun to explore Q# and the world of quantum development.

Today’s update includes significant performance improvements for simulations, regardless of the number of qubits required, as shown in the H2 simulation below. This is a standard sample included in the Microsoft Quantum Development Kit.

Simulation comparison

This update includes new debugging functionality within Visual Studio. The probability of measuring a “1” on a qubit is now automatically shown in the Visual Studio debugging window, making it easier to check the accuracy of your code. The release also improves the display of variable properties, enhancing the readability of the quantum state.

Screen showing enhanced debugging

Adding to the new debugging improvements, you’ll find two new functions that output probability information related to the target quantum machine at a specified point in time, called DumpMachine and DumpRegister. To learn more, you can review this additional information on debugging quantum programs.

Thanks to your community contributions, the Microsoft Quantum Development Kit now includes new helper functions and operations, plus new samples to improve the onboarding and debugging experience. Check out the release notes for a full list of contributions.

Download the latest Microsoft Quantum Development Kit

We’ve been thrilled with the participation, contributions, and inspiring work of the Q# community. We can’t wait to see what you do next.

The December release of SQL Operations Studio is now available

This post is authored by Alan Yu, Program Manager, SQL Server.

We are excited to announce the December release of SQL Operations Studio is now available.

Download SQL Operations Studio and review the Release Notes to get started.

SQL Operations Studio is a data management tool that enables you to work with SQL Server, Azure SQL DB and SQL DW from Windows, macOS and Linux. To learn more, visit our GitHub.

SQL Operations Studio was announced for Public Preview on November 15th at Connect(), and this December release is the first major update since the announcement.

The December release includes several major repo updates and feature releases, including:

  • Migrating SQL Ops Studio Engineering to public GitHub repo
  • Azure Integration with Create Firewall Rule
  • Windows Setup and Linux DEB/RPM installation packages
  • Manage Dashboard visual layout editor
  • “Run Current Query with Actual Plan” command

For complete updates, refer to the Release Notes.

Migrating SQL Ops Studio Engineering to public GitHub repo

To provide better transparency with the SQL Operations Studio community, we have decided to migrate the Github internal branch to the public repo. This means any bug fixes, feature developments, or even test builds can be publicly viewed before an update is officially announced.

We made this move because we want to collaborate with the community to continually deliver features that our users want. This gives you the opportunity to see our changes in action to address your top voted issues. Visit our GitHub page and give us your feedback.

Azure Integration with Create Firewall Rule

Now let’s get into new features. A common issue when connecting to Azure SQL DB instances is that the connection can fail due to server firewall rules. This would require loading Azure Portal to configure firewall rules so that you can connect to your database, which can be inconvenient.

To speed up this process, we have enabled Azure Integration with Create Firewall Rule dialog. When your connection to an Azure SQL DB instance fails because of firewall settings, this dialog will appear, allowing the user to use their Azure subscription account to automatically configure the client IP address with the server. This retains the same experience as configuration on Azure Portal, except you can do it all through SQL Operations Studio.

Windows Setup installation and Linux DEB/RPM installation packages

We are always looking for new ways to improve the installation experience. With the December release, we have added Windows Setup installation to simplify installation on Windows. This wizard will allow the user to:

  • Select installation location
  • Select start menu folder
  • Option to add to path

In addition to Windows Setup, we have also added Linux DEB/RPM installation packages. These will add new ways for Linux users to download SQL Operations Studio for their choice of installation.

Feel free to try out these new installation experiences on our download page.

Manage Dashboard visual layout editor

In the initial release, there were not many options to customize the visual layout of the dashboards. With the December release, you can now resize and move your widgets by enabling the visual layout editor mode by clicking the pencil on the top right of the Manage Dashboard screen. This gives users greater control of their dashboard in addition to building their own custom insight widgets.

Run Current Query with Actual Plan command

Another new feature we have enabled is Run Current Query with Actual Plan, which is a command that will execute the current query and return the actual execution plan with the query results. This feature area is still in-progress as we work through the best UX for integrating this command directly into the query editor. While that design work is in-progress the functionality is still available via the Command Palette and you can define a keyboard shortcut if using this feature frequently.

Contact us

If you have any feature requests or issues, please submit to our GitHub issues page. For any questions, feel free to comment below or tweet us @sqlopsstudio.

Introducing Microsoft Machine Learning Server 9.2 Release

This post is authored by Nagesh Pabbisetty, Partner Director of Program Management at Microsoft.

Earlier this year, Microsoft CEO Satya Nadella shared his vision for Microsoft and AI, pointing to Microsoft’s beginnings as a tools company, and our current focus on democratizing AI by putting tools “in the hands of every developer, every organization, every public sector organization around the world”, so that they can build their own intelligence and AI capabilities.

Today, we are taking a significant step in realizing Satya’s vision by launching Microsoft Machine Learning Server 9.2, our most comprehensive machine learning and advanced analytics platform for enterprises. We have exciting updates to share, including full data science lifecycle support (data preparation, modeling and operationalization) for Python as a peer to R, and a repertoire of high performance distributed ML and advanced analytics algorithm packages.

We started the journey of transforming Microsoft R Server into Machine Learning Server a year ago, by delivering innovations in Microsoft R Server 9.0 and 9.1. We made significant enhancements in this release to create the Machine Learning Server 9.2 platform which replaces Microsoft R Server and offers powerful ML capabilities.

Microsoft Machine Learning Server is the most inclusive enterprise platform that caters to the needs of all constituents – data engineers, data scientists, line-of-business programmers and IT professionals – with full support for Python and R. This flexible platform offers a choice of languages and features algorithmic innovation that brings the best of open source and proprietary worlds together. It enables best-in-class operationalization support for batch and real-time.

Microsoft Machine Learning Server includes:

  1. High-performance ML and AI wherever your data lives.
  2. The best AI innovation from Microsoft and open source.
  3. Simple, secure and high-scale operationalization and administration.
  4. A collaborative data science environment for intelligent application development.
  5. Deep ecosystem engagements, to deliver customer success with optimal TCO.

It’s now easier than ever to procure and use Microsoft Machine Learning Server on all platforms. Licensing has been simplified to the following, effective October 1st 2017:

  • Microsoft Machine Learning Server is built into SQL Server 2017 at no additional charge.
  • Microsoft Machine Learning Server stand-alone for Linux or Windows is licensed core-for-core as SQL Server 2017.
  • All customers who have purchased Software Assurance for SQL Server Enterprise Edition are entitled to use 5 nodes of Microsoft Machine Learning Server for Hadoop/Spark for each core of SQL Server 2017 Enterprise Edition under SA. In addition, we are removing the core limit per-node; customers can have unlimited cores per node of Machine Learning Server for Hadoop/Spark.

You can immediately download Microsoft Machine Learning Server 9.2 from MSDN. It comes packed with the power of the open source R and Python engines, making both R and Python ready for enterprise-class ML and advanced analytics. Also check out the R Client for Windows, R Client for Linux, and Visual Studio 2017 with R and Python Tools.

Let’s take a peek at each of the key areas of the new Microsoft Machine Learning Server outlined above.

1. High-performance Machine Learning and AI, Wherever Data Lives

The volume of the data that’s being used by enterprises to make smart business decisions is growing exponentially. The traditional paradigm requires users to move data to compute which introduces challenges with latency, governance and cost, even if it was possible to move the data to where compute is. The modern paradigm is to take compute to where the data is, to unlock intelligence, and this is Microsoft’s approach.

In enterprises, it is common to have data spread across multiple data platforms and migrate data from one platform to another, over time. In such a world, it is essential that ML and analytics are available on multiple platforms, and are portable, and Microsoft delivers on this need. Microsoft Machine Learning Server 9.2 runs on Windows, three flavors of Linux, the most popular distributions of Hadoop Spark and in the latest release of SQL Server 2017. As always, we will soon make this release available on Azure as Machine Learning Server VMs, SQL Server VMs, and as Machine Learning Services on Azure HDInsight, in addition to an ever-growing portfolio of cloud services.

Today, we are also announcing Public Preview of R Services on Azure SQL DB, to make it easy for customers who are going cloud-first or transitioning to the cloud from on-premises.

For more information, review the links below:

2. The Best AI Innovation from Microsoft and Open Source

As make AI accessible to every individual and organization, one of our key goals is to use this technology to amplify human ingenuity through intelligent technology. We are designing AI innovations that extend and empower human capabilities in all aspects of life.  We are infusing AI across our most popular products and services, and creating new ways to interact more naturally with technology. Offerings such as the Microsoft Cognitive Toolkit for deep learning, our Cognitive Services collection of intelligent APIs, SQL Server Machine Learning Services and Azure Machine Learning exemplify our approach.

Microsoft Machine Learning Server includes a rich set of highly scalable and distributed set of algorithms such as revoscaler, revoscalepy, and microsoftML that can work on data sizes larger than the size of physical memory, and run on a wide variety of platforms in a distributed manner.

The open source ecosystem is innovating at a fast pace as well, with AI toolkits such as TensorFlow, MXNet, Keras and Caffe, in addition to our open source Cognitive Toolkit.

Microsoft Machine Learning Server 9.2 bridges these two worlds, enabling enterprises to build on a single ML platform where one can bring any R or Python open source ML package, and have it work side-by-side with any proprietary innovation from Microsoft. This is a key investment area for us. You can learn more from the resources below:

3. Simple, Secure and High-Scale Operationalization and Administration

Enterprises that rely on traditional paradigms and environments for operationalization end up investing a lot of time and effort towards this area. It is not uncommon for data scientists to complete their models and hand them over to line-of-business programmers to translate that into popular LOB languages and APIs. The translation time for the model, iterations to keep it valid and current, regulatory approval, managing permissions through operationalization – all of these things are big pain points, and they result in inflated costs and delays.

Microsoft Machine Learning Server offers the best-in-class operationalization solution in the industry. From the time an ML model is completed, it takes just a few clicks to generate web services APIs that can be hosted on a server grid (either on premises or in the cloud) which can then be integrated with LOB applications easily. In addition, Microsoft Machine Learning Server integrates seamlessly with Active Directory and Azure Active Directory, and includes role-based access control to make sure that the security and compliance needs of the enterprise are satisfied. The ability to deploy to an elastic grid lets you scale seamlessly with the needs of your business, both for batch and real-time scoring.

For more information, refer to the links below:

4. A Collaborative Data Science Environment for Intelligent Application Development

In enterprises, different departments take the lead for different aspects of the data science life-cycle. For instance, data engineers lead data preparation, data scientists lead experimentation and model building, IT professionals lead deployment and operationalization, and LOB programmers develop and enhance applications with intelligence, tailoring them to the needs of the business. With the in-database analytics capability of SQL Server 2017 and SQL Server 2016 (powered by Microsoft Machine Learning Services), all these constituents can work collaboratively and in the context of the leading mission critical database that is trusted by enterprises all over the world.

Python and R are the most popular languages for ML and advanced analytics. The choice of a language depends on the expertise and culture of engineers and scientists, the data science problems to be solved, and the availability of algorithms toolkits for the chosen language. Each language is supported by a choice of open-source IDEs. It’s not unusual to have debates on which language to choose because enterprises think they have to make an either-or choice.

With Microsoft Machine Learning Server, both R and Python are fully supported. You can bring in and use the latest open source toolkits along with the included Microsoft toolkits for AI and advanced analytics, all on top of a single enterprise-grade platform. Specific enhancements to support Python in the current release include:

  • New Python packages: revoscalepy and microsoftml, bringing high performance and battle tested machine learning algorithms to Python users.
  • Pre-trained cognitive models for image classification and sentiment analysis.
  • Interoperability with PySpark.
  • Python models deployed as web services.
  • Real-time and batch scoring of Python models.

Concurrent with this release, Microsoft is also releasing a public preview of Azure Machine Learning, a comprehensive environment for data science and AI. We will integrate Microsoft Machine Learning Server capabilities with this platform, to realize an industry-leading workbench for data science and AI.

For more information, refer to the links below:

5. Deep Ecosystem Engagements, to Deliver Customer Success with Optimal TCO

Individuals embarking on the journey of making their applications intelligent, or, simply wanting to learn the new world of AI and ML, need the right learning resources to help them get started. Microsoft provides several learning resources, and has engaged several training partners to create a repertoire of solution templates to help you ramp up and become productive quickly, including the following:

Enterprises have big investments in infrastructure and applications and may need the help of partners such as Systems Integrators (SIs) and Independent Software Vendors (ISVs) to help them transform into the world of intelligent applications. Microsoft has nurtured a vibrant ecosystem of partners to help our customers here. Learn about some of our strategic partnerships at the links below:


With the launch of Microsoft Machine Learning Server 9.2, we are proud to bring enterprises worldwide an inclusive platform for machine learning and advanced analytics. We have created a better-together environment that brings intelligence where the data lives, supports both R and Python, both open source and proprietary innovation, the ability to work on the data science lifecycle across a wide variety of platforms, and infuse intelligence at scale, both in batch and real-time contexts, with APIs for the most popular LOB languages.

Adopting machine learning and advanced analytics requires a holistic approach that transcends technology, people and processes. We are proud to continue delivering the best tools, platforms and ecosystem to ensure that enterprise users are set up for success. Our next steps are to integrate Azure Machine Learning and Microsoft Machine Learning Server closely, and continue to take machine learning to our customers’ data, wherever it may reside.


New Azure advancements remove cloud barriers for enterprises at Ignite 2017

This post is authored by Scott Guthrie, Executive Vice President, Cloud & Enterprise.

Cloud technology has enabled the era of digital transformation for enterprise customers, small businesses, and governments alike. While the vast majority of organizations have moved to a cloud-first technology strategy, most are still early on realizing this strategy due to a number of aspects from technology complexity to evolving regulations. Over the past year, the Microsoft Azure team has explicitly focused on removing all barriers for enterprise customers, so that even the most complex technology and policy requirements are uniquely met with Azure. And, while this opens up new opportunities for enterprise organizations, all customers benefit. Fundamentally, we believe that the success made possible by the cloud must be accessible to every business and every organization – small and large, old and new.

Today, at Microsoft Ignite conference in Orlando, I talked about how cloud is no longer about who has more features, it’s about how successful you can be with the cloud. With our depth of enterprise understanding and grit to do the complex technology work, Azure uniquely unlocks cloud-based success for all of our customers. To do this, we have focused our innovation into four key areas that I spoke about: enabling IT and developer productivity, providing a consistent hybrid cloud, unlocking AI solutions, and ensuring trust through security, privacy and cost controls.

Productive with Cloud

As you move increasingly large and complex applications to the cloud, it requires a comprehensive set of tools to build, deploy, and manage them efficiently. Within management capabilities specifically, Azure’s integrated management tools continue to expand end-to-end monitoring, alerting, and now provide central policy management for all VMs created in Azure. Combined with the new PowerShell and Bash support in Azure Cloud Shell, you are armed for efficiency running apps on Azure.

To support these large and complex applications, Azure also continues to expand the infrastructure available to span all types of workloads, from our recent M series VMs for SAP HANA implementations, to the new deep learning NVIDIA GPU-based VMs and high-memory E series of VMs.

Ultimately, your cloud-first strategy also incorporates moving to a DevOps approach. To this end, we built Visual Studio Team Services, a cloud-based DevOps toolset. And, we’ve tightly integrated Azure and Visual Studio Team Services to provide an end-to-end DevOps experience across build, deploy, and run for your applications. No need to patch together different dev tools like other clouds; with Azure it’s all built in, regardless of which OS you’re using or language you choose.

Providing Consistent Hybrid Cloud

Azure has long been committed to enabling the only true consistent cloud experience from identity, to data, to platform, to security and management. We uniquely understand that a distributed hybrid cloud model is the durable cloud model. And, we uniquely understand that hybrid cloud is more than just infrastructure – it must address your entire environment.

Enabling consistent development across cloud and on-premises, Azure Stack integrated systems are now shipping and available for purchase – with Dell EMC, Lenovo, and Hewlett Packard Enterprise (HPE) showcasing their solutions here at Ignite. Now developers can build one application and have it run in Azure and Azure Stack, opening up new uses cases such as edge and disconnected solutions and meeting literally every regulatory requirement.

Frequently, the most important, but also most complex, aspect of any application is the data. And, dealing with data in a hybrid application or full cloud migration situation can be prohibitively expensive. I was recently reviewing a statement of work for an enterprise organization to migrate their 1,000+ SQL Server based applications to AWS. The cost to modify each of these applications, so they could move to AWS, was over $20 million US. That’s unreasonable. We’ve built a fully managed Azure SQL Database service, now with 100 percent SQL Server compatibility for no code changes via managed instance, and are introducing a new Azure Database Migration Service that enables a near-zero downtime migration. The customer facing a $20 million migration to AWS can migrate all of their application data to Azure without significantly less time and 70 percent lower cost.

Speaking of data, today we also announced general availability of SQL Server 2017. This is an incredible milestone representing the first version of SQL Server to run on Windows Server, Linux, and Docker. In fact, there have been 2,000,000 pulls of the SQL Server on Linux image on Docker Hub! In addition, SQL Server 2017 enables in-database advanced machine learning with support for scalable Python and R-based analytics. This means you can train advanced models easily with data inside SQL Server without having to move data. The bottom line is that SQL Server 2017 delivers industry-leading, mission critical performance and security with everything built in, including AI, now on the platform of your choice. These are just some of the reasons that dV01 moved onto SQL Server 2017 on Linux and is experiencing unmatched performance and value.

Additionally, we’re also making it increasingly cost effective to run SQL Server and Windows Server on Azure. Taking advantage of Azure Hybrid Benefits, customers can gain up to 50% reduction in licensing costs. Combined with the no-code changes with Azure Database Migration Service, it’s clear that Azure is THE most cost-effective cloud to run your Windows Server and SQL Server applications.

While other cloud vendors talk about hybrid as purely infrastructure, or even as simply hosting legacy virtualization infrastructure in public cloud, we know this is not sufficient. Only Microsoft offers the comprehensive, consistent, hybrid cloud to address the real-world needs of enterprise customers today and into the future.

Unlocking Intelligent Solutions

Data isn’t just a core part of apps – it is fundamental to developing breakthrough intelligent apps. Azure has a comprehensive set of both data services and AI services that enables every organization to build experiences powered by AI.

As cloud-based applications increasingly scale, reach global users, and power AI experiences, we have come to a place where you need data at planet scale and performance. This is why we built Azure Cosmos DB, the first globally distributed, multi-model database service delivering turnkey global horizontal scale out with guaranteed millisecond latency and uptime. Today, we extend what Azure Cosmos DB can do, with new integration with Azure Functions, for event-based, serverless systems. This new combination of Azure Cosmos DB and Azure Functions enables developers to use event-driven serverless computing at global scale.

To enable the new generation of AI-powered apps and experiences, Azure has built the entire stack for AI – from infrastructure, to platform services, to AI dev tools. Azure offers the most complete, end-to-end AI capabilities such that AI solutions are possible for any developer and any scenario.

Within our AI services, I’m excited to announce breakthrough new Azure Machine Learning capabilities, including a new Machine Learning Workbench, that dramatically improves AI productivity of any developer and data scientist. These new capabilities provide rapid data wrangling and agile experimentation using familiar and open tools. AI developers and data scientists can now use Azure Machine Learning to develop, experiment and deploy AI models on any type of data, on any scale, in Azure and on-premises.

Ensuring Trust Through Security, Privacy, and Cost Controls

We’ve long understood that Azure would only be used if customers trust the technology. This is why we have continued to lead the industry in security and privacy certifications. And, this is why we continue to push the industry forward with new security and privacy innovations including Azure confidential computing enabling encryption of data while in use, and a new Azure DDoS protection service that monitors the public IP addresses of your resources within Azure, learns an application’s normal traffic patterns, and instantly mitigates a DDoS attack when it is detected.

Ensuring customer trust is also why we continue to invest in global infrastructure from the 42 global Azure regions to the new MAREA undersea cable reaching from Spain to Virginia. To meet even the most rigorous requirements, we just announced that Azure will extend our global regions with Availability Zones. This combination of global regions and Availability Zones provides customers with the most robust infrastructure for application resiliency of any cloud provider. Whether for high availability, redundancy, or site failover, Azure provides the full spectrum of resiliency options, so customers can run even their most mission critical applications with peace of mind.

To put security expertise in the hands of every customer, we’re expanding the integrated Azure Security Center capabilities to now also monitor and protect on-premises systems and other clouds, enabling full hybrid cloud security management and threat detection. Powered by Microsoft’s Intelligent Security Graph, the Azure Security Center provides both security recommendations and threat detection, with remediation now possible from directly within the Security Center. Combined with new Just in Time (JIT) admin access to resources in Azure, you have an end-to-end, integrated security toolset. With the global and nation-state security threats facing customers today, Azure offers built-in security and intelligence-powered security management tools – all directly in Azure.

A key aspect of trusting the cloud is fully understanding the costs. No one wants a surprise bill. We recently announced the acquisition of Cloudyn, the leader in cloud cost management. Today, I’m thrilled to announce that Cloudyn is now integrated into Azure and the new Azure Cost Management services will be free for all Azure customers.

To further enhance pricing options for customers, we announced today that Azure will begin offering Reserved VM Instances, with price savings of up to 72% cost savings for one- or three-year commitments. With Azure Reserved VM Instances (RI), you have unprecedented flexibility to cancel or refund your RIs at any time. RIs help provide cost predictability and ensure you have the VM capacity you need when you need it. We will continue to expand our Reserve Instances to additional Azure services in the future, as well. And, by combining Azure RIs with Azure Hybrid Benefits, customers can save up to 82%. Azure is providing incredible cost savings, coupled with unmatched cost management tools to enable transparency and control over your cloud costs at all times.

Today’s announcements, and the entire Azure team’s work over the past year, focus on ensuring Azure is the cloud that can meet the most rigorous and mission critical requirements of governments and enterprise customers, with the cost efficiency and productivity necessary for every startup and small business. That’s because we believe that the success made possible by the cloud must be accessible to every business and every organization.

Azure. Cloud for all.

Extending Microsoft Azure IP Advantage to China

This blog post was authored by Erich Andersen, Corporate Vice President and Chief IP Counsel, Microsoft Intellectual Property. 

Cloud-fueled digital transformation enables companies around the world to create new products and services, and engage with their customers at an unprecedented pace and scale. As they become digital businesses, companies need to address legal challenges which come with participating in the digital economy. Microsoft has developed strategies and assets to manage the intellectual property infringement risks that come with digital transformation. As our customers and partners become digital businesses, we are using our IP expertise and patent portfolio to help our customers protect their innovations in the cloud and focus on developing their business to succeed in their transformation.

Today, we are announcing that Microsoft Azure IP Advantage will be available in China beginning October 1, 2017, ensuring that Azure customers in China can enjoy the same great IP protection benefits as customers in the rest of the world. 

We have had a tremendous response to the program since we launched it last February. Customers recognize that uncapped indemnification coverage, including for open source software that powers Azure experiences, access to 10,000 Microsoft patents, and the springing license right are valuable benefits that help them manage IP risk.

Many customers tell us that the patent pick benefit alone serves as a significant deterrent against patent assertions and that the breadth of our indemnification pledge is unmatched by competitors. ISVs building on Azure are excited by the ability to access 10,000 Microsoft patents to complement their own patent portfolio. TechInsights confirms that, “Microsoft Azure IP Advantage outranks competitors Oracle, Google, Amazon and VMware’s portfolios.” None of Microsoft’s Azure competitors offer a similar package of offerings. The fact that these tools are available for free to eligible Azure customers makes it all the more compelling.

Extending these benefits to China aligns well with Microsoft’s approach to delivering cloud services on a truly global scale. Azure has 42 regions around the world and that number is growing. In China, Microsoft has partnered with 21Vianet to deliver Microsoft Azure services to our customers since March 2014. No other cloud service provider can match the Azure global data center footprint, and many of them are just getting started in China while Microsoft has already been in market for several years. Beyond the public cloud, customers can leverage Azure Stack to use Azure services in their private data centers or in markets where Azure public cloud is not available yet, all through a consistent set of services and APIs.

[embedded content]

The benefit of Azure IP Advantage is obvious. A recent study by IPlytics has shown that patent assertion entities have increased their stockpile of cloud computing patents by 130% since 2011. Worse, cloud-related patent litigation in the US has grown by 700% since 2012. We can see these trends taking hold in China as well where patent litigation has increased 158% between 2011 and 2016. Patent filings in China have surpassed the US since 2015.

We’re pleased to be supported in our Azure IP Advantage launch in China by valued customers. MoBike, the world largest bicycle sharing company headquartered in Beijing, is using the Azure platform to rapidly and expand its business outside of China into to Manchester in the UK and other cities worldwide. Azure IP Advantage protections follow MoBike in its international expansion.

Azure IP Advantage is already available outside of China. With this announcement, customers can rely on Azure IP Advantage protections anywhere they deploy their SaaS applications.

Sneak peek #3: Windows Server, version 1709 for developers

This blog post was authored by Patrick Lang, Senior Program Manager, Windows Server.

Windows 10 and Windows Server are built on customer feedback. We use the tools built right into Windows to maintain a continuous cycle of feedback from our customers used to drive refinements and new features. After the launch of Windows Server 2016, we continued to listen to our customer feedback channels and the repeated message we heard is that you want access to new Windows Server builds more frequently to test new features and fixes. First, we announced that Windows Server would be joining the Windows Insider program so you can download and test the latest builds. However, previews alone aren’t enough so we launched the Windows Server Semi-annual Channel to ship supported releases twice per year, and this fall will be the first release on that cadence.

Since the launch of Windows Server 2016, container adoption has skyrocketed, with many customers using a “lift and shift” approach to migrate existing applications and start the journey to modernize their deployments. Hyper-V can also provide unprecedented isolation between containers, and you can leverage your existing Active Directory infrastructure for apps in containers with Group Managed Service Account support.

We heard loud and clear that developers need a platform that provides great density and performance as well as flexibility to run containerized applications. Here’s a glimpse on what’s coming in Windows Server, version 1709 for developers:

Faster downloads, builds, and deployments with Nano Server container image

In the Windows Server Semi-Annual Channel, we’ve optimized Nano Server as a container base OS image and decreased it from 390 MB to 80 MB. That’s a nearly 80% savings! This gives developers a much smaller image ideal for building new applications or adding new services to existing applications.

We launched Windows Server containers with a getting started guide and open-source documentation on GitHub. The community response has been excellent, and we’ve had over 150 people share their expertise and contribute back. Check out our documentation page to learn more. For those of you who joined the Windows Insiders program, you can also check out the documentation on how to use containers with Insider images.

Linux containers

We knew developers were eager to run any container, Windows or Linux, on the same machine. The crowd went wild when we announced this at Dockercon earlier this year and it showed how much demand there was for this work. This feature uses Hyper-V isolation to run a Linux kernel with just enough OS to support containers. Since then, we’ve been hard at work building this technology with new functionality in Hyper-V,  joint work with the Linux community, and contributions to the open source Moby project on which Docker technology is built. Now it’s time to share a sneak peek of how to run Linux containers and start getting feedback on how it’s working for Windows Insiders.

You can get started with these features right away as a Windows Insider. To try this out, you’ll need:

  • Windows 10 or Windows Server Insider Preview build 16267 or later
  • A build of the Docker daemon based off the Moby master branch
  • Your choice of compatible Linux image

Our joint partners have published guides with steps on how to get started:

More to come!

Of course, this is just a glimpse on the news for developers in this release. We have a bunch more we’ll talk about in the blogs to come. Keep an eye out for other blogs in this series and join the Windows Insiders program to have access to the preview releases. Feedback is always welcome! Please use the Windows Feedback tool Hub if you’re a Windows 10 Insider, or join us at the Windows Server Insiders Tech Community.

Check out other blogs in this series:

Sneak Peek #1: Windows Server, version 1709

This blog post was authored by Jeff Woolsey, Principal Program Manager, Windows Server.


We’re watching the calendar and counting down to Microsoft Ignite September 25-29 in Orlando, Florida. Ignite is a great way to see the latest and greatest products and technologies with hundreds of hours of content, meet with your peers and partners, and get firsthand experience with hands-on labs. If you’re already registered for Ignite, be sure to check the event catalog and start selecting your sessions. If you haven’t grabbed a ticket yet, there are limited passes remaining, so get one while you still can!

As we countdown to Ignite, we want to begin a blog series that provides a sneak peek of the next release of Windows Server: Windows Server, version 1709. We’ll be launching Windows Server, version 1709 at Ignite, which builds on the innovation in Windows Server 2016, so let’s begin with a brief recap of some of the areas we focused on in Windows Server 2016.

Application innovation

One area of great interest to customers around the world is application modernization. While a large percentage of applications have moved from physical machines to virtual machines, you’ve told us that you want more. You’ve told us:

  1. You have existing business critical applications that you would like to modernize by moving to a modern platform with better security and better resource usage with minimal/no development effort. Think of this as “lift and shift.”
  2. You are building new applications and you want to build these apps with the cloud as a design point and with the flexibility to run on-premises, in the cloud, or as a hybrid service that takes advantage of the best of both worlds.

In Windows Server 2016, we delivered on both these areas in a major way and we’re just getting started with our investments in Cloud App Platform to:

  1. Provide a way for IT Pros to lift and shift traditional apps to Docker containers with Server Core.
  2. Enable cloud developers to write new cloud apps with Nano Server, .NET Core and Docker.

From a platform standpoint, Windows Server 2016 is the first version of Windows Server to include container technology. Windows Server containers provide application isolation through process and namespace isolation. You can realize the benefits of using containers for applications—with little or no code changes with Windows Server Core. We then added Hyper-V isolation to Windows Server Containers to expand on the isolation by running each container in a highly-optimized virtual machine making it ideal for running in a hostile multitenant environment. Containers, Nano Server, Azure Container Service, and Windows Server provide a rich set of cloud enabling building blocks for true business agility in building always-on, scalable, and distributed applications to run in Azure, on-premises, or hybrid.


Windows Server 2016 is designed with security in mind throughout development as part of our SDL, and reduces risk with multiple layers of security deeply integrated in the operating system for on-premises and cloud protection such as Secure Boot, Code Integrity, Virtualization Based Security, Control Flow Guard, Windows Defender, Just in Time Administration, Just Enough Administration, and much more…

One of the most innovative solutions delivered in Windows Server 2016 was the coupling of security and our hypervisor, Hyper-V, to create Shielded VMs. Shielded VMs are a groundbreaking new technology that makes a virtual machine running Windows a “black box” to protect against a rogue administrator or a virtual machine getting into the wild. Nothing in the industry compares to Shielded VMs.

Software Defined Datacenter that’s ready for the cloud

Windows Server provides the same Hyper-V hypervisor that we run in Azure, so you get the benefits of Azure’s requirements too. A great example of an Azure requirement being delivered to you is industry-leading scale. Windows Server 2016 supports the largest physical servers (24 TB RAM, 512 logical processors) and the largest virtual machines (12 TB RAM, 240 virtual processors). Those massive scalability requirements were driven by Azure, and we are happy to share the same technology with you in Windows Server.

In terms of Software-defined Networking, we took our learnings from Azure and brought them to Windows Server with technologies such as the Azure Data plane, software load balancer, distributed firewall and more. With Windows Server 2016 we delivered Azure inspired, Software-defined Networking to be used on-premises, and these same technologies are also used by Microsoft Azure Stack.

In terms of storage, we took the best performing Software-defined Storage stack and enabled new flexible hyper-converged deployment capabilities to build highly available, scalable software-defined storage solutions at a fraction of the cost of a storage area network (SAN) or network-attached storage (NAS). The Storage Spaces Direct feature lets you use industry-standard servers with local storage. We then added Storage Replica which provides both synchronous and asynchronous options to meet your business requirements.

Long-Term Servicing Channel and Semi-Annual Channel Releases

As we prepare for the Windows Server, version 1709 release, we also want to make sure that folks clearly understand the new release models, including the Long-Term Servicing Channel and the Semi-annual Channel.

Before we discuss these two release models, let’s provide some context. Going all the way back to Windows Server 2003, Microsoft regularly delivered Windows Server releases every two to three years. Over the years, we heard feedback that Microsoft was “too slow.” Customers wanted us to go faster. Customers told us that they felt that being on the leading edge of a technology gave them a competitive advantage. So, we changed. Following the release of Windows Server 2012, we released Windows Server 2012 R2 less than a year later. The feedback we then received, and from some quite loudly, was “Microsoft you’re going too fast. Slow down.”

The pushback on a faster release was an interesting data point. It indicated we had two tracks of customers. One who wanted slow consistency and another who wanted continuous innovation. So, we tried another approach to better test this hypothesis.

In Windows Server 2016 development, we began by releasing frequent Technology Previews (TP). We released a total of five technology previews throughout development. Each TP included additional features, and we partnered with users to help us make changes through development. There were many organizations who were so pleased with a particular TP release that they asked us if we would support them in production.

The Windows Server 2016 development cycle only reinforced the notion that we needed two tracks, which is what we are now delivering with the Long-Term Servicing Channel (LTSC) and the Semi-Annual Channel. So moving forward, Windows Server is evolving to deliver innovation through two channels: The Long-Term Servicing Channel and the Semi-Annual Channel.

  • Long-Term Servicing Channel (LTSC) – this is business as usual with 5 years of mainstream support and 5 years of extended support. You’ll have the option to upgrade to the next LTSC release every 2-3 years the same way folks have for the last 20 years.
  • For those of you who want to innovate faster and take advantage of new features sooner, we are adding the Semi-annual Channel. The Semi-Annual Channel is a Software Assurance benefit and is fully supported in production. The difference is that it is supported for 18 months and there will be a new version every six months.

Keep in mind that both the Long-Term Servicing Channel and the Semi-Annual Channel are both fully supported in production, and that you can mix and match. For example:

  • If you have a legacy application that you rarely touch running in a VM, then maybe the LTSC release makes sense.
  • If you have a new, cloud application that your dev team is building using containers and they want the latest and greatest container features in Nano Server/Server Core, then likely the Semi-Annual Channel is the right choice.

The point is, we’re providing both options, and you get to choose which makes the most sense for you. Finally, whether you choose LTSC or Semi-annual Channel, you are in full control of patching your servers. To make the Windows Server versions easy to identify, we are taking a cue from the Windows team and refer to this release by the year and the month. In this case, 1709 refers to the year 2017, and the ninth month, September. Very straightforward. The way that we are delivering Windows Server moving forward offers more opportunity than ever to influence product direction, so please sign up to the Windows Server Insider Program if you haven’t already!

In the next few blogs, we’re going to introduce areas of investments for the Windows Server, version 1709 for developers, security, Software-defined datacenter and management.

Now available: Windows Server 2016 Security Guide!

This blog post was authored by Nir Ben Zvi, Principal PM Manager, Windows Server.

Windows Server 2016 includes major security innovations that can help protect privileged identity, make it harder for attackers to breach your servers, and detect attacks so that you can respond faster. This is powerful technology, and all that’s missing is guidance on how to best deploy and use Windows Server 2016 to protect your server workloads.

Today we are pleased to share the new Windows Server 2016 Security Guide.

This paper includes general guidance for helping secure servers in your environment as well as specific pointers on how you can utilize new security features in Windows Server 2016. We are committed to continue our effort to provide you with the right security solutions so that you can better protect, detect and respond to threats in your datacenter and private cloud.

Download the Windows Server 2016 Security Guide now and check out our website for more information on Windows Server security.