Tag Archives: availability

Effectively implement Azure Ultra Disk Storage

In August 2019, Microsoft announced the general availability of a new Managed Disks tier: Ultra Disk Storage. The new offering represents a significant step up from the other Managed Disks tiers, offering unprecedented performance and sub-millisecond latency to support mission-critical workloads.

The Ultra Disk tier addresses organizations reluctant to move data-intensive workloads to the cloud because of throughput and latency requirements.

According to Microsoft, Azure Ultra Disk Storage makes it possible to support these workloads by delivering next-generation storage technologies geared toward performance and scalability, while providing you with the convenience of a managed cloud service.

Understanding Azure Ultra Disk

Managed Disks is an Azure feature that simplifies disk management for infrastructure-as-a-service storage. A managed disk is a virtual hard disk that works much like a physical disk, except that the storage is abstracted and virtualized. Azure stores the disks as page blobs, in the form of random I/O storage objects.

To use managed disks, you only have to provision the necessary storage resources and Azure does the rest, deploying and managing the drives.

Azure offers four Managed Disks tiers: Standard HDD, Standard SSD, Premium SSD and the new Ultra Disk Storage, which also builds on SSD technologies. Ultra Disk SSDs support enterprise-grade workloads driven by systems such as MongoDB, SQL Server, SAP HANA and high-performing, mission-critical applications. The latest storage tier comes with configurable performance attributes, making it possible to adjust IOPS and throughput to meet evolving performance requirements.

Azure Ultra Disk Storage implements a distributed block storage architecture that uses NVMe to support I/O-intensive workloads. NVMe is a host controller interface and storage protocol that accelerates data transfers between data center systems and SSDs over a computer’s high-speed PCIe bus.

Ultra Disk Storage makes it possible to utilize a VM’s maximum I/O limits using only a single ultra disk, without needing to stripe multiple disks.

Along with the new storage tier, Azure introduced the virtual disk client (VDC), a simplified client that runs on the compute host. The client has full knowledge of the virtual disk metadata mappings in the Azure Ultra Disk cluster. This knowledge enables the client to communicate directly with the storage servers, bypassing the load balancers and front-end servers often used to establish initial disk connections.

With earlier Managed Disk storage tiers, the route was much less direct. For example, Azure Premium SSD storage is dependent on the Azure Blob storage cache. As a result, the compute host runs the Azure Blob Cache Driver, rather than the VDC. The driver communicates with a storage front end, which, in turn, communicates with partition servers. The partition servers then talk to the stream servers, which connect to the storage devices.

The VDC, on the other hand, supports a more direct connection, minimizing the number of layers that read and write operations traverse, reducing latency and increasing performance.

Deploying Ultra Disk Storage

Azure Ultra Disk Storage lets you configure capacity, IOPS and throughput independently, providing the flexibility necessary to meet specific performance requirements. For capacity, you can choose a disk size ranging from 4 GiB to 64 TiB, and you can provision the disks with up to 300 IOPS per GiB, to a maximum of 160,000 IOPS per disk. For throughput, Azure supports up to 2,000 MB per second, per disk.

Ultra Disk Storage makes it possible to utilize a VM’s maximum I/O limits using only a single ultra disk, without needing to stripe multiple disks. You can also configure disk IOPS or throughput without detaching the disk from the VM or restarting the VM. Azure automatically implements the new performance settings in less than an hour.

To deploy Ultra Disk Storage, you can use the Azure Resource Manager, Azure CLI or PowerShell. Ultra Disk Storage is currently available in three Azure regions: East US 2, North Europe and Southeast Asia. Microsoft plans to extend to other regions, but the company has not provided specific timelines. In addition, Ultra Disk Storage supports only the ESv3 and DSv3 Azure VMs.

Azure Ultra Disk handles data durability behind the scenes. The service is built on Azure’s locally redundant storage (LRS), which maintains three copies of the data within the same availability zone. If an application writes data to the storage service, Azure will acknowledge the operation only after the LRS system has replicated the data.

When implementing Ultra Disk Storage, you must consider the throttling limits Azure places on resources. For example, you could configure your VM with a 16-GiB ultra disk at 4,800 IOPS. However, if you’re working with a Standard_D2s_v3 VM, you won’t be able to take full advantage of the storage because the VM gets throttled to 3,200 IOPS as a result of its limitations. To realize the full benefits available to Ultra Disk Storage, you need hardware that can support its capabilities.

Where Ultra Disk fits in the Managed Disk lineup

Azure Managed Disks simplify disk management by handling deployment and management details behind the scenes. Currently, Azure provides the following four storage options for accommodating different workloads.

The Standard HDD tier is the most basic tier, providing a reliable, low-cost option that supports workloads in which IOPS, throughput and latency are not critical to application delivery. For this reason, the Standard HDD tier is well suited to backup and other non-critical workloads. The maximum disk size for this tier is 32,767 GiB, the maximum IOPS is 2,000 and the maximum throughput is 500 MiB per second.

The Standard solid-state drive tier offers a step up from the Standard HDD tier to support workloads that require better consistency, availability, reliability and latency. The Standard SSD tier is well suited to web servers and lightly used applications, as well as development and testing environments. The maximum disk size for this tier is 32,767 GiB, the maximum IOPS is 6,000 and the maximum throughput is 750 MiB per second.

Prior to the release of the Ultra Disks tier, the Premium SSD tier was the top offering in the Managed Disks stack. The Premium tier is geared toward production and performance-sensitive workloads that require greater performance than the lower tiers. This tier can benefit mission-critical applications that support I/O-intensive workloads. The maximum disk size for this tier is 32,767 GiB, the maximum IOPS is 20,000 and the maximum throughput is 900 MiB per second.

The Ultra Disks tier is the newest Managed Disks service available to customers. The new tier takes performance to the next level, delivering high IOPS and throughput, with consistently low latency. Customers can dynamically change performance settings without restarting their VMs. The Ultra Disks tier targets data-intensive applications such as SAP HANA, Oracle Database and other transaction-heavy workloads. The maximum disk size for this tier is 65,536 GiB, the maximum IOPS is 160,000 and the maximum throughput is 2,000 MiB per second.

Because Ultra Disk Storage is a new Azure service, it comes with several limitations. The service is available in only a few regions and works with only a couple types of VMs. Additionally, you cannot attach an ultra disk to a VM running in an availability set. The service also does not support snapshots, VM scale sets, Azure disk encryption, Azure Backup or Azure Site Recovery. You can’t convert an existing disk to an ultra disk, but you can migrate the data from an existing disk to an ultra disk.

Despite these limitations, Azure Ultra Disk Storage could prove to be an asset to organizations that plan to move their data-intensive applications to the cloud. No doubt Microsoft will continue to improve the service, extending their reach to other regions and addressing the lack of support for other Azure data services, but that hasn’t happened yet, and some IT teams might insist that these issues be resolved before they consider migrating their workloads. In the meantime, Ultra Disk Storage promises to be a service worth watching, especially for organizations already committed to the Azure ecosystem.

Go to Original Article
Author:

12 TB VMs, Expanded SAP partnership on Blockchain, Azure Monitor for SAP Solutions

A few months back, at SAP’s SAPPHIRE NOW event, we announced the availability of Azure Mv2 Virtual Machines (VMs) with up to 6 TB of memory for SAP HANA. We also reiterated our commitment to making Microsoft Azure the best cloud for SAP HANA. I’m glad to share that Azure Mv2 VMs with 12 TB of memory will become generally available and production certified in the coming weeks, in US West 2, US East, US East 2, Europe North, Europe West and Southeast Asia regions. In addition, over the last few months, we have expanded regional availability for M-series VMs, offering up to 4 TB, in Brazil, France, Germany, South Africa and Switzerland. Today, SAP HANA certified VMs are available in 34 Azure regions, enabling customers to seamlessly address global growth, run SAP applications closer to their customers and meet local regulatory needs.

Learn how you can leverage Azure Mv2 VMs for SAP HANA by watching this video.
An image of a video player, clicking takes you to the video.

Running mission critical SAP applications requires continuous monitoring to ensure system performance and availability. Today, we are launching private preview of Azure Monitor for SAP Solutions, an Azure Marketplace offering that monitors SAP HANA infrastructure through the Azure Portal. Customers can combine monitoring data from the Azure Monitor for SAP Solutions with existing Azure Monitor data and create a unified dashboard for all their Azure infrastructure telemetry. You can sign up by contacting your Microsoft account team.

We continue to co-innovate with SAP to help accelerate our customers’ digital transformation journey. At SAPPHIRE NOW, we announced several such co-innovations with SAP. First, we announced general availability of SAP Data Custodian, a governance, risk and compliance offering from SAP, which leverages Azure’s deep investments in security and compliance features such as Customer Lockbox.

Second, we announced general availability of Azure IoT integration with SAP Leonardo IoT, offering customers the ability to contextualize and enrich their IoT data with SAP business data to drive new business outcomes. Third, we shared that SAP’s Data Intelligence solution leverages Azure Cognitive Services Containers to offer intelligence services such as face, speech, and text recognition. Lastly, we announced a joint collaboration of the integration of Azure Active Directory with SAP Cloud Platform Identity Authentication Service (SAP IAS) for a seamless single sign on and user provisioning experience across SAP and non-SAP applications. Azure AD Integration with SAP IAS for seamless SSO is generally available and the user provisioning integration is now in public preview. Azure AD integration with SAP SuccessFactors for simplified user provisioning will become available soon.

Another place I am excited to deepen our partnership is in blockchain. SAP has long been an industry leader in solutions for supply chain, logistics, and life sciences. These industries are digitally transforming with the help of blockchain, which adds trust and transparency to these applications, and enables large consortiums to transact in a trusted manner. Today, I am excited to announce that SAP’s blockchain-integrated application portfolio will be able to connect to Azure blockchain service. This will enable our joint customers to bring the trust and transparency of blockchain to important business processes like material traceability, fraud prevention, and collaboration in life sciences.

Together with SAP, we are offering a trusted path to digital transformation with our best in class SAP certified infrastructure, business process and application innovation services, and a seamless set of offerings. As a result, we help migrate to Azure SAP customers across the globe such as Carlsberg and CONA Services, who have large scale mission critical SAP applications. Here are a few additional customers benefiting from migrating their SAP applications to Azure:

Al Jomaih and Shell Lubricating Oil Company: JOSLOC, the joint venture between Al Jomaih Holding and Shell Lubricating Oil Company, migrated their mission critical SAP ERP to Azure, offering them enhanced business continuity and reduced IT complexity and effort, while saving costs. Migrating SAP to Azure has enabled the joint venture to prepare for their upgrade to SAP S/4HANA in 2020.

TraXall France: TraXall France provides vehicle fleet management services for upwards of 40,000 managed vehicles. TraXall chose Microsoft Azure to run their SAP S/4HANA due to the simplified infrastructure management and business agility, and to meet compliance requirements such as GDPR.

Zuellig Pharma: Amid a five-year modernization initiative, Singapore-based Zuellig Pharma wanted to migrate their SAP solution from IBM DB2 to SAP HANA. Zuellig Pharma now runs its SAP ERP on HANA with 1 million daily transactions and 12 TB of production workloads at a 40 percent savings compared to their previous hosting provider.

If you’re attending SAP TechEd in Las Vegas, stop by at the Microsoft booth #601 or attend one of the Microsoft Azure sessions to learn more about these announcements and to see these product offerings in action.

To learn more about how migrating SAP to Azure can help you accelerate your digital transformation, visit our website at https://azure.com/sap.

Go to Original Article
Author: Microsoft News Center

Tableau BI gets Extensions API in version 2018.2 update

Amid positive early reviews, Tableau announced the general availability of Tableau 2018.2, an extensive upgrade that amplifies the scope of Tableau BI tools with expanded analytics functions and a more streamlined dashboard.

The update comes a few days after Tableau released the beta version of 2018.3 that further simplifies the user interface and enables users to more easily consolidate different sources of data.

The general release version of 2018.2 brings a range of notable changes and new capabilities to Tableau BI tools that introduces customized third-party capabilities to the self-service analytics and data visualization platform.

Released in beta form in April, this week’s formal release of Tableau 2018.2 enables nonbeta users to use several new features, including automatic mobile layouts and Spatial Join, which integrates disparate data sources under a single common attribute.

Probably the two most significant features the release adds to Tableau BI tools are in Dashboard Extensions and Tableau Services Manager.

Drag-and-drop dashboard extensions

The Extensions API essentially opens the platform to both first-party and third-party developers and users, allowing them to create and share their own dashboard extensions with different functionalities.

It’s really exciting to see what the community is able to do and also the creativity of folks to take self-service analytics to the next level.
Francois Ajenstatchief product officer, Tableau

“It’s really exciting to see what the community is able to do and also the creativity of folks to take self-service analytics to the next level,” said Francois Ajenstat, chief product officer at Tableau.

Introducing third-party extensions to a dashboard is a drag-and-drop process, and the new Extensions Gallery enables users to browse and select extensions made by Tableau partners. For example, the feature could let users who, on their own, might not be able to design, say, a predictive analytics model, to simply drag and drop one in.

The Extensions API and several other recent dashboard design enhancements will be welcomed by Tableau users, said Jen Underwood, founder and principal analyst of Florida-based Impact Analytix.

It could “open up a new world of possibilities for augmented analytics, machine learning, statistics, advanced analytics, workflow and other types of apps to integrate directly within Tableau,” Underwood said.

The other standout feature of the new release of 2018.2 is Tableau Services Manager, which allows Tableau Server administration to be done completely through the browser, and generally tries to make server management simpler and faster.

New update enters beta

The beta release of Tableau 2018.3 brings its own expanded capabilities to Tableau BI tools, including dashboard navigation buttons in Tableau Desktop, transparent worksheet backgrounds and a mixed content display in Tableau Server and Online that can show all of a user’s content on the same page.

Heatmaps, a new mark or chart type for Tableau, are expected to be added to the beta in a future update, Tableau said in a blog post.

While Tableau did not say when 2018.3 would be officially released, in sticking to the company’s quarterly schedule, it can likely be expected to leave beta this fall. Betas can see numerous tweaks and adjustments before the official release and even “fundamental changes,” depending on customer feedback or Tableau’s own observations,” Ajenstat said.

Seeking to simplify

The new Tableau BI capabilities introduced in the updates are indicative of Tableau’s business mission, “to help people see and understand data,” Ajenstat said.

Future updates, he said, will likely be aimed at making “analytics easier for everyone,” and will incorporate smart capabilities with tools like AI, machine learning and natural language processing, in part due to the organization’s recent acquisition of MIT AI startup Empirical Systems.

Tableau’s announcements came as competitor Qlik, which research and advisory firm Gartner regularly ranks highly alongside Tableau and Microsoft’s Power BI, announced the acquisition of self-service BI and data visualization startup Podium Data. According to Qlik, the move will increase the company’s ability to compete with Tableau.

Announcing Azure Location Based Services public preview

Today we announced the Public Preview availability of Azure Location Based Services (LBS). LBS is a portfolio of geospatial service APIs natively integrated into Azure that enable developers, enterprises and ISVs to create location aware apps and IoT, mobility, logistics and asset tracking solutions. The portfolio currently comprises of services for Map Rendering, Routing, Search, Time Zones and Traffic. In partnership with TomTom and in support of our enterprise customers, Microsoft has added native location capabilities to the Azure public cloud.

Azure LBS has a robust set of geospatial services atop a global geographic data set. These services are comprised of 5 primary REST services and a JavaScript Map Control. Each service has a unique set of capabilities atop of the base map data and are built in unison and in accordance with Azure standards making it easy to work interoperable between the services. Additionally, Azure LBS is fully hosted and integrated into the Azure cloud meaning the services are compliant with all Azure fundamentals for privacy, usability, global readiness, accessibility and localization. Users can manage all Azure LBS account information from within the Azure portal and billed like any other Azure service.

Azure LBS uses key-based authentication. To get a key, go to the Azure portal and create and Azure LBS account. By creating an Azure LBS account, you automatically generate two Azure LBS keys. Both keys will authenticate requests to the various Azure LBS services. Once you have your account and your keys, you’re ready to start accessing Azure Location Based Services. And, the API model is simple to use. Simply parameterize your URL request to get rich responses from the service:

Sample Address Search Request: atlas.microsoft.com/search/address/json?api-version=1&query=1 Microsoft Way, Redmond, WA

Azure LBS enters public preview with five distinct services. Render (for maps), Route (for directions), Search, Time Zones and Traffic and a JavaScript Map Control. Each of these services are described in more detail below.

Azure Map Control

The Azure Map Control is a JavaScript web control with built-in capabilities for fetching Azure LBS vector map tiles, drawing data atop of it and interacting with the map canvas. The Azure Map Control allows developers to layer their data atop of Azure LBS Maps in both vector and raster layers meaning if enterprise customers have coordinates for points, lines and polygons or if they have geo-annotated maps of a manufacturing plant, a shopping mall or a theme park they can overlay these rasterized maps as a new layer atop of the Azure Map Control. The map control has listeners for clicking the map canvas and getting coordinates from the pixels allowing customers to send those coordinates to the services for searching for businesses around that point, finding the nearest address or cross street to that point, generating a route to or from that point or even connecting to their own database of information to find geospatially referenced information important to their business that is near that point.

Azure Location Based Services Map Control

The Azure Map Control makes it simple for developers to jumpstart their development. By adding a few lines of code to any HTML document, you get a fully functional map.



  

   Hello Azure LBS
    
     
      
     

     

In the above code sample, be sure to replace [AZURE_LBS_KEY] with your actual Azure LBS Key created with your Azure LBS Account in the Azure portal.

Render Service

The Azure LBS Render Service is use for fetching maps. The Render Service is the basis for maps in Azure LBS and powers the visualizations in the Azure Map Control. Users can request vector-based map tiles to render data and apply styling on the client. The Render Service also provides raster maps if you want to embed a map image into a web page or application. Azure LBS maps have high fidelity geographic information for over 200 regions around the world and is available in 35 languages and two versions of neutral ground truth.

Azure Location Based Services Render Service

The Azure LBS cartography was designed from the ground up and created with the enterprise customer in mind. There are lower amounts of information at lower levels of delineation (zooming out) and higher fidelity information as you zoom in. The design is meant to inspire enterprise customers to render their data atop of Azure LBS Maps without additional detail bleeding through disrupting the value of customer data.

Routing Service

The Azure LBS Routing Service is used for getting directions, but not just point A to point B directions. The Azure LBS Routing Service has a slew of map data available to the routing engine allowing it to modify the calculated directions based on a variety of scenarios.  First, the Routing Service provides customers the standard routing capabilities they would expect with a step-by-step itinerary. The calculation of the route can use the faster, shortest or avoiding highly congested roads or traffic incidents. For traffic-based routing, this comes in two flavors: “historic” which is great for future route planning scenarios when users would like to have a general idea of what traffic tends to look like on a given route; and, “live” which is ideal for active routing scenarios when a user is leaving now and wants to know where traffic exists and the best ways to avoid it.

Azure LBS Routing will allow for commercial vehicle routing providing alternate routes made just for trucks. The commercial vehicle routing supports parameters such as vehicle height, weight, the number of axels and hazardous material contents all to choose the best, safest and recommend roads for transporting their haul. The Routing Service provides a variety of travel modes, including walking, biking, motorcycling, taxiing or van routing.

Azure Location Based Services Route Service

Customers can also specify up to 50 waypoints along their route if they have pre-determined stops to make. If customers are looking for the best order in which to stop along their route, they can have Azure LBS determine the best order in which to route to multiple stops by passing up to 20 waypoints into the Routing Service where an itinerary will be generated for them.

Using the Azure LBS Route Service, customers can also specify arrival times when they need to be at a specific location by a certain time. Using the massive amount of traffic data, almost a decade of probes captured per geometry and high frequency intervals Azure LBS can let customers know given day or the week and time when is the best time of departure. Additionally, Azure LBS can use current traffic conditions to notify customers of a road change that may impact their route and provide updated times and/or alternate routes.

Azure LBS can also take into considering the engine type being used. By default, Azure LBS assumes a combustion engine is being used; however, if an electrical engine is in use Azure LBS will accept input parameters for power settings and generate the most energy efficient route.

The Routing Services also allows for multiple, alternate routes to be generated in a single query. This will save on over the wire transfer. Customers can also specify that they would like to avoid specific route types such as toll roads, freeways, ferries or carpool roads.

Sample Commercial Vehicle Route Request: atlas.microsoft.com/route/directions/json?api-version=1&query=52.50931,13.42936:52.50274,13.43872&travelMode=truck

Search Service

The Azure LBS Search Service provides the ability for customers to find real world objects and their respective location. The Search Service provides for three major functions:

  1. Geocoding: Finding addresses, places and landmarks
  2. POI Search: Finding businesses based on a location
  3. Reverse Geocoding: Finding addresses or cross streets based on a location

Azure Location Based Services Search Service

With the Search Service, customers can find addresses and places from around the world. Azure LBS supports address level geocoding in 38 regions, cascading to house numbers, street-level and city level geocoding for other regions of the world. Customers can pass addresses into the service based in a structured address form; or, they can use an unstructured form when they want to allow for their customers to search for addresses, places or business in a single query. Users can restrict their searches by region or bounding box and can query for a specific coordinate to influence the search results to improve quality. Reverse the query to provide a coordinate, say from a GPS receiver, customers can get the nearest address or cross street returned from the service.

The Azure LBS Search Service also allows customers to query for business listings. The Search Service contains hundreds of categories and hundreds of sub-categories for finding businesses or points of interest around a specific point or within a bounding area. Customers can query for businesses based on brand name or general category and filter those results based on location, bounding box or region.

Sample POI Search Request (Key Required): atlas.microsoft.com/search/poi/category/json?api-version=1&query=electric%20vehicle%20station&countrySet=FRA

Time Zone Service

The Azure LBS Time Zone Service is a first of it’s kind providing the ability to query time zones and time for locations around the world. Customers can now submit a location to Azure LBS and receive the respective time zone, the respective time in that time zone and the offset to Coordinated Universal Time (UTC). The Time Zone Service provides access to historical and future time zone information including changes for daylight savings. Additionally, customers can query for a list of all the time zones and the current version of the data – allowing customers to optimize their queries and downloads. For IoT customers, the Azure LBS Time Zone Service allows for POSIX output, so users can download information to their respective devices that only infrequently access the internet. Additionally, for Microsoft Windows users, Azure LBS can transform Windows time zone IDs to IANA time zone IDs.

Sample Time Zone Request (Key Required): atlas.microsoft.com/timezone/byCoordinates/json?api-version=1&query=32.533333333333331,-117.01666666666667

Traffic Service

The Azure LBS Traffic Service provides our customers with the ability to overlay and query traffic flow and incident information. In partnership with TomTom, Azure LBS will have access to a best in class traffic product with coverage in 55 regions around the world. The Traffic Service provides the ability to natively overlay traffic information atop of the Azure Map Control for a quick and easy means of viewing traffic issues. Additionally, customers have access to traffic incident information – real time issues happening on the road and collected through probe information on the roads. The traffic incident information provides additional detail such as the type of incident and the exact location. The Traffic Service will also provide our customers with details of incidents and flow such as the distance and time from one’s current position to the “back of the line;” and, once a user is in the traffic congestion the distance and time until they’re out of it.

Azure Location Based Services Traffic Service

Sample Traffic Flow Segment Request: atlas.azure-api.net/traffic/flow/segment/json?api-version=1&unit=MPH&style=absolute&zoom=10&query=52.41072,4.84239

Azure Location Based Services are available now in public preview via the Azure portal. Get your account created today.

Managed Applications are now Generally Available in the Azure Marketplace

I am excited to announce the general availability of Managed Applications in the Azure Marketplace. Managed Applications, an Azure unique offering, enables you to deploy entire applications and empower your partner to fully manage and maintain the application in your environment. This means, a partner like Xcalar, can deliver more than just deployment on a set of VMs. Xcalar can now deliver both the application and a fully operated solution, offering “Ap/Ops.” Partners like Xcalar will also maintain and service the application solution directly in your Azure environment.

This new distribution channel for our partners will change customer expectations in the public cloud. Unlike our competitors, in Azure, a marketplace application can now be much more than just deployment and set-up. Now it can be a fully supported and managed solution. This is a first in the public cloud!

You and your partners have been the inspiration for Azure Managed Applications. We hear from many of our customers looking to transform IT operations, that they need the simplicity of fully managed applications without all the infrastructure hubbub. Furthermore, our partners are seeking opportunities to offer their customers more value by adding service operations to their portfolio. With Managed Applications, you and your partners can achieve these goals. Here are a few of the details of the offering:

  • Managed Service Providers (MSPs), Independent Software Vendors (ISVs) and System Integrators (SIs) can build turnkey cloud solutions using Azure Resource Manager templates. Both the application IP and the underlying pre-configured Azure infrastructure can easily be packaged into a sealed and serviceable solution. This enables “Ap/Ops,” offering both the application and operations together in one package.
  • Customers can deploy these turnkey solutions in their own Azure subscription as a sealed service, which is fully operated and maintained by the partner across the solution lifecycle. In addition, only the minimal level of access is granted, specifically for the sealed solution and its lifecycle operations.
  • The result is a higher quality of service for our customers, fully managed by our partners. Thanks to the sealed and immutable nature of Managed Applications, nothing changes in the application or the infrastructure configuration, unless it is an explicit lifecycle operation by the trusted partner.

Whether these solutions are complex applications that are custom-built and maintained by MSPs or packaged applications delivered and serviced by ISVs, you can focus on what you need to do to accelerate your business transformation without having to worry so much about running someone else’s software. Managed Applications accelerate innovation, even in the most advanced application scenarios, bringing the best of software-as-a-service and infrastructure-as-a-service together.

At Ignite we enabled our partners to deploy and service Managed Applications inside customer owned enterprise service catalogs. With today’s addition of Azure Marketplace as a distribution channel, partners can add value to their marketplace offering by adding lifecycle and support services with an incremental flat monthly fee. Our launch partners are excited and ready to go…

“Azure Managed Applications enables OpsLogix customers to easily deploy and use our solutions without having to undertake on-going maintenance and servicing complexities.”
Vincent de Vries, CEO, OpsLogix

“Azure Managed Applications allows Xcalar Data Platform to provide a higher quality of service and help forge stronger partnerships with our customers. Enterprise applications, Azure cloud infrastructure, and production operations can now be packaged into a single solution.”
Vikram Joshi Co-founder and CEO, Xcalar

“We’re delighted to be bringing the power and simplicity of Cisco Meraki cloud networking to customers using Microsoft Azure. Our two companies both share a passion for better IT, and by leveraging our powerful Auto VPN and SD-WAN features we can seamlessly connect users to the resources they depend on every day.”
Raviv Levi, Senior Product Line Manager, Cisco Meraki

Go ahead, give it a try. We would love to hear from you about your experience with Azure Managed Applications and our partner solutions. Leave us feedback at the user voice channel or the comments section below.

More innovation for our partners is on the horizon, so stay tuned! And, do not forget to check Gaurav’s blog for a closer look under the hood on how this works.

See ya around,

Corey Sanders

Announcing general availability of Azure Managed Applications Service Catalog

Today we are pleased to announce the general availability of Azure Managed Applications Service Catalog.

Service Catalog allows corporate central IT teams to create and manage a catalog of solutions and applications to be used by employees in that organization. It enables organizations to centrally manage the approved solutions and ensure compliance. It enables the end customers, or in this case, the employees of an organization, to easily discover the list of approved solutions. They can consume these solutions without having to worry about learning how the solution works in order to service, upgrade or manage it. All this is taken care of by the central IT team which published and owns the solution.

In this post, we will walkthrough the new capabilities that have been added to the Managed Applications and how it improves the overall experience.

Improvements

We have made improvements to the overall experience and made authoring much easier and straight forward. Some of the major improvements are described below.

Package construction simplified

In the preview version, the publisher needed to author three files and package them in a zip. One of them was a template file which contained only the Microsoft.Solutions/appliances resource. The publisher also had to specify all of the actual parameters needed for the deployment of the actual resources in this template file again. This was in addition to these parameters already being specified in the other template file. Although this was needed, it caused redundant and often confusing work for the publishers. Going forward, this file will be auto-generated by the service.

So, in the package (.zip), only two files are now required – i) mainTemplate.json (template file which contains the resources that needs to be provisioned) ii) createUIDefinition.json

If your solution uses nested templates, scripts or extensions, those don’t need to change.

Portal support enabled

At preview, we just had CLI support for creating a managed application definition for the Service Catalog. Now, we have added Portal and PowerShell support. With this, the central IT team of an organization can use portal to quickly author a managed application definition and share it with folks in the organization. They don’t need to use CLI and learn the different commands offered there.

These could be discovered in the portal by clicking on “More Services” and then searching for Managed. Don’t use the ones which say “Preview”.

image

To create a managed application definition, select “Service Catalog managed application definitions” and click on “Add” button. This will open the below blade.

ManagedAppDefBlade

Support for providing template files inline instead of packaging as .zip

Create a .zip file, uploading it to a blob, making it publicly accessible, getting the URL and then creating the managed application definition still required a lot of steps. So, we have enabled another option where you can specify these files inline using new parameters that have been added to CLI and Powershell. Support for inline template files will be added to portal shortly.

Service Changes

Please note that the following major changes have been made to the service.

New api-version

The general availability release is introducing a new api-version which will enable you to leverage all the above mentioned improvements. The new api-version is 2017-09-01. Azure Portal will use this new api-version. The latest version of Azure CLI and Azure PowerShell leverages this new api-version. It will be required that you switch to this latest version to develop and manage Managed Applications. Note that creating and managing Managed Applications will not be supported using the existing version of CLI after 9/25/2017. Existing resources which have been created using the old api-version (old CLI) will still continue to work.

Resource type names have changed

The resource type names have changed in the new api-version. And so, Microsoft.Solutions/appliances is now Microsoft.Solutions/applications, and Microsoft.Solutions/applianceDefinitions is Microsoft.Solutions/applicationDefinitions.

Upgrade to the latest CLI and PowerShell

As mentioned above, to continue using and creating Managed Applications, you will have to use the latest version of CLI and PowerShell, or you can use the Azure portal. Existing versions of these clients built on the older api-version will no longer be supported. Your existing resources will be migrated to use the new resource types and will continue to work using the new version of the clients.

Supported locations

Currently, the supported locations are West Central US and West US2.

Please try out the new version of the service and let us know your feedback through our user voice channel or in the comments below.

Additional resources

Data Management Gateway – High Availability and Scalability Preview

We are excited to announce the preview for Data Management Gateway – High Availability and Scalability.

You can now associate multiple on-premise machines to a single logical gateway. The benefits are:

  • Higher availability of Data Management Gateway (DMG) – DMG will no longer be the single point of failure in your Big Data solution or cloud data integration with Azure Data Factory, ensuring continuity with up to 4 nodes.

data-factory-gateway-high-availability-and-scalability

  • Improved performance and throughput during data movement between on-premises and cloud data stores. Get more information on performance comparisons.
  • Both Scale out and Scale up support – Not only the DMG can be installed across 4 nodes (scale out), but you can now increase/decrease the concurrent data movement jobs at each node (scale up/down) as per the need.
    Note: The Scale up/down feature is now available for all existing Single Node (GA) gateways. This update is not limited to this preview. 
  • Richer Data Management Gateway Monitoring experience – You can monitor each node status and resource utilization all at one place on the Azure Portal. This helps simplify the DMG management. 

dmg-monitor

Note: Monitoring is now available for all existing Single Node (GA) gateways. This update is not limited to this preview. 

For more information on the Data Management Gateway ‘High Availability and Scalability’ feature check our documentation.

Getting started

Scenario 1 – Setting up a new ‘Highly Available and Scalable’ Data Management Gateway.

[embedded content]

Scenario 2 – Upgrading existing Data Management Gateway to enable the ‘High Availability and Scalability’ feature.

[embedded content]

Prerequisite – This preview feature is supported on Data Management Gateway version 2.12.xxxx.x and above. Please make sure you are using version 2.12.xxxx.x or above. Download the latest version of Data Management Gateway here.

In case you have any queries, please feel free to reach out to us at [email protected]

Online training for Azure Data Lake

We are pleased to announce the availability of new, free online training for Azure Data Lake. We’ve designed this training to get developers ramped up fast. It covers all the topics a developer needs to know to start being productive with big data and how to address the challenges of authoring, debugging, and optimizing at scale.

Explore the training

Click on the link below to start!

Microsoft Virtual Academy: Introduction to Azure Data Lake

Looking for more?

You can find this training and many more resources for developers.

Course outline

1 | Introduction to Azure Data Lake

Get an overview of the entire Azure Data Lake set of services including HDI, ADL Store, and ADL Analytics.

2 | Introduction to Azure Data Lake Tools for Visual Studio

Since ADL developers of all skill levels use Azure Data Lake Tools for Visual Studio, review the basic set of capabilities offered in Visual Studio.

3 | U-SQL Programming

Explore the fundamentals of the U-SQL language, and learn to perform the most common U-SQL data transformations.

4 | Introduction to Azure Data Lake U-SQL Batch Job

Find out what’s happening behind the scenes, when running a batch U-SQL script in Azure.

5 | Advanced U-SQL

Learn about the more sophisticated features of the U-SQL language to calculate more useful statistics and learn how to extend U-SQL to meet many diverse needs.

6 | Debugging U-SQL Job Failures

Since, at some point, all developers encounter a failed job, get familiar with the causes of failure and how they manifest themselves.

7 | Introduction to Performance and Optimization

Review the basic concepts that drive performance in a batch U-SQL job, and examine strategies available to address those issues when they come up, along with the tools that are available to help.

8 | ADLS Access Control Model

Explore how Azure Data Lake Store uses the POSIX Access Control model, which is very different for users coming from a Windows background.

9 | Azure Data Lake Outro and Resources

Learn about course resources.

Powered by WPeMatico