Tag Archives: General

Tableau BI gets Extensions API in version 2018.2 update

Amid positive early reviews, Tableau announced the general availability of Tableau 2018.2, an extensive upgrade that amplifies the scope of Tableau BI tools with expanded analytics functions and a more streamlined dashboard.

The update comes a few days after Tableau released the beta version of 2018.3 that further simplifies the user interface and enables users to more easily consolidate different sources of data.

The general release version of 2018.2 brings a range of notable changes and new capabilities to Tableau BI tools that introduces customized third-party capabilities to the self-service analytics and data visualization platform.

Released in beta form in April, this week’s formal release of Tableau 2018.2 enables nonbeta users to use several new features, including automatic mobile layouts and Spatial Join, which integrates disparate data sources under a single common attribute.

Probably the two most significant features the release adds to Tableau BI tools are in Dashboard Extensions and Tableau Services Manager.

Drag-and-drop dashboard extensions

The Extensions API essentially opens the platform to both first-party and third-party developers and users, allowing them to create and share their own dashboard extensions with different functionalities.

It’s really exciting to see what the community is able to do and also the creativity of folks to take self-service analytics to the next level.
Francois Ajenstatchief product officer, Tableau

“It’s really exciting to see what the community is able to do and also the creativity of folks to take self-service analytics to the next level,” said Francois Ajenstat, chief product officer at Tableau.

Introducing third-party extensions to a dashboard is a drag-and-drop process, and the new Extensions Gallery enables users to browse and select extensions made by Tableau partners. For example, the feature could let users who, on their own, might not be able to design, say, a predictive analytics model, to simply drag and drop one in.

The Extensions API and several other recent dashboard design enhancements will be welcomed by Tableau users, said Jen Underwood, founder and principal analyst of Florida-based Impact Analytix.

It could “open up a new world of possibilities for augmented analytics, machine learning, statistics, advanced analytics, workflow and other types of apps to integrate directly within Tableau,” Underwood said.

The other standout feature of the new release of 2018.2 is Tableau Services Manager, which allows Tableau Server administration to be done completely through the browser, and generally tries to make server management simpler and faster.

New update enters beta

The beta release of Tableau 2018.3 brings its own expanded capabilities to Tableau BI tools, including dashboard navigation buttons in Tableau Desktop, transparent worksheet backgrounds and a mixed content display in Tableau Server and Online that can show all of a user’s content on the same page.

Heatmaps, a new mark or chart type for Tableau, are expected to be added to the beta in a future update, Tableau said in a blog post.

While Tableau did not say when 2018.3 would be officially released, in sticking to the company’s quarterly schedule, it can likely be expected to leave beta this fall. Betas can see numerous tweaks and adjustments before the official release and even “fundamental changes,” depending on customer feedback or Tableau’s own observations,” Ajenstat said.

Seeking to simplify

The new Tableau BI capabilities introduced in the updates are indicative of Tableau’s business mission, “to help people see and understand data,” Ajenstat said.

Future updates, he said, will likely be aimed at making “analytics easier for everyone,” and will incorporate smart capabilities with tools like AI, machine learning and natural language processing, in part due to the organization’s recent acquisition of MIT AI startup Empirical Systems.

Tableau’s announcements came as competitor Qlik, which research and advisory firm Gartner regularly ranks highly alongside Tableau and Microsoft’s Power BI, announced the acquisition of self-service BI and data visualization startup Podium Data. According to Qlik, the move will increase the company’s ability to compete with Tableau.

Computer parts incl brand new PSU

Hi All

Due to a mix of upgrading, fixes and a general clear-out, I have the following items for sale:

+ Seasonic 550FX Focus Gold PSU. This is warranty replacement from Seasonic and still in its original box. GBP60 (Retail around GBP80)

More items to be added later today.

Postage will be royal mail or you can collect from either NW2 or SW1.


Price and currency: GBP
Delivery: Delivery cost is not included
Payment method: PPG, Revolut or cash

Computer parts incl brand new PSU

Data protection trends: Ransomware, M&A deals dominate news

From the constant threat of ransomware attacks to looking ahead to the European Union’s General Data Protection Regulation, backup vendors had a lot to tackle in 2017. And there was even a lot of movement among vendors themselves, with several big names making acquisitions to gain footholds in important markets.

Here we run down the year’s top data protection trends and news.

Ransomware protection gains strength

The ransomware epidemic is not slowing down. While ransomware has been out there for some time now, it made international headlines in May when the WannaCry strain simultaneously hit 300,000 machines in 150 countries. Other strains have made big news and caused problems for organizations of all sizes this year. Statistics vary, but many organizations say ransomware attacks are on the rise.

While WannaCry didn’t end up pulling in as much ransom as the attackers likely anticipated, that attack and others had organizations scrambling and making data protection a top focus. Often, backup and recovery is the only way out after ransomware hits. And that focus was evident with backup vendors as well, as data protection trends in this area included adding ransomware-specific features.

  • Acronis built a new version of its Active Protection technology — integrated into Acronis True Image backup software — that uses machine learning to help prevent ransomware viruses from corrupting data. It attempts to detect suspicious application behavior before file corruption. Active Protection is available in Acronis Backup software.
  • BackupAssist launched CryptoSafeGuard, part of its data protection software for SMBs, which works with existing antimalware software. It scans and detects suspicious activity in source files that can be related to ransomware, then sends alerts and blocks backup jobs from running.
  • Druva built ransomware monitoring and detection tools into its InSync endpoint data protection software. The software flags unusual activity occurring to data and helps identify the last good snapshot to recover the entire data set or individual files.
  • Unitrends Recovery Series physical appliances and Unitrends Backup virtual appliances use predictive analytics to determine the probability that ransomware exists in an environment. The vendor alerts customers when it detects the virus, so they can immediately restore from the last legitimate recovery point.

Mergers and acquisitions aplenty

The data protection 2017 market saw a large amount of merger and acquisition activity, particularly in the second half of the year. Cloud backup provider Carbonite was especially busy.

Here are several major moves from the past year:

  • Security and data protection vendor Barracuda is going private, following its purchase in November by equity firm Thoma Bravo for $1.6 billion.
  • Vista Equity Partners in October acquired data protection vendor Datto and will merge it with IT management provider Autotask, in a play to bring several technologies under one roof for SMBs, including backup and disaster recovery, professional services automation and networking continuity. Earlier in the year, Datto bought cloud-based networking provider Open Mesh.
  • Carbonite purchased Datacastle’s endpoint backup in August, which gives the growing cloud backup vendor better scalability and a bigger play in the SMB market. That same month, Code42 announced it is shutting down its consumer cloud backup product in 2018 to focus on other sectors and referring consumers to Carbonite. Earlier in the year, Carbonite bought Double-Take Software to improve its high-availability technology.
  • Peak 10 closed on a $1.675 billion acquisition of ViaWest in August, which will lead to a data protection suite of services between the cloud services providers that includes storage, backup and replication.
  • Axcient, which provides cloud-based disaster recovery and data protection, and EFolder, which offers cloud business continuity, cloud file sync and cloud-to-cloud backup, announced in July that they are merging.
  • Data protection vendor Arcserve in July acquired Zetta and its cloud backup and disaster recovery, following its purchase earlier in the year of FastArchiver for on-premises or public cloud emails.

The convergence and hyper-convergence of data protection

As vendors like Cohesity and Rubrik continue to lead the converged secondary storage market, backup going hyper-converged is one of the top data protection trends of 2017.

As vendors like Cohesity and Rubrik continue to lead the converged secondary storage market, backup going hyper-converged is one of the top data protection trends of 2017. Several vendors this year launched backup for hyper-converged products, with at least one data protection product focused solely on the Nutanix Acropolis Hypervisor (AHV).

The Unitrends Recovery Series backup appliances and Unitrends Backup virtual appliances feature integration for AHV. The vendor also protects all hypervisors that run on Nutanix and supports VMware, Hyper-V and Citrix XenServer hypervisors. Veeam, Commvault and Rubrik are among the other data protection vendors that recently launched or will launch backup for AHV.

Comtrade Software in June launched its HYCU dedicated to AHV backup. The vendor later in the year updated its product with increased support for Nutanix storage and backup management features.

Commvault went to a place it didn’t originally plan on going: the hardware market. The vendor launched its first scale-out integrated hardware appliance for data protection as it attempts to compete with Rubrik and Cohesity, as well as traditional backup vendors. The HyperScale platform is part of Commvault’s product strategy to build out its data services with software-defined storage and convergence. Converged secondary storage — one of the data protection trends that continues to grow — handles such nonprimary tasks as backup, archiving, test and development, and disaster recovery.

Ready or not, here comes GDPR

Companies are scrambling to ensure compliance with the European Union’s General Data Protection Regulation, which goes into effect in May and covers data produced by EU citizens and data stored within the union. It consists of 99 articles, including a rule that gives individuals the right to force organizations to delete all personal data.

But the rule requiring companies to notify customers of a data breach within 72 hours struck a chord this year via the Equifax breach. The company discovered it in July and reported it publicly in September. Companies not in compliance with GDPR face millions of dollars in fines.

Surveys routinely show that companies are not adequately prepared for GDPR. Some vendors, though, are trying to help aid compliance. For example, Veritas’ Integrated Classification Engine uses machine learning to identify sensitive and personal data.

Data protection trends take on storage growth

Tape storage got a capacity bump with the release of LTO-8. The latest version, launched two years after LTO-7 hit the market, features 32 TB of compressed capacity per tape, sustained data transfer rates of up to 1,180 MBps for compressed data, uncompressed capacity of 12.8 TB and an uncompressed transfer rate of 472 MBps. Tape is seen as a safe, offline backup in the face of cyberattacks such as ransomware. Plus, the massive capacity can help with long-term retention of huge data sets that continue to grow.

“No business measures data storage in terabytes anymore,” analyst Jon Toigo wrote in a November SearchDataBackup article. “… So LTO-8, with its 32 TB capacity, seems to be just what the doctor ordered for companies most likely to make big use of tape technology: cloudies and data-intensive verticals, such as healthcare, surveillance, research labs, and oil and gas. These firms are putting tape back to use in an old, secondary storage role.”

What’s old has become new again.

GDPR requirements loom for Windows Server admins

The clock is ticking to get your Windows systems ready for the General Data Protection Regulation. To assist with…

these compliance efforts, Microsoft offers several resources to help systems administrators.

A European Union privacy law, GDPR goes into effect in May 2018 and signifies more wide-reaching ramifications for IT than other regulations. For example, while the Health Insurance Portability and Accountability Act is relevant only to healthcare providers, most organizations must adhere to GDPR requirements. The regulation applies to any organization — including those based outside Europe — that processes, collects or stores data of EU citizens.

This sweeping data privacy regulation presents a compliance challenge for even the smallest companies. For example, if a U.S. company sells items from its website to an EU citizen, GDPR applies to that business. Even something minor, such as storing an EU citizen’s phone number on digital media, forces a company to either observe the rules or delete the data.

What is GDPR?

GDPR imposes stringent requirements on how businesses handle the personal data of EU citizens. GDPR will replace the EU’s Data Protection Directive, which only affected organizations with a physical presence in Europe.

The GDPR requirements state that “personal data is any information relating to an individual, whether it relates to his or her private, professional or public life. It can be anything from a name, a home address, a photo, an email address, bank details, posts on social networking websites, medical information or a computer’s IP address.”

Organizations subject to GDPR compliance rules will need to retain data processing records that show a strong effort has been made to observe the more than 100 GDPR requirements. Penalties for noncompliance go up to 20 million euros — about $24 million — or up to 4% of a company’s annual revenue, whichever is more.

How does Windows Server help with GDPR?

Although Windows Server 2016 does not have specific features related to GDPR, the OS has other functionality to protect organizations from a data breach.

For example, the Just Enough Admin and the Just in Time Admin features protect against overprivileged administrative accounts. If a business has one administrator whose main responsibility is Active Directory management, then this person usually gets full administrative privileges, even though they just need to perform one specific type of administrative task. The Just in Time Admin and Just Enough Admin features grant the permissions required for a specific task for a limited period of time. The IT department can add an additional layer of security by configuring Windows to validate the administrator’s identity through multifactor authentication before the request is granted.

Another security feature that can help with GDPR compliance initiatives is Windows Defender Credential Guard. New to Windows Server 2016, this feature uses a hypervisor to isolate authentication credentials to restrict access to privileged system software. A similar tool called Windows Defender Remote Credential Guard protects the credentials used for remote desktop sessions.

Windows Defender Device Guard is an application whitelisting tool in Windows Server 2016 that an admin uses to specify which binaries can run on the system to prevent malware attacks. If there is an attempt to execute unauthorized code, Windows Server will block it and log the activity.

Microsoft updated Windows Server’s security auditing capabilities, which is useful for GDPR compliance. The company designed Windows Server 2016 to integrate with security information and event management systems and extended the server OS to support two new types of auditing. For the first time, Windows Server can natively audit group memberships and Plug and Play (PnP) activity. PnP auditing helps admins detect the use of external storage devices.

What else does Microsoft offer?

Microsoft promotes its cloud service as a method to accelerate GDPR compliance. For companies that do not have that option, there are other Microsoft services and tools that can help.

The GDPR Benchmark is a questionnaire that asks about two dozen questions and offers a series of recommendations based on the answers. Figure 1 shows an excerpt from the site.

GDPR survey
Figure 1. Microsoft’s GDPR assessment site provides recommendations based on the answers to a series of questions.

The site asks for the company’s location, size and whether it is a Microsoft partner and then proceeds with a number of GDPR-specific questions. The GDPR Benchmark tool is essentially a Microsoft sales utility, but it has merit to highlight the areas the admin needs to address to meet GDPR requirements.

A Microsoft site dedicated to GDPR offers guidance through a series of documents and videos that can assist organizations though the compliance process.

GDPR Detailed Assessment
Figure 2. The GDPR Detailed Assessment package includes an Excel spreadsheet to measure a company’s level of GDPR compliance.

Figure 2 shows an Excel spreadsheet that is part of the GDPR Detailed Assessment package on the site. The spreadsheet contains more than 100 questions related to how the organization stores, maintains, secures and processes data. Complete the spreadsheet to assess the overall compliance readiness of the organization and which areas require improvement.

Managed Applications are now Generally Available in the Azure Marketplace

I am excited to announce the general availability of Managed Applications in the Azure Marketplace. Managed Applications, an Azure unique offering, enables you to deploy entire applications and empower your partner to fully manage and maintain the application in your environment. This means, a partner like Xcalar, can deliver more than just deployment on a set of VMs. Xcalar can now deliver both the application and a fully operated solution, offering “Ap/Ops.” Partners like Xcalar will also maintain and service the application solution directly in your Azure environment.

This new distribution channel for our partners will change customer expectations in the public cloud. Unlike our competitors, in Azure, a marketplace application can now be much more than just deployment and set-up. Now it can be a fully supported and managed solution. This is a first in the public cloud!

You and your partners have been the inspiration for Azure Managed Applications. We hear from many of our customers looking to transform IT operations, that they need the simplicity of fully managed applications without all the infrastructure hubbub. Furthermore, our partners are seeking opportunities to offer their customers more value by adding service operations to their portfolio. With Managed Applications, you and your partners can achieve these goals. Here are a few of the details of the offering:

  • Managed Service Providers (MSPs), Independent Software Vendors (ISVs) and System Integrators (SIs) can build turnkey cloud solutions using Azure Resource Manager templates. Both the application IP and the underlying pre-configured Azure infrastructure can easily be packaged into a sealed and serviceable solution. This enables “Ap/Ops,” offering both the application and operations together in one package.
  • Customers can deploy these turnkey solutions in their own Azure subscription as a sealed service, which is fully operated and maintained by the partner across the solution lifecycle. In addition, only the minimal level of access is granted, specifically for the sealed solution and its lifecycle operations.
  • The result is a higher quality of service for our customers, fully managed by our partners. Thanks to the sealed and immutable nature of Managed Applications, nothing changes in the application or the infrastructure configuration, unless it is an explicit lifecycle operation by the trusted partner.

Whether these solutions are complex applications that are custom-built and maintained by MSPs or packaged applications delivered and serviced by ISVs, you can focus on what you need to do to accelerate your business transformation without having to worry so much about running someone else’s software. Managed Applications accelerate innovation, even in the most advanced application scenarios, bringing the best of software-as-a-service and infrastructure-as-a-service together.

At Ignite we enabled our partners to deploy and service Managed Applications inside customer owned enterprise service catalogs. With today’s addition of Azure Marketplace as a distribution channel, partners can add value to their marketplace offering by adding lifecycle and support services with an incremental flat monthly fee. Our launch partners are excited and ready to go…

“Azure Managed Applications enables OpsLogix customers to easily deploy and use our solutions without having to undertake on-going maintenance and servicing complexities.”
Vincent de Vries, CEO, OpsLogix

“Azure Managed Applications allows Xcalar Data Platform to provide a higher quality of service and help forge stronger partnerships with our customers. Enterprise applications, Azure cloud infrastructure, and production operations can now be packaged into a single solution.”
Vikram Joshi Co-founder and CEO, Xcalar

“We’re delighted to be bringing the power and simplicity of Cisco Meraki cloud networking to customers using Microsoft Azure. Our two companies both share a passion for better IT, and by leveraging our powerful Auto VPN and SD-WAN features we can seamlessly connect users to the resources they depend on every day.”
Raviv Levi, Senior Product Line Manager, Cisco Meraki

Go ahead, give it a try. We would love to hear from you about your experience with Azure Managed Applications and our partner solutions. Leave us feedback at the user voice channel or the comments section below.

More innovation for our partners is on the horizon, so stay tuned! And, do not forget to check Gaurav’s blog for a closer look under the hood on how this works.

See ya around,

Corey Sanders

SQL Server 2017 and Azure Data Services – The ultimate hybrid data platform

This post was authored by Rohan Kumar, General Manager Database Systems Engineering.

Today at PASS Summit 2017, we are showcasing new advances across SQL Server 2017 and our Azure data services. And we’re demonstrating how these products – both on-premises and in the cloud – come together to form the ultimate hybrid data platform. Our recent announcements, here at PASS Summit and at Ignite in September are great examples of how we’re empowering data professionals – like our community here at PASS Summit 2017 – to do more and achieve more.

With the recent launch of SQL Server 2017, the first database with AI built-in, you can now run your production workloads on Linux, Windows, and Docker. SQL Server 2017 delivers a mission-critical database with everything built-in, on the platform of your choice. And, it can unlock seamless DevOps with Docker Enterprise Edition containers, bringing efficiency and simplicity to your innovation. New features enable analysis of graph data, and advanced analytics using Python and R. We have incorporated your feedback to add features that will make your day-to-day job easier like Adaptive Query Processing and Automatic Plan Correction by finding and fixing performance regressions automatically.

In addition, SQL Server 2017 running on Windows and Linux can take advantage of new leaps forward in hardware. As demonstrated today by Bob Ward, SQL Server 2017 on SUSE Enterprise Linux Server on an HPE ProLiant DL380 Gen 10 Server with scalable persistent memory ran queries more than 5 times faster than a fast SSD drive array at 50% of the cost – making it the world’s first enterprise-grade diskless database server.

These new cross-platform capabilities have made SQL Server accessible to users of Windows, Linux and Mac. At PASS Summit, we are excited to provide a sneak peek at Microsoft SQL Operations Studio. In a few weeks, users will be able to download and evaluate this free, light-weight tool for modern database development and operations on Windows, Mac or Linux machines for SQL Server, Azure SQL Database, and Azure SQL Data Warehouse. Increase your productivity with smart T-SQL code snippets and customizable dashboards to monitor and quickly detect performance bottlenecks in your SQL databases on-premises or in Azure. You’ll be able to leverage your favorite command line tools like Bash, PowerShell, sqlcmd, bcp and ssh in the Integrated Terminal window. Users can contribute directly to SQL Operations Studio via pull requests from the GitHub repository.

For customers who are ready to modernize to the cloud, Azure SQL Database Managed Instance and Azure Database Migration Service, both now in private preview, making it easy to lift-and-shift your on-premises SQL Server workloads with few or no changes. The upcoming Azure Hybrid Benefit for SQL Server enables customers to use on-premises SQL Server licenses for the easiest lift and shift of SQL Server workloads to the fully-managed cloud.

Azure SQL Database is ready for your most mission-critical workloads. Today, we demonstrated the high scale and performance of SQL Database, with the ability to insert 1.4 million rows per second. In addition, we are making it easier than ever to gain insights from data at this scale. We recently made generally available, the ability to run advanced analytics models from Azure Machine Learning super-fast from T-SQL, with new Native T-SQL scoring. And in today’s demos, we show how you can use this new capability to score large amounts of data in real time – at an average of under 20ms per row!

Starting in early 2016, we have been delivering machine-learning based customer value directly into the Azure SQL Database managed service. More recently, we’ve delivered several intelligent capabilities including automatic tuning, performance monitoring and tuning, Adaptive Query Processing, and Threat Detection. These capabilities significantly reduce time requires to manage anywhere from one to thousands of databases and help proactively defend against potential threats. And the preview Vulnerability Assessment feature helps you more easily understand your security and compliance with new initiatives like GDPR.

In addition to sharing the same “everything built-in” SKU model with SQL Server for lower total cost of ownership versus competitors, Azure SQL Database adds value to your database with these built-in administration features. Now it’s easier than ever to move to Azure SQL Database with our new partnership with Attunity. Customers can now take advantage of Attunity Replicate for Microsoft Migrations, a free offer that accelerates migrations from various database systems, including Oracle, Amazon Redshift, and PostgreSQL to the Microsoft Data Platform.

To simplify analytics in the cloud, we’re releasing a public preview of new hybrid data integration capabilities in Azure Data Factory including the ability to run SSIS packages within the service. This means you can run your SSIS data integration workloads in Azure, without changing the packages – for true hybrid data integration across on-premises and cloud. And our SSIS partner technologies like Biml can now work to automate and enhance data integration across on-premises and cloud.

Dramatic scale investments are now in public preview for Azure SQL Data Warehouse. With the new Compute-Optimized Tier, you can get up to 100x the performance and 5x the scale. This new tier builds on the existing optimized for elasticity tier – giving customers the benefit of a fully-managed platform that suits the demands of their workload.

Visualizing data helps users across the organization take informed action. Microsoft delivers Business Intelligence capabilities to help customers model, analyze, and deliver business insights, which can be consumed by business users on mobile devices, on the web or embedded in applications.

Analysis Services helps you transform complex data into semantic models making it easy for business users to understand and analyze large amounts of data across different data sources. The same enterprise-grade SQL Server Analysis Services you’ve used on-premises is also available as a fully managed service in Azure. With today’s announcement of the Scale Out feature for Azure Analysis Services, you can easily set the number of replicas for your Azure Analysis Services instances to support large numbers of concurrent users with blazing fast query performance.

Power BI is a market leading SaaS service that is easy to get started and simple to use, with data connections to many on-premises and cloud data sources. It allows users across roles and industries to go from data to insights in minutes. A recent key addition to the Power BI portfolio is Power BI Report Server. Power BI Report Server enables self-service BI and enterprise reporting, all in one on-premises solution by allowing you to manage your SQL Server Reporting Services (SSRS) reports alongside your Power BI reports. Today we announce the availability of a new version of Power BI Report Server that will enable keeping Power BI reports on-premises that connect to any data! Read more on the Power BI blog.

Microsoft’s guiding principle has been to build the highest performing, most secure, and consistent data platform for all your applications across on-premises and cloud. By joining us on this journey, you can build upon your investments in SQL Server to expand the scope of your role in your organization, from database systems to advanced analytics and artificial intelligence. We look forward to working with you!

If you aren’t with us at PASS Summit 2017 this week, you can still see the announcements and demonstrations by purchasing session recordings to stream or download and watch at home. Sign up at the PASS Summit Website.

If you’d like to learn more about the latest enhancements in the Microsoft Data Platform, visit our data platform webpage, or data platform overview on Azure.

Announcing general availability of Azure Managed Applications Service Catalog

Today we are pleased to announce the general availability of Azure Managed Applications Service Catalog.

Service Catalog allows corporate central IT teams to create and manage a catalog of solutions and applications to be used by employees in that organization. It enables organizations to centrally manage the approved solutions and ensure compliance. It enables the end customers, or in this case, the employees of an organization, to easily discover the list of approved solutions. They can consume these solutions without having to worry about learning how the solution works in order to service, upgrade or manage it. All this is taken care of by the central IT team which published and owns the solution.

In this post, we will walkthrough the new capabilities that have been added to the Managed Applications and how it improves the overall experience.


We have made improvements to the overall experience and made authoring much easier and straight forward. Some of the major improvements are described below.

Package construction simplified

In the preview version, the publisher needed to author three files and package them in a zip. One of them was a template file which contained only the Microsoft.Solutions/appliances resource. The publisher also had to specify all of the actual parameters needed for the deployment of the actual resources in this template file again. This was in addition to these parameters already being specified in the other template file. Although this was needed, it caused redundant and often confusing work for the publishers. Going forward, this file will be auto-generated by the service.

So, in the package (.zip), only two files are now required – i) mainTemplate.json (template file which contains the resources that needs to be provisioned) ii) createUIDefinition.json

If your solution uses nested templates, scripts or extensions, those don’t need to change.

Portal support enabled

At preview, we just had CLI support for creating a managed application definition for the Service Catalog. Now, we have added Portal and PowerShell support. With this, the central IT team of an organization can use portal to quickly author a managed application definition and share it with folks in the organization. They don’t need to use CLI and learn the different commands offered there.

These could be discovered in the portal by clicking on “More Services” and then searching for Managed. Don’t use the ones which say “Preview”.


To create a managed application definition, select “Service Catalog managed application definitions” and click on “Add” button. This will open the below blade.


Support for providing template files inline instead of packaging as .zip

Create a .zip file, uploading it to a blob, making it publicly accessible, getting the URL and then creating the managed application definition still required a lot of steps. So, we have enabled another option where you can specify these files inline using new parameters that have been added to CLI and Powershell. Support for inline template files will be added to portal shortly.

Service Changes

Please note that the following major changes have been made to the service.

New api-version

The general availability release is introducing a new api-version which will enable you to leverage all the above mentioned improvements. The new api-version is 2017-09-01. Azure Portal will use this new api-version. The latest version of Azure CLI and Azure PowerShell leverages this new api-version. It will be required that you switch to this latest version to develop and manage Managed Applications. Note that creating and managing Managed Applications will not be supported using the existing version of CLI after 9/25/2017. Existing resources which have been created using the old api-version (old CLI) will still continue to work.

Resource type names have changed

The resource type names have changed in the new api-version. And so, Microsoft.Solutions/appliances is now Microsoft.Solutions/applications, and Microsoft.Solutions/applianceDefinitions is Microsoft.Solutions/applicationDefinitions.

Upgrade to the latest CLI and PowerShell

As mentioned above, to continue using and creating Managed Applications, you will have to use the latest version of CLI and PowerShell, or you can use the Azure portal. Existing versions of these clients built on the older api-version will no longer be supported. Your existing resources will be migrated to use the new resource types and will continue to work using the new version of the clients.

Supported locations

Currently, the supported locations are West Central US and West US2.

Please try out the new version of the service and let us know your feedback through our user voice channel or in the comments below.

Additional resources

A new vision for intelligent communications in Office 365 – Office Blogs

Today’s post was written by Lori Wright, general manager for Microsoft Teams and Skype product marketing.

Today at Microsoft Ignite in Orlando, Florida, we introduced a new vision for intelligent communications, transforming calling and meeting experiences for people and organizations around the world. Intelligent communications go beyond traditional unified communications, enabling you to complete tasks more efficiently with minimal context switching, participate in more productive meetings that cover the entire meeting lifecycle, and better manage your everyday communications overload.

Microsoft Teams is core to our vision for intelligent communications—bringing together conversations, meetings, files, Office apps, and third-party integrations—to provide a single hub for teamwork in Office 365. Teams is now being used by over 125,000 organizations across the world in just six months since its launch. Its strong momentum has proven that teamwork is essential to the way work gets done today.

To achieve our vision for intelligent communications, we are bringing comprehensive calling and meetings capabilities into Teams, along with data and insights from the Microsoft Graph, and a strong roadmap of innovation to empower teams to achieve more.

All of this is being built on a new, modern Skype infrastructure for enterprise-grade voice and video communications. Our next generation, cloud-born architecture is already powering communication experiences in Teams, and is evolving rapidly. We are excited about this new infrastructure because it will provide both speed of innovation as well as higher quality communication experiences.​

As we build out these capabilities, Teams will evolve as the primary client for intelligent communications in Office 365, replacing the current Skype for Business client over time.

The future of business meetings

Combining communications, collaboration, and intelligence in this way will make new things possible across the lifecycle of a call or meeting:

  • Before a meeting, Teams will surface relevant documents and rich information about the participants to help you prepare.
  • During the meeting, the conversation can be captured, transcribed, and time-coded, with closed captioning and voice recognition for attributing remarks to specific individuals.
  • After the meeting, the cloud recording and transcript can be automatically added to the relevant channel, so conversations, documents, notes, and action items can be reviewed, indexed, and searched by the entire team.

Image of a Teams meeting with four participants.

Introducing calling features and meeting enhancements in Teams

Over the past six months, we’ve continued to enhance the communications capabilities in Teams, with new features like scheduled meetings, Outlook calendar integration, and meetings on mobile. Also, earlier this month, we began rolling out guest access—so you can use Teams to collaborate with people outside your company. In the coming months, we will begin adding calling features in Teams—including inbound and outbound calls to PSTN numbers, hold, call transfer, and voicemail.

We are also introducing new enhancements to Teams meetings, including audio conferencing (available in preview today)—enabling participants to join a Teams meeting by dialing a telephone number—and interoperability between Teams and Skype for Business, including universal presence, and messaging and calling interoperability.

This is just the beginning of a big wave of feature releases that will bring the core set of meetings and phone system capabilities into Teams.

We remain committed to bringing the familiar Skype experience into any and every meeting room. We have seen strong customer momentum with Skype Rooms Systems. Today, Lenovo announced they will bring to market a new Skype Room Systems device, Smart Hub 500, expanding on the current portfolio of Skype Room Systems with Logitech, Crestron, and Polycom. In addition, Polycom, Pexip, and Blue Jeans Networks will deliver cloud video interop capabilities within Teams. This adds to the existing video interop capabilities for Skype for Business delivered by Polycom’s RealConnect for Office 365 and Pexip’s Infinity Fusion product.

What’s next

Office 365 customers can take advantage of the capabilities in Microsoft Teams starting today. We are committed to providing visibility into the Teams product roadmap, so our customers can assess when Teams is right for them. We intend to make an updated roadmap for Teams available in October.

We plan to continue to offer and support Skype for Business in Office 365 and Skype for Business Server on-premises. For customers who are not yet ready to move their PBX and advanced calling capabilities to the cloud, we will release a new version of Skype for Business Server targeted for the second half of calendar year 2018.

Microsoft Teams and Skype for Business clients can be run side by side to evaluate and explore what’s best for your organization.

We encourage every Office 365 customer to begin using Teams today. Office 365 customers currently using Skype for Business can find guidance and resources on the intelligent communications page in the FastTrack portal.

—Lori Wright