Tag Archives: today

Announcing new AI and mixed reality business applications for Microsoft Dynamics – The Official Microsoft Blog

Today, I had the opportunity to speak to press and analysts in San Francisco about our vision for business applications at Microsoft. In addition, I had the privilege to make two very important announcements: the upcoming availability of new Dynamics 365 AI applications, and our very first mixed reality business applications: Dynamics 365 Remote Assist and Dynamics 365 Layout.

Our vision for business applications at Microsoft

We live in a connected world where companies are challenged every day to innovate so they can stay ahead of emerging trends and repivot business models to take advantage of new opportunities to meet growing customer demands.

To innovate, organizations need to reimagine their processes. They need solutions that are modern, enabling new experiences for how they can engage their customers while making their people more productive. They need unified systems that break data silos, so they have a holistic view of their business, customers and employees. They need pervasive intelligence threaded throughout the platform, giving them the ability to reason over data, to predict trends and drive proactive intelligent action. And with adaptable applications, they can be nimble, allowing them to take advantage of the next opportunity that comes their way.

Two years ago, when we introduced Dynamics 365 we started a journey to tear down the traditional silos of customer relationship management (CRM) and enterprise resource planning (ERP). We set out to reimagine business applications as modern, unified, intelligent and adaptable solutions that are integrated with Office 365 and natively built on Microsoft Azure.

With the release of our new AI and mixed reality applications we are taking another step forward on our journey to help empower every organization on the planet to achieve more through the accelerant of business applications. Specifically, today we are making the following announcements:

Dynamics 365 + AI

First, I am happy to announce the coming availability of a new Dynamics 365 AI offering — a new class of AI applications that will deliver out-of-the-box insights by unifying data and infusing it with advanced intelligence to guide decisions and empower organizations to take informed actions. And because these insights are easily extensible through the power of Microsoft Power BI, Azure and the Common Data Service, organizations will be able to address even the most complex scenarios specific to their business.

Dynamics 365 AI for Sales: AI can help salespeople prioritize their time to focus on deals that matter most, provide answers to the most common questions regarding the performance of sales teams, offer a detailed analysis of the sales pipeline, and surface insights that enable smarter coaching of sales teams.

Dynamics 365 AI for Customer Service: With Microsoft’s AI and natural language understanding, customer service data can surface automated insights that help guide employees to take action and can even leverage virtual agents to help lower support costs and enable delightful customer experiences, all without needing in-house AI experts and without writing any code.

Dynamics 365 AI for Market Insights: Helps empower your marketing, social media and market research teams to make better decisions with market insights. Marketers can improve customer relationships with actionable web and social insights to engage in relevant conversations and respond faster to trends.

To help bring this to life, today we released a video with our CEO, Satya Nadella, and Navrina Singh, a member of our Dynamics 365 engineering team, showing examples of ways we’re bringing the power of AI to customer service organizations.

Dynamics 365 + Mixed Reality

Our second announcement of the day centers on the work we are doing to bring mixed reality and business applications together.

Since the release of Microsoft HoloLens over two years ago, the team has learned a lot from customers and partners. The momentum that HoloLens has received within the commercial space has been overwhelmingly positive. This has been supported by increased demand and deployment from some of the world’s most innovative companies.

We recognize that many employees need information in context to apply their knowledge and craft. Not only on a 2-D screen — but information and data in context, at the right place, and at the right time, so employees can produce even greater impact for their organizations. Mixed reality is a technology uniquely suited to do exactly that.

This is a whole new kind of business application. And that’s precisely what we’re introducing today, Dynamics 365 Remote Assist and Dynamics 365 Layout.

Today, we also showcased for the first time how Chevron is deploying HoloLens to take advantage of Dynamics 365 mixed reality business applications.

Chevron is already achieving real, measurable results with its global HoloLens deployment. Previously it was required to fly in an inspector from Houston to a facility in Singapore once a month to inspect equipment. Now it has in-time inspection using Dynamics 365 Remote Assist and can identify issues or provide approvals immediately.

In addition, remote collaboration and assistance have helped the company operate more safely in a better work environment, serving as a connection point between firstline workers and remote experts, as well as cutting down on travel and eliminating risks associated with employee travel.

Here is a peek into the work Chevron is doing with mixed reality:

Unlock what’s next with the Dynamics 365 October 2018 release

Next week at Microsoft Ignite and Microsoft Envision we’ll be in Orlando talking with thousands of customers, partners, developers, and IT and business leaders about our October 2018 release for Dynamics 365 and the Power platform that will be generally available Oct. 1. The wave of innovation this represents across the entire product family is significant, with hundreds of new capabilities and features.

We will have a lot more to talk about in the weeks and months ahead. We look forward to sharing more!

Tags: , , ,

Announcing Azure Pipelines with unlimited CI/CD minutes for open source

With the introduction of Azure DevOps today, we’re offering developers a new CI/CD service called Azure Pipelines that enables you to continuously build, test, and deploy to any platform or cloud. It has cloud-hosted agents for Linux, macOS, and Windows, powerful workflows with native container support, and flexible deployments to Kubernetes, VMs, and serverless environments.

Microsoft is committed to fueling open source software development. Our next step in this journey is to provide the best CI/CD experience for open source projects. Starting today, Azure Pipelines provides unlimited CI/CD minutes and 10 parallel jobs to every open source project for free. All open source projects run on the same infrastructure that our paying customers use. That means you’ll have the same fast performance and high quality of service. Many of the top open source projects are already using Azure Pipelines for CI/CD, such as Atom, CPython, Pipenv, Tox, Visual Studio Code, and TypeScript – and the list is growing every day.

In the following, you can see Atom running parallel jobs on Linux, macOS, and Windows for its CI.

atom2x

Azure Pipelines app on GitHub Marketplace

Azure Pipelines has an app in the GitHub Marketplace so it’s easy to get started. After you install the app in your GitHub account, you can start running CI/CD for all your repositories.

pipelines2x

Pull Request and CI Checks

When the GitHub app is setup, you’ll see CI/CD checks on each commit to your default branch and every pull request.

highlight2x

Our integration with the GitHub Checks API makes it easy to see build results in your pull request. If there’s a failure, the call stack is shown as well as the impacted files.

githubchecks2x

More than just open source

Azure Pipelines is also great for private repositories. It is the CI/CD solution for companies like Columbia, Shell, Accenture, and many others. It’s also used by Microsoft’s biggest projects like Azure, Office 365, and Bing. Our free offer for private projects includes a cloud-hosted job with 1,800 minutes of CI/CD a month or you can run unlimited minutes of CI/CD on your own hardware, hosted in the cloud or your on-premises hardware. You can purchase parallel jobs for private projects from Azure DevOps or the GitHub Marketplace.

In addition to CI, Azure Pipelines has flexible deployments to any platform and cloud, including Azure, Amazon Web Services, and Google Cloud Platform, as well as any of your on-premises server running Linux, macOS or Windows. There are built-in tasks for Kubernetes, serverless, and VM deployments. Also, there’s a rich ecosystem of extensions for the most popular languages and tools. The Azure Pipelines agent and tasks are open source and we’re always reviewing feedback and accepting pull requests on GitHub.

Join our upcoming live streams to learn more about Azure Pipelines and other Azure DevOps services.

  • Keynote: Watch our live Azure DevOps keynote on September 11, 2018 from 8:00 – 9:30 AM Pacific Time.

  • Live training: Join our live Mixer workshop with interactive Q&A on September 17, 2018 from 8:30 AM – 2:30 PM Pacific Time.

You can save-the-date and watch both live streams on our events page. There you’ll also find additional on-demand videos and other resources to help get you started.

I’m excited for you to try Azure Pipelines and tell us what you think. You can share your thoughts directly to the product team using @AzureDevOps, Developer Community, or comment on this post.

Jeremy Epling

@jeremy_epling

Introducing Azure DevOps

Today we are announcing Azure DevOps. Working with our customers and developers around the world, it’s clear DevOps has become increasingly critical to a team’s success. Azure DevOps captures over 15 years of investment and learnings in providing tools to support software development teams. In the last month, over 80,000 internal Microsoft users and thousands of our customers, in teams both small and large, used these services to ship products to you.

The services we are announcing today span the breadth of the development lifecycle to help developers ship software faster and with higher quality. They represent the most complete offering in the public cloud. Azure DevOps includes:

Azure PipelinesAzure Pipelines

CI/CD that works with any language, platform, and cloud. Connect to GitHub or any Git repository and deploy continuously. Learn More >

Azure BoardsAzure Boards

Powerful work tracking with Kanban boards, backlogs, team dashboards, and custom reporting. Learn more >

Azure ArtifactsAzure Artifacts

Maven, npm, and NuGet package feeds from public and private sources. Learn more >

Azure ReposAzure Repos

Unlimited cloud-hosted private Git repos for your project. Collaborative pull requests, advanced file management, and more. Learn more >

Azure Test PlansAzure Test Plans

All in one planned and exploratory testing solution. Learn more >

Each Azure DevOps service is open and extensible. They work great for any type of application regardless of the framework, platform, or cloud. You can use them together for a full DevOps solution or with other services. If you want to use Azure Pipelines to build and test a Node service from a repo in GitHub and deploy it to a container in AWS, go for it. Azure DevOps supports both public and private cloud configurations. Run them in our cloud or in your own data center. No need to purchase different licenses. Learn more about Azure DevOps pricing.

Here’s an example of Azure Pipelines used independently to build a GitHub repo:

pipelinesbuild2x

Alternatively, here’s an example of a developer using all Azure DevOps services together from the vantage point of Azure Boards.

boards2x

Open Source projects receive free CI/CD with Azure Pipelines

As an extension of our commitment to provide open and flexible tools for all developers, Azure Pipelines offers free CI/CD with unlimited minutes and 10 parallel jobs for every open source project. With cloud hosted Linux, macOS and Windows pools, Azure Pipelines is great for all types of projects.

Many of the top open source projects are already using Azure Pipelines for CI/CD, such as Atom, Cpython, Pipenv, Tox, Visual Studio Code, and TypeScript – and the list is growing every day.

We want everyone to have extremely high quality of service. Accordingly, we run open source projects on the same infrastructure that our paying customers use.

Azure Pipelines is also now available in the GitHub Marketplace making it easy to get setup for your GitHub repos, open source or otherwise. 

Here’s a walkthrough of Azure Pipelines:

Learn more >

The evolution of Visual Studio Team Services (VSTS) 

Azure DevOps represents the evolution of Visual Studio Team Services (VSTS). VSTS users will be upgraded into Azure DevOps projects automatically. For existing users, there is no loss of functionally, simply more choice and control. The end to end traceability and integration that has been the hallmark of VSTS is all there. Azure DevOps services work great together. Today is the start of a transformation and over the next few months existing users will begin to see changes show up. What does this mean?

  • URLs will change from abc.visualstudio.com to dev.azure.com/abc. We will support redirects from visualstudio.com URLs so there will not be broken links.
  • As part of this change, the services have an updated user experience. We continue to iterate on the experience based on feedback from the preview. Today we’re enabling it by default for new customers. In the coming months we will enable it by default for existing users.
  • Users of the on-premises Team Foundation Server (TFS) will continue to receive updates based on features live in Azure DevOps. Starting with next version of TFS, the product will be called Azure DevOps Server and will continue to be enhanced through our normal cadence of updates.

Learn more

To learn more about Azure DevOps, please join us:

  • Keynote: Watch our live Azure DevOps keynote on September 11, 2018 from 8:00 – 9:30 AM Pacific Time.

  • Live training: Join our live Mixer workshop with interactive Q&A on September 17, 2018 from 8:30 AM – 2:30 PM Pacific Time.

You can save-the-date and watch both live streams on our events page. There you’ll also find additional on-demand videos and other resources to help get you started.

We couldn’t be more excited to offer Azure DevOps to you and your teams. We can’t wait to see what amazing things you create with it.

PlayerUnknown’s Battlegrounds Full Product Release Now Available on Xbox One – Xbox Wire

Today, the Full Product Release (1.0) update for PlayerUnknown’s Battlegrounds (PUBG) released for new and existing owners across the Xbox One family of devices. This is a big moment for the PUBG Xbox community, now over nine million players strong, who have been every bit an important part of the development process since we first launched in Xbox Game Preview in December 2017. With the support of fans and the team at Microsoft, it’s been an incredible journey and we’re just getting started.

The Full Product Release comes with several exciting updates, including the Xbox One debut of the Sanhok Map, available today, along with Event Pass: Sanhok, which unlocks awesome rewards for leveling up and completing missions. The Sanhok Map is included with the Full Product Release 1.0 update, and Event Pass: Sanhok can be purchased in the Microsoft Store or the PUBG in-game Store beginning today. For additional details on all of the new features included in the Full Product Release update today and in the weeks ahead, click here.

While Full Product Release represents an exciting milestone for PUBG on Xbox One, it does not represent the end of the journey. The game will continue to be updated and optimized, and we have an exciting roadmap of new features and content ahead in the months to come, including the winter release of an all-new snow map.

The Full Product Release of PUBG for Xbox One is available for $29.99 USD digitally and as a retail disc version at participating retailers worldwide. If you already own the Xbox Game Preview version of PUBG on Xbox One you will receive a content update automatically today at no additional cost.

As shared previously, we’re also providing some special bonuses both to new players and those who have supported PUBG over the past nine months.

To enhance the ultimate PUBG experience on Xbox, fans can also look forward to the PlayerUnknown’s Battlegrounds Limited Edition Xbox Wireless Controller, which is now available for pre-order at the online at the Microsoft Store and starts shipping to retailers worldwide on October 30 for $69.99 USD.

Be sure to tune into Mixer’s very own HypeZone PUBG Channel to catch the most exciting, down-to-the-wire PUBG action that give viewers the opportunity to discover streamers of all levels during the most intense moments of the game.

Whether you’re already a player or your chicken dinner hunt starts today – now is the best time to jump into PUBG on Xbox One!

VSAN hyper-converged users offer buying, implementing advice

LAS VEGAS — Today, VMware paints vSAN hyper-converged technology as a key piece of IT everywhere, from the data center to the cloud to the edge. But early vSAN customers remember when it was still a nascent concept and not fully proven.

As a customer panel at VMworld 2018, vSAN hyper-converged software users offered advice for buying and implementing what, in some cases, was still a suspect technology when they adopted it. The customers were split between using vSAN on integrated appliances, such as Dell EMC VxRail hardware, or buying on servers as vSAN Ready Nodes. Either way, they faced similar buying decisions and implementation challenges.

Here is some of the advice offered for going down the road of vSAN hyper-converged and hyper-converged infrastructure (HCI) in general.

Start small and prove its value

Several of the vSAN hyper-converged customers said it was difficult to gain support originally for moving from a traditional three-tier architecture to HCI. It helped to start with a specific use case to prove the technology and then grow from there.

William Dufrin, IT manager of client virtualization engineering and architecture at General Motors, said the early case was virtual desktop infrastructure (VDI).

“In our environment, change is kind of rough,” Dufrin said. “We’re a large organization, and it could be difficult to make changes like vSAN instead of traditional storage.”

He said IT developers started using vSAN for VDI in 2014, “and in four years, we’ve seen a huge adoption rate inside the organization because of the values and the savings. It’s been stable, and performance has been phenomenal.”

Dufrin said General Motors now has around 10,000 virtual desktops running on a six-node cluster, with two fault domains for availability.

Mark Fournier, systems architect for the United States Senate Federal Credit Union in Alexandria, Va., said his bank started with vSAN Ready Nodes in remote branches. The HCI implementation came around the time USSFCU began virtualizing in 2014.

“Going to vSAN was a challenge against some of the traditional technology we had,” Fournier said. “Even though we were virtualizing, we were still siloing off storage, compute and networking. To get into what seems to be the future, we upgraded our branches using vSAN Remote Office Branch Office licensing. That allowed us to implement hyper-converged architecture in our branches for a lot less money than we expected.”

Fournier said the credit union put Ready Nodes on all-flash blade servers in three branches. He said a four-node all-flash implementation in one branch is so fast now that some of his organization’s developers want to move workloads to the branch.

“With the new PowerEdge M7000 from Dell, options for onboard storage are more flexible, and [it] allows us to bring vSAN out of the branches and into the data center now that management sees the benefit we get out of it,” Fournier said.

Think platform and relationships, and consider all options

The panelists said they did a lot of research before switching to HCI and picking a vendor. They evaluated products from leading HCI vendors, different offerings from the same vendor and compared HCI to traditional IT before making buying decisions.

Mariusz Nowak, director of infrastructure services at Oakland University in Rochester, Mich., said cost played a large role — as is often the case with educational institutions.

“I was sick and tired of replacing entirely every traditional storage array every few years and begging for new money, hundreds of thousands of dollars,” he said. “My boss, and everyone else, wasn’t happy to have to spend tons of money.”

Oakland University was a VMware customer since 2005, and Nowak said he looked at early versions of vSAN hyper-converged software, but felt it wasn’t ready for the university. After VMware added more enterprise features, such as stretch clusters, deduplication and encryption, Oakland installed the HCI software in 2017. It now has 12 vSAN hosts, with 400 guest virtual machines and 350 TB of storage on vSAN Ready Nodes running on Dell EMC PowerEdge servers.

I was sick and tired of replacing entirely every traditional storage array every few years and begging for new money, hundreds of thousands of dollars.
Mariusz Nowakdirector of infrastructure services, Oakland University

“I choose Ready Nodes so I don’t have extra overhead,” Nowak said. “With VxRail, you have to pay more. With Ready Nodes, I can modify my hardware whenever I need, whether I need more capacity or more CPUs. Some HCI vendors will say, ‘This is the cookie-cutter node that you have to buy.’ We have more flexibility.”

Alex Rodriguez, VDI engineer at Rent-A-Center, based in Plano, Texas, said his company did a proof of concept (POC) with Dell EMC VxRail, Nutanix and SimpliVity — since acquired by Hewlett Packard Enterprise — when evaluating HCI in 2016. He said price and vendor relationships also figured in the final decision.

“When we did a POC, Nutanix won out,” he said. “But we saw a cost benefit with VxRail, and we decided to go in that direction because of our relationship with VMware. And each generation of this [vSAN] software has gotten a whole lot better. Performance is better and manageability is easy. You may find an application that’s better for one stack or another, but overall we think VxRail is a better platform.”

Divide and cluster

Several of the panelists suggested using clusters or stretch clusters with vSAN hyper-converged infrastructure to help separate workloads and provide availability.

Nowak said Oakland University installed 10 nodes in a stretched cluster across two campus data centers, with 10 Gigabit Ethernet uplinks to a witness site connecting them.

“For little cost, I have an active-active data center solution,” he said. “If we lost one data center, I could run almost my entire workload on another site, with no disruption. I can technically lose one site and shift my workload to another site.”

Rent-A-Center’s Rodriguez set up a four-node cluster with management applications and a 12-node cluster for VDI and other applications after installing Dell EMC VxRail appliances in 2016.

“We wanted to make sure we could manage our environment,” he said. “If we would’ve consolidated the management stack with the VDI stack and something happened, we would’ve lost control. Having segmentation gave us control.”

Bing helps you learn more about the news in less time

Being an informed consumer of the news is more challenging today than it used to be. We live in a busy world where dozens of headlines compete for our attention every day. On top of that, it’s difficult to know if you’re getting all sides of a story or just leaning into an echo chamber, and it can feel like a full-time job to seek out various points of view.

At Bing, we want to empower users to get an overview of the news in less time. That’s why we built the Bing spotlight that provides overviews of news topics that you can see right in the Bing search results when you search for major developing news stories.

Spotlight shows users the latest headlines, a rundown of how the story has developed over time, and relevant social media posts from people around the web. Spotlight also shows diverse perspectives on a given topic so users can quickly get a well-rounded view on the topic before deciding what they want to go deeper on and read by clicking on any of the articles.

Spotlight is currently available on Bing desktop and mobile web in the US.


 

Users’ trust in the news we present is of the utmost importance to Bing, and we’re committed to providing a well-rounded view of news from diverse, quality sources.

To start, Bing monitors millions of queries and news articles every day and identifies impactful stories that evolve over a period of weeks or months. We look at various user signals such as queries and browser logs, and document signals from publishers such as how many publishers cover a story, their angles, and how prominently they feature the story on their site.  For controversial topics, in the Perspectives module, we show different viewpoints from high-quality sources. For a source to be considered high quality, it must meet the Bing News PubHub Guidelines, which is a set of criteria that favors originality, readability, newsworthiness, and transparency. Top caliber news providers identify sources and authors, give attribution and demonstrate sound journalistic practices such as accurate labeling of opinion and commentary. Behind the scenes, we leverage our deep learning algorithms and web graphs of hundreds of millions of web sites in the Bing index to identify top sources for national news, per category, query, or article. Our goal is to provide broader context for impactful stories, from politics to business to major disasters, and much more.

To try the new experience, search for major news topics like self-driving cars on Bing.com, or find the latest spotlights on the Bing.com homepage carousel.

Providing different perspectives in our spotlight experience is part of a broader effort to help our users be more informed with various perspectives on a range of topics, from news to common health questions. We’re working hard to expand the range of topics covered by this approach, including expanding the numbers of topics spotlight covers, to help you become more informed in less time and effort. We hope you’re as excited about these updates as we are!

Dell EMC HCI and storage cloud plans on display at VMworld

LAS VEGAS — Dell EMC launched cloud-related enhancements to its storage and hyper-converged infrastructure products today at the start of VMworld 2018.

The Dell EMC HCI and storage product launch includes a new VxRail hyper-converged appliance, which uses VMware vSAN software. The vendor also added a cloud version of the Unity midrange unified storage array and cloud enhancements to the Data Domain data deduplication platform.

Dell EMC HCI key for multi-cloud approach?

Dell EMC is also promising synchronized releases between the VxRail and the VMware vSAN software that turns the PowerEdge into an HCI system – although it could take 30 days for the “synchronization.” Still, that’s an improvement over the six months or so it now takes for the latest vSAN release to make it to VxRail.

Whether you’re protecting data or storing data, the learning curve of your operating model — regardless of whether you’re on premises or off premises — should be zero.
Sam Grocottsenior vice president of marketing, ISG, Dell EMC

Like other vendors, Dell EMC considers its HCI a key building block for private and hybrid clouds. The ability to offer private clouds with public cloud functionality is becoming an underpinning of the multi-cloud strategies at some organizations.

Sam Grocott, senior vice president of marketing for the Dell EMC infrastructure solutions group, said the strong multi-cloud flavor of the VMworld product launches reflects conversations the vendor has with its customers.

“As we talk to customers, the conversation quickly turns to what we are doing in the cloud,” Grocott said. “Customers talk about how they’re evaluating multiple cloud vendors. The reality is, they aren’t just picking one cloud, they’re picking two or even three clouds in a lot of cases. Not all your eggs will be in one basket.”

Dell EMC isn’t the only storage vendor making its storage more cloud-friendly. Its main storage rival NetApp also offers its unified primary storage and backup options that run in the cloud, and many startups focus on cloud compatibility and multi-cloud management from the start.

Grocott said Dell’s overall multi-cloud strategy is to provide a consistent operating model experience on premises, as well as in private and public clouds. That strategy covers Dell EMC and VMware products. Dell EMC VxRail is among the products that tightly integrates VMware with the vendor’s storage.

“That’s what we think is going to differentiate us from any of the competition out there,” he said. “Whether you’re protecting data or storing data, the learning curve of your operating model — regardless of whether you’re on premises or off premises — should be zero.”

Stu Miniman, a principal analyst at IT research firm Wikibon, said Dell EMC is moving toward what Wikibon calls a True Private Cloud.

Wikibon’s 2018 True Private Cloud report predicts almost all enterprise IT will move to a hybrid cloud model dominated by SaaS and true private cloud. Wikibon defines true private cloud as completely integrating all aspects of a public cloud, including a single point of contact for purchase, support, maintenance and upgrades.

“The new version of the private cloud is, let’s start with the operating model I have in the public cloud, and that’s how I should be able to consume it, bill it and manage it,” Miniman said. “It’s about the software, it’s about the usability, it’s about the management layer. Step one is to modernize the platform; step two is to modernize the apps. It’s taken a couple of years to move along that spectrum.”

New Cohesity backup adds Helios SaaS management

Cohesity Inc. today released Helios, a SaaS application that works in conjunction with Cohesity DataPlatform to give IT administrators greater control over their consolidated secondary data.

Helios allows Cohesity backup customers to manage data under control of DataPlatform software, whether it is on premises or in public clouds. Helios enables customers to view and search secondary data, make global policy changes and perform upgrades through a single dashboard.

Helios requires an extra license separate from DataPlatform, based on the amount of data under management. Positioned as an add-on for DataPlatform users, it’s designed to enhance secondary data management with a slew of features, including some that utilize predictive analytics and machine learning.

With Helios, Cohesity is following the lead of rival Rubrik Inc., which launched its Polaris SaaS-based management last April. Cohesity and Rubrik sell scale-out, node-based secondary storage platforms that manage data on premises and in the cloud.

Raj Dutt, product marketing director at Cohesity, said one of Helios’ core goals is to simplify multicluster administration. The Cohesity backup SmartAssist feature suggests resource allocations across the environment based on service-level agreements set by the administrator. Using machine learning, Helios examines how an infrastructure is being used and suggests when to add resources or make adjustments. Helios will also allow its users to make peer comparisons by sharing anonymized metadata from other Cohesity customers.

Other features include global hardware health monitoring, pattern and password detection, video compression and machine learning to analyze how changes will impact clusters before they are rolled out.

Cohesity Helios screenshot
Helios brings multicluster management under one dashboard.

Dutt said the difference between Helios and competitors, such as Dell EMC CloudIQ analytics and Rubrik Polaris, is “almost none of the [others] offer active management on a global scale.”

Although Helios is generally available today, Dutt said not all of its features will be ready to go right out of the gate. They will be rolled out as part of monthly releases of the core Cohesity backup software, with the expectation that all of the planned capabilities will be available by the end of 2018.

Cohesity backup checks the SaaS boxes

Edwin Yuen, senior analyst at Enterprise Strategy Group, said Helios fills the major requirements for SaaS-based management across clusters and clouds.

They’re experts in their storage and they’re adding a management layer on top of it.
Edwin Yuensenior analyst, Enterprise Strategy Group

“Within systems management, you need to have three things,” he said. “One is inventory — you need to be able to know what you have out there and go and find it. No. 2, you need to have status — you need to know what’s going on with them. And three, you need to have actions — you need to actually be able to do something about them. A lot of tools don’t actually do that. … Helios does.”

Yuen also pointed out that many vendors are moving from simply selling their software licenses to SaaS-based, subscription models. “It’s often consumption-based, it’s a living service, you’ll get data updates so you’re not always waiting for another version,” Yuen said. “If you are going to manage across multiple destinations, that model does make a lot of sense.”

As more products offering assisted integration and optimization like the Cohesity backup software emerge in the multi-cloud management space, Yuen speculates there will be a growing demand for cross-platform, vendor-agnostic products. Helios can see and manage the metadata hosted on Microsoft Azure, Google and Amazon Web Services public clouds — as long as you’re running Cohesity DataPlatform.

“They’re experts in their storage and they’re adding a management layer on top of it,” Yuen said. “The question is are you going to be an expert in the management layer so that it doesn’t matter what storage you have? I think there’s going to be demand for this type of solution across the board for managing data.”

Scale-out Qumulo NAS qualifies latest Dell EMC PowerEdge servers

Qumulo today added a hardware option for its customers by qualifying its scale-out NAS software to run on Dell Technologies’ PowerEdge servers. That leaves open the possibility that Qumulo will gain customers on Dell EMC servers at the expense of Dell EMC Isilon’s clustered NAS platform.

Qumulo is nearly two years into an OEM deal with Dell EMC archrival Hewlett Packard Enterprise. HPE rebrands and sells Qumulo’s scale-out NAS software on its servers. There is no joint go-to-marketing agreement between Qumulo and Dell EMC, which is a NAS market leader. The partnership means customers can purchase PowerEdge hardware from their preferred Dell EMC resellers and install Qumulo NAS software on the box.

Dell qualified Qumulo NAS software to run on dual-socket 2U Dell EMC PowerEdge R740xd servers.

“There are a lot of customers who build private clouds on Dell hardware. We’re now in a position where they can choose our software to build their computing,” Qumulo chief marketing officer Peter Zaballos said.

Dell EMC 14th-generation PowerEdge are equipped with about 20% more NVMe flash capacity than R730 models. One of the use cases cited by Dell EMC is the ability to use a single PowerEdge 14G node to power its IsilonSD Edge virtual NAS software, which competes with Qumulo storage.

Will Qumulo on PowerEdge compete with Dell EMC Isilon NAS?

The Qumulo File Fabric (QF2) file system scales to support billions of files and hundreds of petabytes. QF2 is available on Qumulo C-Series hybrid arrays, all-flash P-Series or preinstalled on HPE Apollo servers. Customers also may run it as an Elastic Compute Cloud instance to burst and replicate in AWS.

Qumulo NAS gear is sold mostly to companies in media and entertainment and other sectors with large amounts of unstructured data.

Zaballos said QF2 on PowerEdge isn’t a direct attempt to displace Isilon. The goal is to give Dell EMC shops greater flexibility, he said.

“We’re looking to build the biggest footprint in the market. Between Dell and HPE, that’s about 40% of the server market for data centers,” Zaballos said.

Qumulo competes mainly with Isilon and NetApp’s NAS products and has won customers away from Isilon. Pressure on traditional NAS vendors is also coming from several file system-based cloud startups, including Elastifile, Quobyte, Stratoscale and WekaIO.

Qumulo founders Peter Godman, Aaron Passey and Neal Fachan helped develop the Isilon OneFS clustered file system, which paved the way for the startup’s initial public offering in 2006. EMC bought the Isilon technology for $2.25 billion in 2010 and then was acquired as part of the Dell-EMC merger in 2015.

Qumulo CEO Bill Richter was president of the EMC Isilon division for three years. He joined Qumulo in 2016.

Greg Schulz, an analyst with Server StorageIO, based in Stillwater, Minn., likened the Qumulo-PowerEdge configuration to Dell EMC’s “co-optetition” OEM agreement with hyper-converged vendor Nutanix.

Qumulo NAS has been focused on high-performance, big-bandwidth file serving, which may not play well in environments that have many smaller files and mixed workloads. That’s an area Isilon has adapted to over the years. The other obstacle is getting [beyond] large elephant-hunt deals into broader markets, and getting traction with Dell servers can help them fill gaps in their portfolio,” Schulz said.

Ron Pugh, vice president for Dell EMC OEM sales in North America, said it’s not unusual for potential competitors to rely on Dell hardware products.

“If you look deeply inside the Dell Technologies portfolio, some of our customers can be considered competitors. Our OEM program is here to be a building block for our customers, not to build competing products,” Pugh said.

Dell EMC also sells Elastifile cloud-based NAS on its servers and is an Elastifile strategic investor.

Qumulo: AI tests on P-Series flash

Qumulo this week also previewed upcoming AI enhancements to its P-Series to enable faster prefetching of application data in RAM. Those enhancements are due to roll out in September. Grant Gumina, a Qumulo senior product manager, said initial AI enhancements will improve performance of all-flash P-Series. Series proofs of concept are under way with media customers, Gumina said.

“A lot of studios are using SANs to power primarily file-based workloads in each playback bay. The performance features in QF2 effectively means they can install a NAS for the first time and move to a fully Ethernet-based environment,” Gumina said.

File storage vendors NetApp and Pure Storage recently added flash systems built for AI, incorporating Nvidia hardware.

BP Logix BPM tool packs AI features in latest release

Low-code BPM development tools today already help developers simplify and speed up business process application development. The next step is to make those apps smarter.

To that end, BP Logix, a business process management (BPM) company in Vista, Calif., recently introduced version 5.0 of its Process Director that adds AI features to enable predictive analysis, enhanced UIs and journals for configurable collaboration.

Rather than present complex AI features, Process Director 5.0 offers a set of basic machine learning tools that the average app developer can use, such as a point-and-click graphical interfaces that guide configuration processes and display results of analytics, with no coding required.

Embedding intelligence into business applications requires specialized knowledge and teams of data scientists, said Charles Araujo, principal analyst for Intellyx, a consulting firm in Glens Falls, N.Y. Process Director 5.0’s blend of AI and low-code features brings predictive application processes to nontechnical users.

“The value Process Director 5.0 delivers is less about features, per se, and more about accessibility,” Araujo said.

AI inside

The AI tools inside Process Director 5.0 enable machine learning, sentiment analysis, capture and expression of dissimilar events and conditions in a single state and configurable collaboration. The company also added UI features for iterative list search, calendar views, and inline HTML and text editing.

“AI and machine learning create prediction models that have been missing from BPM,” said Neil Ward-Dutton, research director for MWD Advisors, a U.K.-based IT consulting firm. With AI, the application learns from past history, identifies trends and makes recommendations for decisions.

As an example, Ward-Dutton pointed to how AI capabilities can help with a loan request by identifying factors that make the applicant and the loan’s purpose a low or high risk. Combined data mining and machine learning tools aggregate information about previous loan applications and current market conditions to help the loan officer make a decision.

AI and machine learning create prediction models that have been missing from BPM platforms.
Neil Ward-Duttonresearch director, MWD Advisors

Araujo said he sees businesses with reliable data on actions and outcomes adopt AI-enabled, predictive-type applications quickly and with good results. Developers can use that legacy data to build models that predict behavior of application users who meet certain criteria and perform specific actions. With these functions, the tool recommends a best action and prioritizes options that are presented to the user, so the application feels more intuitive or takes actions automatically.

Applying AI for nontechnical users, even with accessible tools, requires a change in traditional BPM project approaches. Araujo said project teams will have to think like a data scientist.

“Applying intelligence to applications requires imagination,” he said. “Developers need to think about application usage patterns and imagine ways to use predictive capabilities to meet users’ needs.”

“That’s not the way we’ve historically approached applications, particularly business-process-based ones,” Araujo added.

Process Director 5.0 is generally available, with versions for both cloud and on premises. In addition to AI and low-code/no-code development tools, the platform includes traditional BPM capabilities for compliance automation, process modeling, multifactor authentication and other standard BPM features.