Tag Archives: tools

Qualtrics XM adds mobile, AI, information governance

Qualtrics XM added AI and information governance tools to its customer and employee experience measurement platform this week and gave its year-old mobile app an infusion of dashboards to put data into the hands of front-line workers on the go.

In some ways, the new features reflect the influence of SAP, which acquired Qualtrics for $8 billion a year ago. The new features, such as mobile dashboarding, likely reflect a step toward making Qualtrics data relevant and available to customer-facing employees who use other SAP applications, in addition to marketing and research teams, Constellation Research principal analyst Nicole France said.

Getting such data into the hands of front-line employees makes the data more likely to be effectively used.

“Simply making these tools more widely available gets people more used to seeing this type of information, and it changes behaviors,” France said, adding that new features like mobile dashboards subtly get more people involved in using real-time performance metrics. “It’s doing it in almost a subliminal way, rather than trying to make it a quick-change program.” 

A number of Qualtrics competitors have also slowly added mobile dashboarding so employees can monitor reaction to a product, customer service or employee initiatives. But they’re all trying to find the right balance, lest it degrades employee experience or causes knee-jerk reactions to real-time fluctuations in customer response, Forrester Research senior analyst Faith Adams said

Qualtrics XM mobile NPS dashboard
Qualtrics XM mobile-app upgrades include dashboards to convey real-time customer response data to front-line employees responsible for product and service performance.

“It can be great — but it is also one that you need to be really careful with, too,” Adams said. “Some firms have noted that when they adopt mobile, it sometimes sets an expectation to employees of all levels that they are ‘always on.'”

Both France and Adams noted that the mobile app will help sales teams keep more plugged in to customer sentiment in their territories by getting data to them more quickly.

BMW, an early adopter of the new mobile app, uses it in dealerships to keep salespeople apprised of how individual customers feel about the purchasing process during the sale, and to prevent sales from falling through, according to Kelly Waldher, Qualtrics executive vice president and general manager.

AI and information governance tools debut

Qualtrics XM also added Smart Conversations, an AI-assisted tool to automate customer dialog around feedback. Two other AI features comb unstructured data for insights; one graphically visualizes customer sentiment and the other more precisely measures customer sentiment.

Prior to being acquired by SAP, Qualtrics had built its own AI and machine learning tools, Waldher said, and will continue to strategically invest in it. That said, Qualtrics will likely add features based on SAP’s Leonardo AI toolbox down the road.  

“We have that opportunity to work more closely with SAP engineers to leverage Leonardo,” Waldher said. “We’re still in the early stages of trying to tap into the broader SAP AI capabilities, but we’re excited to have that stack available to us.”

Also new to Qualtrics XM is a set of information governance features, which Waldher said will enable customers to better comply with privacy rules in both the U.S. and Europe. Qualtrics users will be able to monitor who is using data, and how within their organizations.

“Chief compliance officers and those within the IT group can make sure that the tools that are being deployed across the organization have advanced security and governance capabilities,” Waldher said. “SAP’s global strength, their presence in Western Europe and beyond, has strongly reinforced the path [of building compliance tools] we were already on.”

The new features are included in most paid Qualtrics plans at no extra charge, with a few of the AI tools requiring different licensing plans to use.

Go to Original Article
Author:

New AWS cost management tool, instance tactics to cut cloud bills

LAS VEGAS — Amazon continuously rolls out new discounting programs and AWS cost management tools in an appeal to customers’ bottom lines and as a hedge against mounting competition from Microsoft and Google.

Companies have grappled with nasty surprises on their AWS bills for years, with the reasons attributed to AWS’ sheer complexity, as well as the runaway effect on-demand computing can engender without strong governance. It’s a thorny problem with a solution that can come in multiple forms.

To that end, the cloud giant released a number of new AWS cost management tools at re:Invent, including Compute Optimizer, which uses machine learning to help customers right-size their EC2 instances.

At the massive re:Invent conference here this week, AWS customers discussed how they use both AWS-native tools and their own methods to get the most value from their cloud budgets.

Ride-sharing service Lyft has committed to spend at least $300 million on AWS cloud services between the beginning of this year and the end of 2021.

Lyft, like rival Uber, saw a hockey stick-like growth spurt in recent years, going from about 50 million rides in 2015 to more than 350 million a few years later. But its AWS cost management needed serious work, said Patrick Valenzuela, engineering manager.

An initial effort to wrangle AWS costs resulted in a spreadsheet, powered by a Python script, that divided AWS spending by the number of rides given to reach an average figure. The spreadsheet also helped Lyft rank engineering teams according to their rate of AWS spending, which had a gamification effect as teams competed to do better, Valenzuela said in a presentation.

Within six months, Lyft managed to drop the AWS cost-per-ride figure by 40%. But it needed more, such as fine-grained data sets that could be probed via SQL queries. Other factors, such as discounts and the cost of AWS Reserved Instances, weren’t always reflected transparently in the AWS-provided cost usage reports used to build the spreadsheet.

Lyft subsequently built a second-generation tool that included a data pipeline fed into a data warehouse. It created a reporting and dashboard layer on top of that foundation. The results have been promising. Earlier this year, Lyft found it was now spending 50% less on read/writes for its top 25 DynamoDB tables and also saved 50% on spend related to Kubernetes container migrations.

 “If you want to learn more about AWS, I recommend digging into your bill,” Valenzuela said.

AWS cost management a perennial issue

While there are plenty of cloud cost management tools available in addition to the new AWS Compute Optimizer, some AWS customers take a proactive approach to cost savings, compared to using historical analysis to spot and shed waste, as Lyft did in the example presented at re:Invent.

Privately held mapping data provider Here Technologies serves 100 million motor vehicles and collects 28 TB of data each day. Companies have a choice in the cloud procurement process — one being to force teams through rigid sourcing activities, said Jason Fuller, head of cloud management and operations at Here.

“Or, you let the builders build,” he said during a re:Invent presentation. “We let the builders build.”

Still, Here had developed a complex landscape on AWS, with more than 500 accounts that collectively spun up more than 10 million EC2 instances a year. A few years ago, Here began a concerted effort to adopt AWS Reserved Instances in a programmatic manner, hoping to squeeze out waste.

Reserved Instances carry contract terms of up to three years and offer substantial savings over on-demand pricing. Here eventually moved nearly 80% of its EC2 usage into Reserved Instances, which gave it about 50% off the on-demand rate, Fuller said.

The results have been impressive. During the past three-and-a-half years, Here saved $50 million and avoided another $150 million in costs, Fuller said.

Salesforce is another heavy user of AWS. It signed a $400 million infrastructure deal with AWS in 2016 and the companies have since partnered on other areas. Based on its 2017 acquisition of Krux, Salesforce now offers Audience Studio, a data management platform that collects and analyzes vast amounts of audience information from various third-party sources. It’s aimed at marketers who want to run more effective digital advertising campaigns.

Audience Studio handles 200,000 user queries per second, supported by 2,500 Elastic MapReduce Clusters on AWS, said Alex Estrovitz, director of software engineering at Salesforce.

“That’s a lot of compute, and I don’t think we’d be doing it cost-effectively without using [AWS Spot Instances],” Estrovitz said in a re:Invent session. More than 85% of Audience Studio’s infrastructure uses Spot Instances, which are made up of idle compute resources on AWS and cost up to 90% less than on-demand pricing.

But Spot Instances are best suited for jobs like Audience Studio’s, where large amounts of data get parallel-processed in batches across large pools of instances. Spot Instances are ephemeral; AWS can shut them down upon a brief notice when the system needs resources for other customer jobs. However, customers like Salesforce can buy Spot Instances based on their application’s degree of tolerance for interruptions.

Salesforce has achieved 48% savings overall since migrating Audience Studio to Spot Instances, Estrovitz said. “If you multiply this over 2,500 jobs every day, we’ve saved an immense amount of money.”

Go to Original Article
Author:

SageMaker Studio makes model building, monitoring easier

LAS VEGAS — AWS launched a host of new tools and capabilities for Amazon SageMaker, AWS’ cloud platform for creating and deploying machine learning models; drawing the most notice was Amazon SageMaker Studio, a web-based integrated development platform.

In addition to SageMaker Studio, the IDE for platform for building, using and monitoring machine learning models, the other new AWS products aim to make it easier for non-expert developers to create models and to make them more explainable.

During a keynote presentation at the AWS re:Invent 2019  conference here Tuesday, AWS CEO Andy Jassy described five other new SageMaker tools: Experiments, Model Monitor, Autopilot, Notebooks and Debugger.

“SageMaker Studio along with SageMaker Experiments, SageMaker Model Monitor, SageMaker Autopilot and Sagemaker Debugger collectively add lots more lifecycle capabilities for the full ML [machine learning] lifecycle and to support teams,” said Mike Gualtieri, an analyst at Forrester.

New tools

SageMaker Studio, Jassy claimed, is a “fully-integrated development environment for machine learning.” The new platform pulls together all of SageMaker’s capabilities, along with code, notebooks and datasets, into one environment. AWS intends the platform to simplify SageMaker, enabling users to create, deploy, monitor, debug and manage models in one environment.

Google and Microsoft have similar machine learning IDEs, Gualtieri noted, adding that Google plans for its IDE to be based on DataFusion, its cloud-native data integration service, and to be connected to other Google services.

SageMaker Notebooks aims to make it easier to create and manage open source Jupyter notebooks. With elastic compute, users can create one-click notebooks, Jassy said. The new tool also enables users to more easily adjust compute power for their notebooks and transfer the content of a notebook.

Meanwhile, SageMaker Experiments automatically captures input parameters, configuration and results of developers’ machine learning models to make it simpler for developers to track different iterations of models, according to AWS. Experiments keeps all that information in one place and introduces a search function to comb through current and past model iterations.

AWS CEO Andy Jassy talks about new Amazon SageMaker capabilitiesatre:Invent 2019
AWS CEO Andy Jassy talks about new Amazon SageMaker capabilities at re:Invent 2019

“It is a much, much easier way to find, search for and collect your experiments when building a model,” Jassy said.

As the name suggests, SageMaker Debugger enables users to debug and profile their models more effectively. The tool collects and monitors key metrics from popular frameworks, and provides real-time metrics about accuracy and performance, potentially giving developers deeper insights into their own models. It is designed to make models more explainable for non-data scientists.

SageMaker Model Monitor also tries to make models more explainable by helping developers detect and fix concept drift, which refers to the evolution of data and data relationships over time. Unless models are updated in near real time, concept drift can drastically skew the accuracy of their outputs. Model Monitor constantly scans the data and model outputs to detect concept drift, alerting developers when it detects it and helping them identify the cause.

Automating model building

With Amazon SageMaker Autopilot, developers can automatically build models without, according to Jassy, sacrificing explainability.

Autopilot is “AutoML with full control and visibility,” he asserted. AutoML essentially is the process of automating machine learning modeling and development tools.

The new Autopilot module automatically selects the correct algorithm based on the available data and use case and then trains 50 unique models. Those models are then ranked by accuracy.

“AutoML is the future of ML development. I predict that within two years, 90 percent of all ML models will be created using AutoML by data scientists, developers and business analysts,” Gualtieri said.

SageMaker Autopilot is a must-have for AWS.
Mike GualtieriAnalyst, Forrester

“SageMaker Autopilot is a must-have for AWS, but it probably will help” other vendors also, including such AWS competitors as DataRobot because the AWS move further legitimizes the automated machine learning approach, he continued.

Other AWS rivals, including Google Cloud Platform, Microsoft Azure, IBM, SAS, RapidMiner, Aible and H2O.ai, also have automated machine learning capabilities, Gualtieri noted.

However, according to Nick McQuire, vice president at advisory firm CCS Insight, some of the  new AWS capabilities are innovative.

“Studio is a great complement to the other products as the single pane of glass developers and data scientists need and its incorporation of the new features, especially Model Monitor and Debugger, are among the first in the market,” he said.

“Although AWS may appear late to the game with Studio, what they are showing is pretty unique, especially the positioning of the IDE as similar to traditional software development with … Experiments, Debugger and Model Monitor being integrated into Studio,” McQuire said. “These are big jumps in the SageMaker capability on what’s out there in the market.”

Google also recently released several new tools aimed at delivering explainable AI, plus a new product suite, Google Cloud Explainable AI.

Go to Original Article
Author:

Tableau Foundation, Splash fight for clean water in schools

Data is one of the tools being used in the fight for clean water in schools in Ethiopia and India, and the Tableau Foundation is helping provide both the funds and technology that turn information into a vehicle for change.

At the recently completed Tableau Conference 2019 in Las Vegas, the vendor revealed a $1 million grant through the Tableau Foundation to support Splash in its effort to provide safe water, sanitation and hygiene programs to urban schools throughout Addis Ababa, Ethiopia, and Kolkata, India, by 2023.

Through its foundation, Tableau is also providing Splash with free licenses to use its platform, access to Tableau’s network of partners, and free tech support.

Splash, meanwhile, is a global nonprofit organization working to ensure safe water, sanitation and access to proper hygiene to children living in urban poverty throughout Asia and Africa.

The Tableau Foundation had already been connected with Splash for three years, giving the group small grants in recent years. The foundation made the $1 million commitment to Splash a few months before the conference.

In total, the Tableau Foundation and Splash are hoping to reach approximately a million children — more than 450,000 in Addis Ababa and more than 450,000 in Kolkata — over a five-year period. And once success is proved in those cities, Splash hopes to broaden its reach to other cities throughout Ethiopia and India.

“What it means is the kids will have lower rates of diarrheal diseases so they actually can attend schools and improve education outcomes,” said Neal Myrick, global head of the Tableau Foundation. “Girls who are going into puberty can get help during their menstrual periods and they don’t have to skip school during that time.

“It’s a health-related thing that has direct education outcomes, and then the broader community takes advantage of the clean water systems that are installed in the schools,” he continued.

Splash's schedule for providing clean water to more than 400,000 children in Addis Ababa, Ethiopia, is displayed on a dashboard.
A dashboard displays Splash’s projected schedule for providing clean water to more than 400,000 children in Addis Ababa, Ethiopia.

What data can do

Currently, Splash is providing water, sanitation and hygiene access to 16% of the schools in Addis Ababa and 19% of the schools in Kolkata.

Data will play a major role in helping Splash reach its goal of expanding that reach to approximately 1 million children by 2023, said Eric Stowe, founder and CEO of Splash.

Data will be a means of convincing municipalities to allow Splash to come into their schools, and also toward convincing potential donors to contribute the funds Splash needs to continue its work.

“It’s hard to get a government to move on something,” Stowe said. “Moving a city forward is hard unless you have good data for them to look at — you’re not going to get the minister of education in a city to act with Excel spreadsheets. You have to show them how you’re moving the needle.”

We’re using Tableau to show big data sets in meaningful ways that we’ve never been able to do before.
Eric StoweFounder and CEO, Splash

Before using Tableau, Splash had stories of individual people and schools it could tell, but those anecdotes didn’t tell the full story.

They didn’t demonstrate what Splash could do at scale; for example, what Splash could do for 500,000 children instead of just 500.

“We’re using Tableau to show big data sets in meaningful ways that we’ve never been able to do before,” Stowe said. “We need to show governments we can be cost-effective at scale so they allow us to go into other cities.”

Splash’s stack

Beginning in 2011, Splash began publishing its data online, showing its work on an internally designed website called ProvingIt.

Proving it, however, proved to be cumbersome for Splash.

Not only was the data entry process difficult, so was the data management and data preparation, which required manual manipulation.

Splash needed something better.

“It was unwieldy,” Stowe said. “There were redundant data entries, and it wasn’t pulling data from a controlled resource.”

With the help of Tableau, Splash was able to completely redesign its analytics stack from data entry through insight, and do so in a mere 10 weeks.

Splash now uses CommCare to collect data on the ground, using Salesforce as its customer relationship management system, Microsoft Azure and Snowflake for data warehousing, Alteryx to clean its data, and finally Tableau to visualize the data. Meanwhile, it uses Mapbox as its mapping tool.

Coincidentally, Salesforce and Tableau are in the process of combining forces after Salesforce acquired Tableau in June for $15.7 billion.

Beyond Splash

Splash is one of about two dozen nonprofit organizations and projects the Tableau Foundation is helping with both funding and technology.

Others include a project called Community Solutions that uses software training and financial support to combat homelessness in 50 U.S. cities; Visualize No Malaria, which is attempting to eliminate malaria in Zambia by 2021; and projects to battle climate change.

Most, according to Myrick, are like the Tableau Foundation’s commitment to Splash in that they involve more than simply contributing money to a cause. Although Tableau said at its 2018 user conference that it planned to commit $100 million from 2018 to 2025 to philanthropic works, when it teams with an organization, it does so with a clearer goal and timeframe in mind.

Visualize No Malaria, for example, represents — to date since 2014 — a $4 million commitment from the Tableau Foundation, and in the five years since the project began, due to better data that has led to faster action, Myrick said that there has been a 90% decline in malaria death and an 80% decline in malaria cases in Zambia’s Southern Province.

“Now, when a clinic director is looking at a dashboard, she is seeing the number of active cases discovered, people tested, people treated, and that data are no more than a week old,” Myrick said. “That lets her better allocate and target her scarce resources. Now Zambia is committed to roll that process out nationwide, and they’re also rolling into Senegal and Ethiopia.”

Next for Splash

Now that Splash has secured a $1 million grant from the Tableau Foundation, as well as $8 million from the Ethiopian government and is in the process of trying to secure $4 million from India’s government — part of $45 million it eventually hopes to raise — it can move beyond the beginning phase of its projects in Addis Ababa and Kolkata.

According to Stowe, while 2019 was focused on pilot projects — starting small — 2020 will focus on developing a model so that school projects will go as smoothly as possible. The next three years will then focus squarely on reaching the goal of providing clean water, sanitation and hygiene education to more than 1,500 schools and nearly a million children.

Data, meanwhile, will continue to be Splash’s tool for carrying out its mission.

“We need to be able to show and convince governments of the cost efficiencies and the direct impact we can make, and the increase we can provide in education for girls in particular,” Stowe said. “We also need to show schools where they land on the spectrum compared to other schools, if they’re the gold standard or if they’re laggards.”

He added that data is important when meeting with donors as well.

“We need to show that the funds are being used effectively,” Stowe said, “and that we’re on schedule or ahead.”

The look of the data, however, differs depending on who is viewing it.

Different dashboards are designed for different audiences — governments, schools, donors, even finance teams — taking advantage of perhaps Tableau’s greatest strength.

“We need really good data,” Stowe said, “that is convincing as hell.”

Go to Original Article
Author:

Redis Labs eases database management with RedisInsight

The robust market of tools to help users of the Redis database manage their systems just got a new entrant.

Redis Labs disclosed the availability of its RedisInsight tool, a graphical user interface (GUI) for database management and operations.

Redis is a popular open source NoSQL database that is also increasingly being used in cloud-native Kubernetes deployments as users move workloads to the cloud. Open source database use is growing quickly according to recent reports as the need for flexible, open systems to meet different needs has become a common requirement.

Among the challenges often associated with databases of any type is ease of management, which Redis is trying to address with RedisInsight.

“Database management will never go out of fashion,” said James Governor, analyst and co-founder at RedMonk. “Anyone running a Redis cluster is going to appreciate better memory and cluster management tools.”

Governor noted that Redis is following a tested approach, by building out more tools for users that improve management. Enterprises are willing to pay for better manageability, Governor noted, and RedisInsight aims to do that.

RedisInsight based on RDBtools

The RedisInsight tool, introduced Nov. 12, is based on the RDBTools technology that Redis Labs acquired in April 2019. RDBTools is an open source GUI for users to interact with and explore data stores in a Redis database.

Database management will never go out of fashion.
James GovernorAnalyst and co-founder, RedMonk

Over the last seven months, Redis added more capabilities to the RDBTools GUI, expanding the product’s coverage for different applications, said Alvin Richards, chief product officer at Redis.

One of the core pieces of extensibility in Redis is the ability to introduce modules that contain new data structures or processing frameworks. So for example, a module could include time series, or graph data structures, Richards explained.

“What we have added to RedisInsight is the ability to visualize the data for those different data structures from the different modules,” he said. “So if you want to visualize the connections in your graph data for example, you can see that directly within the tool.”

RedisInsight overview dashboard
RedisInsight overview dashboard

RDBTools is just one of many different third-party tools that exist for providing some form of management and data insight for Redis. There are some 30 other third-party GUI tools in the Redis ecosystem, though lack of maturity is a challenge.

“They tend to sort of come up quickly and get developed once and then are never maintained,” Richards said. “So, the key thing we wanted to do is ensure that not only is it current with the latest features, but we have the apparatus behind it to carry on maintaining it.”

How RedisInsight works

For users, getting started with the new tool is relatively straightforward. RedisInsight is a piece of software that needs to be downloaded and then connected to an existing Redis database. The tool ingests all the appropriate metadata and delivers the visual interface to users.

RedisInsight is available for Windows, macOS and Linux, and also available as a Docker container. Redis doesn’t have a RedisInsight as a Service offering yet.

“We have considered having RedisInsight as a service and it’s something we’re still working on in the background, as we do see demand from our customers,” Richards said. “The challenge is always going to be making sure we have the ability to ensure that there is the right segmentation, security and authorization in place to put guarantees around the usage of data.”

Go to Original Article
Author:

Google cloud network tools check links, firewalls, packet loss

Google has introduced several network monitoring tools to help companies pinpoint problems that could impact applications running on the Google Cloud Platform.

Google launched this week the first four modules of an online console called the Network Intelligence Center. The components for monitoring a Google cloud network include a network topology map, connectivity tests, a performance dashboard, and firewall metrics and insights. The first two are in beta, and the rest are in alpha, which means they are still in the early stages of development.

Here’s a brief overview of each module, based on a Google blog post:

— Google is providing Google Cloud Platform (GCP) subscribers with a graphical view of their network topology. The visualization shows how traffic is flowing between private data centers, load balancers, and applications running on computing environments within GCP. Companies can drill down on each element of the topology map to verify policies or identify and troubleshoot problems. They can also review changes in the network over the last six weeks.

— The testing module lets companies diagnose problems with network connections within GCP or from GCP to an IP address in a private data center or another cloud provider. Along with checking links, companies can test the impact of network configuration changes to reduce the chance of an outage.

–The performance dashboard provides a current view of packet loss and latency between applications running on virtual machines. Google said the tool would help IT teams determine quickly whether a packet problem is in the network or an app.

–The firewall metrics component offers a view of rules that govern the security software. The module is designed to help companies optimize the use of firewalls in a Google cloud network.

Getting access to the performance dashboard and firewall metrics requires a GCP subscriber to sign up as an alpha customer. Google will incorporate the tools into the Network Intelligence Center once they reach the beta level.

Go to Original Article
Author:

Microsoft Power Platform adds chatbots; Flow now Power Automate

More bots and automation tools went live on the Microsoft Power Platform, Microsoft announced today. In their formal introductions, Microsoft said the tools will make data sources flow within applications like SharePoint, OneDrive and Dynamics 365, and create more efficiencies with custom apps.

The more than 400 capabilities added to the Microsoft Power Platform focus on expanding its robotic process automation potential for users, as well as new integrations between the platform and Microsoft Teams, according to a blog post by James Phillips, corporate vice president of business applications at Microsoft.

Some of those include robotic process automation (RPA) tools for Microsoft Power Automate, formerly known as Flow, which makes AI tools easier to add into PowerApps. Also newly available are tools for creating user interfaces in Power Automate.

AI Builder adds a point-and-click means to fold common processes such as forms processing, object detection and text classification into apps — processes commonly used for SharePoint and OneDrive content curation.

Microsoft is adding these tools, as well as new security features to analytics platform Power BI, in part to coax customers who remain on premises into the Azure cloud, said G2 analyst Michael Fauscette.

PowerApps reduce the development needed to create necessary connections between systems in the cloud, such as content in OneDrive and SharePoint with work being done in Dynamics 365 CRM, Teams and ERP applications.

Microsoft Power Automate, formerly Flow
Microsoft Power Automate, a low-code app-design tool,is the new version ofFlow.

Chatbots go live

Also announced as generally available at Microsoft Ignite are Power Virtual Agents, do-it-yourself chatbots on the Microsoft Power Platform.

They’ll likely first be used by customer service teams on Dynamics 365, said Constellation Research analyst R “Ray” Wang, but they could spread to other business areas such as human resources, which could use the bots to answer common questions during employee recruiting or onboarding.

If an agent is costing you $15 an hour and the chatbot 15 cents an hour … it’s all about call deflection.
R ‘Ray’ WangAnalyst, Constellation Research

While some companies may choose outside consultants and developers to build custom chatbots instead of making their own on the Microsoft Power Platform, Wang said some companies may try it to build them internally. Large call centers employing many human agents and running on Microsoft applications would be logical candidates for piloting new bots.

“I think they’ll start coming here to build their virtual agents,” Wang said. “[Bot] training will be an issue, but it’s a matter of scale. If an agent is costing you $15 an hour and the chatbot 15 cents an hour … it’s all about call deflection.”

Microsoft Power Platform evolves

PowerApps, which launched in late 2015, originally found utility with users of Microsoft Dynamics CRM who needed to automate and standardize processes across data sets inside the Microsoft environment and connect to outside platforms such as Salesforce, said Gartner analyst Ed Anderson.

Use quickly spread to SharePoint, OneDrive and Dynamics ERP users, as they found that Flow — a low-code app-design tool — enabled the creation of connectors and apps without developer overhead. Third-party consultants and developers also used PowerApps to speed up deliverables to clients. Power BI, Power Automate and PowerApps together became known as the Microsoft Power Platform a year ago.

“PowerApps are really interesting for OneDrive and SharePoint because it lets you quickly identify data sources and quickly do something meaningful with them — connect them together, add some logic around them or customized interfaces,” Anderson said.

Go to Original Article
Author:

Microsoft acquires Mover.io to ease OneDrive migrations

Microsoft plans to add cloud migration tools to aid SharePoint on-premises migrations to the OneDrive cloud with its Monday acquisition of Mover.io, an eight-year-old Canadian startup specializing in the self-service mass migration of enterprise files.

Mover.io, acquired for an undisclosed sum, also provides cross-cloud migrations from a dozen OneDrive file-sharing cloud competitors, including Box, Dropbox, Egnyte and Google Drive. Microsoft continues to support SharePoint on-premises, but the company has not said how long it will continue to do so, leaving room for speculation among users and experts.

Mover.io, acquired a month after Microsoft bought data-migration vendor Movere, will join several file-migration tools and services already on the Microsoft cloud platform, including FastTrack and the SharePoint Migration Tool. Users also have a choice of several other third-party tools to do the job, including ShareGate and Metalogix, which support file migrations to OneDrive.

Microsoft could, theoretically, poach customers from competing cloud file-management systems such as Box with the Mover.io migration tools. But the real OneDrive migration target customer for the Mover.io tools is Microsoft’s SharePoint on-premises base, said Deep Analysis founder Alan Pelz-Sharpe.

Enterprise-scale file migrations from on-premises servers to the cloud pose challenges of maintaining file directory structure as well as access and security policies, Pelz-Sharpe said. SharePoint enterprise migrations in particular can be even thornier because it was designed for front-line office workers to set up ad-hoc file-sharing sites with little IT assistance.

The fact that SharePoint’s been around for nearly two decades, pre-dating widespread cloud adoption, compounds the issue. Pelz-Sharpe described one of his clients, a utility company, whose SharePoint on-premises footprint has grown over the years to 12,000 SharePoint sites.

“They have no idea what is in them, and no idea what to do with them,” Pelz-Sharpe said. “These things can be complex. It’s a recognized problem, so the more experience, skills and tools Microsoft can bring to help, the better.”

Specifics about Mover.io features integrating with the Microsoft 365 platform will come next month, said Jeff Teper, Microsoft corporate VP for Office, SharePoint and OneDrive, in a blog post announcing the acquisition.

Go to Original Article
Author:

AIOps tools mature as cloud infrastructure grows

Enterprise IT interest in AIOps tools has grown in 2019, as reflected in the latest features in DevOps monitoring tools, but advanced IT automation still hasn’t caught on beyond the bleeding edge.

Two AIOps vendors have reported sales strong enough to propel them to IPOs this month: first, Dynatrace re-entered the New York Stock Exchange Sept. 10 after five years of ownership by a private equity firm. DevOps monitoring competitor Datadog is expected to launch an IPO this week. A third cloud-native monitoring company, New Relic, already publicly traded, reported faltering sales numbers earlier this year and underwent C-suite upheaval as a result, but plowed ahead with fresh features for the New Relic One suite at its user conference this week.

IT industry analysts speculate that the New Relic One development process, which first began in late 2017 and integrated IP from multiple acquisitions, may have caused the business setback for the company. It remains to be seen whether those effort will pay off in an expanded customer base for New Relic.

New Relic One refreshes perspectives on IT monitoring data

For existing New Relic users, New Relic One contains important improvements that will address their needs in the future. Most importantly, New Relic One unified dashboard views that had previously been fragmented, and added the ability to search across multiple domains, whether user accounts or public clouds.

“New Relic One gave us a nice common interface, where we used to only be able to search data within user sub-accounts,” said Joshua Biggley, senior enterprise monitoring engineer at Cardinal Health, a healthcare services and products company based in Dublin, Ohio. Cardinal Health has used New Relic since 2016, and rolled out New Relic One in production when it became generally available.

This week, New Relic added several feature updates, including support for third-party data sources, agentless as well as agent-based monitoring deployments, log monitoring, tools to build programmable monitoring apps on the platform and new AIOps capabilities.

For Cardinal Health, log monitoring data and information from third-party sources such as time-series databases and open source distributed tracing tools will add more dimensions of context to the centralized DevOps monitoring interface, Biggley said. He and his colleagues plan to use the programmability features to build new monitoring databases and assess a broader range of relationships between pieces of monitoring data.

“You can say, ‘these servers have eight CPUs each, and your workload average is 10, but you don’t know what that means if you don’t know how many CPUs are actually configured,” Biggley said.

AIOps will mature alongside DevOps

New Relic has offered data analytics that it called Applied Intelligence in its products before, but this week’s New Relic One updates adds AIOps features such as expanded alert reduction and automated creation of notifications and workflows in third-party tools such as PagerDuty, ServiceNow and Slack.

This type of AI-driven IT automation has been a hot topic in IT ops circles since 2017, but it has taken until this year for AIOps products to fully mature, and will still take more time before IT shops are ready to put them to widespread use.

“Event correlation and alert reduction are the unicorn everyone’s chasing,” Biggley said. “But people tend to be afraid of automation, and it all depends on data — garbage in, garbage out.”

Biggley said he wants clarify the specific goals he wants to achieve with AIOps automation before he dives in.

“You can apply machine learning to anything, but should you?” he said.

You can apply machine learning to anything, but should you?
Joshua BiggleySenior enterprise monitoring engineer, Cardinal Health

Industry-wide, enterprises are adopting tools with AI built in: a Q2 2019 Forrester Research survey found that 51% of global infrastructure decision makers have already adopted, or are in the process of implementing, AI- and machine learning-enabled systems, with another 21% stating that they plan to adopt those technologies in the next year.

However, the percentage of companies that have achieved AIOps automation in production using such tools is unclear at this point. AIOps early adopters have gained advantages from alert reduction through such tools, but not without having to work through data quality issues, and some remain skeptical of their ability to deliver incident auto-remediation.

The DevOps monitoring maturation process among enterprises actually tends to make things worse before they get better, said Nancy Gohring, analyst at 451 Research.  

“As companies reorganize into DevOps teams that both develop and operate microservices, performance actually gets worse for a while, because there are too many tools and unclear responsibilities,” Gohring said. “Eventually, organizations form a more centralized observability team, reduce the number of tools they use, and application performance improves.”

Only once organizations get past the initial chaotic stage of DevOps adoption can they proceed to AIOps automation that reduces the manual intervention DevOps monitoring tools require, Gohring said. Such tools also provide the most value for complex cloud-native architectures, such as container-based microservices, and most enterprises haven’t yet widely adopted such infrastructures in production.

Dynatrace predicts future of NoOps

Not everyone shares Gohring’s outlook on the pace of AIOps adoption. Dynatrace, for example, resumed its status as a publicly traded company with a focus on advanced IT automation, and the prediction that many of its customers will soon get to NoOps, where systems resolve IT incidents with no human intervention.

“When customers see what we’ve achieved with Dynatrace and NoOps, they see that it’s possible,” said Dynatrace co-founder and CTO Bernd Greifeneder. “We’ve heard a lot about NoOps being a dumb idea that will never work, but I can invite you to our own labs to see it.”

Forrester’s Gohring is skeptical that NoOps will ever become mainstream in enterprises.

“Some future phase will look a lot closer to NoOps,” she said. “But that’s far down the road, hazy and ill-defined. We’re taking steps toward it, but it’s unclear if it’s achievable.”

Go to Original Article
Author:

Gen Z in the workforce both want and fear AI and automation

For Gen Z in the workforce, AI and automation are useful and time-saving tools, but also possible threats to job security.

Typically characterized as the demographic born between the mid-1990s and mid-2000s, Generation Z  is the first generation to truly grow up exclusively with modern technologies such as smart phones, social media and digital assistants.

Many Gen Z-ers first experienced Apple’s Siri, released in 2011, and then Amazon’s Alexa, introduced in 2014 alongside Amazon Echo, at a young age.

The demographic as a whole tends to have a strong understanding of the usefulness of AI and automation, said Terry Simpson, technical evangelist at Nintex, a process management and automation vendor

Gen Z in the workforce

Most Gen Z employees have confidence in AI and automation, Nintex found in a September 2019 report about a survey of 500 current and 500 future Gen Z employees. Some 88% of the survey takers said AI and automation can make their jobs easier.

This generation understands AI technology, Simpson said, and its members want more of it in the workplace.

“For most organizations, almost 68 percent of processes are not automated,” Simpson said. Automation typically replaces menial, repetitive tasks, so lack of automation leaves those tasks to be handled by employees.

Gen Z, Gen Z in the workforce, AI and automation
Gen Z wants more automation in the workplace, even as they fear it could affect job security.

For Gen Z in the workforce, a lack of automation can be frustrating, Simpson said, especially when Gen Z-ers are so used to the ease of digital assistants and automated programs in their personal lives. Businesses generally haven’t caught up to the AI products Gen Z-ers are using at home, he said.

Yet, even as Gen Z-ers have faith that AI and automation will help them in the workplace, they fear it, too.

Job fears

According to the Nintex report, 57% of those surveyed expressed concern that AI and automation could affect their job security.

“A lot of times you may be a Gen Z employee that automation could replace what you’re doing as a job function, and that becomes a risk,” Simpson said.

Everybody says I don’t want to lose my job to a robot, and then Outlook tells you to go to a meeting and you go.
Anthony ScriffignanoChief data scientist, Dun & Bradstreet

Still, he added, automation can help an organization as a whole, and can ease the employees’ workloads.

“Everybody says I don’t want to lose my job to a robot, and then Outlook tells you to go to a meeting and you go,” said Anthony Scriffignano, chief data scientist at Dun & Bradstreet.

Jobs that can be easily automated may eventually be given to an automated system, but AI will also create jobs, Scriffignano said.

As a young generation, Gen Z-ers may have less to fear than other generations, however.

Younger generations are coachable and more open to change than the older generations, Scriffignano said. They will be able to adapt better to new technologies, while also helping their employers adapt, too.

“Gen Z have time in their career to reinvent themselves and refocus” their skills and career goals to better adapt for AI and automation, Scriffignano said.

Go to Original Article
Author: