Tag Archives: data

U.S. facility may have best data center PUE

After years of big gains, the energy efficiency of data centers has stalled. The key data center efficiency measurement, power usage effectiveness, is not improving. It even declined a little from last year.

The reason may have to do with the limits of the technology in use by the majority of data centers.

Improving data center PUE, “will take a shift in technology,” said Chris Brown, chief technical officer of Uptime Institute LLC, a data center advisory group in Seattle. Most data centers — as many as 90% — continue to use air cooling, which isn’t as effective as water in removing heat, he said.

But one data center has made the shift in technology: the National Renewable Energy Laboratory (NREL) in Golden, Colo. The NREL’s mission is to work on advancing energy-related technologies, such as renewable power, sustainable transportation, building efficiency, grid modernization, among others. Its supercomputer data center deploys technologies that help it to achieve a very low data center PUE.

The technologies includes cold plates, which uses liquid to draw waste heat away from the CPU and memory. It also has a few rear door heat exchangers. A heat exchanger is fitted to the rear of server racks. It removes heat from the server when it interacts with water carrying coils that cool the air before it enters the data center room.

“The closer you get to rack in terms of picking up that waste heat and removing it, the more energy efficient you are going to be,” said David Sickinger, a researcher at NREL’s Computational Science Center.

Data center efficiency gains have stalled

NREL uses cooling towers to chill the water, which can be as warm as 75 degrees Fahrenheit and still cool the systems. The cooler and drier climate conditions of Colorado help. NREL doesn’t have mechanical cooling, which includes chillers. 

Because of the increasing power of high-performance computing (HPC) systems, “that has sort of forced the industry to be early adopters of warm water liquid cooling,” Sickinger said. 

The lowest possible data center PUE is 1, which means that all the power drawn goes to the IT equipment. NREL is reporting that its supercomputing data center PUE is 1.04 on an annualized basis. The NREL HPC data center has two supercomputers in a data center of approximately 10,000 square feet.

“We feel this is sort of world leading in terms of our PUE,” Sickinger said.

Is AI starting to reduce staffing needs?

Something else that NREL believes sets it apart is its reuse of the waste heat energy. The lab uses it to heat offices and for heating loops under outdoor patio areas to melt snow.

More than 10 years ago, the average PUE as reported by Uptime was 2.5. That has since improved. By 2012, the average data center PUE was 1.65. It continued to improve slightly but has since leveled off. In 2019, the average data center PUE ticked up to a PUE of nearly 1.7.

“I think as an industry we started to get to about the end of what we can do with the way we’re designing today,” Brown said. He believes in time data centers will look at different technologies, such as immersion cooling, which involves immersing IT equipment in a nonconductive liquid.

I think as an industry we started to get to about the end of what we can do with the way we’re designing today.
Chris BrownChief technical officer, Uptime Institute LLC

Improvements in data center PUE add up. If a data center has a PUE of 2, it is using 2 megawatts of power to support 1 megawatt of IT load. But if a data center can lower the PUE to 1.6, then 1.6 megawatts is being used by the facility, providing a savings of about 400 kilowatts of electrical energy, Brown said.

Data centers are becoming major users of electricity in the United States. They account for nearly 2% of all U.S. electrical use.

In a 2016 U.S. government-sponsored study, researchers reported that data centers in 2014 accounted for about 70 billion kWh and was forecasted to reach 73 billion kWh in 2020. This estimate has not been updated, according to energy research scientist Jonathan Koomey, who was one of the authors of the study.

Koomey, who works as an independent researcher, said it is unlikely the estimates in the 2016 report have been exceeded much, if at all. He’s involved in a new independent research effort to update those estimates.

NREL is working with Hewlett Packard Enterprise to develop AI algorithms specific to IT operations, also known as AIOps. The goal is to develop machine learning models and predictive capabilities to optimize the use of data centers and possibly inform development of data centers to serve exascale computing, said Kristin Munch, NREL manager of the data, analysis and visualization group.

National labs generally collect data on their computer and data center operations, but they may not keep this data for a long period of time. NREL has collected five years worth of data from its supercomputer and facilities, Munch said.

Go to Original Article
Author:

Gartner Names Microsoft a Leader in the 2019 Enterprise Information Archiving (EIA) Magic Quadrant – Microsoft Security

We often hear from customers about the explosion of data, and the challenge this presents for organizations in remaining compliant and protecting their information. We’ve invested in capabilities across the landscape of information protection and information governance, inclusive of archiving, retention, eDiscovery and communications supervision. In Gartner’s annual Magic Quadrant for Enterprise Information Archiving (EIA), Microsoft was named a Leader again in 2019.

According to Gartner, “Leaders have the highest combined measures of Ability to Execute and Completeness of Vision. They may have the most comprehensive and scalable products. In terms of vision, they are perceived to be thought leaders, with well-articulated plans for ease of use, product breadth and how to address scalability.” We believe this recognition represents our ability to provide best-in-class protection and deliver on innovations that keep pace with today’s compliance needs.

This recognition comes at a great point in our product journey. We are continuing to invest in solutions that are integrated into Office 365 and address information protection and information governance needs of customers. Earlier this month, at our Ignite 2019 conference, we announced updates to our compliance portfolio including new data connectors, machine learning powered governance, retention, discovery and supervision – and innovative capabilities such as threading Microsoft Teams or Yammer messages into conversations, allowing you to efficiently review and export complete dialogues with context, not just individual messages. In customer conversations, many of them say these are the types of advancements that are helping them be more efficient with their compliance requirements, without impacting end-user productivity.

Learn more

Read the complimentary report for the analysis behind Microsoft’s position as a Leader.

For more information about our Information Archiving solution, visit our website and stay up to date with our blog.

Gartner Magic Quadrant for Enterprise Information Archiving, Julian Tirsu, Michael Hoeck, 20 November 2019.

*This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request from Microsoft.

Gartner does not endorse any vendor, product, or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally, and is used herein with permission. All rights reserved.

Go to Original Article
Author: Steve Clarke

Tableau Foundation, Splash fight for clean water in schools

Data is one of the tools being used in the fight for clean water in schools in Ethiopia and India, and the Tableau Foundation is helping provide both the funds and technology that turn information into a vehicle for change.

At the recently completed Tableau Conference 2019 in Las Vegas, the vendor revealed a $1 million grant through the Tableau Foundation to support Splash in its effort to provide safe water, sanitation and hygiene programs to urban schools throughout Addis Ababa, Ethiopia, and Kolkata, India, by 2023.

Through its foundation, Tableau is also providing Splash with free licenses to use its platform, access to Tableau’s network of partners, and free tech support.

Splash, meanwhile, is a global nonprofit organization working to ensure safe water, sanitation and access to proper hygiene to children living in urban poverty throughout Asia and Africa.

The Tableau Foundation had already been connected with Splash for three years, giving the group small grants in recent years. The foundation made the $1 million commitment to Splash a few months before the conference.

In total, the Tableau Foundation and Splash are hoping to reach approximately a million children — more than 450,000 in Addis Ababa and more than 450,000 in Kolkata — over a five-year period. And once success is proved in those cities, Splash hopes to broaden its reach to other cities throughout Ethiopia and India.

“What it means is the kids will have lower rates of diarrheal diseases so they actually can attend schools and improve education outcomes,” said Neal Myrick, global head of the Tableau Foundation. “Girls who are going into puberty can get help during their menstrual periods and they don’t have to skip school during that time.

“It’s a health-related thing that has direct education outcomes, and then the broader community takes advantage of the clean water systems that are installed in the schools,” he continued.

Splash's schedule for providing clean water to more than 400,000 children in Addis Ababa, Ethiopia, is displayed on a dashboard.
A dashboard displays Splash’s projected schedule for providing clean water to more than 400,000 children in Addis Ababa, Ethiopia.

What data can do

Currently, Splash is providing water, sanitation and hygiene access to 16% of the schools in Addis Ababa and 19% of the schools in Kolkata.

Data will play a major role in helping Splash reach its goal of expanding that reach to approximately 1 million children by 2023, said Eric Stowe, founder and CEO of Splash.

Data will be a means of convincing municipalities to allow Splash to come into their schools, and also toward convincing potential donors to contribute the funds Splash needs to continue its work.

“It’s hard to get a government to move on something,” Stowe said. “Moving a city forward is hard unless you have good data for them to look at — you’re not going to get the minister of education in a city to act with Excel spreadsheets. You have to show them how you’re moving the needle.”

We’re using Tableau to show big data sets in meaningful ways that we’ve never been able to do before.
Eric StoweFounder and CEO, Splash

Before using Tableau, Splash had stories of individual people and schools it could tell, but those anecdotes didn’t tell the full story.

They didn’t demonstrate what Splash could do at scale; for example, what Splash could do for 500,000 children instead of just 500.

“We’re using Tableau to show big data sets in meaningful ways that we’ve never been able to do before,” Stowe said. “We need to show governments we can be cost-effective at scale so they allow us to go into other cities.”

Splash’s stack

Beginning in 2011, Splash began publishing its data online, showing its work on an internally designed website called ProvingIt.

Proving it, however, proved to be cumbersome for Splash.

Not only was the data entry process difficult, so was the data management and data preparation, which required manual manipulation.

Splash needed something better.

“It was unwieldy,” Stowe said. “There were redundant data entries, and it wasn’t pulling data from a controlled resource.”

With the help of Tableau, Splash was able to completely redesign its analytics stack from data entry through insight, and do so in a mere 10 weeks.

Splash now uses CommCare to collect data on the ground, using Salesforce as its customer relationship management system, Microsoft Azure and Snowflake for data warehousing, Alteryx to clean its data, and finally Tableau to visualize the data. Meanwhile, it uses Mapbox as its mapping tool.

Coincidentally, Salesforce and Tableau are in the process of combining forces after Salesforce acquired Tableau in June for $15.7 billion.

Beyond Splash

Splash is one of about two dozen nonprofit organizations and projects the Tableau Foundation is helping with both funding and technology.

Others include a project called Community Solutions that uses software training and financial support to combat homelessness in 50 U.S. cities; Visualize No Malaria, which is attempting to eliminate malaria in Zambia by 2021; and projects to battle climate change.

Most, according to Myrick, are like the Tableau Foundation’s commitment to Splash in that they involve more than simply contributing money to a cause. Although Tableau said at its 2018 user conference that it planned to commit $100 million from 2018 to 2025 to philanthropic works, when it teams with an organization, it does so with a clearer goal and timeframe in mind.

Visualize No Malaria, for example, represents — to date since 2014 — a $4 million commitment from the Tableau Foundation, and in the five years since the project began, due to better data that has led to faster action, Myrick said that there has been a 90% decline in malaria death and an 80% decline in malaria cases in Zambia’s Southern Province.

“Now, when a clinic director is looking at a dashboard, she is seeing the number of active cases discovered, people tested, people treated, and that data are no more than a week old,” Myrick said. “That lets her better allocate and target her scarce resources. Now Zambia is committed to roll that process out nationwide, and they’re also rolling into Senegal and Ethiopia.”

Next for Splash

Now that Splash has secured a $1 million grant from the Tableau Foundation, as well as $8 million from the Ethiopian government and is in the process of trying to secure $4 million from India’s government — part of $45 million it eventually hopes to raise — it can move beyond the beginning phase of its projects in Addis Ababa and Kolkata.

According to Stowe, while 2019 was focused on pilot projects — starting small — 2020 will focus on developing a model so that school projects will go as smoothly as possible. The next three years will then focus squarely on reaching the goal of providing clean water, sanitation and hygiene education to more than 1,500 schools and nearly a million children.

Data, meanwhile, will continue to be Splash’s tool for carrying out its mission.

“We need to be able to show and convince governments of the cost efficiencies and the direct impact we can make, and the increase we can provide in education for girls in particular,” Stowe said. “We also need to show schools where they land on the spectrum compared to other schools, if they’re the gold standard or if they’re laggards.”

He added that data is important when meeting with donors as well.

“We need to show that the funds are being used effectively,” Stowe said, “and that we’re on schedule or ahead.”

The look of the data, however, differs depending on who is viewing it.

Different dashboards are designed for different audiences — governments, schools, donors, even finance teams — taking advantage of perhaps Tableau’s greatest strength.

“We need really good data,” Stowe said, “that is convincing as hell.”

Go to Original Article
Author:

New Salesforce Customer 360 aims to unify data sources

SAN FRANCISCO — Salesforce’s new customer data and identity platform released this week connects data across sales, service, marketing and other departments within an organization to provide users with a single ID for each customer.

The platform, Salesforce Customer 360 Truth (also called Single Source of Truth and SSOT by Salesforce executives) is considered by many to be a customer data platform (CDP) — though that is only one component of the new platform.

Whatever one calls it, Salesforce Customer 360 Truth resembles competing CDPs from Oracle, Adobe and Microsoft in functionality. Customers, analysts and partners said the new feature bundle solves a problem endemic to many CX platforms: reconciling customer IDs and data across sales, marketing, e-commerce and customer service platforms to a golden record.

Features in Customer 360 Truth — which come at no extra charge to Salesforce subscribers — include customer data management, identity management and privacy tools that are available now. Salesforce plans to release a unified customer profile, the most significant feature, in 2020.

The capabilities, taken together, will not only aggregate updated customer data from one base like a CDP, but will be able go further than CDPs, typically used by marketing and advertising teams. Customer 360 Truth features can route it and push actions and personalizations to sales, service and support teams dealing with an individual customer, the company said.

Customer data will be structured on the Cloud Information Model open standard modeled by MuleSoft and developed jointly by AWS, Genesys and Salesforce for The Linux Foundation.

It’s all long overdue, IDC analyst Neil Ward-Dutton said.

“Salesforce should have done this five years ago. If anything’s surprising, it’s that they have managed to not have it until now, because many customers have more than one cloud and there’s been no easy way to get a view of individual, unique IDs,” Ward-Dutton said. “It’s instructive that it’s free. I think that’s the only thing they could do. If they would have charged for this, it would have been a really bad mistake.”

Salesforce Customer 360 Truth includes a Data Manager component.
Customer 360 Truth connects customer data across Salesforce clouds and non-Salesforce systems and matches that data to individual customers to create a common profile and issue a single Salesforce ID.

Customers, partners take stock

Customers generally reacted positively to the news of Salesforce Customer 360 Truth at Dreamforce here this week, with some wondering what data and process housecleaning outside of Salesforce will be required to use the tools.

If anything’s surprising, it’s that they have managed to not have it until now, because many customers have more than one cloud and there’s been no easy way to get a view of individual, unique IDs.
Neil Ward-DuttonAnalyst, IDC

That’s the case for e.l.f. Cosmetics, CIO and CTO Ekta Chopra said. Her company runs Salesforce marketing, sales, service and e-commerce clouds, and also is an early adopter of Salesforce’s order management system, processing about a million transactions a year. While Customer 360 Truth features look promising, her company will have to evaluate how to manage different types of profiles such as customers versus wholesalers.

“We have to make sure we’re gathering all that data in the best possible way,” Chopra said. “We’re not just a direct-to-consumer business.”

Hyland Software is both a Salesforce customer and a partner, with its OnBase enterprise content management system integration available on Salesforce’s AppExchange. Salesforce Customer 360 Truth is a move in the right direction to reconcile conflicting customer data, but the process will always require a mix of different vendors’ tools to nail it all down, said Ed McQuiston, Hyland executive vice president and chief commercial officer.

“There is no one, ubiquitous platform that gives you 360,” McQuiston said. “Salesforce plays a critical part for us in terms of understanding the customer, our interactions, et cetera. But we use our own product with it, because I want to see the contracts we have, the support information. I want that complete view.”

Patrick Morrissey, general manager of Salesforce partner Upland Software, said he thinks Customer 360 features will help Salesforce customers use Upland’s Altify revenue management tools more effectively.

“Customer revenue optimization intersects quite nicely with Customer 360,” Morrissey said. “The problem is that the data and processes don’t connect. The vision that Salesforce has around Customer 360 is fantastic, because it brings the data together for the customer and reduces friction.”

CDPs can only go so far

Salesforce might not call Customer 360 Truth a CDP because its capabilities extend beyond what competing CDPs do, said Joe Stanhope, analyst at Forrester Research, who watches the technology closely.

“Salesforce was talking quite a bit about CDPs in the early iterations of Customer 360,” Stanhope said. “But I think, over time, the scope evolved and expanded. Ultimately, Customer 360 is about more than a CDP, and even more than just marketing. Customer 360 is the key to enabling the Salesforce ecosystem with data.”

Arizona State University’s CTO of EdPlus online learning, Donna Kidwell, sees the Salesforce tools as a good start to wrangle sprawling data. Her team is building a blockchain ledger to track accomplishments of the university’s learners, which comprises students pursuing degrees, professionals earning certifications, high schoolers attending camps and others who interact in some way with the university.

The ambitious project involves Salesforce CRM data and Salesforce Blockchain as a spoke of a much larger wheel that ultimately will enable data sharing across educational institutions and employers.

CDPs in general — and Salesforce Customer 360 Truth in particular — may help consolidate data that can be fed into the ledger at some point in the future. But ultimately, managing customer data across learning systems, HR applications, Salesforce and other contributing systems is a much larger problem than a CDP can solve.

“I’m definitely tracking the CDPs,” Kidwell said. “I’m hopeful that Salesforce will ease some of those concerns, but I can’t imagine they’ll be the single source. There’s not going to be a single source of truth. We’re actually going to need data strategies, and our technologies will help implement those strategies.”

Go to Original Article
Author:

Clumio extends support to AWS EBS with $135M funding bump

Clumio plans to use its $135M Series C funding to expand its data protection scope.

The backup-as-a-service startup came out of stealth in August with protection for VMware on premises, VMware Cloud on AWS and native AWS services. Clumio this week disclosed its latest funding round. Clumio’s Series A and B rounds raised $11M and $40M, respectively, so the latest $135M injection marks a substantial funding boost for the company. Sutter Hill Ventures and Altimeter Capital led the new funding round.

Clumio CEO Poojan Kumar said the data protection vendor will build out its services platform eventually to cover every data source, starting with Amazon Elastic Block Store (EBS). The Amazon EBS support was rolled out in beta this week, and Clumio is working on support for Microsoft Azure and Google Cloud Platform.

Clumio’s Amazon EBS support is expected to become generally available before the end of 2019.

Clumio’s mission statement is to do with backup what Salesforce has done with customer relationship management (CRM) by delivering backup as a service entirely through the cloud. All processes involved with backup such as deduplication, access management, encryption, backup scheduling and resource allotment would be handled through the service.

Kumar said as customers’ infrastructures expand beyond their data centers, keeping their data protected has grown more complicated. Clumio is targeting customers who don’t want to devote IT resources to keeping up with that complexity.

“Customers are saying, ‘I want to stop the business of doing this myself,'” Kumar said.

Kumar said this is especially true with cloud-native data. He said customers have told him that data generated from a cloud application shouldn’t then be replicated to a data center in order to back it up. He said most customers want to lower or remove their data center footprint.

screenshot of Clumio EBS backup
Clumio now protects AWS EBS data in addition to VMware Cloud on AWS.

Investor interest in backup

This has been a banner year for funding for backup vendors, with four vendors alone pulling in more than $1 billion. Veeam received $500 million and Rubrik pulled in $261M in January, and Druva rose $130M in June.  

Christophe Bertrand, senior analyst at IT analysis firm Enterprise Strategy Group, said Clumio nearly tripled its previous funding, although it competes in a crowded backup market, because of its “born-in-the-cloud-ness.” Cloud-native service offerings, subscription pricing and the lack of on-premises investment are all seen as the direction of where backup is currently headed, so investors are buying in now.

“Investors are looking at this as where the market is going,” Bertrand said. “Winning Best of Show at VMworld probably helped, too.”

Bertrand was quick to note that Clumio faces a tough competitive field, however.  Druva and Carbonite — now  part of OpenText — have provided cloud-based backup for years, and long-time backup vendor Commvault added its Metallic SaaS backup service in October. Most of the other large backup vendors can also protect data in the cloud.

Bertrand expects Clumio to invest in extending its geographical reach and bolstering its partner ecosystem. As for product development, he expects Clumio to support more applications, hypervisors besides VMware vSphere and even use cases beyond backup.

Kumar said the Clumio roadmap calls for adding features such as security, container support, customer access keys and bandwidth throttling. He said Clumio will also develop support for new workloads, expand its channel strategy and add to its engineering team.

“They have to do these things quickly to gain traction,” Bertrand said. “It’s going to become a very contested space.”

Go to Original Article
Author:

Azure Quickstart Center – Microsoft Garage

It started with data. Data showed customers were experiencing difficulties onboarding to Azure, not knowing how to get started, with murky understanding of all the tools and services available to them. Ayesha Ghaffar, a data scientist who analyzed these and many other customer onboarding experiences, saw an opportunity to improve the existing process. She put together a proposal of a new way to acclimate customers to Azure. Ayesha began meeting with a few product teams about her idea and everyone felt it was worthwhile to pursue – but what she found was resources were thin and other commitments took priority. “I just wanted to make this idea a reality, but I was a data scientist and I didn’t know how on my own. I needed help to make this happen.” The answer came in the 2017 company-wide Hackathon. “A week before the Hackathon started, I thought, ‘I’m just going to write an email and put it out there. This is the problem, this is how we can solve it, and this is the impact.’ I sent it to everyone.”

Soon, employees from around the world – designers, user researchers, developers, program managers – wanted to join her Hackathon project. They found a common passion for the customer experience. Peilin Nee shared her experience as part of the hack team. “Designing with empathy begins with yourself. This was my first time working with 15 others coming from different disciplines who have never met before. What amazed me is how we put aside our differences and work towards our shared dream – provide new Azure users a better getting-started experience and ultimately increase trial users’ adoption rate. And, we did just that in two and a half days.”

Azure Quickstart Center

For their hack project, called Azure Launchpad, they created an extension in the Azure Portal that served as a hub for key getting started resources and personal recommendations for customers to achieve their cloud goals. Their efforts were noticed by the head of the Azure Portal product, Leon Welicki who was able to secure a Portal developer, Asher Garland, to continue to work on the project. While Ayesha was still a data scientist, she worked with Asher on a private preview version of Quickstart Center for the Ignite 2017 conference.

After the positive feedback they received from the private preview released for the conference, they were invited to present their work to Scott Guthrie who later mentioned it in one of his employee meetings. Because Ayesha moonlit as a product manager for the hack project, the Portal team eventually hired her to work on it full time as the PM in November 2017.

At Build 2018, Azure Quickstart Center shipped in public preview and became available to all customers through the Azure Portal. In 2019, Azure Quickstart Center partnered with the FastTrack team and Azure Docs to introduce Azure setup guides in the Quickstart Center to help customers deal with more complex scenarios like governance and migration based on the Microsoft Cloud Adoption Framework for Azure.

“The Cloud Adoption Framework provides the One Microsoft voice for cloud adoption. Azure Quickstart Center has made that voice actionable by injecting pointed and relevant guidance into the customer experience. The user now has prescriptive instructions at the point of execution. Together, these assets create a seamless integration of strategy and tactical execution,” shared Brian Blanchard, Sr. Director, Cloud Adoption Framework.

The Hackathon helped the team get feedback and continues to be a way to showcase the value of their ideas and projects. For Ayesha, “Hackathon is one of my favorite things at Microsoft – it helped me land a job as a PM, get visibility on the solution, and I learned how to solve customer problems in depth.” For the 2019 Hackathon, Ayesha and team (along with FastTrack team) iterated on features they could add that customers have said they wanted. “The goal is to democratize knowledge and put it in these custom guides. Our superpower is enabling our customers and partners to find their own superpowers. Cloud can be complicated, we want to enable customers and partners to simplify it for their organizations by creating their own custom guides for their users.”

Go to Original Article
Author: Microsoft News Center

Alteryx analytics platform focuses on ease of use

In its effort to make data analysis accessible to data consumers of all levels, ease of use is a key tenet of the Alteryx analytics platform.

Alteryx, founded as SRC LLC in 1997 and based in Irvine, Calif., provides a suite of four tools designed for both data scientists and data analysts to manage and interpret data. Included in the Alteryx analytics platform are Alteryx Connect, Alteryx Designer, Alteryx Promote and Alteryx Server.

The vendor’s most recent update was Alteryx 2019.3, which was released in October. Alteryx 2019.4 is scheduled for release in December.

The October release included more than 25 new and upgraded features, with ease of use at the core of many, according to Ashley Kramer, senior vice president of product management at Alteryx.

Among them are features to help reduce the number of clicks it takes to get to the point of visualizing data to make analytic decisions. Data Profiling enables the user to visualize their data as they work with it, and Alteryx introduced a way to work with data in an Excel-like format at the bottom of the screen as well.

“What we focused on in 2019.3 are really features we call You Features, which are features the customers had been asking for,” Kramer said. “Some were features people probably didn’t even notice, but some were those ones that people discover and call us crying, ‘Thank you,’ because it makes [them more efficient] as they’re building out their workloads.”

The December update to the Alteryx analytics platform will again center around ease of use, with an enhanced mapping tool part of the upgrade.

The analytic process from data input through analysis is displayed on an Alteryx Design dashboard.
An Alteryx Design dashboard shows the analytic workflow from data input through analysis.

Another piece will be low-code and no-code modeling assistance for developers.

“Alteryx has done a good job riding the market trends,” said Dave Menninger, analyst at Ventana Research. “They rode the data preparation wave to fuel their IPO and now they are riding the [augmented intelligence/machine learning] wave. The company remains focused on the needs of their customers, [and that] has enabled them to succeed despite some areas of the product that are not quite as modern as some of their competitors.”

He added that ease of use is a strength of the Alteryx analytics platform, with the qualifier that Alteryx has traditionally been aimed at users with a high degree of data literacy.

Alteryx is easy to use for the personas they target — data-literate analysts preparing data for others and analyzing it for themselves.
Dave MenningerAnalyst, Ventana Research

“Alteryx is easy to use for the personas they target — data-literate analysts preparing data for others and analyzing it for themselves,” Menninger said. “They are not a general-purpose BI tool.”

That said, Alteryx is looking to make its product more accessible to citizen data scientists.

The vendor has a partnership with Tableau, for example, and the Alteryx analytics platform can be used in concert with Tableau or any other vendor’s business intelligence tools. In addition, Alteryx, like other vendors in the analytics industry, is focused on increasing data literacy as data analysis becomes a greater part of everyday business.

A particular focus for Alteryx, as evidenced by the low-code and no-code modeling assistance that will be included in the December release, are tools to aid developers.

Looker included a kit for developers in its most recent update, and Yellowfin followed suit with its own set of developer tools in its November release.

“We now have in beta what we’re calling Assisted Modeling, which is basically a guided utility to walk the analyst who doesn’t have any coding skills through the model-building process,” Kramer said. “They have a data problem, they want to do something like predict, but they don’t know what kind of model they want — they might not even know they need a model — so they can use this and we’ll walk them through.”

Alteryx’s intent in two recent acquisitions — ClearStory Data in April and Feature Labs in October — was to speed up the process of making the development process more accessible to those without information technology backgrounds.

“They are making the data preparation tasks more interactive with the ability to visualize the data at various steps in the process,” Menninger said. “With the acquisition of ClearStory Data earlier this year, they can provide data profiling and automatically infer relationships in the data.”

Feature Labs, meanwhile, helps designers add AI and machine learning capabilities.

“There are many tedious and time-consuming steps to creating AI/ML models,” Menninger said. “If you can automate some of those steps, you can produce more accurate models because you can explore more alternatives and you can produce more models because you’ve saved time via the automation.”

The Alteryx analytics platform will continue to focus on ease of use, with cloud migration and AI features such as recommendations and guided experiences part of the roadmap, according to the vendor.

“Organizations are trying to be data-centric and build an organization based on analytics,” said Zak Matthews, Alteryx’s sales engineering manager, “and that’s hard to do if the tools aren’t easy to use.”

Go to Original Article
Author:

SAP Data Warehouse Cloud may democratize data analytics

SAP Data Warehouse Cloud is the latest products entrant designed to democratize data analytics.

SAP Data Warehouse Cloud is a data warehouse as a service product that became generally available in October. It connects a variety of data sources but enables organizations to decide where and how they want to keep and access the data.

It includes some features that should make it attractive to business users, particularly for those already running SAP systems, but SAP Data Warehouse Cloud faces a crowded market. Still, it’s a product SAP clearly needs to remain competitive in the data warehouse and data analytics game, according to one analyst.

Dan LahlDan Lahl

Making data analytics more applicable and accessible to the business is the main goal of SAP Data Warehouse Cloud, said Dan Lahl, SAP vice president of product marketing.

“In SAP Data Warehouse Cloud we’ve added new things to the data warehouses that allow customers to define their own data sets, and either virtualize or pull in that data so they can make decisions on exactly the data they want to look at,” Lahl said. “Usually you get five guys with four spreadsheets and everybody’s arguing over who’s got the data and the decision of record. SAP Data Warehouse Cloud will either virtualize or move data, specifically to the information the customer needs, and then the business user decides what dashboards or what reports they want to use on that data. It’s getting to business decisions more quickly for business users.”

One feature, Spaces, enables business users to align data in ways that are aligned with their business or process needs.

“Data warehouses used to take a long time to build. It was very expensive, it took a number of people to build them, and by the time you got to build the reports that you wanted, nobody wanted them anymore,” Lahl said. “This is kind of the antithesis of that, getting the business user to look and access the data they want and then build reporting and visualizations off of that.”

SAP Data Warehouse Cloud also includes SAP Analytics Cloud, which provides built-in reporting and dashboard capabilities.

Opening up analytics

Velux Group, a global manufacturing of windows based in Horsholm, Denmark, has been evaluating SAP Data Warehouse Cloud as a key part in the evolution of its data analytics program. The company has 27 production sites in 10 countries, sales offices in 40 countries, and 11,500 employees.

Andreas MadsenAndreas Madsen

Velux Group is a longtime SAP customer, running SAP ECC for ERP and BusinessObjects BI and Business Warehouse (BW) for data analytics and storage. However, the company has run into some limitations with BusinessObjects and is looking to overcome those limitations with SAP Data Warehouse Cloud, said Andreas Madsen, senior data and analytics partner at Velux.

Velux is in the process of developing a new business model to sell more directly to consumers, in addition to the traditional channel of installers and resellers.

“Normally, it’s the installers and dealers that actually sell our windows — we sell through them. So we don’t know that much about our end users, but that’s changing as we’re moving into the online market as well,” Madsen said.

Direct selling requires more flexibility than what Velux’s current systems offer.

“There’s a transition in what we need to know about our end users and how we use data,” he said. “And although we have a very good, well-functioning ECC system, BW, and some classic reporting, it’s really hard to navigate in, it’s really hard to explore data when everything is structured in the data warehouse.”

The ultimate strategic goal is for Velux to become more customer-centric, and Madsen said that in order to do this, the company needs a more open data platform that enables business users to link the data as needed.

“There is a world outside of our corporate enterprise systems. We cover all the business processes in Velux, but if you look at marketing they might have 50 or 60 different databases that that they use data from, they need to be able to cater to that data as well,” Madsen said. “It’s important data, but it’s not something that we are in control of, so we need to give them a platform where they can operate and then combine that with the enterprise data.”

SAP’s needed reaction to data analytics market

SAP Data Warehouse Cloud is a necessary evolution of SAP’s HANA-based data analytics approach, especially given the crowded, competitive analytics market, said Dana Gardner, president and principal analyst with Interarbor Solutions LLC.

It needs to be noted that SAP is reacting to the market rather than making the market.
Dana GardnerPresident and principal analyst, Interarbor Solutions LLC

“There are lot of companies out there like Qlik, Tableau, and others that are making inroads and we’re seeing more of this analytics as a service and the democratization of data,” Gardner said. “SAP is recognizing that they need to compete better in all aspects of data analytics and not just the enterprise systems of record integration part of it. But it needs to be noted that SAP is reacting to the market rather than making the market.”

Dana GardnerDana Gardner

The SAP Data Warehouse Cloud approach focuses on the democratization of data analytics and makes it simple and automated behind the scenes, so that more business users get the types of analytics they need based on their business role and what work they are doing, Gardner said.

“You don’t want to make data analytics just available to data scientists. It’s time to break down the ivory tower,” he said. “The more people that use more analytics in your organization, the better you’re going to be as a company, so it’s important that SAP gets aggressive and out on front on this.”

Go to Original Article
Author:

How PowerCLI automation brings PowerShell capabilities to VMware

VMware admins can use PowerCLI to automate many common tasks and operations in their data centers and perform them at scale. Windows PowerShell executes PowerCLI commands via cmdlets, which are abbreviated lines of code that perform singular, specific functions.

Automation can help admins keep a large, virtualized environment running smoothly. It helps with resource and workload provisioning. It also adds speed and consistency to most operations, since an automated task should behave the same way every time. And because automation can guide daily repetitions of testing, configuration and deployment without introducing the same errors that a tired admin might, it aids in the development of modern software as well.

PowerShell provides easy automation for Windows environments. VMware admins can also use the capabilities of PowerShell, however, with the help of VMware’s PowerCLI, which uses PowerShell as a framework to execute automated tasks on VMware environments.

PowerShell and PowerCLI

In a VMware environment, PowerCLI automation and management is provided at scale in a quicker way than using a GUI via the PowerShell framework. PowerCLI functions as a command-line interface (CLI) tool that “snaps into” PowerShell, which executes its commands through cmdlets. PowerCLI cmdlets can manage infrastructure components, such as High Availability, Distributed Resource Scheduler and vMotion, and can perform tasks such as gathering information, powering on and off VMs, and altering workloads and files.

In a single line of code, admins can enact mass changes to an entire VMware environment.

PowerShell commands consist of a function, which defines an action to take, and a cmdlet, which defines an object on which to perform that action. Parameters provide additional detail and specificity to PowerShell commands. In a single line of code, admins can enact mass changes to an entire VMware environment.

Common PowerCLI cmdlets

You can automate vCenter and vSphere using a handful of simple cmdlets.

With just five cmdlets, you can execute most major vCenter tasks. To obtain information about a VM — such as a VM’s name, power state, guest OS and ESXi host — use the Get-VM cmdlet. To modify an existing vCenter VM, use Set-VM. An admin can use Start-VM to start a single VM or many VMs at once. To stop a VM use Stop-VM, which simply shuts down a VM immediately, or Stop-VMGuest, which performs a more graceful shutdown. You can use these cmdlets to perform any of these tasks at scale across an entire data center.

You can also automate vSphere with PowerCLI. One of the most useful cmdlets for vSphere management is Copy-VMGuestFile, which enables an admin to copy files and folders from a local machine to a vSphere VM. Admins can add a number of parameters to this cmdlet to fine-tune vSphere VM behavior. For example, there is -GuestCredential, which authenticates a VM, and -GuestToLocal, which reverses the flow of information.

Recent updates to PowerCLI and PowerShell

PowerCLI features over 500 separate commands, and the list is only growing. In June 2019, VMware released PowerCLI 11.3, which added 22 new cmdlets for HCX management and support for opaque networks, additional network adapter types and high-level promotion of instant clones.

PowerShell is more than simply PowerCLI, of course. In May 2019, Microsoft released the most recent version of PowerShell: PowerShell 7, which includes several new APIs in the .NET Core 3.0 runtime. At the PowerShell summit in September 2019, Microsoft announced several other developments to PowerShell programming.

PowerShell now works with AWS serverless computing, which enables you to manage a Windows deployment without managing a Windows Server machine. So, you can run PowerShell on an API and use it to run serverless events, such as placing an image in an AWS Simple Storage Service bucket and converting that image to multiple resolutions.

PowerShell also offers a service called Simple Hierarchy in PowerShell (SHiPS). An admin can use SHiPS to build a hierarchical file system provider from scratch and bypass the normal complexity of such a task. SHiPS reduces the amount of code it takes to write a provider module from thousands of lines to around 20.

Go to Original Article
Author:

Microsoft to apply CCPA protections to all US customers

Microsoft is taking California’s new data privacy law nationwide.

The software giant this week said it will honor the California Consumer Privacy Act (CCPA) throughout the United States. When the CCPA goes into effect on Jan. 1, 2020, companies in California will be required to provide people with the option to stop their personal information from being sold, and will generally require that companies are transparent about data collection and data use.

The CCPA applies to companies that do business in California, collect customers’ personal data and meet one of the following requirements: have annual gross revenue of more than $25 million; buy, receive, sell or share personal data of 50,000 or more consumers, devices or households for commercial purposes; or earn 50% or more of their annual revenues from selling consumers’ personal data.

Julie Brill, Microsoft’s corporate vice president for global privacy and regulatory affairs and Chief Privacy Officer, announced her company’s plans to go a step further and apply the CCPA’s data privacy protections to all U.S. customers — not just those in California.

“We are strong supporters of California’s new law and the expansion of privacy protections in the United States that it represents. Our approach to privacy starts with the belief that privacy is a fundamental human right and includes our commitment to provide robust protection for every individual,” Brill wrote in a blog post. “This is why, in 2018, we were the first company to voluntarily extend the core data privacy rights included in the European Union’s General Data Protection Regulation (GDPR) to customers around the world, not just to those in the EU who are covered by the regulation. Similarly, we will extend CCPA’s core rights for people to control their data to all our customers in the U.S.”

Brill added that Microsoft is working with its enterprise customers to assist them with CCPA compliance. “Our goal is to help our customers understand how California’s new law affects their operations and provide the tools and guidance they will need to meet its requirements,” she said.

Microsoft did not specify when or how it will apply the CPAA for all U.S. citizens. In recent years the company has introduced several privacy-focused tools and features designed to give customers greater control over their personal data.

Fatemeh Khatibloo, vice president and principal analyst at Forrester Research, said Microsoft has an easier path to becoming CCPA compliant because of its early efforts to broadly implement GDPR protections.

“They’re staying very true to all the processes they went through under GDPR,” she said. “CCPA has some differences with GDPR. Namely, it’s got some requirements to verify the identity of people who want to exercise their rights. GDPR is still based on an opt-in framework rather than an opt-out one; it requires consent if you don’t have another legal basis for processing somebody’s data. The CCPA is still really about giving you the opportunity to opt out. It’s not a consent-based framework.”

Khatibloo also noted that Microsoft was supportive of the CCPA early on, and that Brill, who formerly served as commissioner of the U.S. Federal Trade Commission under the Obama administration, has a strong history on data privacy.

“She understands the extensive need for a comprehensive privacy bill in the U.S., and I think she also understands that that’s probably not going to happen in the next year,” Khatibloo said. “Instead of waiting for a patchwork of laws to turn up, I think she’s taking a very proactive move to say, ‘We’re going to abide by this particular set of rules, and we’re going to make it available to everybody.’ The other really big factor here is, who wants to be the company that says its New York customers don’t have the same rights that its California customers do?

Rebecca Herold, an infosec and privacy consultant as well as CEO of The Privacy Professor consultancy, argued that while CCPA does a good job addressing the “breadth of privacy issues for individuals who fall under the CCPA definition of a ‘California consumer,'” it falls short in multiple areas. To name a few criticisms, she pointed out that it doesn’t apply to organizations with under $25 million in revenue, it does not apply to all types of data or individuals such as employees, and that many of its requirements can come across as confusing.

But Herold said Microsoft’s move to apply CCPA for all 50 states makes sense and it’s something she recommends to her clients when consulting on new regional regulations. “When looking at implementing a wide-ranging law like CCPA, it would be much more simplified to just follow it for all their millions of customers, and not try to parse out the California customers from all others,” she said via email. “It is much more efficient and effective to simplify data security and privacy practices by treating all individuals within an organization’s database equally, meeting a baseline of actions that fit all legal requirements across the board. This is a smart and savvy business leadership move.”

Mike Bittner, associate director of digital security and operations for advertising security vendor The Media Trust, agreed that Microsoft’s move isn’t surprising.  

“For a large company like Microsoft that serves consumers around the world, simplifying regulatory compliance by applying the same policies across an entire geography makes a lot of sense, because it removes the headaches of applying a hodgepodge of state-level data privacy laws,” he said in an email. “Moreover, by using the CCPA — the most robust U.S. data privacy law to date — as the standard, it demonstrates the company’s commitment to protecting consumers’ data privacy rights.”

Herold added that the CCPA will likely become the de facto data privacy law for the U.S. in the foreseeable future because Congress doesn’t appear to be motivated to pass any federal privacy laws.

Brill appeared to agree.

“CCPA marks an important step toward providing people with more robust control over their data in the United States,” she wrote in her blog post. “It also shows that we can make progress to strengthen privacy protections in this country at the state level even when Congress can’t or won’t act.”

Senior reporter Michael Heller contributed to this report.

Go to Original Article
Author: