Tag Archives: Tableau

Tableau 2020.1 unveiled for beta testing

Though a couple of weeks still remain in 2019, Tableau is turning its attention to next year, and on Wednesday rolled out Tableau 2020.1 for beta testing.

Though not yet available to the general public, the beta version of Tableau 2020.1 includes 21 features.

Among them are an update to Explain Data, an augmented intelligence product built directly in Tableau that uses statistical algorithms to analyze data and then explain what is driving specific data points. The update aims to improve the performance of Explain Data — first unveiled in Tableau 2019.3 — for wide data sets, and includes refined models to help customers derive deeper insight from their data.

In addition, Tableau 2020.1 includes Dynamic Parameters, which saves users the cumbersome task of republishing a workbook with parameters every time the underlying data changes by performing automatic updates. It also includes enhanced map building prowess, an add-on to Tableau Data Management that will speed up the process of getting to the right data and improved connectors to Salesforce and Snowflake.

Despite the array of updates and new offerings, the 21 features included in the beta version of Tableau 2020.1 are modest improvements rather than major new capabilities, analysts said.

“It’s all organic growth, incremental improvements,” said Boris Evelson, principal analyst at Forrester. “[Tools like] Explain Data have been a core feature of leading enterprise BI platforms for a while now.”

Similarly, Wayne Eckerson, founder and principal consultant of Eckerson Group, noted that the platform contains upgrades but he said they are not innovative new features that will force other vendors to react.

Tableau's latest platform update, now in beta testing, includes an update to Explain Data.
Tableau 2020.1, just released for beta testing, includes an update to Explain Data, an AI tool from the vendor that attempts to explain the reasons behind data points.

“There are a lot of incremental improvements,” he said, “and there’s more movement to Tableau Server and [Tableau Online] to achieve parity with Tableau Desktop.”

There are a lot of incremental improvements, and there’s more movement to Tableau Server and [Tableau Online] to achieve parity with Tableau Desktop.
Wayne EckersonFounder and principal consultant, Eckerson Group

One feature not included in Tableau 2020.1 is a low-code data modeling tool.

Tableau, which is based in Seattle, revealed on its website that it plans to provide new data modeling capabilities that will allow customers to analyze data without having to learn advanced database concepts or write custom SQL code.

The capability, however, is only in the alpha testing stage at this point.

“That could be interesting,” Eckerson. “I suspect it’s a semantic layer, which Tableau has never really had. That would be big news. They need that to keep up with Power BI, which is one of its key differences with Tableau.”

Though not a specific feature, something else not evident in Tableau 2020.1 — at least not in any obvious way — is influence of Tableau’s acquisition by Salesforce.

Tableau 2020.1 marks Tableau’s second platform update since Nov. 5, when Salesforce and Tableau finally received regulatory approval to proceed with their merger and were finally allowed to begin working together. But the first update, Tableau 2019.4, came just a day after the companies were freed from their regulatory holdup and they never had a chance to join forces.

Five weeks have passed since the lifting of regulatory restrictions and the beta release of Tableau 2020.1, but that’s still not enough time for Salesforce and Tableau to significantly collaborate on technology.

The only mention of Salesforce among the 21 features in Tableau 2020.1 is the improved connector.

“I’ve seen no indications of Tableau and Salesforce doing any integrations as of yet,” Evelson said, “so this is all business as usual for Tableau.”

Go to Original Article
Author:

Tableau analytics platform upgrades driven by user needs

LAS VEGAS — Tableau revealed a host of additions and upgrades to the Tableau analytics platform in the days both before and during Tableau Conference 2019.

Less than a week before its annual user conference, the vendor released Tableau 2019.4, a scheduled update of the Tableau analytics platform. And during the conference, Tableau unveiled not only new products and updates to existing ones, but also an enhanced partnership with Amazon Web Services to help users move to the cloud and a new partner network.

Many of the additions to the Tableau analytics platform have to do with data management, an area Tableau only recently began to explore. Among them are Tableau Catalog and Prep Conductor.

Others, meanwhile, are centered on augmented analytics, including Ask Data and Explain Data.

All of these enhancements to the Tableau analytics platform come in the wake of the news last June that Tableau was acquired by Salesforce, a deal that closed on Aug. 1 but was held up until just last week by a regulatory review in the United Kingdom looking at what effect the combination of the two companies would have on competition.

In a two-part Q&A, Andrew Beers, Tableau’s chief technology officer, discussed the new and enhanced products in the Tableau analytics platform as well as how Tableau and Salesforce will work together.

Part I focuses on data management and AI products in the Tableau analytics platform, while Part II centers on the relationship between Salesforce and Tableau.

Data management has been a theme of new products and upgrades to the Tableau analytics platform — what led Tableau in that direction?

Andrew BeersAndrew Beers

Andrew Beers: We’ve been about self-service analysis for a long time. Early themes out of the Tableau product line were putting the right tools in the hands of the people that were in the business that had the data and had the questions, and didn’t need someone standing between them and getting the answers to those questions. As that started to become really successful, then you had what happens in every self-service culture — dealing with all of this content that’s out there, all of this data that’s out there. We helped by introducing a prep product. But then you had people that were generating dashboards, generating data sets, and then we said, ‘To stick to our belief in self-service we’ve got to do something in the data management space, so what would a user-facing prep solution look like, an operationalization solution look like, a catalog solution look like?’ And that’s what started our thinking about all these various capabilities.

Along those lines, what’s the roadmap for the next few years?

Beers: We always have things that are in the works. We are at the beginning of several efforts — Tableau Prep is a baby product that’s a year and a half old. Conductor is just a couple of releases old. You’re going to see a lot of upgrades to those products and along those themes — how do you make prep easier and more approachable, how do you give your business the insight into the data and how it is being used, and how do you manage it? That’s tooling we haven’t built out that far yet. Once you have all of this knowledge and you’ve given people insights, which is a key ingredient in governance along with how to manage it in a self-service way, you’ll start to see the Catalog product grow into ideas like that.

Are these products something customers asked for, or are they products Tableau decided to develop on its own?

Beers: It’s always a combination. From the beginning we’ve listened to what our customers are saying. Sometimes they’re saying, ‘I want something that looks like this,’ but often they’re telling us, ‘Here is the kind of problem we’re facing, and here are the challenges we’re facing in our organization,’ and when you start to hear similar stories enough you generalize that the customers really need something in this space. And this is really how all of our product invention happens. It’s by listening to the intent behind what the customer is saying and then inventing the products or the new capabilities that will take the customer in a direction we think they need to go.

Shifting from data management to augmented intelligence, that’s been a theme of another set of products. Where did the motivation come from to infuse more natural language processing and machine learning into the Tableau analytics platform?

Beers: It’s a similar story here, just listening to customers and hearing them wanting to take the insights that their more analyst-style users got from Tableau to a larger part of the organization, which always leads you down the path of trying to figure out how to add more intelligence into the product. That’s not new for Tableau. In the beginning we said, ‘We want to build this tool for everyone,’ but if I’m building it for everyone I can’t assume that you know SQL, that you know color design, that you know how to tell a good story, so we had to build all those in there and then let users depart from that. With these smart things, it’s how can I extend that to letting people get different kinds of value from their question. We have a researcher in the NLP space who was seeing these signals a while ago and she started prototyping some of these ideas about how to bring natural language questioning into an analytical workspace, and that really inspired us to look deeply at the space and led us to think about acquisitions..

What’s the roadmap for Tableau’s AI capabilities?

With the way tech has been developing around things like AI and machine learning, there are just all kinds of new techniques that are available to us that weren’t mainstream enough 10 years ago to be pulling into the product.
Andrew BeersChief technology officer, Tableau

Beers: You’re going to see these AI and machine learning-style capabilities really in every layer of the product stack we have. We showed two [at the conference] — Ask Data and Explain Data — that are very much targeted at the analyst, but you’ll see it integrated into the data prep products. We’ve got some smarts in there already. We’ve added Recommendations, which is how to take the wisdom of the crowd, of the people that are at your business, to help you find things that you wouldn’t normally find or help you do operations that you yourself haven’t done yet but that your community around have done. You’re going to see that all over the product in little ways to make it easier to use and to expand the kinds of people that can do those operations.

As a technology officer, how fun is this kind of stuff for you?

Beers: It’s really exciting. It’s all kinds of fun things that we can do. I’ve always loved the mission of the company, how people see and understand data, because we can do this for decades. There’s so much interesting work ahead of us. As someone who’s focused on the technology, the problems are just super interesting, and I think with the way tech has been developing around things like AI and machine learning, there are just all kinds of new techniques that are available to us that weren’t mainstream enough 10 years ago to be pulling into the product.

Go to Original Article
Author:

Tableau analytics platform gets AI, data management upgrades

LAS VEGAS — Improved augmented intelligence and data preparation capabilities are at the core of new updates to the Tableau analytics platform.

The Seattle-based vendor unveiled the enhancements on Wednesday at its annual user conference here.

Tableau revealed an upgrade to Ask Data, the Tableau analytics platform’s natural language processing tool that was introduced in February. Ask Data is now able to interpret more complex questions than it could before, and it can now do year-over-year geospatial comparisons, according to the vendor.

Tableau evolving its platform

In addition, with the recent introduction of Tableau Catalog and an upgrade to Prep Conductor, users of the Tableau analytics platform will be able to prepare data within Tableau itself rather than another product before importing the cleansed data into Tableau.

Finally, Tableau added to the Tableau analytics platform Metrics, a mobile-first tool that will enable users to monitor key performance indicators.

The product moves unveiled at the conference show that Tableau is continuing to evolve its popular analytics and business intelligence platform by adding features to help end users do more with self-service, said Doug Henschen, principal analyst at Constellation Research.

“With great self-service freedom comes great responsibility to do a better job of data governance,” Henschen said. “Tableau Catalog gives Tableau customers a data access and data lineage tracking aid to help users spot the data they should be using to help avoid recreating data sets and analyses that already exist or that could easily be extended to cover new use cases.”

Tableau Catalog gives Tableau customers a data-access and data-lineage tracking aid to help users spot the data they should be using to help avoid recreating data sets and analyses that already exist or that could easily be extended to cover new use cases.
Doug HenschenPrincipal analyst, Constellation Research

The host of upgrades come a day after Tableau revealed an enhanced partnership agreement with Amazon Web Services, Modern Cloud Analytics, designed to help Tableau’s many on-premises users migrate to the cloud.

A user looks at the new features

Meanwhile, one of the self-service customers Henschen alluded to is the University of Michigan, which has nearly 100,000 potential users with 50,000 employees and 48,000 students.

While it hasn’t yet taken advantage of the burgeoning data management capabilities of the Tableau analytics platform, the school is interested in Tableau’s natural language processing capabilities.

But with nearly 100,000 potential users — from hospital staff to the history department — nothing is as simple as choosing to use one BI tool within an overall system and eschewing another.

“Our data is relatively simple enough that we don’t need to constantly pivot or join a number of different things from a number of different places together,” Matthew Pickus, senior business intelligence analyst, said of Michigan’s decision to not yet employ tools like Tableau Catalog and Prep Conductor. “We try and keep the system as enterprise as possible.”

Christopher Gardner, business intelligence senior analyst at the University of Michigan, added that the potential cost of using the data preparation tools, given the number of users across the university, is a constraint.

That said, because data across the university’s myriad departments is often kept by each department according to that department’s own method — creating data silos — data standardization is something that could be on the horizon at Michigan, the school’s BI analysts said.

“It’s starting to get talked about a little more, so it may be something we start investigating,” Gardner said.

Bringing analytics to end users

“Some of the data management tools will become much more needed in the future,” Pickus added. “We’re just trying to figure out the best way to approach it. It’s going to become more important.

Tableau reaching down not just how to visualize your data but how to help you manage and organize your data across all the sets is going to be very helpful in the future.”

NLP, meanwhile, is something Michigan’s IT leaders see as a way to make analytics more accessible to its employees and students.

A gif displays Ask Data, Tableau's natural language processing tool.
A gif shows Tableau’s Ask Data, its natural language processing tool, in action.

But Gardner and Pickus said they want more from NLP tools than they’re currently capable of providing, whether part of the Tableau analytics platform or any other BI vendor’s suite.

“Our executives are very interested in it,” said Gardner. “They’re looking for ways to make data more accessible to users who aren’t familiar with reporting tools. To us it’s kind of frustrating, because we’ve got the reporting tools. Let’s take it a step further, and instead of just reporting let’s start doing analysis and start getting trends.”

Perhaps that’s next for the Tableau analytics platform.

Go to Original Article
Author:

Tableau BI gets Extensions API in version 2018.2 update

Amid positive early reviews, Tableau announced the general availability of Tableau 2018.2, an extensive upgrade that amplifies the scope of Tableau BI tools with expanded analytics functions and a more streamlined dashboard.

The update comes a few days after Tableau released the beta version of 2018.3 that further simplifies the user interface and enables users to more easily consolidate different sources of data.

The general release version of 2018.2 brings a range of notable changes and new capabilities to Tableau BI tools that introduces customized third-party capabilities to the self-service analytics and data visualization platform.

Released in beta form in April, this week’s formal release of Tableau 2018.2 enables nonbeta users to use several new features, including automatic mobile layouts and Spatial Join, which integrates disparate data sources under a single common attribute.

Probably the two most significant features the release adds to Tableau BI tools are in Dashboard Extensions and Tableau Services Manager.

Drag-and-drop dashboard extensions

The Extensions API essentially opens the platform to both first-party and third-party developers and users, allowing them to create and share their own dashboard extensions with different functionalities.

It’s really exciting to see what the community is able to do and also the creativity of folks to take self-service analytics to the next level.
Francois Ajenstatchief product officer, Tableau

“It’s really exciting to see what the community is able to do and also the creativity of folks to take self-service analytics to the next level,” said Francois Ajenstat, chief product officer at Tableau.

Introducing third-party extensions to a dashboard is a drag-and-drop process, and the new Extensions Gallery enables users to browse and select extensions made by Tableau partners. For example, the feature could let users who, on their own, might not be able to design, say, a predictive analytics model, to simply drag and drop one in.

The Extensions API and several other recent dashboard design enhancements will be welcomed by Tableau users, said Jen Underwood, founder and principal analyst of Florida-based Impact Analytix.

It could “open up a new world of possibilities for augmented analytics, machine learning, statistics, advanced analytics, workflow and other types of apps to integrate directly within Tableau,” Underwood said.

The other standout feature of the new release of 2018.2 is Tableau Services Manager, which allows Tableau Server administration to be done completely through the browser, and generally tries to make server management simpler and faster.

New update enters beta

The beta release of Tableau 2018.3 brings its own expanded capabilities to Tableau BI tools, including dashboard navigation buttons in Tableau Desktop, transparent worksheet backgrounds and a mixed content display in Tableau Server and Online that can show all of a user’s content on the same page.

Heatmaps, a new mark or chart type for Tableau, are expected to be added to the beta in a future update, Tableau said in a blog post.

While Tableau did not say when 2018.3 would be officially released, in sticking to the company’s quarterly schedule, it can likely be expected to leave beta this fall. Betas can see numerous tweaks and adjustments before the official release and even “fundamental changes,” depending on customer feedback or Tableau’s own observations,” Ajenstat said.

Seeking to simplify

The new Tableau BI capabilities introduced in the updates are indicative of Tableau’s business mission, “to help people see and understand data,” Ajenstat said.

Future updates, he said, will likely be aimed at making “analytics easier for everyone,” and will incorporate smart capabilities with tools like AI, machine learning and natural language processing, in part due to the organization’s recent acquisition of MIT AI startup Empirical Systems.

Tableau’s announcements came as competitor Qlik, which research and advisory firm Gartner regularly ranks highly alongside Tableau and Microsoft’s Power BI, announced the acquisition of self-service BI and data visualization startup Podium Data. According to Qlik, the move will increase the company’s ability to compete with Tableau.

Tableau acquisition of MIT AI startup aims at smarter BI software

Tableau Software has acquired AI startup Empirical Systems in a bid to give users of its self-service BI platform more insight into their data. The Tableau acquisition, announced today, adds an AI-driven engine that’s designed to automate the data modeling process without requiring the involvement of skilled statisticians.

Based in Cambridge, Mass., Empirical Systems started as a spinoff from the MIT Probabilistic Computing Project. The startup claims its analytics engine and data platform is able to automatically model data for analysis and then provide interactive and predictive insights into that data.

The technology is still in beta, and Francois Ajenstat, Tableau’s chief product officer, wouldn’t say how many customers are using it as part of the beta program. But he said the current use cases are broad and include companies in retail, manufacturing, healthcare and financial services. That wide applicability is part of the reason why the Tableau acquisition happened, he noted.

Catch-up effort with advanced technology

In some ways, however, the Tableau acquisition is a “catch-up play” on providing automated insight-generation capabilities, said Jen Underwood, founder of Impact Analytix LLC, a product research and consulting firm in Tampa. Some other BI and analytics vendors “already have some of this,” Underwood said, citing Datorama and Tibco as examples.

The Tableau acquisition adds an AI-driven engine that’s designed to automate the data modeling process without requiring the involvement of skilled statisticians.

Empirical’s automated modeling and statistical analysis tools could put Tableau ahead of its rivals, she said, but it’s too soon to tell without having more details on the integration plans. Nonetheless, she said she thinks the technology will be a useful addition for Tableau users.

“People will like it,” she said. “It will make advanced analytics easier for the masses.”

Tableau already has been investing in AI and machine learning technologies internally. In April, the company released its Tableau Prep data preparation software, with embedded fuzzy clustering algorithms that employ AI to help users group data sets together. Before that, Tableau last year released a recommendation engine that shows users recommended data sources for analytics applications. The feature is similar to how Netflix suggests movies and TV shows based on what a user has previously watched, Ajenstat explained.

Integration plans still unclear

Ajenstat wouldn’t comment on when the Tableau acquisition will result in Empirical’s software becoming available in Tableau’s platform, or whether customers will have to pay extra for the technology.

[embedded content]

Empirical CEO Richard Tibbetts on its automated data
modeling technology.

“Whether it’s an add-on or how it’s integrated, it’s too soon to talk about that,” he said.

However, he added that the Empirical engine will likely be “a foundational element” in Tableau, at least partially running behind the scenes, with a goal that “a lot of different things in Tableau will get smarter.”

Unlike some predictive algorithms that require large stores of data to function properly, Empirical’s software works with “data of all sizes, both large and small,” Ajenstat said. When integration does eventually begin to happen, Ajenstat said Tableau hopes to be able to better help users identify trends and outliers in data sets and point them toward factors they could drill into more quickly.

Augmented analytics trending

Tableau’s move around augmented analytics is in line with what Gartner pointed to as a key emerging technology in its 2018 Magic Quadrant report on BI and analytics platforms.

Various vendors are embedding machine learning tools into their software to aid with data preparation and modeling and with insight generation, according to Gartner. The consulting and market research firm said the augmented approach “has the potential to help users find the most important insights more quickly, particularly as data complexity grows.”

Such capabilities have yet to become mainstream product requirements for BI software buyers, Gartner said in the February 2018 report. But they are “a proof point for customers that vendors are innovating at a rapid pace,” it added.

The eight-person team from Empirical Systems will continue to work on the software after the Tableau acquisition. Tableau, which didn’t disclose the purchase price, also plans to create a research and development center in Cambridge.

Senior executive editor Craig Stedman contributed to this story.

Hyper engine aims to give enterprise Tableau analytics a boost

Tableau is continuing its focus on enterprise functionality, rolling out several new features that the company hopes will make its data visualization and analytics software more attractive as an enterprise tool to help broaden its appeal beyond an existing base of line-of-business users.

In particular, the new Tableau 10.5 release, launched last week, includes the long-awaited Hyper in-memory compute engine. Company officials said Hyper will bring vastly improved speeds to the software and support new Tableau analytics use cases, like internet of things (IoT) analytics applications.

The faster speeds will be particularly noticeable, they said, when users refresh Tableau data extracts, which are in-memory snapshots of data from a source file. Extracts can reach large sizes, and refreshing larger files takes time with previous releases.

“We extract every piece of data that we work with going to production, so we’re really looking forward to [Hyper],” Jordan East, a BI data analyst at General Motors, said in a presentation at Tableau Conference 2017, held in Las Vegas last October.

East works in GM’s global telecom organization, which supports the company’s communications needs. His team builds BI reports on the overall health of the communications system. The amount of data coming in has grown substantially over the year, and keeping up with the increasing volume of data has been a challenge, he said.

Extracting the data, rather than connecting Tableau to live data, helped improve report performance. East said he hopes the extra speed of Hyper will enable dashboards to be used in more situations, like live meetings.

Faster extracts mean fresher analytics

The Tableau 10.5 update also includes support for running Tableau Server on Linux, new governance features and other additions. But Hyper is getting most of the attention. Potentially, faster extract refreshes mean customers will refresh extracts more frequently and be able to do their Tableau analytics on fresher data.

“If Hyper lives up to demonstrations and all that has been promised, it will be an incredible enhancement for customers that are struggling with large complex data,” said Rita Sallam, a Gartner analyst.

Sallam’s one caveat was that customers who are doing Tableau analytics on smaller data sets will see less of a performance upgrade, because their extracts likely already refresh and load quickly. She said she believes the addition of Hyper will make it easier to analyze data stored in a Hadoop data lake, which was typically too big to efficiently load into Tableau before Hyper. This will give analysts access to larger, more complex data sets and enable deeper analytics, Sallam said.

Focus on enterprise functionality risky

Looking at the bigger picture, though, Sallam said there is some risk for Tableau in pursuing an enterprise focus. She said moving beyond line-of-business deployments and doubling down on enterprise functionality was a necessary move to attract and retain customers. But, at the same time, the company risks falling behind on analytics functionality.

Sallam said the features in analytics software that will be most important in the years ahead will be things like automated machine learning and natural language querying and generation. By prioritizing the nuts and bolts of enterprise functionality, Tableau hasn’t invested as much in these types of features, Sallam said.

“If they don’t [focus on enterprise features], they’re not going to be able to respond to customers that want to deploy Tableau at scale,” Sallam said. “But that does come with a cost, because now they can’t fully invest in next-generation features, which are going to be the defining features of user experience two or three years from now.”