Tableau Lays Out Vision at Pivotal Time for Company

Register now

Austin, Texas, recently boasted the largest user conference for Tableau with over 13,000 people attending live, spanning multiple city blocks, and another 10,000 people joining virtually.

It’s a pivotal time for the company, once a start-up and now a $700 million plus market leader. Adam Selipsky, formerly of Amazon Web Services, recently took over as CEO, with co-founder Christian Chabot transitioning to chairman of the board. Initially thriving in a market of only a handful of data discovery offerings, it is now competing in a much more crowded modern BI and analytics market. (See our note: Tableau Adds Data Federation and Advanced Analytics, but Amid a More Competitive Modern BI Market)

Here are the five biggest themes for what the vendor plans to deliver over the next three years:


Tableau continues to improve on its core capabilities of visual analytics by adding things like multiple chart layers, automatic insights, and intelligent mouse tips. The mouse tip or hover over (because, hey, that could be a tap on a phone) is more than just data on a data point; it may display things like variance analysis. Automatic insights are charts and findings that are automatically created as a user investigates the data. For example, the chart at right shows CO2 emissions versus GDP. By lassoing bottom left points, the blue charts appear as automatic insights, created by the software, not the user. It shows that m
ost of the points were in Africa and the relation to internet usage and mobile phone. Based on initial demos, these don’t appear to be as smart as say Salesforce BeyondCore or IBM Watson Analytics, but it’s certainly a step in that direction.  And we look forward to testing!

In addition, Tableau will add natural language to its visualizations. The language understands context – so it’s more intuitive than straight key word search. For example, a user can ask about the “largest earthquakes in California.” A slider will auto appear to let the user refine magnitude. Or in the screenshot, “houses near Ballard” – and “near” is a slider for distance.

I love this! And in comparing notes with customer Netflix, they see this as huge too – way better than sifting through dozens of tabs in a workbook. (see our Hype Cycle for Business Intelligence and Analytics, 2016, for more info on how search and natural language are impacting the BIA market.


Earlier this year, Tableau acquired Hyper technology, a columnar and in-memory engine. It plans to replace the Tableau Data Extract with Hyper in 2017, improving both performance and scale of the current TDEs.  In the keynote, the vendor showed data from news events – the GDELT project – with over 400 million rows of data. Visualizing 3 billion rows of weather data was equally fast and ingested in seconds as the visualization refreshed.


The biggest news here is project Maestro, in which Tableau plans to release a self-service data preparation product. It will be tightly integrated with the Tableau Data Server, but can also be used stand-alone.  The degree that this disrupts existing partnerships with vendors such as Alteryx, Trifacta, Paxata, and Datawatch, (to name a few – see our note Market Guide for Self-Service Data Preparation) depends on capabilities, price point, and differentiation – for both Tableau’s new product and for the respective partners.

And yet, it’s once again the other little things that are a big deal for self-service for the enterprise. For example, the ability to flag a data set as certified, or for a user to add their own calculation to a sanction data model.


Support for hybrid connectivity from Tableau Online to live data sources on-premises is the biggest news here. For now, customers must push data to the cloud in the form of a TDE.

Tableau has long been data source agnostic, has gradually become cloud agnostic, and also announced it will soon run on the Linux operating system.


Discussion threads are coming to Tableau, but in addition, at long last KPI alerting. The workflow for this seems to be nicely implemented. A new recommendation engine will also show a user potential datasets and workbooks that relate to their existing analysis.

Timing for all these things were not particularly specific, although some will be in 2017.

While the biggest announcements were during the morning keynote, the developers on stage were once again a conference highlight. It was interesting to me that four of the five developers were female, bucking the industry statistics of less than 25% female engineers. I had to chuckle that the biggest applause was over the ability to evenly distribute visuals across the page (like in Power Point); it’s the little things that can be such a pain!

I was bummed I only had time to attend one customer session, but it was an excellent session by Honeywell. Their journey to self-service BI and analytics is fairly typical, from IT initially refusing to fund the software investment to now recognizing self-service has cut IT data delivery time in half.  And lucky for me that sessions were recorded.

(About the author: Cindi Howson is aa research vice president at Gartner Group. This post originally appeared on her Gartner blog, which can be viewed here)

For reprint and licensing requests for this article, click here.