As we start our farewell tour of the analyst world, Amalgam Insights had the opportunity to attend Tableau Conference, which has consistently been one of the key events for data and analytics throughout my analyst career. The most influential Tableau Conference from a personal level was when I attended in 2013 and then-CEO Christian Chabot gave an inspiring speech on the role of discovery. It was so long ago that Tableau was just starting to get a Mac-native version and had just launched a data connection interface!
But even then, Tableau stood for a fundamental transformation in business intelligence and data discovery, one that would end up up-ending the definition of the business intelligence market and seeing monolithic billion-dollar revenue companies fall to the vision of Tableau. It is one of the few times in my career that I have seen the analyst industry unanimously agree that a market needed to be effectively redefined. And this redefinition came from the understanding that the data analyst job was changing from the order-taking report builder that I had done early in my career to a data discoverer, contextualizer, and storyteller. That role has continued to evolve to this day and every analytics vendor has had to describe its ability to empower the data analyst.
Contextualizing the Background for Tableau Conference 2025
We face another fundamental shift in the data analyst world, this time driven by AI. Over the past three years, the emergence of generative AI and related agentic AI capabilities have created a new interface for computing that has led to terms including retrieval augmented generation, vibe coding, and agentic analytics to come into the technical vernacular. And in this light, vendors have sought to state that AI is here to help the data analyst and that no jobs will be lost.
Here at Amalgam Insights, I’m here to provide a more realistic perspective. There are jobs in the data analyst world that are focused on the basic creation of dashboards and some basic data cleansing. And these jobs may have been steady and reliable jobs for many years through the last era of the data analyst and the rise of the data preparation solutions that emerged over the past decade. But those jobs are going to disappear as agents start taking over basic capabilities. And I believe this is a trend that will hold across analytic platforms and solutions. Let’s be honest about what AI can do.
At the same time, AI struggles with maintaining attention across multi-stage tasks, lacks the creativity to look outside its training data or to proactively find new data, and still maintains both strategy and storytelling weaknesses when trying to work outside of a generic context. All this provides context for this year’s Tableau Conference 2025 and the big announcement of Tableau Next, which lends some potential answers for the future of the data analyst as well as the future of Tableau as a Salesforce-based solution.
Understanding Tableau Next
In 2025, Tableau’s current CEO, Ryan Aytay, kicked off Tableau Conference 2025 with an honest appeal to the data analyst on the future of Tableau and the commitment that exists for supporting Tableau as a set of solutions as well as a community. Since taking over as CEO of Tableau two years ago, Aytay has faced the interesting challenge of integrating Tableau with Salesforce in a way that takes advantage of Salesforce’s massive investments in data and AI without losing the magic that has made Tableau one of the most influential technologies and tech communities of the 21st century.

Tableau Next, which is currently available as part of the Tableau+ SKU, has been developed as three interrelated sets of technologies used to help support a more agentic use of analytics, which simply means that human analysts will have greater access to AI for supporting the busy work of data while business users will have greater access to natural language-based querying of data.

The first of the three sets of technical capabilities are the foundational layers used to support the data. Tableau describes these layers as an open data layer, a semantic layer, a visualization layer, and an action layer. For many long-time Tableau users, there has never really been a need to think about much more than the semantic and visualization layers, but as we’ve already established the times are a-changin’. The most interesting aspect of these layers to me is the re-use of Salesforce platform capabilities to expand Tableau’s functionality. That sentence alone probably needs some explaining.
So, when most people first hear of integration with Salesforce, they think of integration with Salesforce CRM or some combination of Sales Cloud, Service Cloud, or maybe Marketing Cloud. That is not what is meant here. Others with knowledge of Salesforce may think this refers to Heroku or some sort of application development platform. But that is not what is meant here, either. Rather, the type of integration I am referring to here is between Tableau and specific functional aspects of the Salesforce Platform.
For instance, the open data layer that Tableau is providing to support access to external data platforms like Databricks and Snowflake uses the Salesforce Data Cloud to provide real time data access to a wide variety of data lakes, data sources, and business applications. This capability is something that Tableau needed to build to improve data analyst access to data and avoid unnecessary imports and exports. Zero copy data access is always preferrable when possible. The open data layer does not completely eliminate the need to duplicate and transfer data, but it does reduce this need while allowing for greater data access and orchestration.
The action layer is a reuse of another long-time Salesforce platform capability, the process automation capability of Flow. And this use of Flow is really interesting to me because it will allow for greater process automation within Flow. Using the action layer will be less clunky than the current use of Tableau External Actions. Despite Tableau’s dominant market position, it has not been seen as a leading data workflow automation solution. It is no secret that the data analyst will be asked to both support the AI-based creation of commoditized work as well as to be more responsible for automating the contextualization and insight creation associated with new data.
Tableau Semantics is an additional step towards supporting context. Data context is all over the place between data warehouses, data catalogs, ETL tools, and even Retrieval Augmented Generation jobs used for AI models. Although the initial version of Tableau Semantics is focused on Tableau Next, Tableau Cloud, Tableau Server, and the Salesforce platform, which includes access to the recently announced Salesforce Agentforce, future versions are expected to support Tableau Semantics to third-party semantic layers where I believe the real value will lie. Having a full enterprise semantic layer under the umbrella of the solution that is often the first tool that uncovers big strategic insights from data would be extremely helpful.
Currently, the visualization layer of Tableau Next is probably the area that will get the most scrutiny from veteran users, especially as Tableau Next is designed to be an agentic solution while the classic Tableau solution is still most optimized for providing the widest variety of visual capabilities. Tableau Next’s visualizations shine in terms of performance and supporting real-time data in a composable and API-friendly manner. Visualizations in Tableau Next are focused on being application components, which is important as the data analyst will be held increasingly responsible for sharing data outputs across every imaginable channel including reports, dashboards, visualizations, workflow automations, APIs, applications, and agents. The less work analysts have to spend on making visualizations app and API-ready, the better.
Tableau Next Launches Agentic Analytic Skills
The second area of Tableau Next that analysts will notice is the set of pre-built agentic analytics skills that are mostly scheduled to come out in June 2025. Data analysts have understandably been concerned about the agentic and AI capabilities coming into data analyst work as there has been a lot of press over the past three years about AI taking away technical jobs and even pressure from employers to avoid hiring employees to do work that AI can do. So, what is Tableau Next’s AI intended to do?
At Tableau Conference, Tableau Chief Product Officer Southard Jones provided some guidance on areas where Tableau Next would provide additional context for data analysts, especially in areas that data analysts typically either do not enjoy or are unable to keep up with because of the overwhelming volume of potential requests.

Data Pro is an agentic assistant for data preparation and quality that aligns with the Tableau brand promise of supporting data analysts. If the average data analyst rated their work from a 1-5 scale, I am sure that data prep and quality tasks would rank a consistent 1 as the bane to any good analyst’s goal of making data insights shine. Even those concerned about AI will likely welcome any agentic help supporting prep, cleansing, and transformation tasks.
Tableau’s second agentic capability is Concierge, which answers natural language data requests and is aimed at the business user.
Although the data analyst can use Concierge to look over large reams of data and provide some guidance on how to describe the data in human language, the primary intention here seems to be in helping the average business user to create, organize, and leverage basic data charts and visualizations. It will be interesting to see both how these queries end up leading to more complex requests for data analyst support and if this insight creation may end up creating a massive amount of outputs that need to be curated and rationalized by… of course, the data analyst.
Although I am generally a fan of giving more people the self-service access to data that they seek, I do wonder if there will be unexpected challenges from giving people who lack data fluency the access to explore data and rapidly create insights that lead to “top 3 opportunities,” “top 4 opportunities,” “top 5 opportunities” and “top 6 opportunities” all ending up in a departmental dashboard or notebook or holding area of some sort that needs to be cleaned up. Perhaps we will be trading data prep for analytic debt and prep in the near future as the cost of getting sales the data they tactically need in real-time? As long as the business value is there, this is not really a problem as much as a bit of a redefinition of the data analyst’s role. Both Data Pro and Concierge are scheduled for a June 2025 Generally Available launch.
The third Tableau agentic capability, Inspector, is not scheduled for June, but later in 2025. Inspector is designed to constantly monitor data for changes and then provide alerts related to those changes. This is a capability that I believe will be very interesting to combine with Tableau Action Layer, as an alert based on either an IT problem or a customer service problem could quickly set off a series of root-cause analysis visualizations or data pings or agentic descriptions or a rapid response team. Inspector also extends Tableau’s functionality to become increasingly proactive and automated to fit with the data analyst’s increasing need to orchestrate responses while designing data-driven explanations of the truth.
Finally, Tableau is including both internal and external marketplaces to help Tableau users to share and reuse existing data and analytic assets. Over time, internal marketplaces may end up superseding some of the need for static dashboards as marketplace subscriptions or selections allow for greater customization of data connections and data assets. The ability to potentially build valuable data products on an external marketplace could be interesting as well, depending on whether the customer has data that is commercially valuable.
Recommendations for Current and Potential Tableau Customers
Overall, the Tableau Next offering demonstrates both Tableau’s ability to leverage capabilities from the Salesforce platform and the desire to help Tableau data analysts as analysts are being asked to do more and different activities based on their data and analytic capabilities. Based on these functionalities and capabilities, Amalgam Insights provides the following suggestions.
First, look at these capabilities as part of an exploration of how your data analyst job will change. Automation will take away some of your dashboarding and reporting responsibilities. Look at Tableau Next as a starting point to see if there are opportunities to become more of a virtuoso graphic visualizer working off open-ended questions or whether data analyst skills can be used to help orchestrate enterprise analytic outputs and requests more intelligently. I’ll be going deeper into the demands on the next generation data analyst in a separate piece, but this is the time to look at your 100 day plan to figure out what kind of data analyst you want to be and the 1000 day plan to get from here to there.
Second, look at the new capabilities across data layer, semantics, and especially the action layer. Data analysts will be increasingly asked to provide more programmatic and automated access to analytics. Although some of this access will be handled through self-service agents, there will still be demands to manually define actions and to translate agentic and generative AI requests into auditable actions and workflows. See how Tableau Action layer compares to existing data and process automation solutions in your organization and determine how much overlap may exist from a functional perspective.
Third, Amalgam Insights believes this is a good time to look at Concierge in terms of how it will fundamentally change analytic usage in your organization. Concierge is both a massive opportunity and risk, as is any technology that can potentially be used by half of your employees without a lot of training. The UX for Concierge is the same natural language that we have all learned in our generative AI experiments. But Amalgam Insights still recommends a careful multi-stage approach with an intermediate stage of testing with trusted users before opening up Concierge to everyone. The concern here is less about whether end users will get what they want and more about ensuring that there are not unintended consequences for the data and analytics teams. But an approach like Concierge is necessary for business data to have its intended impact on the majority of employees. We have learned that even Tableau can’t turn the majority of employees into data analysts in most companies, so Tableau has taken the next step of making data easier to query and explore.
Tableau Next introduces a new set of capabilities for the data analyst portfolio as data responsibilities continue to expand. The integration of Salesforce platform capabilities into Tableau should be seen as a net positive, as the data and agentic capabilities that Tableau Next is inheriting are strategically important to the whole of Salesforce. The biggest challenge that Tableau analysts now face is to figure out how to reconfigure their job responsibilities to reflect what can now be automated. From a practical perspective, it is likely that analysts need to proactively identify and learn the “value-added” activities that should be supported with the additional time that AI and automation make available. Amalgam Insights provides this list as a preview for our upcoming video “The Future of the Data Scientist.”
