The Reports Of AI Replacing BI & Analytics Are Greatly Exaggerated
How Tableau Plans To Redefine The Field
Self-service, low code, and automation tools were supposed to have ended BI and analytics years ago. Here we are in 2024, and demand for Data Analysts is second only to AI Product Managers in terms of available jobs. Last week, I sat down with the Chief Product Officer for Tableau, Southard Jones, to discuss what’s next for BI and analytics.
He and Salesforce are all in on AI, but data still consumes his vision. As we talked, one question got louder and louder. AI will bring self-service, no-code analytics into reality. Jones talked about data and analytics being available everywhere without needing to work in Tableau. So, is Tableau about to go away? His answer will surprise you.
Tableau is reinventing itself to stay relevant and deliver value for another decade. In the process, it’ll save analytics and BI. Jones’s vision puts a new type of data at the center of AI. Models are increasingly commoditized, and data is emerging as the competitive advantage.
If Salesforce’s CPO is right, BI Engineers will become Knowledge Graph Engineers, and Analysts will transform into Business Ontologists. Tableau’s past is data management and visualization. It’s handing those workflows to AI agents to focus on higher-value work. Its future is knowledge and expertise management.
How Agents Surpass LLMs To Take Over BI & Analytics
You won’t see the level of disruption until you start using agents. Most are experimenting with ChatGPT or one of the copilot-style options, including me, to get a sense of what’s next. I didn’t realize how much they limited my thinking.
Agents do work in ways that are very digital and that digital can’t come close to. Digital apps are constrained by business logic, and their data access is narrowly defined. That’s how they deliver reliable functionality. The same business logic guardrails and data access controls can constrain agents, making them more predictable.
Agents serve a broad range of customer and user intents. Salesforce demoed a Saks Fifth Avenue customer service agent. The customer called in, and the agent answered immediately. It didn’t provide a list of options. The agent asked, “How can I help you?”
The “customer” said they bought an item that was too small and wanted to exchange it for the next size up.
The agent confirmed the item in question and the new size with the customer. It gave them an estimated shipping arrival date.
The customer said they needed it sooner, and the agent offered in-store pickup at the closest location to them.
The customer agreed, and that was it.
That’s not how LLMs or copilots are designed to work. What do agents replacing copilots mean for BI and Analytics workflows? The self-service concept comes to life if we build an agent with access to all the business’s customer data.
Agents Will Take Over Some Analytics & BI Work
Southard Jones knows the magnitude of changes coming to BI and analytics. In the past, users would go to Tableau for their data visualization needs. The platform was a one-stop shop for access to data and insights. However, agents change all that.
All the agent needs is access to data, and a single pipeline is reusable by multiple agents. The rest of the workflows that used to be the domain of Tableau, BI Engineers, and Data Analysts will increasingly be managed by the agents. That’s an amazing step forward for users. They get real-time access to insights without needing the data team to build each product and dashboard that supports them.
However, will the data team hand this over to users and trust agents to deliver insights? If agents make mistakes, users make decisions with bad data. Customers get the wrong order information, or the wrong item is shipped.
Reliability Is A Massive Impediment
Expectations for accuracy and reliability are significant. It’s not enough to be right 65% of the time anymore. Jones says that Tableau’s insights and Einstein’s prescriptive capabilities must reach 95% for BI and analytics agents to gain wider acceptance.
Here’s an example of why that’s the case. I use Grammarly to edit everything I write. It does an excellent job of preventing the mistakes that make me sound unprofessional or outright ignorant. It also makes a lot of unnecessary suggestions, and some suggestions are wrong. Grammarly is an improvement over other editing tools, but it leaves much to be desired because it hasn’t hit the 95% accuracy threshold yet.
Trust leads to autonomy. Agents will be given ownership of more activities as they prove themselves. Trust won’t happen overnight, and Agentforce is built with visibility at every step of the agent’s process. There are explanations of why the agent did what it did. Salesforce understands that trust must be earned, and its AI product strategy includes trust-building design tenets.
That’s a significant weakness of copilots and LLMs. Gemini provides links to the websites that contribute to Google’s generative search summaries. We trust the search results because we can verify them. ChatGPT and other copilots aren’t built with visibility into their complex reasoning. We don’t know when it’s wrong because it doesn’t either.
Agentic reliability is improving, and built-in transparency is winning users over. The shift is coming workflow by workflow, and one improvement at a time. Customers expect access to data where they work, and their tolerance for app hopping has plummeted. Many visualizations will be replaced by conversations. Data will be a ubiquitous partner and a feature of every application.
How Will Tableau Save BI & Analytics?
Reliability is a function of data, not the models themselves. AI is data, but most data aren’t useful for model training. Southard Jones calls this ‘machine data.’ Businesses have gathered machine data for decades and struggled to turn it into reliable models. Data needs business and customer context to fill the gap. That’s where Jones sees BI Engineering and Data Analyst roles evolving.
Every business has its own language and domain expertise that create a unique topology. The best way to capture it is with a knowledge graph. Tableau’s CPO sees BI Engineers owning the conversion process and transitioning data warehouses into knowledge graphs.
Data Analysts have significant business knowledge and customer context. In Jones’s view, they are best positioned to be Business Ontologists who define the business’s topology and language. They define the structure and connections in the knowledge graph.
That’s what Southard Jones sees as Tableau’s future. It is the development environment that supports the transitions from BI to Knowledge Graphs. Tableau is a centralized platform that distributes data products across the business, supporting people, model training, and agents.
The purpose of data gathering is to bring new information into the business. Knowledge graphs are effective data models to document the business’s domain expertise, continuously add to it, and distribute it anywhere it can help improve business outcomes. That’s what makes data a novel asset class. Once executives treat it that way, the platform that extracts the most value from data will become critical infrastructure.
An interesting hype-free take on the road map of Tableau, thanks Vin. Sounds like to prepare for the future, every analyst should be over indexing on the business side of their job right now, not the nitty gritty of the tools. I am curious how this will integrate with the data engineers / analytics engineers who are maintaining vast and complex databases and data models
"Data Analysts have significant business knowledge and customer context."-- not so sure about this. Business Analysts do though, but they're an extinct role which is not necessarily replaced by DA.