Enterprising Insights, Episode 24- Tableau Conference, Industry News, and an Enterprise Apps Rave

Enterprising Insights, Episode 24- Tableau Conference, Industry News, and a Enterprise Apps Rave

In this episode of Enterprising Insights, The Futurum Group’s Enterprise Applications Research Director Keith Kirkpatrick discusses Tableau Conference 2024, recent earnings news from major enterprise application vendors. He then closes out the show with the Rant or Rave segment, where he picks one item in the market, and either champions or criticizes it.

You can grab the video here and subscribe to our YouTube channel if you’ve not yet done so.

Listen to the audio below:

Disclaimer: The Enterprising Insights podcast is for information and entertainment purposes only. Over the course of this podcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we do not ask that you treat us as such.


Keith Kirkpatrick: Hello everybody. I’m Keith Kirkpatrick, Research Director with The Futurum Group, and I’d like to welcome you to Enterprising Insights. It’s our weekly podcast that explores the latest developments in the enterprise software market and the technologies that underpin these platforms, applications and tools. This week I’m going to recap yet another conference I attended, which is the Tableau Conference 2024, which is held in sunny San Diego. Then I’m going to mention some brief industry news from the past couple of weeks, and then of course, close out the show with my rant or rave segment where I pick one item in the market and I will either champion or criticize it. So without any further ado, let’s get right into it.

So as I mentioned, I just got back from the Tableau Conference, which was the, basically Tableau, of course, if you’re not aware, they are a provider of data visualization software. So if you think about data visualization, instead of looking at a static Excel table or a chart or whatever, it’s actually a visualization of the data and they really take many, many different forms and it really is pretty amazing if you look at the different types or different ways that data can be presented. Tableau is a software vendor that actually provides that functionality. Now, the Tableau Conference, it is really focused in on what they consider to be their core users, which are data analysts, and really the idea here is for them, for Tableau to really announce some of the new product enhancements, talk a little bit about their strategy moving forward and really give them a forum, giving the analysts a forum to show off what they’ve done over the past year or so with the product.

That really comes in the form of, there was a nice little segment there that they did called Iron Viz, which is set up like a, the Iron Chef competition, where you have three data analysts who are up on stage where they have to create three visualizations in a short amount of time, and then a judging panel picks the winner. That was actually really interesting. I’m sure they have the results somewhere up their site. So anyway, what I wanted to talk about here were some of the specific product announcements that they made at the conference and then get into some of the other key takeaways I had from the event that I found particularly interesting. Talking about a few of the highlights of the product announcements that I thought were interesting. One of the features is this local file save within the Tableau Desktop public edition. This is a really interesting feature. I’m really surprised they never had really had this in the past, but basically it allows users to save and work with data on their desktop locally rather than having to have it up on a server edition where anyone could see what’s going on with it before a visualization is complete.

This is pretty interesting because if you think about a worker who is perhaps trying to create a visualization, they may be trying certain things out, maybe the visualization won’t look very good as they’re working on it, but they don’t necessarily want to have that shared with the entire community or basically the entire work group up on the server because sometimes when you put something out there like that, if people see it, they might make comments and it might skew how things are going. So this is an interesting feature because this allows the users to develop visualizations and then reveal or share them with stakeholders on their schedule rather than while they’re in development. So very interesting. Another one is these new Viz extensions. These are basically visualization extensions that allow users to create custom visualizations directly from the Marks card more quickly and enable data analysts to tell a story through data with less friction and effort. Essentially, they’re almost like templates that really help data analysts create interesting visualizations more quickly with less effort.

This dovetails with this whole trend around trying to reduce the amount of friction and repetitive work and really grunt work associated with manipulating data, which ultimately saves time, makes people more productive and hopefully allows them to focus in on the things that are most important to them, which is really manipulating, really telling a story with that data. Now, there are a few other interesting announcements that they made around data and data governance in terms of there’s something they call a data cockpit, which allows Tableau administrators frequently asked questions to be answered about who’s contributing to the file, what data is being used, whether or not the data has been certified and other attributes. It’s essentially just really a check on to make sure that the data that is being incorporated in a specific visualization to make sure that you know all of the attributes about it, which is particularly important as you see folks who, if there are multiple contributors or if you’re not sure exactly where that data came from, it’s really important that they know that.

What else was interesting? They have announced because if people aren’t aware, Tableau was bought by Salesforce, I believe back in 2019. So Salesforce and Tableau have been trying to come together a bit more in terms of providing features and functionality to allow for a more streamlined experience. And what was announced at the conference is Einstein Co-pilot for Tableau Prep, which is the data prep functionality or application, and for sentiment analysis. This is really interesting because it allows a more natural interaction with these applications to make it easier for data analysts to use the software. And then of course they made some other, I don’t want to say minor, but I guess they’re definitely more granular updates that I thought were interesting. One of them was something called spatial parameters, and basically what this allows is geocoding at the address level and allowing data to be basically assigned to a specific geographic location. This is pretty interesting because it allows a much deeper and more granular level of analysis to be provided on data which can then be represented within a visualization.

So if you’re thinking about, let’s say you’re doing a geo analysis of a particular attribute, you want to see where is there a certain prevalence of X, Y, Z variables. You can see that at a very, very local level, which is very interesting, particularly as we get more granular with data in terms of collection and utilization of that. Let’s see here. Google Fonts, this is actually pretty interesting because this is support for Google Fonts that will work on Tableau cloud and on the web, and really what it does is it allows a much greater incorporation of a brand’s own guidelines or readability because basically the Google Fonts that are out there and are used by a lot of brands is now available on Tableau cloud. So you don’t have that thing where you create a visualization that does not look like it was created by a particular organization because the fonts don’t match. So that’s pretty interesting. What else here? Another really interesting thing is Tableau is trying to do, there’s really a big thing going on here in that if you think about Tableau and data and visualizations, it’s great to be able to go into Tableau and look at a really interesting visualization, but ultimately it’s a pain because if you’re working in something else and you want to get something, you need to click out of that app and then go into Tableau.

Well, not so much anymore. They’ve been really trying to focus on a way to allow users of other applications to basically pull in data visualizations from Tableau to actually be used in the flow of work, and one of the announcements they made was an integration with Microsoft Teams. So users will now be able to embed Tableau Pulse insights within teams. I think that’s a pretty big announcement, and the reason I say that is because if you look at it, one of Tableau’s biggest competitors is Microsoft’s Power BI, and there is this thing in the market where you would love for all of your customers to migrate over to your platform entirely, and that’s just never going to happen. You’re going to have organizations that are, for whatever reason, the organization is pretty much standardizing on Teams, but they may have some users who really like to use Tableau because it’s powerful, because you can do all sorts of interesting integrations with data, particularly if it’s held within let’s say Sales Cloud within Salesforce Sales Cloud, but maybe they prefer the collaboration platform of Microsoft Teams. So you’re not going to necessarily get them off Teams, but you still want to support them in the flow of work by making all of this other data available within that collaboration platform.

So I think we’re going to see a lot more of that as we move forward. So those are just some of the new features that were announced that resonated with me. Obviously I’m not a data visualization analyst and whatnot, but those are what seem to be interesting in terms of dovetailing with some of the bigger trends in workplace, which is really trying to make it easier for people to interact with data wherever they are in whatever applications they prefer as opposed to making them go outside of that. So I wanted to just briefly touch on a few other things here, just general takeaways from the event. I think the biggest thing that really resonated with me is the high level of enthusiasm from its dedicated loyal base of data analysts. We sat in the keynote in the first day and whenever a new feature was announced, you had just this crazy enthusiastic cheering for these new features. Now, yes, some of it started to become a little bit, who can cheer the loudest or make the biggest little joke here or there, but you really can’t deny the fact that users really do like the product and they do really appreciate the fact that Tableau does try to listen to its users in terms of incorporating new functionality or features.

Now that being said, Tableau management has said, Hey, we listen to everything. We won’t necessarily include everything they asked for because of either technical reasons, or it doesn’t fit into their strategy, but I think it’s clear that Tableau does have this loyal and dedicated base of users. And that will help them particularly in terms of if you think about organizations and how they purchase software, having that loyal base of users pounding the table saying, we want this application, it does help influence purchasing. Now, the challenge, of course, is translating that enthusiasm from the user level up to the CIO level or whoever is involved with the purchasing. That’s where it’s going to really come down to Tableau being able to effectively tell their story and translate this enthusiasm among their user base and really drawing that and comparing or basically linking that to productivity benefits, ROI, all of those more traditional business metrics which are used to determine software purchasing. Now, another takeaway I got here is Tableau, they certainly are trying to incorporate new technology across the platform, but they seem to be doing it in a very measured and conservative way. I think in talking to some users at the event, some of them felt that they moved a little bit more slowly than they would’ve liked.

They would love to see more innovation more quickly than Tableau’s comfortable delivering at. I think there was a good reason why Tableau was doing that. Ultimately, if you think of an enterprise software product, you can’t have a buggy or glitchy product when it’s being used by enterprises when they’re demanding or when they’re really depending on that information. And that means it takes time to pilot new features, it takes time to test it, it takes time to red team it to make sure that you can’t break it either intentionally or accidentally from a security standpoint. So I think it is a prudent approach by Tableau to take a more measured approach to rolling out new features. It might frustrate some users, but ultimately I think that’s the right way to go. I know we’re in an AI arms race right now where everybody wants to be the next to saying, we have this new feature and we’re this close to going GA and doing that quickly. But I think that particularly when it comes to a product that is used for critical business analysis, it makes sense to take a more conservative approach to rolling out these new features. And obviously a lot of what we were told in terms of new product features and enhancements is under NDA, so I can’t really get into that, but I will say that management does understand that they do need to deliver innovations to the market to their own customers, and they are working on it.

Now, I alluded to next point earlier, and really it revolves around the embracing of a more open technology ecosystem and really being able to provide insights from this data and providing these visualizations within the flow of work, even if that flow does not include Tableau or Salesforce applications. I think this is a strategy that is, this is going to become the norm where it isn’t going to matter what application you choose to work in for most of your daily tasks. You’re eventually going to get to the point where if you need data and it happens to be held in a different location or held in a different application, you eventually will be able to access it through an API. I think that is going to become the standard as we move forward because customers are going to demand it. This whole expansion of the technology stack is one thing in terms of making sure that you’re utilizing them at a right level and you’re making sure that you’re managing the security issues, but it really comes down to the end user and productivity. If I have to cycle through eight different applications to get the data I need, that’s not efficient if that’s where you are wasting time switching between apps and also potentially introducing errors because if you’re not able to automatically pull that data, there’s transcription errors, you could just be putting it in the wrong place, what have you, it’s not an efficient way to do it.

And in this day and age, with the tools that we have, we can certainly do a better job of basically incorporating data and control within whatever application the user chooses to use most of the time. So I think this is a good strategy for Tableau to continue with. I would hope that other vendors after are taking a similar approach as well. Now, I think the last point is pretty specific to Tableau. So as anyone knows, if you take a look at Tableau as a data visualization provider and you compare it against some of their competitors, if you think of Oracle or you think of Microsoft, whoever else is out there, Tableau is generally speaking, positioned as a premium offering in terms of the license cost. Obviously when you start to get into enterprise licenses, you start to see a little bit, the price is doing a little more of this, but if you were just to look casually and go, oh, for a single user license for let’s say Power BI, you can even use it for free initially or it’s a relatively inexpensive $20 per month or per user per month license versus Tableau, which is significantly higher. Now, the issue of course is that that has created the impression, rightly or wrongly, that Tableau is seen as being relatively expensive in the marketplace, particularly for folks who want a data visualization platform and they aren’t necessarily power users.

So Tableau really needs to demonstrate the real-world business value that can be derived from their platform and the use of all of their tools and really the ecosystem around Tableau, and that’s what is going to really tell the value story. It isn’t necessarily just about looking at the single-user license cost. It’s not about just saying one costs this much and one costs this much. They really have to dive into some of the other aspects which can include things like support. I did speak with management who mentioned that the company over the last year, year and a half, it’s really tried to refocus its efforts on providing a more tiered approach to support to basically match the different types of users that they have with the different support offerings in an effort to really ensure that there’s customer success with Tableau. And I think that is what is going to really help them nullify these issues when it comes down to the perception that, oh, Tableau is very expensive. And I’ve had many conversations with people about this, and you’ll have shops or organizations where they have a couple of Tableau licenses and they have a bunch more whatever the other competitors are, and really Tableau, what they need to do to really expand is talk about the overall value that is being derived from the platform as an integrated application within Salesforce and also being able to integrate very easily with a number of different types of companies.

And we’re talking everything from its direct competitors like Microsoft as well as some of the big cloud vendors out there as well who have a data lakes where a lot of data is stored. So I think it is a, it’s certainly possible for them to do it. We’ll just have to see how they approach messaging moving forward. So with that, I will say it was a great conference. It was my first Tableau conference. It was really interesting to see again, the enthusiasm from not only the users but also from management, which I think has put them on the right path moving forward. Okay. Now I’d like to talk a little bit about some of the other news in the market. As everyone is aware, over the past couple of weeks, some very large SaaS application vendors have released earnings, and I think we can talk about ServiceNow, SAP, Microsoft, Google, OpenText, you name it. They’ve all pretty much have done a couple of things. One is strong top-line growth for most of these companies in most of their divisions. What’s driving that? Well, certainly AI or the promise of AI, also cloud, the move to the cloud that is also driving earnings, simply because of the fact that organizations are realizing that to deploy these new technologies is more efficient, generally speaking, to have some sort of a cloud infrastructure in place to do that.

Now, I will say there is a bit of a caveat here in that a couple of these companies have issued future guidance that might be a little bit lower than what the street had been expecting, and we’re talking for late 24 going into 25. And I think there’s a couple of concerns there. Obviously there’s the continuing economic picture of high inflation. I think there’s a bit of a worry there about how that’s going to impact these companies’ abilities to continue on that same growth trajectory. There’s also the concern, I think, and rightfully so, it comes down to are these companies able to translate the enthusiasm in pilot programs with AI and initial use cases into real world production value of AI use cases over the next quarter, two quarters, half a year, what have you? Because that’s really where they’re going to actually start generating revenue for these vendors when we start to see mass adoption of generative AI within these platforms. And it can be either within the overall platform itself in terms of more licenses being purchased versus, and it can also be more of a tiered thing where you have your basic license and you’re buying an add-on AI assistant component there, which obviously generates revenue.

And then of course, the third thing is looking at this model that some companies are starting to look at, which is more of a consumption based approach where you have token based system where X number of tokens equals this type of generative AI process or function versus another function, which might use more or less. And the idea is that it almost works sort of like a retainer or a deposit account almost, where you have X number of credits and you basically draw down off of that. And that is how it is built. And the idea of course, is to more closely tie revenue to the actual cost of deploying generative AI. So there’s certainly a little bit of concern there just as we move ahead. I think the key for all of these organizations is that they need to focus, not necessarily on our AI can do this or our technology is better than their technology. It has to be around how are they actually delivering business innovation and business benefits. And that means looking at real world metrics of, okay, I saved, by using this AI I’ve saved this amount of time or this amount of revenue or money, or my agents were this much more productive, or my productivity workers were able to accomplish this much more because of AI.

I think those are the stories that need to be told that will continue to convince companies to invest in these types of products as we move forward, because at a certain point, it’s not going to be any different between any of these vendors in terms of actually what generative AI can do. So with that, hopefully we will continue to see strength in the market as we move forward. Now, I’d like to wrap up the coverage of the news today and of course, close out our show with the rant or rave segment. And this week I actually have a rave. And one of the things that I heard about, not only at the Tableau Conference, but at the Oracle event I went to, the Ibuya event I went to, the Adobe event I went to, and even the Zendesk event that I went to, and these are just the most recent events I’ve been at in the last two or three weeks, is that these organizations are really highlighting the need for their end customers to have a really solid data management, data governance, data observability strategy. So what are we talking about here? Well, we all know that LLMs are really powerful, but they’re only going to be as successful with generative AI if they have data that is clean, that is labeled, that is properly segmented, that is available and ultimately is available for use so the LLMs can actually act upon that.

So if you think about it means making sure that the data metrics are in place to measure the impact of an LLM. What actually are you trying to accomplish and does that actually help you? Does that get you to that goal? Certainly making sure that all of the data has the proper metadata so it can be labeled and identified and used where it needs to be. And then of course, looking at reporting, that is a big issue for many, many companies, making sure that you know what’s happening at all times with that data, how it’s being used, what’s the impact, what is the input, what is the output of using generative AI or really any AI on that data. So I think vendors are really talking a lot about that now. And of course, the reason is, is because they’re getting to the point where most of them are realizing that some of them may have their own LLMs. Most are taking more of an open approach to LLMs, but really customers are not going to see the benefit of generative AI unless the AI is acting upon data, the customer’s data, and making sure that that data is accurate and actually can actually hold insights that are able to be extracted by the LLMs.

So really, I think we’re starting to see both the vendors and of course, end user organizations focus on data as an enabler of generative AI performance and making sure that they are doing everything they can to make sure that their data strategy is sound before they try to actually implement generative AI. And many of these companies, many of these vendors are actually working with them through their professional services division to help them along that process. And I think that is going to be key to seeing this widespread adoption of generative AI. So that’s all the time I have today. So I want to thank everyone for joining me here on Enterprising Insights. I’ll be back again next week with another episode focused on the happenings within the enterprise application market, CX and EX market and collaboration space as well. So thank you all for tuning in and be sure to subscribe, rate and review this podcast on your preferred platform. Thanks, and we’ll see you next time.

Author Information

Keith has over 25 years of experience in research, marketing, and consulting-based fields.

He has authored in-depth reports and market forecast studies covering artificial intelligence, biometrics, data analytics, robotics, high performance computing, and quantum computing, with a specific focus on the use of these technologies within large enterprise organizations and SMBs. He has also established strong working relationships with the international technology vendor community and is a frequent speaker at industry conferences and events.

In his career as a financial and technology journalist he has written for national and trade publications, including BusinessWeek,, Investment Dealers’ Digest, The Red Herring, The Communications of the ACM, and Mobile Computing & Communications, among others.

He is a member of the Association of Independent Information Professionals (AIIP).

Keith holds dual Bachelor of Arts degrees in Magazine Journalism and Sociology from Syracuse University.


Latest Insights:

Omar Rawashdeh, PowerEdge Product Planner at Dell, joins Dave Nicholson and Lisa Martin to shed light on Dell's latest PowerEdge Servers, highlighting the innovative T160/R260 PE servers and their pivotal role in modern IT infrastructure.
Jon Siegal, SVP Dell Portfolio Marketing at Dell, and Jonathan Huang, MD/PhD Candidate at Northwestern Medicine, share their experiences on leveraging AI to advance patient care. Their insights highlight significant AI-driven innovations in healthcare.
Dell’s Justin Bandholz joins Dave Nicholson and Lisa Martin to share his insights on Dell's innovative new CSP Edition products — the PowerEdge R670 & R770, designed specifically for the challenges faced by Cloud Service Providers.
Christina Day, Director of DRAM Product Marketing at Samsung Semiconductor, joins hosts Dave Nicholson and Lisa Martin to share her insights on how advanced memory technology is critical for accelerating and enhancing AI capabilities, highlighting the potential of Processing-In-Memory (PIM).