Throwing away BI is a Bad Idea
With all of the vibe analytics, Text2SQL and AI for Analytics out there, people are thinking that BI's days are numbered. Yoni Leitersdorf, our CEO & Co-Founder, begs to differ.
If you doom-scroll through LinkedIn for more than a couple of minutes, you’re bound to encounter a post about BI being dead and AI replacing it. It’s especially true if you’re in the data and analytics field.
However, contrary to the AI enthusiasts out there yelling into the ether, I would like to share a different opinion: BI is (hopefully, probably) not going anywhere soon.
Yes, people are building data chatbots.
People are playing around with semantic layers for AI.
We’ve even heard of vibe analytics.
But… the reality is that AI is still having a hard time making sense of the data, and visualizing it correctly. This will be the case for the foreseeable future. It’s still just too hard. (AI is still under-delivering)
It’s even worse than that - by giving a business user this “loaded gun” of an unchecked data chatbot, you are risking them shooting themselves in the foot. Or shooting you in the foot (when they present a chart generated by AI at your exec meeting).
In a recent conversation with a large retailer, we were told they brought such an AI tool in, only to find out that business users often sent the results to analysts to review (reverse engineer) before relying on them.
That AI tool is now removed from their environment.

BI’s next act
Many enterprises have invested millions of dollars, if not tens of millions, into their BI. Building dashboards, improving dashboards, ignoring old dashboards and then building new ones in their stead. All this effort, in order to essentially codify the organization’s core metrics and the analytics around them.
We would be foolish to throw that away. There’s a gold-mine there!
The correct approach would be:
Pull metadata out of your current BI platform - the definitions of the dashboards and reports, the transformation/semantic layer, the data sources, the activity log, etc.
Use it to determine the high quality dashboards, and serve them to business users in response to natural language queries; as well as:
Use the metadata from the BI to learn more about the data sources themselves (assets within the cloud data warehouse for example), and use that to guide querying of those data sources (both through Text2SQL and manually).
This way, you are leveraging your investment into your BI, instead of wasting it. You are driving up the ROI, while also adopting the newest capabilities offered by AI.
In other words, it’s not PowerBI/Tableau vs Snowflake Cortex/Databricks Genie. It’s both together for the win.
But Yoni… BI platforms have terrible APIs
Our experience at Solid with the BI platforms’ APIs is that it seems like they have an internal personal conflict: on the one hand, they have APIs, because customers expect them to. On the other hand, they are afraid that those APIs will be used to replace them (for example: a migration project that uses the old BI’s API to extract the definitions from it). The public APIs are good, but not good enough… and I think that’s on purpose.
So, we invest a monumental amount of development time in leveraging both public and not-so-public APIs, reverse-engineering their metadata structure, and other shenanigans to achieve this. Is it glorious work? Far from it. But someone needs to do it, right?
There are several people on our development team that wake up at night, covered in sweat, after a nightmare about LookML / DAX / Tableau’s Semantic Layer or some other vendor-invented “modeling language”. What I’m proposing in this blog post is easier said than done, but it’s definitely possible.
My point here is… if we do it, if one or more software vendors do it, then there’s a pot of gold at the end of the rainbow for us all.