⏰ Reading Time - 9 minutes ⏰
“The future analyst won’t write SQL, they’ll configure the AI analyst.”
Let that sink in.
It’s a fundamental shift in how data work will be done.
Last week, I promised that I will go deeper into what I'm actually building, where things are breaking down, and why the real bottleneck isn’t what you think it is.
So, let's get into it, shall we?
Simple: I want to take you inside the process of building a functional AI analytics agent, one that could change how you analyze data in your team or business.
And I’ll show you why a GPT wrapper isn't the real answer.
You’ll learn:
In the last issue, I outlined the general direction of where this is going. Here's a quick recap:
Step 1 – Reactive Agent
Answers business questions on demand. Think: "What was revenue last week?" or "How did the email funnel perform this month?"
Step 2 – Proactive Agent
Constantly monitors business goals and initiatives. It tracks whether you’re hitting targets and surfaces risks before they turn into problems.
Step 3 – Autonomous Agent
Takes action. For example, it might pause a losing landing page test or reallocate spend between ad creatives, without you lifting a finger.
Right now, I’m building Step 1: a reactive agent that works.
And trust me, even that’s not as plug-and-play as people claim.
Wobby's claim is to build AI Analysts that deliver business-ready insights straight from your data warehouse, right in Slack or Teams. Self-serve analytics, trusted results.
I connected my own Data Action Mentor BigQuery Data Warehouse and started building my first agents.
The goal:
No more ad hoc SQL.
Just ask questions about my data in Slack and get deterministic, high quality answers.
The Wobby team sees a future where the data analyst role will disappear and merge into the analytics engineering role. This role will be responsible for providing the agents with clean data and metadata and configuring the agents.
The idea is to build domain-specific agents. You configure them with rules and data access, and then ask questions in Slack or Teams.
Meet my sessions agent:
Its job: Analyze web funnel performance for sales of my Masterclass "From dashboard factory to strategic partner."
The Instructions window contains general context on how the AI agent should interpret and respond to analysis tasks.
Next, you can restrict the agent to specific datasets and tables and I decided to let it laser-focus on the sessions table in my DWH datamarts layer .
You can also decide if your agent is allowed to execute custom SQL queries. If this toggle is switched on, the agent can write their own queries based on your question and the metadata it can access. I switched this off as 90% of the custom queries were completely off (and my DWH has very clean metadata 😉)
If the switch is "off", the agent can only perform queries based on what it knows from its Knowledge Base.
Here, you can provide two types of knowledge:
2. Contextual descriptions of your business context and vocabulary. For example, I defined my different types of funnels:
Sounds clean. But here’s where things started to fall apart.
Despite all the prep and the very tightly defined use case and data access, my Wobby agent didn’t behave deterministically.
Even when I blocked it from writing its own SQL, results were inconsistent. And when I tried expanding the context to help it out more, two big issues emerged:
Why?
Because Wobby doesn’t currently support a real semantic layer. All that business logic and interpretation is stored in isolated bits of context.
In short:
You can’t get deterministic answers if your agent is guessing what your data means every time.
I'm convinced:
AI Agents won’t work reliably without a semantic layer. One that’s written and maintained in code.
The current Wobby-agent approach breaks because:
→ The agent has no structured way to understand your data
→ All logic lives in disconnected text fields
→ Updating it at scale is a nightmare
If your agent doesn’t know what “revenue,” “active user,” or “checkout conversion” mean in a precise, reusable way, then you’ll always be stuck babysitting it.
Which defeats the point.
That’s where I’m heading next.
Right now, I’m testing Connecty AI, a tool that helps build and maintain a semantic layer using AI. It promises to:
This matters, because defining and maintaining a semantic layer manually is:
Connecty's goal is to build the world's first fully autonomous Day 0 semantic layer.
Many data teams playing with AI agents are missing the most important piece: common, foundational understanding.
Without a semantic layer, your agent is just a fancy autocomplete that makes pretty dashboards but doesn’t understand what it's showing.
Here’s what I’ve learned:
If you’re serious about building analytics agents that don’t just look cool but actually help your business, the next step is clear:
You need to start investing in a real semantic layer.
I’ll go deeper into how I’m testing Connecty and what that setup looks like in the next newsletter.
Until then:
Ask yourself: Are you ready to build AI analytics agents?
Or will you build master hallucinators?
See you next time!
Sebastian
P.S.: This is not a sponsored post. I'm sharing my neutral, unbiased observations.
Subscribe for weekly tips on building impactful data teams in the AI-era
Data Action Mentor Masterclass : 🏭 From dashboard factory to strategic partner♟️
A digital, self-paced masterclass for experienced data professionals who want to work on high-leverage projects (not just dashboards). 📈
Free content to help you on your journey to create massive business impact with your data team and become a trusted and strategic partner of your stakeholders and your CEO.
We build 10X, AI-first data teams. Together.
A curated community for ambitious data leaders who generate outsized business impact (and outsized career growth) by building the AI-powered 10X data team of the future. For the price of less than $1 per day.
You'll get expert content, hype-free conversations, and curated 1:1 matchmaking with forward-thinking professionals in the data & AI space.
We have received your inquiry and will get back to you asap!
Watch your email inbox for an email with the subject line "Data Action Mentor Masterclass - Create more business impact with your data team".
We will send you updates about the status of your application and about the masterclass launch.
Don't forget to check your Spam folder!
👉 We will be in touch as soon as we're ready to launch the masterclass!
Thank you for your interest in the Data Action Mentor Masterclass - Create Massive Impact with your Data Team!
We will let you know by email before October 30 if you are one of the 3 FREE beta testers.
Please note that the masterclass is still in development. We'll keep you in the loop about our progress in building this!
Best,
Sebastian - Founder Data Action Mentor
👉 Please check your email to confirm your subscription!
⏰ This confirmation email may take up to 5 minutes to arrive (it may land in your spam folder).
You are now on the waitlist!
We will be in touch as soon as we open admissions to the Data Action Mentor community!
🤞 Please don't forget to look for an email with the subject line "Please confirm your email" in your inbox to confirm your newsletter subscription.
I will be in touch once I have news regarding the masterclass!
Cheers,
Sebastian - Founder Data Action Mentor
You are now on the waitlist! We will be in touch as soon as we open admissions to the 10x Data Team community!