⏰ Reading Time: 7 minutes ⏰
“Everything should be made as simple as possible. But not simpler.”
- often attributed to Einstein
In analytics, I see two extremes over and over again.
On one end, there’s lazy analytics.
On the other end, there’s massive overengineering.
And if I’m being honest… I’ve personally done both.
Early in my career, I built churn prediction models with dozens of features, training data pipelines, model evaluation frameworks - the whole machine learning circus.
But I’ve also seen companies define churn like this:
“If a customer hasn’t bought anything in the last 90 days, they are churned.”
No analysis. No validation. Just a nice round number.
Both approaches miss the point.
And the lesson I learned over the years is simple:
Great analytics usually lives in the 80/20 middle.
Data teams are full of smart people.
And smart people love solving complex problems.
But here’s the thing I sometimes struggle to admit:
Business impact isn’t driven by intellectual elegance.
It’s driven by ROI.
The goal of analytics isn’t to build the most sophisticated model.
The goal is to create the largest possible business impact with the least amount of resources.
And one of the most important levers in that equation is simple:
Don’t overbuild.
Let me tell you about a mistake I made years ago.
I was working on customer churn prediction. Naturally, my instinct was:
“Let’s build a model.”
So I went down the classic data science path.
We collected dozens of variables:
Then we trained models.
Tested different algorithms.
Iterated.
Weeks of work.
The result?
A model that predicted churn reasonably well.
But then something funny happened.
We compared the model to a simple rule-based heuristic.
Something like:
Customers who have been inactive longer than their typical purchase cycle are at risk.
And guess what.
The simple heuristic performed just as well as the complex model.
At a fraction of the cost and time.
That was a humbling moment.
Not because machine learning is useless - it absolutely isn’t.
But because I had violated a very important rule.
I skipped the 80/20 step.
Now let’s look at the other side of the spectrum.
Many companies define churn like this:
Why those numbers?
Usually:
Because I've done this at this other company. Or gut-feeling.
This is what I call lazy analytics.
It’s easy. It’s fast.
But it’s also wrong most of the time.
Different businesses have very different purchasing patterns.
For some products, 30 days of inactivity is perfectly normal.
For others, 30 days means the customer is long gone.
And even for the same product the differences can be huge across markets, customer segments etc.
So how do we improve this without building a complex machine learning model?
Here’s a simple, data-driven method I often use.
Now something interesting appears.
For example, you might see something like this:
An example chart looks like this (it's the actual chart we used):
This gives you two useful markers:
Soft churn → 60% probability of not returning (after 32 days)
Hard churn → 80% probability of not returning (after 65 days)
These thresholds are:
And you can refine them further by segment.
Congratulations.
You just upgraded from lazy analytics to top-tier customer analytics - without writing a single line of machine learning code.
There’s a deeper psychological pattern here.
Data professionals are usually very smart (at least that's what I like to think of myself 😅).
And smart people often enjoy solving difficult technical problems.
There’s nothing wrong with that.
But complexity can become a trap.
Complex solutions:
But they often come with massive costs:
And in fast-growing companies and competitive environments, that’s a problem.
Because complexity consumes the one resource many companies don’t have (especially in today's AI-first world): time.
Over the years, I’ve developed a simple mental rule.
Whenever someone proposes a complex analytics solution, I ask:
Can we achieve at least 80% of the value with 10-20% of the effort?
And very often the answer is yes.
Here are a few other examples.
Instead of Data Mesh
→ allow business users controlled access to a BigQuery data foundation via Google Sheets.
Instead of marketing mix modeling
→ help marketing teams understand real customer journeys across channels.
Instead of perfect unit economics
→ get contribution margin per unit right and approximate the rest.
Instead of custom recommender systems
→ manually build cross-sell logic for the top products that generate most of your revenue.
Is this perfect?
No.
But it often delivers 80% of the value with 20% of the complexity.
And that’s exactly what companies need in competitive market environments.
Whenever I face an analytics problem, I go through three steps.
1️⃣ Start with the simplest possible heuristic
2️⃣ Validate whether it gets you most of the way
3️⃣ Only add complexity if the simple solution clearly fails
Most teams skip step one. I certainly did.
I used to jump straight into building the “perfect” solution.
But the best analytics teams I know follow a different philosophy:
Build less. Deliver more.
Lazy analytics produces bad decisions.
Overengineered analytics produces no decisions.
The real impact lies somewhere in between.
The sweet spot is the 80/20 middle.
Start simple.
Validate quickly.
And only add complexity when it’s truly justified.
Because the goal of analytics isn’t sophistication.
The goal is ROI.
Cheers,
Sebastian
Subscribe for weekly tips on building impactful data teams in the AI-era
Data Strategy Masterclass: 🏭 From dashboard factory to strategic partner♟️
A digital, self-paced masterclass for growth-oriented data leaders who want to level up their careers by building impactful data teams in the AI-age. 📈
Learn and apply the frameworks that I used to win stakeholder trust, earn a seat in the board room, and lead with impact in 40+ companies across all continents.
Free content to help you on your journey to create massive business impact with your data team and become a trusted and strategic partner of your stakeholders and your CEO.
We have received your inquiry and will get back to you asap!
Watch your email inbox for an email with the subject line "Data Action Mentor Masterclass - Create more business impact with your data team".
We will send you updates about the status of your application and about the masterclass launch.
Don't forget to check your Spam folder!
👉 We will be in touch as soon as we're ready to launch the masterclass!
Thank you for your interest in the Data Action Mentor Masterclass - Create Massive Impact with your Data Team!
We will let you know by email before October 30 if you are one of the 3 FREE beta testers.
Please note that the masterclass is still in development. We'll keep you in the loop about our progress in building this!
Best,
Sebastian - Founder Data Action Mentor
👉 Please check your email to confirm your subscription!
⏰ This confirmation email may take up to 5 minutes to arrive (it may land in your spam folder).
You are now on the waitlist!
We will be in touch as soon as we open admissions to the Data Action Mentor community!
🤞 Please don't forget to look for an email with the subject line "Please confirm your email" in your inbox to confirm your newsletter subscription.
I will be in touch once I have news regarding the masterclass!
Cheers,
Sebastian - Founder Data Action Mentor
We will be in touch asap if your application is successful!
We have received your application!
If your application is successful, we will send you a confirmation and a payment link.
Since there are only 5 available seat for this one-time offer, pre-payment is required to reserve your spot.
We will send you a secure payment link via email once the course is ready!
The price of the course is $99 + VAT.
You can use the discount code GWMJMYNA to receive a 20% discount for my course "Build a Premium Data Consulting Offer in 3 Weeks".
This code is valid until January 15.
You can use the discount code E5OTE5OA to receive a 20% discount for my course "From Dashboard Factory to Strategic Partner" (or give the code to someone who needs it more than you).
I'll be in touch asap