Hi ,
In my previous emails, I've been exploring the hidden mistakes that keep data scientists from advancing their careers. We've discussed the "skills collection trap" and the danger of becoming an order-taker rather than a strategic partner.
Today, I'm addressing the third career-limiting mistake: building impressive proof-of-concepts that never actually solve business problems.
Mistake #3: Building Shiny POCs That Never Make It To Production
"Most analytics and AI projects fail because operationalization is only addressed as an afterthought." - Gartner
After learning so many impressive data science skills, it's only natural to want to put them to use. And when those exciting opportunities aren't
landing in your lap because you're focussed on technical skills instead of business problems (mistake #1) and you're waiting for orders instead of uncovering needs (mistake #2), there's a predictable next step...
You build a proof-of-concept (POC) to showcase your technical prowess.
From cutting-edge AI projects to complex neural network/random forest ensembles, the technical sophistication buried in Jupyter notebooks on data scientists' laptops would impress even the most seasoned practitioners.
But these projects rarely ever make it to production - instead becoming digital ghosts that contribute nothing to your career advancement or the
company's bottom line.
Why?
Because many POC projects suffer from two fatal flaws:
First, they don't actually solve any pressing business problems -
they solve technically interesting problems instead. They're the equivalent of the portfolio projects data scientists are advised to build when looking for their first job.
Second, even when they do address real business needs, they often require significant operational changes that the organisation is unwilling or unable to implement. The solution might be brilliant, but if it
demands rebuilding entire workflows and processes, it's unlikely to ever see the light of day.
I know a data scientist who recently ran into this exact problem. He had built a sophisticated anomaly detection model in a Jupyter notebook that, if put into production, he was sure would reduce the number of defective items by 15%.
But he couldn't get the business support he needed because implementing it would require changes to three different systems and retraining an entire department. After months of work, the model just went to waste - along with his opportunity to demonstrate value to leadership.
Eric Siegel highlights the disconnect between model development and production in The AI Playbook:
"An ML project is a business endeavour, not simply a technical one that can be handed off to data scientists to take on alone. After all, a model is going
to directly change business operations, so the project requires a wholly collaborative process driven by business needs." (p.42)
This disconnect doesn't just waste time and resource - it's also actively holding back your career.
Every impressive
model that stays trapped in your laptop is another missed opportunity to advance your career.
But how can a data scientist bridge this gap between technical possibility and business reality?
What to Do
Instead?
Instead of building POCs, try building minimum viable products (MVPs) that prioritise implementation over technical sophistication.
Start with the simplest possible solution to a real business problem and focus on getting that into
stakeholders' hands as fast as you can - even if you have to deliver insights by email, Excel spreadsheets, or writing numbers on a post-it note and attaching it to their computer.
A high school maths teacher and aspiring data scientist I know exemplified this approach perfectly. He identified a specific data need among faculty, developed a simple but valuable report, and then
emailed it to his fellow staff on the first day of each month, directly from his computer. No fancy deployment, no complex infrastructure - just direct delivery of insights that solved a real business problem.
This approach accomplishes two critical things: it confirms whether your solution addresses a real business need and it demonstrates value right away - both essential for
career advancement.
Once the business need is validated and stakeholders have experienced the initial value, you can iterate and enhance your solution based on actual stakeholder feedback, building the case for more substantial organisational change as the business impact becomes undeniable.
This is exactly the sort of collaborative approach advocated for by Siegel, but with the minimum of friction along the way.
A simple solution might not seem so impressive to your data science peers, but it is infinitely more valuable than a sophisticated one that never leaves your laptop.
Watch for next week's email, where I'll cover the final and perhaps most critical mistake: Letting Your Wins Die In Silence - the reason why your great work might go unnoticed by those who make promotion decisions.
Have you ever built an impressive technical solution that never made it into production?
Reply and share your experience.
Talk again soon,
Dr Genevieve Hayes.
p.s. If you missed the previous emails in this series, or if you prefer
to read the complete article now rather than waiting for the weekly instalments, you can access it in its entirety HERE.