Led Jobs-To-Be-Done creation including cathing insights across 21 past research studies, our 300+ feature requests and needly understand customer needs for CAB workshops and tlkiang with our internal usera weekly and SMEs. From here I mapped our 4 core user & scondary roles following the JTBD frameworks crearing 8 job stages for each.
A clear, evidence-based roadmap; faster discovery cycles (measuring time savings); design insights now shared across teams using engineering tools (GitHub/Cursor); and stakeholders actively seeking research inputs on customer conversations that feed back into our innovation stream of work.
From "Can we build it?" to "Should we build it?" – proving that in the age of AI, knowing what to build is design's strategic value.
Helping farmers make fast decisions in the field
Recording Health Treatments offline
With AI enabling us to build faster than ever, the question shifted from "Can we build it?" to "What should we build?"
In an engineer-focused business in the age of AI, where product roles are evolving, as a Design Product Lead, I needed to guide my product to show that the design's role is far beyond making a product look good, it's about deeply understanding what our customers need and guiding the effort to build the right thing that makes a difference.
Chasing opinions instead of evidence, Fragmenting effort across teams. Adding complexity to already-overloaded users, losing credibility as a design function
Replace idea-led planning with opportunity-led discovery. Know what to build, why, and in what order – backed by customer evidence, not internal assumptions.
• Led Jobs-To-Be-Done and Outcome-Driven Innovation (opportunity scoring), and opportunity mapping (21 studies, 18 stakeholders that created artefacts that guide our discovery and roadmap creation.
• Built an interactive JTBD website using Cursor + GitHub so all teams can access insights.
• Customer Validation Processes to usability test and innovation concepts with our Customer Advisory Board (14 key customers, quarterly).
• Made customer needs visible to everyone at Datapay, including engineering leadership, to help guide the teams to research-led innovation.
Product Leaders, Customers, My Design Team, Engineering, Data, Compliance, Customer Success, and Go To Market.
When I joined FIQ, the IQ app only had a basic mob move record.
When I left FIQ, the IQ app had 11 new recordings, full Tasks and Diary features, and app notifications, among other enhancements.
The approach:
Use Jobs-to-be-Done and Outcome-Driven Innovation to turn innovation from guesswork into science.
"Innovation is not about ideas – it's about systematically addressing unmet customer needs." Tony Ulwick, creator of ODI
Synthesised existing insights to understand where we'd already heard customer pain.
Understanding all 21 past research projects (including the 5 projects I led since working at Datapay)
• Reporting & payroll verification (6/6 sessions): Exception-first, operational efficiency. Currently, a broken control layer, not a missing feature.
• Integration reliability & automation (5/6 session, implied in 6): Stop manual workarounds.
• Multi-position & cross-company management (5/6 sessions): Handle complexity at scale
"Reporting is how enterprise customers prove payroll is correct — when it fails, everything fails"
"Clients have even stopped asking... they know feature requests go nowhere."
Our roadmap was driven by one-off requests, not patterns.
High-level summary of the research synthesis.
Some Internal Teams are feeling the pressure more than others. A real opportunity for product and customer support to work together on solutions (part of our plan).
🧑🤝🧑 Employees. Get paid right. 💬
Employees want the right amount of pay on time, every time.
Confidently approve pay-affected changes for their team without holding up the pay run.
Pay employees correctly, compliantly, and on time. Every time.
Confirm payroll is accurate, compliant, and properly controlled with evidence.
Get the payroll data I need, in a format I can use, without creating work for the Payroll Professionals and without delay.
Our Market Canvas Summary. The Payroll Ecosystem Diagram to the right.
"Identify issues that need attention" (Opportunity score: 6/3/9)
Payroll professionals are cognitively overloaded – manually checking thousands of values, nervous about missing something subtle, unsure where to start when things go wrong.
"I haven't got the ability to check everything."
Cognitive overload from manually checking thousands of values
Nervousness about approving pay they don't fully understand
Uncertainty about root causes of unusual values
Fear of missing something subtle that affects employee pay
Frustration when the system can detect an anomaly but can't explain it
Stress when anomalies are caught late and require last-minute fixes
HTML ANOMALY DETECTION
JTBD insights were hidden in Confleunce documentation. Product, engineering, and design all needed access, but PDFs and slide decks don't scale.
I built an interactive JTBD website using Cursor and GitHub – the same tools our engineers use.
• Browse our customers' JTBD in detail to suit them. They can learn more about who our core and secondary customers are, and what they need, or view our key opportunities, or delve into the deep insights for all 8 job stages.
• Search by job step, pain point, or opportunity score.
• Link directly to specific insights in planning docs.
• See desired outcomes, success criteria, emotional jobs, and supporting quotes.
Using engineering tools to share our customer insights enabled us to deliver insights at speed without needing to spend manual effort on slide decks. Our HTML insights can be easily updated as we validated our Job Maps further and uncovered new opportunities for different customer segments.
Validate upfront → iterate carefully → build
Iterate rapidly with AI → validate with customers → build with confidence
Was: Sketches/presentations (slow). Now: AI + rapid generation (fast)
What's our role now?
Was: 2–3 directions. Now: 5+ variations instantly
How do we evaluate at scale?
Was: User research upfront. Now: Iterate → validate
When do we research?
Was: Detailed specs. New: Working prototype + direction
What does done look like?
Depending on solution complexity and risk (if high complexity and a new ODI piece), we action all of the above before engineering builds anything. We de-risk with speed, not more planning. If the solution is low complexity & risk, we let our new AI Design System create solutions, and we quickly validate at the end.
I didn't just enable my team to adopt AI tools. I redesigned our discovery operating model around them, then taught the team and other Product Leaders how to work this way.
• Test concepts before engineering invests.
• Validate that high-scoring opportunities resonate in practice.
• Surface new pain points we hadn't mapped yet.
• Build trust and show customers we're listening.
• Different UX patterns for surfacing errors
• Trust signals (explain-why, variance views, tolerance bands)
• Automation boundaries (when agents act vs. suggest)
• Explainability standards (what, why, and how to reverse)
• Explainability is non-negotiable – Payroll professionals need to know why something is flagged, not just what.
• Configurability – experienced Payroll Professionals know when the numbers look off for a specific pay run, they want a system that can learn from them and help train their less experienced team members.
• Transparency builds trust – show the logic, not just the answer.
• Reversibility is essential – users need to override AI suggestions safely.
• Spend more time with the anxious designer, sharing industry insights regularly.
• Explain that we control what design's value is – it's about judgment, craft, framing, and velocity, not just execution.
• Model using AI tools openly (Cursor, Claude, Loveable)
• Share my own feelings – I'm navigating the same uncertainty
AI doesn't replace designers who can think strategically – it sorts them. Designers who can frame problems, validate with customers, and move fast with AI are more valuable than ever.
• The Lead Product Manager creates our Roadmap with the above insights.
• I own the ODI opportunity landscape, showing where needs are most underserved.
• I create Solution Trees that force breadth before depth and document trade-offs.
• Opportunity map that sequences bets into themed roadmap tracks.
• AI-assisted concept patterns with guardrails, explainability, and human-in-the-loop principles.
• Interactive JTBD website (built in Cursor/GitHub) for cross-team access that is easily updated.
↑ Stakeholder Alignment. CAB confidence scores increase, "Solves my real problems".
↑ Design Team Capability. Designers using AI tools independently, multiplying team velocity.
↑ Design Value & Customer Voice. Insights shared, stakeholders are now inviting research to customer conversations and other roles understand what our customers need.
• Roadmap clarity. Fewer prioritisation reversals, Less cross-team debate, clearer decisions.
• Fewer "why are we doing this?" questions in planning.
• Teams using JTBD language in discussions ("This serves the 'identify anomalies' job")
• Stakeholders are reaching out to involve research in customer conversations (previously didn't happen).
• Research summaries shared get active engagement with replies like: "This is exactly what I'm hearing from clients"
• CAB feedback shifted from "Can you add [feature]?" to "This actually solves my real problems"
• Time-to-confidence: How fast users go from detection → decision
• Exception resolution rates: % of anomalies resolved without manual investigation
• Variance explanation completeness: Are users getting the "why"?
• Cost-to-serve reductions: Less support, fewer errors, faster pay cycles
IQ stickness increased by 17.4% in 12 months.
When jobs lead, role debates quieten. Product, engineering, and business can all understand "Help payroll professionals spot errors faster" without arguing about features.
Scored outcomes beat loud opinions. It's hard to argue with "This scores 9/10, and that scores 5/10.", we knew where to focus our agentic efforts.
Designers who can frame problems, validate with customers, and move fast with AI are more valuable than ever. Designers who just execute are at risk.
Agentic value is proportional to trust. Users need to know why AI is suggesting something, not just what it suggests.
In a business/engineering-led company, sharing research insights, making customers visible, and showing velocity is how design earned its seat.
My team needs to see me navigating the same uncertainty, using the same tools, and proving the same value.
Building the JTBD site in Cursor/GitHub wasn't just efficient – it signalled that design operates in this new way of working.
The offline Diary enables farmers to make quick management decisions based on data.
Recording a task from the map and diary was the highest adoption rate.