Datapay is a legacy product, we are tasked with building a roadmap for a greenfield experience and exploring the agentic headless domain. The Product had hundreds of competing ideas and lacked a non-structured way to prioritise based on customer needs.
What I did: 
Led Jobs-To-Be-Done creation including cathing insights across 21 past research studies, our 300+ feature requests and needly understand customer needs for CAB workshops and tlkiang with our internal usera weekly and SMEs. From here I mapped our 4 core user & scondary roles following the JTBD frameworks crearing 8 job stages for each.
This enabled me to create our Outcome-Driven Innovation discovery process and redesigned our discovery process utilising different AI tools throughout our process and rapid AI prototyping to validate early concepts with our Customer Advisory Board (14 key customers).
The outcome: 
A clear, evidence-based roadmap; faster discovery cycles (measuring time savings); design insights now shared across teams using engineering tools (GitHub/Cursor); and stakeholders actively seeking research inputs on customer conversations that feed back into our innovation stream of work.
The shift: 
From "Can we build it?" to "Should we build it?" – proving that in the age of AI, knowing what to build is design's strategic value.
Want the full story? Keep reading. ↓

Helping farmers make fast decisions in the field

Recording Health Treatments offline

The Challenge
Datapay is a payroll platform serving 1 in 6 New Zealanders. We support medium-to-large organisations through deep configurability, modern integrations, and AI-led innovation.
We had many plausible directions for new value, but no structured way to compare them. Internally: 300+ feature requests from 6 teams, all equally "urgent". Externally: Customers wanted less friction and more confidence – not more features.
Strategically:
With AI enabling us to build faster than ever, the question shifted from "Can we build it?" to "What should we build?"
The real challenge:
In an engineer-focused business in the age of AI, where product roles are evolving, as a Design Product Lead, I needed to guide my product to show that the design's role is far beyond making a product look good, it's about deeply understanding what our customers need and guiding the effort to build the right thing that makes a difference. 
Risks without disciplined discovery:
Chasing opinions instead of evidence, Fragmenting effort across teams. Adding complexity to already-overloaded users, losing credibility as a design function
Our goal:
Replace idea-led planning with opportunity-led discovery. Know what to build, why, and in what order – backed by customer evidence, not internal assumptions.
My Role, Team and Partners
As Research & Design Lead I owned the discovery, redesigning our process for the AI era, and proving design's strategic value.
What I did:
• Led Jobs-To-Be-Done and Outcome-Driven Innovation (opportunity scoring), and opportunity mapping (21 studies, 18 stakeholders that created artefacts that guide our discovery and roadmap creation.
• Built an interactive JTBD website using Cursor + GitHub so all teams can access insights.
• Customer Validation Processes to usability test and innovation concepts with our Customer Advisory Board (14 key customers, quarterly).
• ​​​​​​​Made customer needs visible to everyone at Datapay, including engineering leadership, to help guide the teams to research-led innovation.
Team & partners: 
Product Leaders, Customers, My Design Team, Engineering, Data, Compliance, Customer Success, and Go To Market. 

When I joined FIQ, the IQ app only had a basic mob move record.


When I left FIQ, the IQ app had 11 new recordings, full Tasks and Diary features, and app notifications, among other enhancements.

What I Did
Part 1: Building a Customer-Centred Roadmap (JTBD & IDI)

The approach: 
Use Jobs-to-be-Done and Outcome-Driven Innovation to turn innovation from guesswork into science.


"Innovation is not about ideas – it's about systematically addressing unmet customer needs." Tony Ulwick, creator of ODI

Step 1: Reviewed 21 past research studies
Synthesised existing insights to understand where we'd already heard customer pain.​​​​​​​

Understanding all 21 past research projects (including the 5 projects I led since working at Datapay)

Step 2: Listening to our Internal Stakeholders 
This includes talking to 6 internal teams, 18 participants in total, to understand what our customers are saying is an underserved or missing job from our legacy product. As the Research & Design Lead, I ran this research project alone and synthesised all insights, focusing on our target customers (high employee counts and high complexity). 
I also analysed 300+ feature requests to understand what customers are asking to be fixed, and worked with our Customer Research GM to understand past market research.
Top themes:
• Reporting & payroll verification (6/6 sessions): Exception-first, operational efficiency. Currently, a broken control layer, not a missing feature.
• Integration reliability & automation (5/6 session, implied in 6): Stop manual workarounds.
• Multi-position & cross-company management (5/6 sessions): Handle complexity at scale
What our stakeholders told us:
"Reporting is how enterprise customers prove payroll is correct — when it fails, everything fails"
"Clients have even stopped asking... they know feature requests go nowhere."
Insight: 
Our roadmap was driven by one-off requests, not patterns.

High-level summary of the research synthesis.

Some Internal Teams are feeling the pressure more than others. A real opportunity for product and customer support to work together on solutions (part of our plan).

Step 3: Built & Validated Our Market Canvas
Mapped our payroll ecosystem: Employees, Pay-Impact Approvers, Payroll Professionals, Auditors, and Secondary Roles (Finance, HR, Legal).
For each role, we defined the core job:
🧑‍🤝‍🧑 Employees
. Get paid right. 💬
Employees want the right amount of pay on time, every time.

Pay-Impact Approvers. Approve it right. 💬
Confidently approve pay-affected  changes for their team without holding up the pay run.

Payroll Professionals. Do it right. 💬
Pay employees correctly, compliantly, and on time. Every time.

Payroll Auditors. Prove it's right. 💬
Confirm payroll is accurate, compliant, and properly controlled with evidence.
Secondary Roles. Give me the answers I need. 📊
Get the payroll data I need, in a format I can use, without creating work for the Payroll Professionals and without delay.

Payroll is an ecosystem of data flowing between our core roles with safeguards to ensure payroll is correct. By looking at these core users in more detail is was obvious to understand why our design discovery is always complex. For the Payroll Professional to have a good experience, all other upstream and downstream roles also need a good experience. If we fail in one critical job step for one role, we may fail at our entire offering.

Our Market Canvas Summary. The Payroll Ecosystem Diagram to the right.

Step 4: Created Job Maps for Each Role​​​​​​​
For Payroll Professionals, we mapped 34 job steps across 8 stages and scored each on Importance vs. Satisfaction (ODI methodology).
This gave us 28 high-opportunity areas – unmet needs where customers struggle most.
Opportunity Score formula:
Importance + (Importance – Satisfaction) × 5
Highest-scoring opportunities (9/10):
Identify errors, anomalies, and unusual patterns (Confirm stage, step 4.2)
Monitor data and integration changes during processing (Monitor stage, 6.3)
Monitor recurring issues or patterns across pay cycles (Monitor stage, 6.5)
Deliver accurate payroll outputs / reporting (Conclude stage, 8.1)
[ASSET 4: Job Map visual summary – the 8-stage flow diagram you mentioned]
Deep dive: Anomaly Detection & Explainable Payroll
One of the highest-scoring opportunities for the Payroll Professional was:
"Identify issues that need attention" (Opportunity score: 6/3/9)
The pain:
Payroll professionals are cognitively overloaded – manually checking thousands of values, nervous about missing something subtle, unsure where to start when things go wrong.
What customers told us:
"I haven't got the ability to check everything."
"Tell me what I need to focus on first."
"Users don't know where to start when something goes wrong."
Emotional jobs:
Cognitive overload from manually checking thousands of values
 Nervousness about approving pay they don't fully understand
Uncertainty about root causes of unusual values
Fear of missing something subtle that affects employee pay
Frustration when the system can detect an anomaly but can't explain it
Stress when anomalies are caught late and require last-minute fixes
Validated in our quarterly Customer Advisory Board with hand-picked customers, helping us build the new product for them. The proof of concepts were well received, and we gathered value insights to interact and add to the backlog in our roadmap.  
[ASSET 5: ODI table screenshot – the anomaly detection example with scores, pains, and solutions]
HTML ANOMALY DETECTION
​​​​​​​
Step 5: Built an interactive JTBD website (using Cursor + GitHub)
The challenge: 
JTBD insights were hidden in Confleunce documentation. Product, engineering, and design all needed access, but PDFs and slide decks don't scale.
My solution: 
I built an interactive JTBD website using Cursor and GitHub – the same tools our engineers use.
Now anyone can:
• Browse our customers' JTBD in detail to suit them. They can learn more about who our core and secondary customers are, and what they need, or view our key opportunities, or delve into the deep insights for all 8 job stages.
• Search by job step, pain point, or opportunity score.
• Link directly to specific insights in planning docs.
• See desired outcomes, success criteria, emotional jobs, and supporting quotes.
Why this matters:
Using engineering tools to share our customer insights enabled us to deliver insights at speed without needing to spend manual effort on slide decks. Our HTML insights can be easily updated as we validated our Job Maps further and uncovered new opportunities for different customer segments. 
It has helped me frame that design is beyond creating deliverables, we are evolving with the new tools and stepping into the strategic space, building credibility and increasing our velocity.
[ASSET 7: Screenshot of the GitHub JTBD website – annotated to show key features like search, job stages, opportunity scores]
Part 2: Redesigning Discovery in the AI Era
The shift: Traditional discovery meant spending weeks designing 2–3 directions, then validating. With AI prototyping (Cursor, Claude, Loveable), we can now generate 5+ variations in days.
This flipped our process:
Validate upfront → iterate carefully → build
Iterate rapidly with AI → validate with customers → build with confidence
[ASSET 8: Triple diamond diagram – BEFORE (traditional) and AFTER (AI-shifted process)]

What this means in practice:
Ideation
Was: Sketches/presentations (slow). Now: AI + rapid generation (fast)
What's our role now?
Exploration
Was: 2–3 directions. Now: 5+ variations instantly
How do we evaluate at scale?
Validation
Was: User research upfront. Now: Iterate → validate
When do we research?
Handoff
Was: Detailed specs. New: Working prototype + direction
What does done look like?

[ASSET 9: AI-Generated Triple Diamond diagram – the one you created showing the new process flow with AI builds, internal review, customer validation, iterate/live]
The unlock:
Depending on solution complexity and risk (if high complexity and a new ODI piece), we action all of the above before engineering builds anything. We de-risk with speed, not more planning. If the solution is low complexity & risk, we let our new AI Design System create solutions, and we quickly validate at the end.
My role in this shift:
I didn't just enable my team to adopt AI tools. I redesigned our discovery operating model around them, then taught the team and other Product Leaders how to work this way.
In the age of AI, problem framing becomes a crucial human skill. With all decisions found in our validated JTBD, we turn insights into data-rich, outcome-driven clarity – fast. This ensures AI accelerates delivery by focusing on the right opportunities first.
Part 3: Validating with the Customer Advisory Board
Every 3 months, I run workshops with our Customer Advisory Board – 14 key customers who represent our target complexity and scale.
How I use CAB sessions:
 Test concepts before engineering invests.
 Validate that high-scoring opportunities resonate in practice.
 Surface new pain points we hadn't mapped yet.
 Build trust and show customers we're listening.
For the anomaly detection opportunity, we tested:
 Different UX patterns for surfacing errors
 Trust signals (explain-why, variance views, tolerance bands)
 Automation boundaries (when agents act vs. suggest)
 Explainability standards (what, why, and how to reverse)
What we learned:
 Explainability is non-negotiable – Payroll professionals need to know why something is flagged, not just what.
Configurability – experienced Payroll Professionals know when the numbers look off for a specific pay run, they want a system that can learn from them and help train their less experienced team members.
 Transparency builds trust – show the logic, not just the answer.
 Reversibility is essential – users need to override AI suggestions safely.
These insights from our regular sessions shape our agentic design principles and feed directly into our specs.
[ASSET 10: CAB workshop photo or slide from your JTBD presentation to the company – shows the process in action]
Part 4: Leading the Team Through Change
The SaaS industry is shifting. AI is changing what designers do, and my team feels it. One designer is excited and leaning into AI tools unprompted, and another is anxious about evolving role and losing control.
How I'm coaching through this:
 Spend more time with the anxious designer, sharing industry insights regularly.
 Explain that we control what design's value is – it's about judgment, craft, framing, and velocity, not just execution.
 Model using AI tools openly (Cursor, Claude, Loveable)
 ​​​​​​​Share my own feelings – I'm navigating the same uncertainty
The message:
AI doesn't replace designers who can think strategically – it sorts them. Designers who can frame problems, validate with customers, and move fast with AI are more valuable than ever.
[ASSET 11: Optional – photo of team working together, or a screenshot of a design critique/workshop]
The Solutions At Datapay
The output isn't a single feature – it's a decision system:
 • I created and validated the detailed Job Maps for our 4 core roles (1 secondary role), end artiefact the Cross-Role Job Map. 
The Lead Product Manager creates our Roadmap with the above insights.
I own the ODI opportunity landscape, showing where needs are most underserved.
I create Solution Trees that force breadth before depth and document trade-offs.
Opportunity map that sequences bets into themed roadmap tracks.
AI-assisted concept patterns with guardrails, explainability, and human-in-the-loop principles.
Interactive JTBD website (built in Cursor/GitHub) for cross-team access that is easily updated.
A reimagined discovery process optimised for AI speed + customer validation.
Now, any new idea can be traced to: the job it serves, the outcome it improves (owned by me), why it's been prioritised and how we'll measure success (co-owned with the Lead Product Manager).
Recording chemical applications in real-time ensures no stock graze on fields in a withholding period, keeping both livestock safe and the farm compliant.​​​​​​​
The Impact
 
We are currently tracking
↓ ​​​​​​Discovery Time. Tracking (old vs. new process), showing velocity gains from deep insights upfront and AI tools, affecting all product roles (PM, PA, Design and Engineers).
  ↑ Stakeholder Alignment. CAB confidence scores increase, "Solves my real problems".
↑ Design Team Capability. Designers using AI tools independently, multiplying team velocity.
↑ Design Value & Customer Voice. Insights shared, stakeholders are now inviting research to customer conversations and other roles understand what our customers need.
• Roadmap clarity. Fewer prioritisation reversals, Less cross-team debate, clearer decisions.
  
Early signals (before full metrics available):
Roadmap clarity:
 Fewer "why are we doing this?" questions in planning.
 Teams using JTBD language in discussions ("This serves the 'identify anomalies' job")
Design visibility:
 Stakeholders are reaching out to involve research in customer conversations (previously didn't happen).
 Research summaries shared get active engagement with replies like: "This is exactly what I'm hearing from clients"
Customer validation:
 CAB feedback shifted from "Can you add [feature]?" to "This actually solves my real problems"

Post-launch, we'll measure:
 Time-to-confidence: How fast users go from detection → decision
 Exception resolution rates: % of anomalies resolved without manual investigation
 Variance explanation completeness: Are users getting the "why"?
 Cost-to-serve reductions: Less support, fewer errors, faster pay cycles

[ASSET 12: Simple visual showing metrics framework – "Measuring" column with placeholders for future data]

IQ stickness increased by 17.4% in 12 months.

Monthly active users: Web (FMS): 42%, IQ: 39%, Classic: 19%. A 13% increase on the IQ app in 12 months.
What I Learned & How I Led
JTBD unifies language
When jobs lead, role debates quieten. Product, engineering, and business can all understand "Help payroll professionals spot errors faster" without arguing about features.
ODI de-risks direction
Scored outcomes beat loud opinions. It's hard to argue with "This scores 9/10, and that scores 5/10.", we knew where to focus our agentic efforts.
AI doesn't replace strategic designers – it sorts them
Designers who can frame problems, validate with customers, and move fast with AI are more valuable than ever. Designers who just execute are at risk.
Explainability is table stakes
Agentic value is proportional to trust. Users need to know why AI is suggesting something, not just what it suggests.
Design value must be visible
In a business/engineering-led company, sharing research insights, making customers visible, and showing velocity is how design earned its seat.
Leading through change requires transparency
My team needs to see me navigating the same uncertainty, using the same tools, and proving the same value.
Using engineering tools builds credibility
Building the JTBD site in Cursor/GitHub wasn't just efficient – it signalled that design operates in this new way of working.

The discovery operating model (JTBD → ODI → AI prototyping → CAB validation) is now our default path for our new big bets.
Design isn't only making experiences that make our customers' lives easier – we're shaping what gets built, why, and how, accelerating how fast we learn.

Summary of Required Assets:
[ASSET 1] Past research summary visual
[ASSET 2] Internal stakeholder insights infographic (top 3 themes)
[ASSET 3] Market Canvas diagram (4 roles + core jobs)
[ASSET 4] Job Map visual (8-stage flow)
[ASSET 5] ODI table screenshot (anomaly detection example)
[ASSET 6] Solution tree visual
[ASSET 7] GitHub JTBD website screenshot (annotated)
[ASSET 8] Triple diamond BEFORE diagram
[ASSET 9] AI-shifted process diagram (your new version)
[ASSET 10] CAB workshop photo or presentation slide
[ASSET 11] Optional: Team photo or workshop
[ASSET 12] Metrics framework visual



The offline Diary enables farmers to make quick management decisions based on data.

Recording a task from the map and diary was the highest adoption rate. 

Work completed whilst at Datapay
Back to Top