Datapay is a payroll platform serving 1 in 6 New Zealanders, supporting medium‑to‑large organisations through deep configurability, modern integrations, and AI‑led innovation. Datapay is a legacy platform. We are tasked with defining the greenfield traditional SaaS experience and sequencing the roadmap, whilst also exploring the agentic headless domain and looking ahead to an agentic generative UI experience for our customers. The product had hundreds of competing ideas led by ai possibilities and no structured way to prioritise work based on real customer needs.
What I did: 
I led the creation of our Jobs‑To‑Be‑Done framework, synthesising insights from 21 past research studies, 300+ feature requests, and continuous weekly conversations with internal users and SMEs. This work informed our regular research activities, where we validated our roadmap ideas and ensured we are grounding discussions in actual customer needs.
From this foundation, I mapped our four core and secondary user roles using the JTBD framework, defining eight job stages for each role.
This enabled me to establish an Outcome‑Driven Innovation discovery process and redesign Datapay’s discovery approach. I introduced a range of AI tools across the workflow and used rapid AI prototyping to validate early concepts with our Customer Advisory Board (CAB), involving 14 key target customers.
The outcome: 
• A clear, evidence‑based roadmap.
• Faster discovery cycles, with measurable time savings.
• Design insights shared across teams using engineering-native tools (GitHub, Cursor).
• Stakeholders proactively seeking research input from customer conversations, feeding directly into our innovation stream.
The shift: 
From “Can we build it?” to “Should we build it?” — demonstrating that, in the age of AI, knowing what to build is design’s strategic value.
Want the full story? Keep reading. ↓

Early concepts validated and prioritised through Jobs‑To‑Be‑Done and Outcome‑Driven Innovation. We focus on confidence, clarity, and decision‑making over feature volume.

Payroll professionals care deeply about employees and value solutions that reduce cognitive load — a core insight that reshaped our discovery focus.

The Challenge
Despite the scale and ambition of our greenfields project, we didn't have a clear way to decide what we should focus on to create meaningful value for our customers. Instead, we had:
• Internally: 300+ feature requests across six teams, all competing and all labelled “urgent”.
• Externally: Customers were asking for less friction and more confidence — not more features.
Strategically:
With AI enabling us to build faster than ever, the question shifted from "Can we build it?" to "What should we build that adds value?"
The real challenge:
In an engineer‑focused organisation, where product roles are evolving alongside AI, I needed to demonstrate that design’s value goes far beyond making products look good.
As a Design Lead, my role is to understand what customers actually need deeply, and to guide the project towards building the right things, in the right order, that genuinely make a difference.
Risks without disciplined discovery:
• Chasing opinions instead of evidence
• Fragmenting effort across teams
• Adding complexity to already overloaded users
Our goal:
Replace idea-led planning with opportunity-led discovery. Know what to build, why, and in what order – backed by customer evidence, not internal assumptions.

Payroll Professionals work under intense, deadline‑driven pressure. Our JTBD work highlighted consistent stress points during pay periods and clarified where our product should actively guide users and reduce cognitive load.

Payroll Professional JTBD — Monitor (Job Stage 6.3): Monitoring data and integrations to ensure payroll confidence, accuracy, and continuity.​​​​​​​

My Role, Team and Partners
As Research & Design Lead, I owned discovery end‑to‑end — redesigning our discovery approach for the AI era and demonstrating design’s strategic value in shaping what we build.
What I did:
• Led Jobs‑To‑Be‑Done and Outcome‑Driven Innovation (opportunity scoring), synthesising insights from 21 research studies and 18 stakeholders into clear artefacts that guide discovery and roadmap decisions.
• Shipped an interactive JTBD website using Cursor and GitHub, making customer insights accessible and usable across the organisation.
• Established customer validation processes to usability‑test and validate innovation concepts with our Customer Advisory Board (14 key customers, quarterly).
• Made customer needs visible across Datapay — including engineering leadership — to shift teams towards research‑led, evidence‑based innovation.
Team & partners: 
Product Leadership, Customers, Design, Engineering, Data, Compliance, Customer Success, and Go‑to‑Market.

Conversations with internal teams informed our understanding of what Datapay already does well — and where payroll professionals still struggle to feel confident and in control.

Conversations with our existing customers are a favourite part of my role.

What I Did
Part 1: Building a Customer-Centred Roadmap (JTBD & ODI)
The approach: 
Use Jobs‑To‑Be‑Done and Outcome‑Driven Innovation to move innovation from opinion‑based guesswork to a repeatable, evidence‑driven system.
"Innovation is not about ideas – it's about systematically addressing unmet customer needs."
Tony Ulwick, creator of JTBD & ODI

Step 1: Reviewed 21 past research studies
I synthesised existing qualitative and quantitative research to understand where we had already heard customer pain, and where themes were repeating but not yet actioned.
Rather than treating past research as historical artefacts, I reframed it as live input into our discovery system, grounding future decisions in proven customer evidence instead of starting from scratch.

Insights synthesised from 21 past research projects (including five I led at Datapay), highlighting consistent themes around accuracy, trust, and the consequences of getting payroll wrong.

Step 2: Listening to Our Internal Stakeholders 
This step focused on working closely with Datapay’s internal teams to understand where customers experience missing or underserved jobs within the legacy product.
I spoke with six internal teams (18 participants in total), focusing specifically on feedback from our target customers — organisations with high employee volumes and high operational complexity.
As Research & Design Lead, I ran this research independently and synthesised all insights end‑to‑end. In parallel, I analysed 300+ feature requests to identify recurring customer pain, and partnered with our Customer Research GM to incorporate insights from previous market research.
Top themes:
 Reporting & payroll verification (6/6 sessions)
Customers need exception‑first workflows and operational efficiency. The issue isn’t a missing feature — it’s a broken control layer.
• Integration reliability & automation (5/6 session, implied in 6)
Customers are using manual workarounds where automation should be trustworthy
• Multi-position & cross-company management (5/6 sessions)
Complexity increases significantly at scale, and existing flows struggle to support it.
What our stakeholders told us:
"Reporting is how enterprise customers prove payroll is correct — when it fails, everything fails"
"Clients have even stopped asking... they know feature requests go nowhere."
Key insight: 
Our roadmap was driven by isolated requests, not recognisable patterns of unmet customer needs.

A high‑level synthesis showing what customers value most (accuracy, compliance, service quality) and where confidence consistently breaks down — particularly across reporting, integrations, and automation.

Feedback from customer‑facing teams highlights increasing pressure and reactive workflows, revealing a growing gap between customer expectations and current product behaviour — and a clear opportunity for product, design, and support to work differently.​​​​​​​

Step 3: Built & Validated Our Market Canvas
I mapped the full payroll ecosystem across our core and secondary roles: Employees, Pay‑Impact Approvers, Payroll Professionals, Auditors, and supporting roles across Finance, HR, Legal, and IT.
For each role, we defined the core job they are trying to get done:
🧑‍🤝‍🧑 Employees. Get paid right. 💬
Employees want the right amount of pay on time, every time.
✔️ Pay-Impact Approvers. Approve it right. 💬
Confidently approve pay-affected changes for their team without holding up the pay run.
🧾 Payroll Professionals. Do it right. 💬
Pay employees correctly, compliantly, and on time. Every time.
🔎 Payroll Auditors. Prove it's right. 💬
Confirm payroll is accurate, compliant, and properly controlled with evidence.
👥 Secondary Roles. Give me the answers I need. 📊
Get the payroll data I need, in a format I can use, without creating work for the Payroll Professionals and without delay.
Payroll is an ecosystem of data flowing between these roles, with safeguards at critical points to ensure payroll is correct. Mapping this ecosystem made it clear why design discovery in payroll is inherently complex.
For Payroll Professionals to have a good experience, all upstream and downstream roles must also succeed. If we fail at a critical job step for even one role, the entire payroll experience — and our product’s value — is at risk.

A visual representation of payroll as an interconnected system, showing how employees, approvers, payroll professionals, auditors, and secondary roles rely on controlled data flows and safeguards.

Enjoying the deep validation work with SMEs, spending hours asking “why” to refine roles, job boundaries, and dependencies — uncovering critical insights that shaped the market canvas.

Step 4: Created Job Maps for Each Role
I created Job Maps for each core role and collaborated with SMEs who have experience across multiple payroll offerings in NZ & AU. We scored each job step on importance vs current satisfaction. This produced our opportunity scores, forming the foundation of our Outcome‑Driven Innovation process. The higher the score, the greater the unmet customer need, and the stronger the signal for prioritisation.
Opportunity score formula: Importance + (Importance – Satisfaction) × 5
Example: Payroll Professional Job Mapping
For Payroll Professionals, we mapped 34 job steps across eight job stages. This surfaced 28 high‑opportunity areas — all representing unmet needs where they currently struggle to perform their core role.
While many Payroll Professional jobs scored highly on importance (payroll cannot be wrong), the opportunity emerged from low satisfaction, not low relevance. This distinction was critical: we weren’t solving minor problems — we were addressing essential work that wasn’t being adequately supported.
Highest-scoring opportunities (9/10):
• Identify issues that need attention. Confirm stage, step 4.2.
• Monitor data and integration changes during processing. Monitor stage, step 6.3.
• Monitor recurring issues or patterns across pay cycles. Monitor stage, step 6.5.
• Deliver accurate payroll outputs/reporting. Conclude stage, step 8.1.

A visual representation of the full Payroll Professional job map, showing how cognitive load and risk accumulate across stages, particularly during the stages confirm, execute and monitor.

Example of Confirm 4.2: Identify issues that need attention, illustrating how ODI scoring connects unmet needs, emotional jobs, and evidence to prioritised product opportunities.

Deep dive: Anomaly Detection & Explainable Payroll
One of the highest-scoring opportunities for the Payroll Professional was:
"Identify issues that need attention"
Opportunity score: Importance 6 / Satisfaction 3 / Opportunity 9
The pain:
Payroll Professionals operate under constant cognitive load. Verifying that a pay run is correct is both high‑risk and high‑frequency. It requires manually checking thousands of values, often under time pressure. They worry about missing something that seems small, and when issues arise, they are unsure where to start, or what truly matters.
What customers told us:
"I haven't got the time to check everything."
"Tell me what I need to focus on first."
"Users don't know where to start when something goes wrong."
Emotional jobs:
😵‍💫 Cognitive overload from manually checking thousands of values
 😬 Nervousness about approving pay they don't fully understand
😖 Uncertainty about the root causes of unusual values
😰 Fear of missing something subtle that affects employee pay
😤 Frustration when the system can detect an anomaly but can't explain it
😓 Stress when anomalies are caught late and require last-minute fixes
This opportunity clearly pointed towards the need for exception‑first workflows, anomaly detection, and explainable payroll — not more data, but better guidance and trust at the moment it matters most.​​​​​​​
Step 5: Built an Interactive JTBD Website (using Cursor + GitHub)
The challenge: 
Our JTBD insights were effectively hidden in Confluence. Product, engineering, and design all needed access — but PDFs, slide decks, and static pages don’t scale or stay relevant. Insights were being created, but they weren’t being used.
The solution: 
I built an interactive JTBD website using Cursor and GitHub — the same tools our engineers already work with. This allowed customer insights to live alongside product and engineering work, rather than as separate design artefacts.
Now, anyone across Datapay can:
• Browse customer JTBD at the level of detail that suits them — from core and secondary roles, to key opportunities, to deep insights across all eight job stages. Most importantly for CS, providing feedback if they're hearing something different.
• Search by job step, pain point, or opportunity score.
• View desired outcomes, success criteria, emotional jobs, and supporting customer quotes in one place.
Why this matters:
Sharing insights through engineering‑native tools removed the overhead of maintaining decks and enabled us to move at speed. Our HTML‑based insights are lightweight to update as we continue validating job maps and uncovering new opportunities across customer segments.
During my time at Datapay, I was promoted from Design Lead to Research & Design Lead. A core objective of this role shift was to ensure customer insight wasn’t owned by design alone, but was visible and actionable across the entire organisation.
This work helped reframe design’s role — moving beyond deliverables to strategic enablement — increasing credibility, adoption, and overall discovery velocity.

JTBD's highest strategic focus areas.

High-level overview of the Payroll Professional's Job Map.

Part 2: Redesigning Discovery in the AI Era
AI removes the old constraints of discovery. Where we once explored a few ideas slowly, AI prototyping tools (e.g. Cursor or Claude Code) now let us generate and test many viable directions quickly. This shifts discovery from polishing a small set of concepts to rapidly exploring the solution space and converging with evidence.
Old modelResearch → 2–3 concepts → validate upfront → build
New model when simple: Feature spec (no design)→ build with AI Design System → validate code → live.
New model when complex: Iterate rapidly with AI → validate with customers → build with confidence.
What's Changed:
Ideation: From sketches and decks to fast, AI‑generated working concepts
Exploration: From limited directions to broad, assumption‑driven variation
Validation: From late-stage testing to continuous iterate‑and‑learn
Research: Guides and sharpens exploration rather than gating it
Handoff: Prototypes and clear direction instead of abstract specs
Discovery is complete when we have:
A validated problem framing
An evidence-backed direction
Key risks surfaced and reduced
Reusable patterns fed back into the system
The unlock:
Depending on solution complexity and risk (if high complexity and a new ODI piece), we action all of the above before engineering builds anything. We de-risk with speed, not more planning. If the solution is low complexity & risk, we let our new AI Design System create solutions, and we quickly validate at the end.
My role in this shift:
I redesigned our discovery operating model for an AI‑first environment — going beyond tools to change how teams explore, decide, and validate. I focus where human judgment matters most: problem framing, decision clarity, and determining the right moment to validate with customers.
When insights are grounded in validated JTBD, AI becomes a multiplier. We move quickly from insight to clear, outcome‑driven decisions — accelerating delivery by investing in the right opportunities first.

BEFORE: Traditional Discovery when I arrived at Datapay

AFTER: Outcome Driven Innovation Discovery when the AI Design System can't handle the complex/new feature

Part 3: Validating with the Customer Advisory Board
I run quarterly workshops with our Customer Advisory Board (CAB) — 14 customers representing our target scale and complexity — to validate discovery decisions before we commit to build.
CAB sessions are used to pressure‑test early concepts, confirm that high‑scoring opportunities hold up in real workflows, and surface risks or unmet needs we haven’t yet mapped. The goal isn’t consensus — it’s evidence strong enough to make confident trade‑offs.
Example Of What We Tested
For the anomaly detection opportunity, we explored multiple solution directions with customers using real payroll scenarios. The focus was on testing assumptions early:
Different UX patterns for surfacing anomalies
 Trust signals (explain-why, variance views, tolerance bands)
 Automation boundaries (when agents act vs. suggest)
 Explainability and safe reversal standards

Reporting was the highest pain uncovered in the previous CAB session. From here, we interviewed 10 customers with exploratory questions, then created 4 concepts from our synthesis insights. This is concept 1: anomaly detection.

Concept 1 feedback. We gathered a value vote for all and ensured each participant wrote their name on their stickies so we could hypothesise why certain feedback was given depending on their role abd companies needs. 

Concept 2: Payrun Checklist

Concept 2: Feedback

What We Learned
 Several themes were consistent across roles and experience levels:
Explainability is non-negotiable – payroll professionals need to understand why something is flagged, not just what.
Configurability matters – experienced payroll professionals want systems that learn from them and support less experienced team members.
 • Transparency builds trust – showing the logic matters as much as the output.
 • Reversibility is essential – AI recommendations must be safe and easy to override.
These insights directly shape our agentic design principles and feed into product specifications. We're continuously learning what our customers need and how the software needs to support them and their teams.

CAB prioritisation and voting outcome

🗳️ Concept Voting, Decision & Outcome
After reviewing all four concepts in the session, participants placed:
 8 dots on Concept 1: Anomaly Detection
 5 dots on Concept 2: Centralised Checklist
 No dots on Concepts 3 or 4
Voting confirmed Anomaly Detection as the strongest first cut, balancing strategic impact with day‑to‑day workflow fit. It received the highest dot count and value score, indicating clear alignment with how customers actually operate. This feedback highlights a balance between strategic impact and practical deliverability in how participants prioritised the concepts.
Following CAB, we refined all concepts based on detailed, role‑specific feedback.​​​​​​​

After CAB we created a final prototype based on our detailed feedback, ready to build and test in live code.

Part 4: Leading the Team Through Change
The SaaS industry is shifting. AI is changing what designers do, and my team feels it. One designer is excited and leaning into AI tools unprompted, and another is anxious about evolving role and losing control.
My role here isn’t to push adoption. It’s to build confidence, clarity, and psychological safety while expectations are changing.
How I'm Coaching Through this:
 Spend more time with the anxious designer, sharing industry insights regularly.
 Explain that we control what design's value is – it's about judgment, craft, framing, and velocity, not just execution.
 Model using AI tools openly (Cursor, Claude, Loveable)
 • Sharing my own uncertainty and learning curve, rather than presenting false confidence
The Message I Reinforce
AI doesn't replace designers who can think strategically – it sorts them. Designers who can frame problems, validate with customers, and move fast with AI are more valuable than ever.

Leading the Design Team through change. Regular team discussions mapping what changes with AI and what remains human‑led.

The Solutions At Datapay
The output of this work isn’t a single feature — it’s a decision system that connects customer needs to confident product choices. I created and now own the foundation of that system:
 • Validated Job Maps for our four core roles, synthesised into a cross‑role job map.
 • An ODI opportunity landscape that makes underserved outcomes visible.
 • Solution Trees and opportunity sequencing to ensure breadth before commitment.
 • AI‑assisted concept patterns with guardrails, explainability, and human‑in‑the‑loop principles.
 • An interactive JTBD site (Cursor + GitHub) to make insights accessible and easy to evolve.
Using these inputs, the Lead Product Manager builds the roadmap with clear traceability from job → outcome → priority → success measure.
This has become our default discovery operating model, optimised for AI‑enabled speed and customer validation.
The Impact

What We're Seeing Now
↓ ​​​​​​Reduced discovery time through clearer upfront framing and AI‑accelerated exploration.
Stakeholder alignment, across Product, Design, and Engineering, on what matters and why.
Increased design capability, with designers independently using AI tools.
Clearer roadmap decisions, with fewer reversals and less cross‑team debate
  
Early signals
• Planning conversations now references jobs and outcomes, not features.
• Stakeholders proactively involve research in customer conversations.
• CAB feedback has shifted from “add this feature” to “this solves my real problem”.
Post-launch Measures
We’ll track time‑to‑confidence, exception resolution rates, explainability completeness, and cost‑to‑serve reduction.

What I Learned & How I Led
JTBD creates shared language
Leading with jobs reduces opinion‑based debate and aligns teams around outcomes.
ODI de-risks direction
Scored outcomes made prioritisation clearer and de‑risked where we invested in agentic AI.
AI increases the premium on judgment
Designers who can frame problems, validate with customers, and move fast with AI are more valuable than ever.
Explainability drives trust
Agentic systems only work when users understand why something is happening.
Design value must be visible
Sharing insights, showing momentum, and making customer needs tangible is how design earns influence.
Leading through change requires openness
I modelled uncertainty, learning, and tool adoption so the team could build confidence through change.
The JTBD → ODI → AI prototyping → CAB validation loop is now how we approach our largest bets.
Design isn’t just shaping experiences — it’s shaping what we build, why it matters, and how quickly we learn.
​​​
Work completed whilst at Datapay
Back to Top