Outcomes: Collaborative retrospective on our discovery process from start to end. Refining our research and design process, communicating this to the Design Team, senior stakeholders and the whole company.
Reviewing the Discovery Process 
I reached out to our Agile Coach to help facilitate a workshop among the primary stakeholders involved in the Discovery process. During the meeting, we realised that the current process was overly complex, with conflicting flows, and several chances for misalignment. It took us a full day to thoroughly document the entire process from start to finish.
We took the time to agree on key roles and responsibilities in our discovery process. As a Design Lead, I am responsible for both research and design. Therefore, I took some time to refine the documentation process for research, making it more streamlined and scalable based on the specific problem we were trying to solve.
Design Discovery update
It's important to recognise that different projects have varying levels of complexity and unknowns, and therefore require different amounts of time and resources. To address this, I developed a design discovery matrix that helps our design team determine the right amount of learning, designing and usability testing needed for each project. This ensures that we can efficiently allocate our resources and provide the best possible results for our farmers.
These guidelines are not set in stone, and not every project will fit into these categories or follow the same formula. First, I shared this chart and different levels of the process with the design team. As a team, we started jotting down our design discovery research toolkit. We realized that every problem requires different research tools, and we all contributed to our toolbox. By doing this, we were able to get more creative and change up our research methods.
I updated the process and informed the Product Owners about their areas of accountability. 
As the design advocate at FarmIQ, during our Seasonal Planning, I updated the whole company on our new discovery matrix. Additionally, I highlighted the areas where my team needs help and consideration as the two scrum teams are delivering work at a faster pace than ever before, and the pressure is still on for design, despite the discovery matrix. ​​​​​​​
Currently, I am working on a process to incorporate Dev Mode into the Scrum Teams' workloads to reduce the time spent on Design Sign-Off for the Designer and Developer involved. We have seen a significant decrease in time needed from the designers on every Jira ticket completed with Dev Mode so far.
Building on our releases
The problem
I often describe the responsibilities of the product design role to my company as wearing four different hats. One of these hats is the 'Looking Ahead' hat, which is mostly worn by me. As a team, we stay informed about industry trends and keep an eye on our competitors in our sector. I work to develop a high-level vision for our design work, which guides us in our day-to-day projects.
 We spend most of our time wearing our 'Discovering' hat. In this phase, we identify a problem or opportunity, learn about it (internal staff and customers), design a solution and usability test it with customers. This process is iterative and continues until we have a solution that solves our customer's problem. Our approach is agile, and we refine the process based on the project. We divide the work into Jira tickets, refine it with the Scrum Team, and prepare it for them to pick up.
We switch to our 'Design Support' hat when the work is being actively worked on within the Scrum Team. During this phase, we keep a close eye on the work to ensure that it matches our designs and advise on good user experience if any edge cases or questions arise. This is our quality assurance process. Once the design work is signed off, it is released. 
Our final focus area is where we wear our "Building on Releases" hat. During this phase, we assess the effectiveness of our product releases and document customer feedback. If the feedback is deemed important, we incorporate it back into our discovery process. However, when things get hectic, we tend to neglect this responsibility as there is no structured process for Designers or Product Owners in place for documenting or reviewing feedback. This issue was brought up by several teams during our company-wide seasonal planning. Seeing an opportunity to bring value to my company and foster better collaboration across multiple teams and tools, I took it upon myself to start investigating our current methods of gathering customer feedback.
Understanding how we currently get customer feedback 
I began by speaking with four key roles to gain an understanding of our current process and documentation: the CS Team Lead, Head of Customer Ops, Senior Marketing, and Product Owners. 
Our Product Owners use MixPanel to track user engagement within our applications, which gives us an overview of usage patterns. However, this data alone does not help us to understand our customers' motivations and needs.
Our Customer Success representatives document all enhancement requests when they communicate with farmers who provide feedback on our applications. However, we know that only 8% of these customers are calling about the mobile app, and they are mostly Farm Owners/Managers. As a result, we do not currently receive feedback from our Worker personas or the mobile app
Marketing started to conduct feedback surveys on our current releases for our products so that we could get a baseline for responses to email surveys.
The Solution
Based on my research, I hypothesise that we can attain more accurate and valuable customer feedback, by asking within our applications immediately after a specific action is completed. Our goal is to receive both quantitative and qualitative responses, which will help us analyse an enhancement and evaluate its value and effort without having to rely on costly one-on-one interviews.
I presented a solution for an in-product survey tool to the Head of Product and Product Owners, who became advocates for a tool and initiated a SPIKE to investigate tools that met my criteria. Once the SPIKE was completed, I arranged meetings with our two preferred tool sales representatives to discuss their respective tools and pricing.
Using the information gathered, I created Lean Business Decision documentation and included eight key stakeholders to provide input on the value and benefits of the survey tool. I presented the documentation to the Head of Product and successfully got provisional approval for the budget, pending confirmation that our customers would engage with the survey tool as expected.
Other things implemented
Only the Product Owners and Head of Product were aware of the research findings
Customer research is most valuable when everyone has a shared understanding of our customers' needs, their jobs-to-be-done, and current pain points.
When I first joined FarmIQ, research was being conducted by Designers and Product Owners, but there was no established process to share the findings with the wider teams. To ensure that the entire company could understand the reasoning behind our product decisions and design solutions, our design team needed to start sharing all of the valuable data and insights we gained from our research.
I believe it is very valuable to observe users' feedback firsthand and watch them interact with our product, whether it be live or through designs. To achieve this, we have started inviting FarmIQ staff from various departments to join our 1-on-1 research sessions with our farmers. For those who couldn't attend, I created a Slack channel called 'Things Our Farmers Say', where we share the highlights and key takeaways after each session. This has given our design team a greater voice and brought the entire company along for the ride.
Work completed whilst at FarmIQ
Back to Top