2.2 Implement

2.2.1 Objective

Now you know what direction you will take, it’s time to roll up your sleeves and start implementing. You need to prepare your product roadmap, make decisions around feature priorities, dependencies and plan releases. In agile software development, a release is a deployable software package culminating in several iterations and can be made before the end of an iteration.

As we use agile processes the plan is NOT sacred, meaning priorities may shift and you should be ready to pivot as you go and as you learn more.

Prepare design documents and use these to create wireframes and high-fidelity prototypes to conduct UI and UX testing with selected participants (users). Building prototypes and conducting tests with users, like usability testing, gives you the chance to improve and correct issues earlier in the process, avoiding spending unnecessary engineering time. During this phase you’ll create (or continue creating) or design system management (DSM). A design system is the single source of truth of UI/UX and design principles aspects of the product, it groups all elements that will allow a team to design and develop a product and its functionalities.

As we use Agile methodologies – instead of a waterfall (linear) product management approach – you’ll break your work into smaller iterations and frequent releases. This means that during the development of the product, you'll be cyclically moving from the ‘Implement’ stage to ‘Release’ stage and back again.

2.2.2 What you will be doing

2.2.3 Steps and tools you can use

a. Prepare the product roadmap *🥇

  • Identify a backlog of features and functionalities for the product (as best as you can guess at this moment). You may want to structure features by release or you can start with blocks/modules of the system and later prioritise each feature in releases.

  • Prioritise features and dependencies. Review what you identified for the MVP and make any changes that may be required. Do the same for other releases.

  • Ideally you will be able to transfer the backlog into a timeline or estimate a list of releases.

  • We recommend using the Product Board to manage your product roadmap. You can access Catalpa’s account (request access to Head of Products or your Production Manager) and see examples of product roadmaps for Bero, Jerasaun Foun and others.

* Note: A product roadmap is not a Gantt chart (a popular project management tool that sometimes we, at Catalpa, use for project work plans), it is a visualisation of our product development plan at a specific point in time. If a product’s priorities change or if there is a need to pivot the product, the product roadmap will change along with it.

b. Define releases and prepare the design document *🥇

  • Based on the product roadmap, start by listing priority features for the next release. An agile epic is a body of work that can be broken down into specific tasks called ‘user stories’ based on our product roadmap. Epics are a helpful way to organise your work and to create a hierarchy for information implementation priorities and dependencies. An epic can last for weeks or months, depending on the complexity and amount of functionalities that will be developed under it. The epic is usually defined in a design document that contains user stories to describe the features - and the value - that you will be building under that unit of work as you build your product incrementally. See how we do Epic selection and planning in our handbook.

  • Now it’s time to get more details for the design and engineering team. For this you’ll use a Design document (you may have already started a draft in step 2.1.3 for the MVP, or you may need to start one now). In any case, we have a design document template ready for you to use. A design document should encompass a summary about what will be the focus of this unit of work - what you want to achieve by the end of that ‘epic’.

  • The design document will also include details on the features to be designed and developed. Your product team needs to understand how a feature works and what the criteria must be met to say that a feature is ‘ready’ to be deployed (or released). At Catalpa we usually use ‘user stories’ to put the user – the person – in the centre of the action. We add ‘conditions of satisfaction’ to detail how we want the functionality to work. In the handbook, you will find some guides to help you writing a good user story.

  • Share the design document with the team and ask for feedback. Work on any adjustments required based on feedback and organise a meeting with the product team (engineers, designers, quality assurance) to review it together. In this meeting it is important to:

    • 1) Have a shared understanding of the purpose and objectives of this unit of work, and clearly understand the value this unit of work will bring to the product,

    • 2) Identify what features will be developed and what criteria must be met to consider it ‘done’,

    • 3) Identify dependencies between features and development tasks,

    • 4) Estimate effort level required to develop each user story (you can use any scale works best for your team, at Catalpa we frequently use ‘day’ ‘days’ ‘week’ ‘weeks’), and;5) Answer the outstanding questions on the design document

Note: If you are lucky enough to have a Production Manager in your team, they can help you getting started with the SCRUM methodology, helping you prepare documents, managing the development board, and ensuring that the development team is coordinated and communicating.

c. Design the UI and UX of the product, starting with the MVP *🥇

  • The design team (or individual designers) will work on the wireframes for any/all user interfaces that will be required for the epic. This should be a collaborative activity as much as possible. Designers will use the design document as a basis, but having opportunities to sketch and discuss solutions with the broader product team is ideal. Product leads may act as a counterpart here but engineers and others can be brought to the process as well.

  • Wireframes should be shared and tested by the team before moving to a hi-fi prototype. This gives everyone a chance to review the design assumptions, any constraints and inform the UI/UX before moving to the next stage. Here is an example of wireframes for a new feature on Bero Content Management System.

  • Hi-fidelity prototypes - ideally with some level of UX explanation, like behaviours of CTA, use of gestures, etc - should be prepared by designers then reviewed and approved by the Product Lead. It is always a good idea to ask engineers and project managers/team leads to explore and provide recommendations or feedback on the prototype. Find here an example of a hi-fidelity prototype for the Bero login and self account creation feature.

  • Not always relevant, but for larger projects or core products, it is recommended that a Design System Manager (DSM) is developed. This will help others - designers, frontend developers, … - to consult it as guidance for the development of any new screen based on existing components. This is a great tool for product UI/UX consistent across time and new incremental releases. See how we are using a DSM for Openly.

d. Test and iterate *🥇

  • Hi-fidelity UI/UX prototypes allow us to test the product - internally and with real users. This is a great tool and it is highly recommended you use it. Testing early and frequently gives you the opportunity to iterate and improve, and increases the likelihood that the final great product will work for our users.

  • Prepare and conduct design feedback sessions to gather user and partner feedback on UI/UX solutions

    • You can use the ‘I like’, ‘I wish’, ‘What if’ exercise (see refence above in 2.1.3 d - Prototype, under the 2.1 Ideate)

  • Prepare and conduct usability testing sessions:

    • Define what questions you want to answer during the usability testing - what are you testing?

    • Define type of session, methodology, tools and script. Make sure the prototype is suitable for what you will be testing, considering:

      • Whether the session will be moderated (in person or remote) or unmoderated

      • Tools needed to run the sessions and collect feedback

      • What device participants will use (smartphones, tablets, laptops, others?)

      • What artefacts will be used for the testing (hi-fi prototypes on Invision, low-fi prototypes)

      • Participants are selected based on resources, time, level of digital literacy and their availability. This article from Nielsen Norman Group suggests 5 as the ideal number of participants for usability testing and presents interesting and good reasons to back it. The exact number of users can change depending on when data has reached saturation.

      • Some examples:

        • For Estrada, we conducted in-person usability testing, facilitated by a member of our team. Participants accessed a hi-fi prototype on inVision using a laptop. The facilitator asked participants to perform a number of tasks. Screen and voice recording were used with Quicktime screen recording feature. The facilitator acted as an observer, taking notes of participants reactions, comments and behaviour.

        • For Bero, we conducted self-tests (unmoderated). Participants were provided a link to access a tool (Lookback) on their mobile phone and to run the test when they were available. The tool would give them access to a staging environment as well as instructions and tasks. The tool captures participants’s audio, video and the session was later viewed and analysed by the product lead and designer.

    • Conduct testing sessions - these can be in-person, online/remote, facilitated or done asynchronously (separately at different times). Find the method that works best for your team, for the participants and for what you are testing. Collect feedback on UI, UX, text/language, content, and accessibility.

    • Include accessibility aspects as part of your testing of prototypes. You may explore accessibility questions such as: visual elements contrast, font size, readability of icons, ability to underline and highlight links, ability to add alternative text, and other accessibility aspects that are relevant to your project.

    • Do an analysis of the results and summarise findings from the sessions.

    • Let’s see in action how we conducted usability testing for Estrada MVP:

      • Script for the testing session - this was used by both the design team (to prepare the functional prototype) and by the facilitator conducting the session.

      • Hi-fidelity prototype - that was created to fit the specific exercises and script of the usability test.

      • The testing was conducted at the Directorate of Roads in Timor. We invited 6 future users of Estrada to participate in the test - one at a time.

      • The exercises were facilitated by our product lead, using a pre-written script.

      • Collection of feedback and results was done using two methods at the same time: 1) a second facilitator was observing and taking notes, 2) screen and voice recording during the test - participants were encouraged to talk through the process - which can be helpful to pick up on insights post-testing.

      • After completing the testing, we gathered feedback and found common issues/feedback. These were then shared with the design team and incorporated in the final design of the product.

    • Let’s have a look at another example, in this case for usability testing for Bero offline sync improvements. In this case we used using an A/B approach to test 2 different UI/UX options for the same functionality:

      • Script - on the original GH issue for the task of usability testing

      • Functional prototypes of 2 UI/UX solutions for the offline sync functionality were prepared: 1) current state, and 2) new proposal

      • We conducted the test remotely and asynchronously using Lookback. This means that the exercise was done by the participant with the presence or facilitation by someone from our team

      • Collection of feedback was done automatically on the Lookback platform

      • A post analysis was prepared and results for each task were presented for each of the UI/UX prototypes being tested

      • Resulting Issue for Frontend implementation informed by the usability testing results

  • Provide a summary of findings and recommendations to the product team, both engineers and designers.

  • Review prototype and iterate - based on what you learned and observed during the testing. You can test again, as many times as you wish - or as many times as your due date and time frame allows you to.

  • Here is are the slides of Tutorial Tuesday that was conducted on Usability testing @ Catalpa

Last updated