top of page

Usability Test Report - High Fidelity

  • Foto do escritor: Leif Bessa
    Leif Bessa
  • 28 de mar.
  • 4 min de leitura

Atualizado: 30 de mai.

Using the Maze platform, we conducted the Second and Third Rounds of Usability Testing with the aim of evaluating aspects of the user experience in our application, including ease of navigation and the intuitiveness of the proposed functionalities.


Second Round


Our second test focused on two specific tasks in the app.


Task 1

Find the grocery expenses in the food category


Task Objective: Validate the expense flow. Evaluate the ease with which users can find and view specific expense details within the categories.


Success: Find the grocery expenses and view their details.


Result:

ree

Test Notes

Many users were confused about which button to start with, accessing paths that did not lead to success. As we can see below in the screens where the hot spots are shown and the last image only shows the user's clicks.

ree
ree
ree

ree

ree

Other problems found and noted:

ree

Redesign


It became clear that the designed flow was not intuitive for the user. To improve usability, we redesigned the “Monthly Expenses” flow to make it more intuitive and with less cognitive load. We changed buttons, colors, components and realigned a possible Writing confusion, as shown below:

ree

Task 2

Create a new payment reminder


Task Objective: Check the intuitiveness of the process of creating a new reminder and how easy it is to locate this reminder in the general reminder list.

Success: Create a new reminder and locate it in the reminder list.


Results:

ree

Test Notes


  • To extract the success rate, the direct and indirect success numbers were added together, analyzing whether everyone reached the calendar screen and clicked on the create reminder button;

  • To analyze the error rate, all clicks on buttons that access other screens and features of the app not related to the task were considered as task errors;

  • The main friction found, with medium severity, was that some users tried to access/create reminders through the header on the button next to the privacy eye, as seen in the images below.

ree
Participante 1
ree
Participante 2
ree
Participante 3

Redesign


On the reminders screen, we chose to make a simple and important change to an error/confusion found quite frequently in testing.

ree

Third Round


We decided to do a third round of testing on the redesign of Task 1 to validate it. In Task 2, since it is a specific adjustment, we did not consider a third round of testing necessary, since the changes will be small and will not interfere with the navigation flow, which has already been validated.


Success: Locate the expense at the Fair and view its details. Validate the redesign.


Considered an error: Clicks on buttons that access other screens unrelated to completing the task.


Task instructions: You opened your Minha Grana Organizada app and want to find a purchase made with your credit card. Imagine that you made a purchase at the Fair and need to check the amount. How would you do it?

ree

Results:

ree

Test Notes


  • Participants: Test 2 had more participants compared to Test 3, which may indicate a greater diversity of user profiles tested in the previous test;

  • Success Rate: We had a higher success rate suggesting that the design or changes implemented were more effective in meeting users' needs;

  • Usability Rate: An increase in the score suggesting that there were improvements in the user experience between tests;

  • Abandonment Rate: A lower rate as a positive indicator that users encountered less difficulty throughout the task;

  • Direct Success Rate and Task Time: The new test had a slightly higher direct success rate and a slightly shorter average task time, reflecting better efficiency in completing tasks;

  • ❌ On the other hand, the Error Rate increased, which may indicate that although users completed more tasks, they made more errors during the process. We found a concentration of errors on Screen 1 of the flow (Home), which gave us two possibilities: This may indicate problems in the task instruction (possible user bias) or in the findability of this screen, suggesting necessary improvements in this area;

  • ✅ Frictionless Flow: Screens 2, 3, 4 and 5 maintained a frictionless and error-free flow for most users, indicating better usability and findability in the new proposal;

  • Exploratory Navigation: Many participants clicked on points that Maze considered incorrect, exploring the flow as if they were using a new application. Despite this, they completed the task correctly. The prototype file could have been simpler, with fewer clickable areas, to better measure errors and limit excessive navigation. Still, button click patterns common to other interfaces facilitated intuitive navigation.


Final Conclusions


Overall, the new test demonstrated significant improvements compared to the previous test. The higher success rate, higher usability score, lower abandonment rate, and faster task time suggest that the implemented changes were effective in improving the user experience. However, the slightly higher error rate indicates specific areas where further analysis could further improve the usability and overall user experience of the application.

"Usability is about people and how they understand and use things, not about technology." (Steve Krug)

This perspective was validated by the test results, revealing the importance of understanding the user experience in optimizing the product interface and functionality.






 
 
bottom of page