EarthRanger is a software platform that integrates historical conservation data, assisting protected area managers, ecologists, and wildlife biologists in making informed operational decisions for wildlife conservation.
EarthRanger Software Solution
After meeting with our sponsor, the Allen Institute of AI EarthRanger team, we learned that there hasn’t been much testing done on the app. This presents a great opportunity for us to focus on improving the Android app experience.
Focus on enhancing the EarthRanger Android mobile app experience.
EarthRanger Team
Participants
Ideal Participants
Age: Mid- to late twenties
Education: Low to mid-level
Digital literacy: Low
Work environment: Very remote regions, away from family for extended periods.
Language skills: Limited English proficiency, speaks Portuguese, Spanish, French, or Swahili
Technological skills: Limited
Device preference: Android users
Hard to Recruit Ideal Participants
The app is used worldwide, so initially, we aimed to recruit participants closely matching our ideal profile.
However, due to challenges in finding such participants, we adjusted our requirements as follows:
Age: Mid- to late twenties
Experience: Having patrol experiences
Device preference: Have some Android device experience
EarthRanger Persona - Rangers
EarthRanger: Global Usage
Usability Test Method
In-person Moderated Usability Test
Our usability test used the contextual inquiry method to gain insights into how users completed tasks with the EarthRanger app in their natural environment.
Task 1
Live Tracking
Start and stop live tracking.
Task 2
Create Report
Start a routine patrol, start a report, save report.
Task 3
Save Draft
Create a report for a fire, save as draft, find the draft, save report.
Task 4
Create Report Offline
Turn on airplane mode, create a report for a fire, and save report.
Task 5
Save Draft Offline
With no internet connection, interact with saved drafts and pending sync.
Data Collection
Qualitative Data
Why & How to Fix
Behavior
Attitudes
Directly observe how participants use EarthRanger App to complete tasks
Quantitative Data
How Many & How Much
Task Completion Rate
SEQ (Single Ease Question) score
System Usability Scale (SUS) Scores
Time on Task
Affinity Mapping
Task Completion Rate
SEQ (Single Ease Question) Score
Some participants found tasks difficult initially but found them easy once they understood how to complete them.
System Usability Scale (SUS) Scores
The maximum SUS score was 92.5 and minimum was 55. The average SUS score is 70 which is acceptable.
Findings
Finding #1
Severity
#Catastrophic
Users w/ Issue: 5
Users had difficulty identifying how to save a draft or submit a report.
“folder is not a draft.”
– Participant 5
“I’m curious about this folder button.”
– Participant 6
Opportunity
Make it clear which are primary, secondary CTAs and what happens when the user clicks them
Finding #2
Severity
#Major
Users w/ Issue: 1
Participants have no way to cancel a report once starts.
“Folder icon at bottom right is unclear. Once I cliched it, it saved draft, but can’t unsaved. Not reversible.”
– Participant 2
“Clicked 3 times back to exit out. No clear exit flow.”
– Participant 2
Opportunity
Prompt users with options as they exit unsubmitted reports.
Finding #3
Severity
#Major
Users w/ Issue: 1
One user had difficulty navigating to find Saved Drafts.
“…but after saving finding the saved draft can be hard.”
– Participant 3
Opportunity
Provide visual indicators to help guide users, such as notification with a red badge or an toast animation, to indicate the location of Saved Drafts
Finding #4
Severity
#Major
Users w/ Issue: 1
During offline tasks, participants found it difficult to confirm whether reports were saved and did not understand the terminologies to verify report status.
“I can say it’s saved but other apps usually show me where it is saved. …Finding the saved report can be hard. If I submitted it, why has it gone under pending sync?”
– Participant 5
“I cannot determine if I saved the report successfully. “
– Participant 6
Opportunity
Provide additional system feedback and documentation.
Display a toast notification confirming that the report has been saved and is moved to “Pending Sync” for synchronization once online.
Provide a link to the “Reports” page where users can view “Pending Sync” reports.
There are 5 more opportunities in the Usability Report.
Identified and prioritized usability issues, providing actionable recommendations for improving the user experience.
Successfully pivoted participant group to address challenging profiles and identified common usability issues in EarthRanger app.
The study provided both qualitative and quantitative data, aligning with feedback from the EarthRanger Team’s real users.
Next Time
Test More Specific Report Types
Expand the variety of report types tested to gain insights into a wider range of user interactions.
Test with Diverse Cultural Backgrounds
Test with participants from diverse cultural backgrounds to understand how different users interpret and interact with the app’s icons.
Test Tracking and Patrolling in Background
Evaluate the app’s performance when users track and patrol in the background while multitasking.
Takeaways
Overcoming Challenges: Unlike other groups that mostly did online usability studies, our group did in-person tests. This was challenging because we had to set up laptop recording and screen recording on Android devices. In two tasks, participants had to navigate the apps without internet connections. These challenges taught us the importance of thorough preparation before a usability test.