top of page

Tumble Summary | Abl Schools

Improving the User Experience of Abl’s Tumble Summary for Building a School Master Schedule

Introduction

Abl’s mission is to systematically modernize school scheduling and operations to strategically remove barriers to postsecondary success. Abl’s Scheduler is one of the tools used to turn scheduling, resource allocation, and academic programming into a strategic tool for transformation. 

 

When building a master schedule, counselors and other educators rely on two summary reports using third-party software integrated into Abl’s app to measure progress being made towards a complete schedule. Users reported challenges such as slow loading times, periodic crashing, incomplete or unhelpful data visualizations, and inefficiency in needing to track progress in two separate reports. This made for a bad user experience during an already hectic time for schedulers, a heavier workload for our support staff who supplemented the report with their own knowledge and strategies when required, and limited ability for engineering to solve technical problems due to the use of outside technology.

 

In this case study I will outline the steps taken to improve the user experience for our customers and reduce support work for our internal staff by building our own summary report.

Problem statement

Customers were threatening to churn due to the difficulties they faced when using the two summary reports during active scheduling. They are crucial tools to measure progress as well as inform the user of problem areas in their schedule. Users were finding themselves unable to make progress due to slow loading times, frequent crashing, confusing data visualizations, needing other information that the report did not provide, or feeling unsure of which report they needed to check to find what they were looking for. Engineering was often unable to help with technical issues due to the report being third-party software.

User Research

To address these issues, I conducted user research to gain further understanding of the problems to solve. I conducted interviews to learn how the report was used to measure progress toward a complete schedule and to gather feedback on the issues commonly faced. In addition to the most pressing need of simply having a report that didn’t crash, other areas of improvement were revealed such as:

  • More intuitive data visualizations. Several graphs and charts within the report were difficult to understand, and did not suggest clear action.

  • Inaccurate or broken data visualizations. Some charts and graphs were not functioning as intended, or were misleading.

  • Missing data. We identified information that would be helpful for users that was missing from the report.

  • Inefficient workflow. Information used to measure progress for building a schedule was contained in two very similar reports, resulting in users having to check two different places as they worked.

  • Unclear next steps. Building a master schedule is a complicated task.The previous report presented data, but offered no help when it came to acting on it. Users were burdened with the responsibility of understanding what they were seeing, and figuring out what to do about it (and where) on their own.

Design solutions

Based on the insights gathered from user research, we implemented the following design solutions:

Action-oriented design:

The new report paired data visualizations with clear calls-to-action that allowed the user to efficiently understand and make their next move. By including pathways that led from this report to relevant areas of the scheduling tool, the user is able to identify a problem, get where they need to go in a single click, and view feedback on their progress when they return to this step. 

​

We included tips and access to help articles anticipating the scheduling issues most commonly faced by users. Instead of being left to discern the right next step, the user is guided towards accomplishing their goals.

Action oriented design.png
Additional information to close the data-needs gap, and general data visualization improvement:

We added new data visualizations that were lacking in the original report, providing the user with new insight in areas where they previously had none. 

​

We reviewed the preexisting data visualizations and edited for clarity and intuitiveness. We chose different graph types to present the information where necessary. We included interactive elements to support additional insight without overcrowding the page. We optimized the design for accessibility by being  mindful of our color and type choices.

Full School TUmble Progress.png
Seat Density.png
Improved workflow efficiency:

By taking the crucial elements from the two reports and combining them into one, we eliminated a superfluous step in the scheduling workflow. Along with the design improvements made to the content of the report, this helped reduce clicks and mental load for users as they schedule.

​

We designed the report to show relevant information in the order that the user would need it as they reach different phases of their scheduling, with the most frequently used data at the top and that which is best for fine-tuning lower down.

Coaching.png
Improved loading times and reduced incidents of crashing:

Building our own summary reports instead of relying on outside software gives our team more opportunities to assist should the report have a technical issue. 

​

We were mindful of load times during the development process. We pressure-tested frequently with large schools (and therefor large amounts of data) to identify which visualizations, or which pools of data would take a long time to load or raise the risk of the report crashing. We designed around these limitations, to ensure that the user would not be interrupted in their work by the page crashing or needing to wait an unreasonable amount of time to get the information they need.

Evaluation

We evaluated the success of our design solution through user testing, tracking customer feedback, and monitoring its use using an analytics tool. Our findings showed that the data visualizations were easy to understand and act on. Load times have remained low, and unlike before our engineers are able to intervene should a technical issue arise. Users visit this page frequently during scheduling season and have reported a better user experience and less confusion as they work through their schedule-building.

Conclusion

This case study highlights the importance of understanding user needs and experiences when designing software tools. By conducting user research and identifying pain points, we were able to develop a new and improved summary report that addressed the challenges faced by their users. The new report included action-oriented design, additional data visualizations, and improved workflow, which resulted in a better user experience and reduced support work for internal staff. The success of the design solution was evaluated through user testing and tracking customer feedback, which showed a significant improvement in user satisfaction and efficiency.

Tumble Summary.png
bottom of page