ChMS Funds

In Brief

Church360, a web-based church management system (ChMS), is a flexible tool used to assist churches in accomplishing their ministry goals. As part of an ongoing overhaul of its design and system, I designed a new interface for tracking incoming donations, established and utilized a team usability testing process, advocated for Hotjar to be added to direct future improvements, and collaborated with the agile development team to successfully launch this new view. 

Duration

Fall 2021

Scope

Church management system

Tools

Adobe XD, HTML/CSS (Tailwind), Github

Team

Lead designer, working with 2-3 developers

Role

Research, prototyping, usability testing, design, front end coding

The Problem

As one piece of Church360, "Funds" allows tracking of incoming donations to particular funds, i.e. women’s ministry or a mission trip. 

In Voice of the Customer feedback, users requested a way to merge funds into one another, change the order in which funds appear, and view archived funds. 

Since our target persona is typically a volunteer with many other tasks on their plate, I also wanted to provide more organization to simplify and reduce visual load.

These improvements tie back to business goals of increased customer retention and conversion, but also success for our users' ministry. 

Solution

Strategy

To empathize with our users, I utilized personas based on the users' own description of their primary roles at church and how they use technology in their ministry. 

I did an informal heuristic review of the existing Funds page with top tasks in mind and researched how other other ChMS systems handled similar functionality. I sorted through customer feedback and evaluated possible improvements to top tasks.

Funds wireframe 1
Funds wireframe 2
"Merge Funds" wireframe
A few wireframes from the "Funds" user testing flow

Design

I created wireframes for the base view, addressing users’ needs and frustrations. Because there were a few new functions (merging funds, rearranging funds), I created a number of iterations and flows that I ran past my colleagues for feedback.

Observing Zoom usability test
Observing Zoom usability test (participants blurred out for privacy)

Testing and Iteration

After establishing a script and sessions for moderated remote testing over Zoom with 5 in-house users, my teammate and I recorded and documented the qualitative feedback from the tests, where we gave each user 7 small “missions” to complete in Maze. I acted as the facilitator while my teammate observed and documented. 

After each session, we discussed our observations and findings and what to adjust in the design. I made improvements where we saw users encounter issues. 

For example, we noticed users couldn't easily discover the new “Archived Funds” section, which led us to revisit how we could better distinguish this section from the others. We wound up breaking out the “Archived Funds” into a new section below, with a header of its own to increase visibility.

Delivery

I walked stakeholders and developers through the design and addressed any possible technical or UX concerns. Next, I coded them with HTML and Tailwind CSS, then delivered to the developers. Afterward, I worked with the devs to ensure accuracy in both desktop and mobile views while maintaining our agile process and timeline.

Funds view screenshot
Final "Funds" view

Outcome

I successfully launched the new design with stakeholders' and users' requested functionality. Since advocating for Hotjar to be added, we hope to review the sessions to discover continuous improvement possibilities in the future. 

Our internal processes were also refined. We established a new design critique meeting to gather feedback from stakeholders and team members more efficiently and effectively by separating them from the larger team meetings.

Establishing and utilizing a process for remote usability testing on this project helped our team validate our design decisions with data and choose where to make improvements. 

I created a guideline for when to utilize extensive testing methods or to scale back in order to stay on track. This has helped me continue to be an advocate for UX internally by being flexible – able to offer more or less extensive research and testing options depending on the scope of the project.

Takeaways