A two-study usability research program evaluating filtering interface designs for Christian shoppers on mobile — comparing prototype variations to identify which approach best supported how ministry users add, view, and remove filters when browsing a large faith-based resource catalog.
Lifeway's product catalog includes thousands of resources across multiple formats, translations, age groups, and series. For ministry staff who know what they're looking for, filtering is essential. But research from persona interviews and behavioral analytics had revealed that filtering behavior was inconsistent and often frustrating — particularly on mobile, where small group leaders do most of their browsing.
The design team was exploring a new filtering interface that introduced "filter tags" — visible, removable pill-shaped labels showing which filters were active above the search results. Before shipping to production, the team needed to validate whether Christian shoppers actually understood and could use the new pattern.
These questions matter because getting filtering wrong on a large faith-based catalog directly impacts whether church administrators and small group leaders can find the right resources at all.
The research was conducted in two sequential studies using UXPin interactive prototypes on mobile web. Both studies used the same participant profile — Christian users familiar with Lifeway who had purchased resources online within the past year.
Study C6480S142 — Comparative prototype study (17 participants). Participants completed one task across three prototype variations: No Filter Tags, Scrollable Filter Tags, and Stacked Filter Tags. The task required removing an applied filter. A final questionnaire asked participants to compare prototypes on ease, speed, and overall preference.
Study C6480S143 — Two-task validation study (15 participants). This study tested the recommended filter tag design with two tasks: adding a filter and removing a filter without using the Filter & Sort button. A final questionnaire assessed whether users could identify which filters were applied, what the tags meant, and which method they preferred.
The two-study structure wasn't originally planned — it was the result of identifying a methodological flaw in the first study mid-analysis and designing a correction before presenting findings to stakeholders.
All participants successfully completed both the adding and removing filter tasks. 80% rated adding a filter as easy; 67% rated removing via tags as easy — a meaningful improvement from the ambiguous first study results.
When shown a product listing with multiple filters applied, 87% correctly identified which filters were active from the tag display — validating that the visual pattern communicated filter state clearly to Christian shoppers.
When not directed to use a specific method, 53% used the filter tags and 73% said they preferred the tags over the Filter & Sort menu for viewing active filters on their mobile device.
The Filter & Sort button and filter tags serve complementary needs for ministry users. Users who discovered tags found them faster; users who didn't defaulted to the button. Keeping both avoids any single point of failure in the filtering experience.
One participant captured the tag experience well: "The little pill thing showed CSB applied to the results plus the number of products listed went down as another indicator the selection was made." That's exactly the comprehension signal the design was aiming for.
Validated research that gives development teams confidence to ship is exactly what a usability study from Transformed Works delivers — including the rigor to catch and correct methodological problems before findings reach stakeholders.
The first study (S142) produced results that appeared to favor "No Filter Tags" — but a methodological issue emerged: the task design allowed participants to complete the task using the Filter & Sort button on all three prototypes, meaning the tags were never actually tested in isolation. The apparent winner was an artifact of the study design, not genuine user preference.
Rather than shipping based on flawed data, a follow-up study was designed with a task that explicitly required tag interaction. Study S143 corrected the bias, produced reliable data, and gave the development team a defensible recommendation backed by genuine user behavior.
This is what good research practice looks like — not just executing studies, but evaluating the quality of findings and designing corrections when needed. If your team needs that level of rigor, let's talk.
A filter tag design validated through corrected, bias-free usability testing — with a clear recommendation to implement tags alongside the Filter & Sort menu for a more flexible and comprehensible filtering experience for Christian shoppers on mobile.