In a previous article about the format of information architecture (IA) work, I shared some IA activities I do during a project:
- Stakeholder interviews
- Analytics review
- User interviews
- IA and content audit
- Findings summary
- New information architecture (sitemaps, content modeling)
- Content wireframes for key pages
- Content testing plan and findings
- Final report
Today, I’m sharing a case study related to the 8th activity mentioned above, content testing. I sometimes refer to content testing as in-page information architecture, which is a process that addresses the question: how do we structure a single page to make the information on it simpler to find and understand?
AI Policy: I personally write each draft and final copy on this website. All content reflects my own thinking, ideas, style, and craft. I do not use AI such as ChatGPT or other LLMs to generate articles. Occasionally, I ask AI (such as Formalizer or Equativ) to summarize or re-state my own ideas and may restructure sections based on the response.
In this content testing case study:
Content testing case study
My role
- Led content discovery and planning for a core workflow redesign, ensuring the right activities were prioritized to uncover users’ needs
- Collaborated with product managers and engineers
Defining terms
Fellowships are funded academic (or professional) opportunities that usually last 2-5 years. For example, UNESCO offers more than 450 fellowships to “continue your studies, pursue a research topic or set up an innovative project.”
Fellowship recipients are called fellows. They’re selected based on their potential and prior achievements. Contrary to popular belief, you wouldn’t greet them by saying “hi fellows.”
According to UserTesting, “content testing determines whether your target audience can find, understand, and comprehend your content. Done well, content testing exactly pinpoints to which words, phrases, and content people respond. It starts early in the UX process and reoccurs whenever new content is implemented.”
Context
I worked with a company we’ll call FG which stands for Fellowships Galore.
FG is a tech company and its research organization offers PhD fellowships. FG has funded more than 200 students. Fellows receive 2 years of paid tuition, an annual salary, and the opportunity to work on state-of-the-art technology.
My approach
I was redesigning the fellowship application process to fit user mental models and support client goals by leveraging research findings, analytics insights, and content best practices.
After talking to stakeholders, I identified these main goals
- Reduce the number of “unqualified” emails received – FG received thousands of questions about the fellowship application process via email. Most of these questions were already answered on the website dedicated to the fellowship program. This created an issue for FG staff who had to dedicate hours every day to responding to inquiries.
- Reduce the volume of unqualified applications
Hypotheses
FG received thousands of already-answered questions via email. This meant fellowship program pages weren’t meeting user needs. My initial hypotheses about why that was included:
- important answers lived in the Frequently Asked Questions (FAQ) section, but people weren’t looking for them there
- information on deadlines and application documents used vague, ambiguous language.
Analytics
I reviewed web analytics data which included looking at:
- traffic acquisition – how were people finding the site and the fellowship program specifically?
- page visits – how many people were finding the site? Which were the most common initial entry points? Which pages were visited the most?
- search terms – what were people typing on search engines to get to this site? What were people searching for on the site itself?
- time spent on page
- device breakdown
Crafting the content testing plan
Based on analytics data, conversations with stakeholders, and content best practices (like “don’t hide important information in FAQs”), I put together 2 different versions of the fellowship program page. I used Userfeel, a user-testing tool that offers unlimited screener questions (for example, you can exclude people who don’t have a Master’s Degree), moderated/unmoderated tests, and assistance with participant recruitment.
Four participants had to fulfill tasks in the first version. Four participants had to fulfill tasks in the second version. Tasks included finding:
- the deadline for applications
- who was eligible to apply
- what people need to include in their application
- if someone studying Applied Statistics could apply
Research participants had to rate the process of finding this information from 1 (Simple) to 5 (Complex) and explain why they chose that rating. I could see their screens and hear them explain their actions or expectations (“Oh, I was expecting B to be in D, but it’s actually in Y”).
If you’re wondering, I included the fourth question to get a better understanding of how people in unrelated fields were navigating the site. Was it clear to biology students that they weren’t eligible?
Content testing findings and outcomes
Navigation – Before

Navigation – After

- We had a term in the in-page navigation called Fellowship Details. This term seemed like an afterthought to participants. They didn’t realize this was the bread and butter of application requirements. Details didn’t have enough information scent. This item was renamed to Eligibility Criteria.
- We had a term in the in-page navigation called Application Dates. Participants completed the “Find the deadline for applications” task accurately and quickly. This tells us this phrase is effective.
- We had a term in the in-page navigation called Available Fellowships. Participants completed the “Find if someone studying Applied Statistics could apply” task accurately and quickly. This tells us this phrase is effective. However, we also rewrote About, the program intro, and the Available Fellowships section to specify that these fellowships were aimed at tech-adjacent students. We wanted to ensure that would be obvious to anyone who would spend a few seconds on the site.
- After adding How to Apply to the in-page navigation and rewriting copy to be clear and actionable (here’s what your application must include, here are the steps involved), participants were able to complete the “What do people need to include in their application?” task accurately and quickly.
- Funding information generated a lot of interest so we added Funding details in the in-page navigation.
- Crucial fellowship application answers (how to apply, who’s eligible) lived under FAQs. This was the last item on the page and in the in-page navigation. Most participants didn’t look there. We removed the FAQs section, incorporating frequent questions like “Is this open to internationals?” and “Do you provide funding for undergrads?” under relevant sections like Eligibility Criteria and Funding details.
- The motion design team created a how-to video outlining steps across different platforms (the website, email, SurveyMonkey)
Content testing results
I’m thrilled to report that we reached our stated goals. The fellowship program manager said they were really happy with the work after launching the new and improved version.
Not receiving and replying to thousands of emails freed up valuable time and mental energy to focus on qualified candidate applications.
I enjoyed this project and helping hard-working students access more funding and opportunities.
Before I say au revoir, I can’t possibly overstate just how much Erika Hall’s Just Enough Research book has impacted my approach when it comes to research.
“There are many, many ways of classifying research, depending on who is doing the classification. Researchers are always thinking up more classifications. Academic classifications may be interesting in the abstract, but we care about utility, what helps get the job done.”
Whether you’re new to user research or an expert who appreciates conciseness and jokes, I highly recommend buying her book.
Her description of research really resonated with me: “Research is simply systematic inquiry. You want to know more about a particular topic, so you go through a process to increase your knowledge.”
Framed this way, research doesn’t sound intimidating at all. I feel excited to dive in despite not being a trained researcher. What do I want to know about the topic I’m researching and how can I acquire this knowledge?


Leave a Reply