Case Study

Case Study

Case Study

Wiley Reader
Study Sessions



Overview

Overview

Objective

Objective

Design a one-size-fits-all question player that can be plugged into multiple products, including a courseware experience and an eBook.

Design a one-size-fits-all question player that can be plugged into multiple products, including a courseware experience and an eBook.

Key Requirements

Key Requirements

1

1

Give it a refresh. The existing question player (below) featured a patchwork of disjointed, outdated UI with features that had been added piecemeal over time. It sorely needed a UI update to bring it up to speed with the rest of our product ecosystem.

Give it a refresh. The existing question player (below) featured a patchwork of disjointed, outdated UI with features that had been added piecemeal over time. It sorely needed a UI update to bring it up to speed with the rest of our product ecosystem.

The existing question player

needed a UI update to bring

it up to speed with the rest

of our products

The existing question player

needed a UI update to bring

it up to speed with the rest

of our products

2

2

Address key user pain points. Users (in our case, higher ed students) had long complained of:

  • Long, unwieldy questions that were difficult to view and interact with

  • Difficulty comparing their answer to the correct answer

Address key user pain points. Users (in our case, higher ed students) had long complained of:

  • Long, unwieldy questions that were difficult to view and interact with

  • Difficulty comparing their answer to the correct answer

3

3

Make it Flexible. One of the key requirements was to build a tool that would work across a variety of different products. In the short term, it would need to work both in a standalone courseware experience for assessments, and in a vertically scrolling eBook for in-chapter knowledge checks.

Make it Flexible. One of the key requirements was to build a tool that would work across a variety of different products. In the short term, it would need to work both in a standalone courseware experience for assessments, and in a vertically scrolling eBook for in-chapter knowledge checks.

Bonus Challenge!

Bonus Challenge!

*

*

Navigate complex, often conflicting requirements from multiple product managers who own the different products the question player will be plugged into

Navigate complex, often conflicting requirements from multiple product managers who own the different products the question player will be plugged into

Outcomes

Outcomes

Collaborated with Product Management, Learning Design, and Engineering to design and deliver a (nearly) one-size-fits-all question player UI that could be used across multiple products and platforms, from online exams to eBooks


User Benefits: Easier to use, more pleasing UI and overall user experience


Business Benefits: Single simplified experience = lses engineering overheard


Side benefit: a unified experience across our tools

Collaborated with Product Management, Learning Design, and Engineering to design and deliver a (nearly) one-size-fits-all question player UI that could be used across multiple products and platforms, from online exams to eBooks


User Benefits: Easier to use, more pleasing UI and overall user experience


Business Benefits: Single simplified experience = lses engineering overheard


Side benefit: a unified experience across our tools

Gathering Requirements

Gathering Requirements

While gathering business needs and requirements is typically a straightforward part of the process, because the player was to be used in multiple products, we had a web of interdependencies to unravel. Navigating this and ensuring all stakeholders’ needs and perceived needs were addressed was ultimately one of the most challenging aspects of the project.


Requirements were oftentimes in direct conflict. One example:

While gathering business needs and requirements is typically a straightforward part of the process, because the player was to be used in multiple products, we had a web of interdependencies to unravel. Navigating this and ensuring all stakeholders’ needs and perceived needs were addressed was ultimately one of the most challenging aspects of the project.


Requirements were oftentimes in direct conflict. One example:

Research

Research

Uncovering User Pain Points

I had heard for some time about the mounting student pain points with the existing question player in WileyPLUS. I reached out to my Customer Success team to learn about the most common customer complaints. The biggest, most consistent complaints were:

  • Difficult to view and interact with long scrolling questions

  • Challenging for user to compare their answer to the correct answer


Competitive Review

Next, I started a thorough competitive review. I looked at all of Wiley's major competitors in the higher ed space, but also took a close look at more current, consumer-facing products, like Duolingo. The former provided a good benchmark for where we were versus the market, and sparked some ideas for quick wins over the competition. The latter gave me a reminder of how much more pleasing we could make the UI.

Uncovering User Pain Points

I had heard for some time about the mounting student pain points with the existing question player in WileyPLUS. I reached out to my Customer Success team to learn about the most common customer complaints. The biggest, most consistent complaints were:

  • Difficult to view and interact with long scrolling questions

  • Challenging for user to compare their answer to the correct answer


Competitive Review

Next, I started a thorough competitive review. I looked at all of Wiley's major competitors in the higher ed space, but also took a close look at more current, consumer-facing products, like Duolingo. The former provided a good benchmark for where we were versus the market, and sparked some ideas for quick wins over the competition. The latter gave me a reminder of how much more pleasing we could make the UI.

Wireframes

Wireframes

The main objectives of the wireframe stage:

A

A

Explore the Components. The biggest benefit of wireframes is having a chance to look at all of the components needed - and playing around with layouts.

B

B

Review Question Player States. Sketch all states of the player (default, answer input, answer submitted, and all the forms of results feedback)

C

C

Evaluate Environments. Evaluate the 2 primary environments the player would live in to understand the unique needs of each

Exploring the Components & States

Wireframes provided a quick means of looking at the various components that would comprise the player. These include

  • question stem (the question itself)

  • answer input area

  • submit/check answer button

  • results feedback

  • correct answer (provided in formative assignments when students get the answer incorrect)

  • answer explanation



Answer Feedback

Requirements called for three types of answer feedback: results (correct/incorrect), the correct answer (in the event the user’s answer is incorrect), and answer explanations. 

The big question: How to show correct answer? One PM felt strongly that it should be consistent across all question types. Lo-fi designs quickly demonstrated this would not be feasible.

It’s easy to show the correct answer for simpler question types. But what about longer, more complex question types? The question UI doesn’t always provide a natural place to mark one answer incorrect and another correct. This mock up of a Fill in the Blank question helped demonstrate that:


Key Takeaway 1:

While it was our ideal state to be consistent, we recognized a need for a unique solution for certain question types. We would pick this back up during design.

Comparing the two environments

Lo-fi mock ups of the Courseware experience (left) vs the eBook experience (right)

  1. Courseware Experience

  • Question player appears on static screen; Question always primary content on screen

  • Desktop first (students don’t take exams on mobile)

  • Sole requirement is presenting the question on the page; navigation, progress, etc. handled outside of question player

  1. eBook Experience

  • Question embedded in long scrolling page of learning content

  • No navigation through questions; must be supported within question player

  • When user submits answer, UI expands to show results, explanation, then changes size again when user goes to next question - how to make this elegant and not jumpy?

Evaluating the states of the eBook experience

Key Takeaway 2:

The two environments had different navigation needs. The assessment experience already had navigation between questions in the UI; this wasn’t the case with the eBook.


This realization was the first inflection point at which we understood that product management’s desire for a one size fits all component might not be possible: the player would need to be dynamic to work elegantly in both places.

Design

Design

Layout

By the time we'd finished the wireframes, the layout was all but decided.


UI Considerations

As mentioned above, one goal was to give the player a UI refresh. We explored a number of UI options, and landed on an experience that:

  • remained austere (students should be focused on the exam, not the UI)

  • made it more pleasing to interact with (the big outlined hit states seen in Multiple Choice questions are a good example)

  • made it very easy to view correct answer feedback and compare it to your own answer


Key Challenge

Picking up Answer Feedback: 

Our lo-fi mock ups had showed that we couldn’t use a single type of answer feedback for all questions. So how to design feedback that would accommodate the unique needs of each question type, making it easy for the user to comprehend their answer vs the correct answer, without making the different types of feedback jarring or confusing?


We explored two general options: 

  1. Replicate the question with the correct answer below the student’s answer - Ultimately decided this was too repetitive and simply too much to look at. Additionally, it required a lot of scrolling back and forth to compare your answer to the correct answer for longer questions.

  2. Swap between your answer and correct answer. While this seems like an obvious solution in hindsight, the big concern was that the student wouldn’t see the correct answer immediately upon submitting their answer. It would make it easier to compare the two answers, but would users expect this after the precedent set by the simpler types? They would be required to discover the correct answer option and click it. It was important to get the student their feedback right away and not miss it. This would be a key item to review in our usability test.


Usability Test

Usability Test

Goal

The primary goal of the usability test was to validate results and correct answer feedback.


Key Questions

  • Do participants see the correct answer?

  • Do participants quickly understand if their answer is correct or incorrect

  • Is it easy to compare their answer to the correct answer

  • Do participants find it confusing to see feedback in two different formats?

Method

  • 15 participants. Age 19-35. 8 male, 7 female.

  • Unmoderated via usertesting.com

  • Participants asked to complete a brief quiz using a Figma prototype and explain their actions and what they see as they do

Results

15

total participants

(8 male, 7 female)

86%

noticed & selected
with correct answer
tab immediately

100%

understood feedback

for simpler questions

immediately