Business CASE

An online textbook publisher required the development of an internal database that would allow textbook editors to create and manage student assessments that are uploaded into Learning Management Systems (LMS) like Blackboard.  

 

The Problem

Users create assessments in Word documents of various formats that are then converted into another format to be uploaded into a learning management system. This process is time-consuming, intensely manual, and lacks organization and standardization. 

How can we create a database that improves user productivity, organizes and standardizes content, and provides seamless integration into external repositories?

 

Constraints

Stakeholders were only willing to commit to a very small budget for this internal site, therefore more advanced functions that users requested like team collaborative tools and automated assessment creation based on selected parameters were not available for MVP and initial rounds of enhancements.  

 

PROJECT SCOPE

  • Map existing format data and XML tags to new format for seamless conversion

  • Create a way for users to create and manage multiple projects

  • Develop a standardized format for creating new questions to match LMS specs

  • Construct models for each question and answer type to match LMS specs

  • Enable users to create and manage questions with import and export function

  • Provide a way for advanced searching

  • Allow multiple user types with appropriate permissions

 

DESIGN OBJECTIVE

  • An organized and less complicated way to create assessments and manage projects

  • Visually appealing dashboard that keeps users engaged

  • Improve processes to simplify and save time

  • Seamless and efficient import and export functions

RESULTS

The database improved productivity significantly, enabling users to create over ten assessments in an eight-hour day, compared to an average of four assessments as was previously measured. Standardizing the format to meet LMS specifications allowed for seamless uploads with less than a 1% error rate.      


MY ROLE

UX Designer, Requirements Development, User Stories, MVP, User Research

 

METHODOLOGIES

Requirements Analysis,  Persona Development, Card Sorting, Task Analysis, User Journeys and Flows, Sketching, Wireframes, Observational task-based analysis, A/B testing, Web Analytics, Prototyping and Iterative User Testing and Now/Next/Later Feature Prioritization

 

TOOLS

Pencil and Paper, Sketch, Balsamiq, Confluence, Excel

 


RESEARCH & ANALYSIS

After defining the scope and creating initial technical requirements, I began my user research and persona development to identify user behaviors, needs and pain points when creating and managing an assessment. Luckily, my users were in-house, and so I was able to meet with them regularly to talk about their pain points and desires for improving their work flow.  

 


PERSONAS

During my interviews I discovered that editors and managers would be the two types of users for which we would build our solution.   

 


SKETCHING & WIREFRAMES

In developing the solution, my process included began with very low-fidelity sketches in Balsamiq, followed by low-fidelity prototypes just to see if editors were able to complete basic processes. My goal was to understand and simplify the process for creating an assessment and focus on MVP with low-fidelity iterations in order to avoid wasting time creating features and functions that users did not need at the moment.  

First Iteration


Second Iteration

Wireframes

Third Iteration

Wireframes
Wireframes

site map

site_map.png

 


USER JOURNEY/FLOW

I visualized the basic user journey for creating a question for my user persona, Carla. She can either create a question immediately or view her catalogue of questions before creating a new one.

 


PROTOTYPING & USABILITY TESTING

We designed a web prototype in order to begin usability testing of the site with low-fidelity wireframes. I created four task scenarios and conducted observational testing with ten in-house editors where I observed them completing each task while they vocalized their thoughts. I also asked questions about their actions. We also conducted A/B testing with two variations of the dashboard, one as a database input screen and the other a dashboard that users accessed through a web portal.

From the online testing, we were able to gather key analytics from error messaging, the best place for navigation, and what processes were not clear to users. Key outcomes from my first round of testing were that users needed better navigation, a way to cancel processes, and a way to see when updates were made to an assessment. We added tabs for each main section, a "cancel" button for each screen as well as the ability to hit "back", and a time and date stamp to indicate when the last update was made.  

 


high fidelity mockups

Select mockups to show the look and feel of the dashboard and forms.

settings.png
settings copy 2.png
settings copy.png