CTAT

Challenge

Research the cause of low usage of our client's product CTAT, and improve it to increase its adoption in higher education settings.

Outcome

The testing and partial build of new web and tutorial designs for CTAT.

Role & Contributions

  • Design lead
  • User research
  • UX design
  • Visual design

Teammates

  • Grace Guo (design researcher)
  • Chris Feng (design researcher)
  • Siting Jin (developer)

Overview

What is CTAT?

The Cognitive Tutor Authoring Tools (CTAT) is a tool suite for creating online intelligent tutors. CTAT allows instructors to easily design problem interfaces and provide contextual feedback to students as they work through problems.

CTAT website's current home page.

Redesign Components

Our deliverables included a redesigned website, a tutor gallery and a series of tutorials intended for inexperienced CTAT users. We focused on guiding users to explore and learn about cognitive tutors, because we found through research that better understanding of CTAT led to greater interest in learning (and potentially adopting) the software.

The style and language of the site aims to be approachable and soothing. The tutor gallery allows newcomers to see and interact with tutors made with CTAT, demonstrating what they can do. The tutorials are designed to teach users how to use CTAT and what its benefits are.

Nagivation from the home page to other pages.

Research

Key Findings

Through competitive analysis and interviews with professors and other stakeholders, we learned about problem-writing processes and useful features for educational technologies. CTAT is an extremely powerful tool, but we identified 3 main problems facing both new and experienced users:

  • Technical limitations: CTAT's bare-bones interface creates an unintuitive, frustrating user experience and makes the software feel unappealing to use.
  • Difficulty with learning and problem-shooting the software: the lack of comprehensive online documentation or an online community is a substantial barrier for users who cannot troubleshoot.
  • Misunderstanding of what CTAT does: it is difficult for new users to see the advanced kind of feedback CTAT can provide, instead finding the software overly complicated and too much work to learn.
It's not that other tools are better than CTAT. But for the types of problems some professors need, CTAT is too much, and they won't take time to learn it.
- Director at the Eberly Center for Teaching Excellence & Educational Innovation

Iterations & Testing

Design Goals

We categorized the main points from our research into three stages of use. Recognizing that we had little control over CTAT’s user experience and existing code base, we opted to focus on informing users about the software and teaching them to use it, deriving our design goals:

  • To help users understand the value of CTAT (pre-use)
  • To redesign tutorials for better onboarding (initial use)

Lo-Fi Prototype

Initially, we focused on designing new tutorials. CTAT’s current tutorial is text-heavy and difficult to navigate, but we hypothesized that effective tutorials would provide better a understanding of CTAT, which we had found increased users’ willingness to learn more.

We tested two tutorial formats: 1) text-based instructions with screenshots, and 2) spoken instructions with a demonstration of the step, using paper. We asked users to follow our tutorials to build a tutor interface for a simple math problem, and what they understood CTAT to be after completing the task.

Above: Text tutorial and demo tutorial prototypes

Mid-Fi Prototype

We found that there were advantages and disadvantages to each format. Text provided better explanations, but demonstrations were easier to follow. In the mid-fi stage, we worked to find a good balance between the two formats for better learning and retention.

Users click on the area indicated by text to trigger an animation and explanation of the step.

User Testing Results

Compared to CTAT's current text-and-images based tutorial, our step breakdown and more visual format felt more approachable to people. We organized the feedback we received to identify specific areas to improve for our next round.

In addition, we realized that the tutorial alone was not enough to educate users about CTAT. We wanted to build a website around the tutorials to provide more context for users first. The combined feedback led to the final iterations shown at the top of the page.

The [tutorial] made me want to know more. If it only uses a few things to make something decent, I want to know what else it can do.
- Undergrad Teaching Assistant at Carnegie Mellon University

Other Process

As the designer on the team, I spent some time developing a new design system for the site and tutorials, exploring components and layouts, and working on other visual nuances.

Typography & Icons

Highlighting Components

Layout


Learning Outcomes

  • Used open-ended research to determine a design direction
  • Balanced design decisions with the client's acceptance criteria
  • Practiced usability testing and iterative design

Reflection

Though we received positive feedback, we were not able to fully test the effectiveness of our design on CTAT adoption in our limited time frame. Our testing showed that the site redesign and new tutorials make it easier for new users to understand CTAT’s capabilities, but there remains the possibility that this may not translate to increased adoption over time.

Some possible next steps for this project might include:

  • Deploying and evaluating the long-term effect of the new website and its features on CTAT adoption. A good indication of tutorial effectiveness could be a correlated upward trend between views and tutors created.
  • Creating a greater range of tutorials. Users who are considering CTAT for its superior flexibility may want to learn how to use a wider range of features. Tutorials that can make more complex features more learnable would appeal to these users and help them create effective tutors for a diverse range of domains.