Giving life to a new concept idea

Making Live Lectures More Interactive with AI

Making Live Lectures More Interactive with AI

Engaged was a thought-stage project aiming to solve a big classroom problem: student disengagement during live lectures. In order to do that, they meant to design a platform that generated real-time quizzes through AI and the recording of the lectures.

Client

EngagED

Date

June 2025

Industry

Ed-Tech

Scope of work

Website Design

Product Design

MVP Ideation

The Challenge

The Challenge

With the new generations, professors face more and more disengaged students due to long theoretical lectures and outdated teaching methods. EngagED solution for this was to use AI to transform passive listening into interactive learning. When they contacted us, their concept idea was 100% defined, so our main focus was to find a way to do it without adding any friction to the teaching experience.

Research

Research

As mentioned before, the stakeholders initial concept was already defined: online platform that records university classrooms and use AI to generate real-life questions based on those recordings. So we went straight to talk to the potential users.

THE GOAL.

·

Understand the current experience: increase our comprehension on teacher's and student's current experience.

·

Test the inicial concept: understand if this product could fit professors lectures and different type of classrooms/topics.

MethodologY.
Methodology.
in-depth Interviews with professors
Short surveys for students
Key insights.

Professors' Insights

Professors' Insights

take aways.

With our research we confirmed what the stakeholder had stated from the beginning while the concept testing shed light over two main concerns the professors had about the incorporation of AI tool within their classrooms.

·

Traditional methods fail to engage today’s visual, digital-native students: they get easily distracted during long theoretical lectures.

While polls or quizzes are seen as good incorporations for students to stay focused and engaged, as well as to improve their learning curves.

·

Professors are open to incorporate real-time tools in their classrooms, but are concerned about:

Letting the AI control the classroom flow. More specifically, interrupting the class.

Not being able to review the AI-generated questions beforehand.

In conclusion, professors are constantly looking for a way to keep students attention and engagement during lectures. In order to do that, they are open to include tech tools, but not to lose control over it of their class flow and content.

Defining the problem

Defining the problem

Problem Statement

Problem Statement

Problem Statement

How might we help teachers keep their students engaged, but without losing control of their classroom flow and content over the AI?

How might we help teachers keep their students engaged, but without losing control of their classroom flow and content over the AI?

How might we help teachers keep their students engaged, but without losing control of their classroom flow and content over the AI?

Benchmark

Benchmark

We analyzed quiz-generation tools and AI note-taking apps to assess:

Features
UI/UX patterns
Interaction flows
Visual style

Ideation

Ideation

Our first step of the ideation process was to prioritize the features we needed in the MVP using the MoSCoW method, because, as we were working on an MVP, we wanted to focus our attention only on "must have" features.


Afterwards, in order to get rid of the blank page syndrome, so we realized the Crazy 8s exercise to get ideas out.

So we focused on the main features and ideated a basic flow that would make the professors have all the control over the AI tool during their lectures.


  1. The AI won't generate the quiz from the recording until the professor actively initiates the question-generation.

  2. The professor can give basic instructions to the AI before generating the quiz by selecting a few available options (in order to make it fast and easy so that class is not interrupted): type of questions, difficulty, number of questions, etc.

  3. Once the quiz is generated, the professor has the opportunity to edit each question individually if he wants to. There are many options to do so in order to make the editing as easy and fast as possible: let the AI regenerate a question, edit it directly, duplicate a question, etc.

Design process

Design process

Considering the previous benchmark we created our first wireframes. Here you can see how our main priority was to design a platform that the professor could easily navigate in the middle of a lecture.

design critique.

The design critique session with fellow colleagues provided valuable feedback and confirmed that the overall direction was solid. The visual and interaction consistency across screens was well-received, which validated many of the initial design choices. At the same time, the discussion evidenced some opportunities for improvement:

  • Flexibility in quiz creation: it was suggested the option to manually add questions, in addition to those automatically generated by the AI.

  • Clarity of actions: the floating button to generate quizzes was perceived as confusing, highlighting the need for a more intuitive interaction pattern.

This helped shape the next iteration, ensuring both usability and adaptability were better addressed.

Defining the style .
defining the style.

Original AI-generated palette (blue + orange) used on the marketing website.

When we started the project, the client was using colors suggested by an AI tool for their marketing site (blue and orange). In one of our early conversations, they mentioned they were open to rethinking the palette, since the initial choices hadn’t been guided by a design strategy.


Given time constraints, and since redefining the brand wasn’t the main reason of our collaboration, we focused on making small but meaningful adjustments that aligned better with the product experience.

Refined palette with a darker blue for focus and a vibrant yellow for energy and clarity.

We introduced a darker, more stable blue as the primary color to generate contrast and and a visual language that feels more serious, focused, and confident, qualities that teachers look for in a reliable work tool. For the secondary color, we replaced the orange with a bright and vibrant yellow, chosen for its ability to bring energy, clarity, and optimism, reinforcing the product’s role in improving engagement and motivation in the classroom.


These refinements created a palette that is both professional and uplifting, creating a balance between reliability and inspiration.

Iterating on the Hi-Fi Prototype.

After incorporating feedback from the initial design critique, we moved from wireframes directly into a high-fidelity prototype. With this version, we conducted a usability test -as well as other design critique- to validate the design decisions and discover potential pain points.

The test provided insights into how professors interacted with the platform, highlighting both strengths and areas for improvement. Based on these findings, we iterated on the prototype.

In the following section, I’ll present a side-by-side comparison of the previous design vs. the updated design.

Simplifying Navigation with a Lesson-Centered Structure

Simplifying Navigation with a Lesson-Centered Structure

Simplifying Navigation with a Lesson-Centered Structure

  1. The Home dashboard did not provide clear value. This presented recordings and uploads grouped by day. But, professors expressed that this structure didn’t reflect how they actually teach. In practice, they may not have any recordings yet or they might have multiple lessons -meaning many recordings and documents all mixed up- within the same day. All this made the daily grouping inconsistent and difficult to navigate. More importantly, during class, professors need to act quickly: either start a new recording or access specific previous lessons (for example, to review or evaluate a past session).


  1. The "Recordings" section's naming and iconography were misleading. These suggested that users would only find audio files, when in reality, each “recording” also contained related materials such as transcripts, summaries, and uploaded documents. This mismatch between labeling, iconography, and content confused users.


To address these issues, we restructured the information architecture and introduced a lesson-centered navigation model. The "Home" page was removed, and the former "Recordings" section was reframed as "Lessons". Each lesson now functions as a container (represented by a folder icon) that organizes all associated materials: recordings, PDFs, presentations, transcripts, and summaries.


This change reduced cognitive load, improved navigation by aligning it more closely with professors’ mental models of how they structure their teaching materials.

Improving Input Controls for Quiz Settings

Improving Input Controls for Quiz Settings

Improving Input Controls for Quiz Settings

In the first prototype, parameters such as number of questions and response time were selected through dropdown menus. Both the design critique and usability test revealed that this approach was unintuitive and limited, forcing users to click through predefined options rather than quickly adjusting values.


To improve efficiency and flexibility, we replaced the dropdowns with sliders, which allow for more direct manipulation, clearer affordances, and a faster way to fine-tune settings. This change simplified the interaction while giving professors greater control.

Improving Visibility of the Transcript Panel

Improving Visibility of the Transcript Panel

Improving Visibility of the Transcript Panel

In the initial design, the transcript was accessible through a floating button that opened a sliding side panel. While this interaction pattern successfully kept the transcript non-intrusive and preserved focus on the quiz page, the usability test revealed a visibility issue: several professors overlooked the button and only discovered the feature when revealed by the team. When asked about it, they clarified that they actually found the transcript highly valuable.

To address this, we kept the side panel interaction -since its important to support the multitasking without disrupting the main flow-, but improved button discoverability by increasing its size and visual weight.

The -not so- final prototype.

next steps & ongoing improvements

next steps & ongoing improvements

The prototype is still a work in progress, as iterative improvement is a core part of the design process. After implementing the initial updates, we conducted another round of usability testing, which provided additional insights to refine the interface and user flows. To complement the qualitative feedback, we also employed the System Usability Scale (SUS) using a Likert scale to measure the intuitiveness and simplicity of the platform and gather quantitative data.


These were the results:

System Usability Scale (SUS)

System Usability Scale (SUS)

System Usability Scale (SUS)

Overall, the results indicate a high level of user satisfaction and confidence. It demonstrates that the platform is largely intuitive and user-friendly. All teachers that participated in this round of usability testes expressed that they imagined themselves incorporating the platform in their classrooms.

following up.
Following up.
  1. Quiz Icon Redefined
    During testing, we observed that the quiz icon lacked clarity and was not easily recognized by users. Following a heuristic evaluation, this falls under “match between system and the real world”, meaning the icon should better align with users’ mental models. So a next step is going to be to explore alternative iconography that communicates the concept of a quiz more intuitively.

  1. Transcript Search & Edit Functionality
    Professors highlighted the need to have more control over transcripts, specifically the ability to search, edit, and potentially use search-and-replace within the text. This request relates to the heuristic “user control and freedom” and “flexibility and efficiency of use.”


  1. Document Sharing & Collaboration (Future Release)
    Looking ahead, participants suggested adding sharing capabilities to distribute transcripts, summaries, and quiz results with students or fellow professors. While it's not a main focus for the MVP, this feature could be a nice addition to a future version of the platform where collaboration would be key.


  1. Quiz Analytics (Future Release)
    Another forward-looking feature would be to integrate analytics on quiz results, giving users access to performance insights and trends. This helps professors have more control over the student's performance and learning curves.

PRODUCT DESIGNER
NIKI MERMELADA
PRODUCT DESIGNER
NIKI MERMELADA