
Ela: Delivering Value to Learners in Weeks
Ela: Delivering Value to Learners in Weeks
I designed the Learning Assistant to help QA give learners the extra support they need to hit their goals faster and stay engaged. With just a couple of months to get it ready, I focused on making it fit smoothly into the platform while laying the groundwork for future upgrades.
I designed the Learning Assistant to help QA give learners the extra support they need to hit their goals faster and stay engaged. With just a couple of months to get it ready, I focused on making it fit smoothly into the platform while laying the groundwork for future upgrades.
My role on the project
From the start, I focused on shaping the Learning Assistant to fit naturally into our platform, making sure it felt right within the overall experience and followed our design system. I wanted it to feel intuitive, helpful, and genuinely useful for learners. I worked closely with engineers, AI specialists, and stakeholders to strike the right balance between technical limitations and user needs, so everything felt smooth and seamless.
My role on the project
From the start, I focused on shaping the Learning Assistant to fit naturally into our platform, making sure it felt right within the overall experience and followed our design system. I wanted it to feel intuitive, helpful, and genuinely useful for learners. I worked closely with engineers, AI specialists, and stakeholders to strike the right balance between technical limitations and user needs, so everything felt smooth and seamless.
My role on the project
From the start, I focused on shaping the Learning Assistant to fit naturally into our platform, making sure it felt right within the overall experience and followed our design system. I wanted it to feel intuitive, helpful, and genuinely useful for learners. I worked closely with engineers, AI specialists, and stakeholders to strike the right balance between technical limitations and user needs, so everything felt smooth and seamless.
“AI should power the whole platform”. Ok, where do we start?
In 2024, QA introduced AI-assisted labs but lacked AI features that spanned the entire learning experience. We inherited a simple yet functional prototype developed a few months earlier as part of the Lab Assistant, which had already undergone extensive testing.
While technically working, the prototype’s capabilities were limited, and the user experience was barebones—essentially a chat interface with no structured guidance. Having a very tight deadline (around two months), rather than discarding this work, we saw an opportunity to build upon it, leveraging its technical foundations while significantly enhancing its usability and integration within our platform. The prototype had already undergone extensive testing, but we needed to seamlessly adapt it to our platform and user needs while also considering a previously released Learning Assistant in our Labs environment. We needed to find the most efficient way to utilize the existing work without unnecessary redevelopment, while ensuring a smooth user experience.
“AI should power the whole platform”. Ok, where do we start?
In 2024, QA introduced AI-assisted labs but lacked AI features that spanned the entire learning experience. We inherited a simple yet functional prototype developed a few months earlier as part of the Lab Assistant, which had already undergone extensive testing.
While technically working, the prototype’s capabilities were limited, and the user experience was barebones—essentially a chat interface with no structured guidance. Having a very tight deadline (around two months), rather than discarding this work, we saw an opportunity to build upon it, leveraging its technical foundations while significantly enhancing its usability and integration within our platform. The prototype had already undergone extensive testing, but we needed to seamlessly adapt it to our platform and user needs while also considering a previously released Learning Assistant in our Labs environment. We needed to find the most efficient way to utilize the existing work without unnecessary redevelopment, while ensuring a smooth user experience.
“AI should power the whole platform”. Ok, where do we start?
In 2024, QA introduced AI-assisted labs but lacked AI features that spanned the entire learning experience. We inherited a simple yet functional prototype developed a few months earlier as part of the Lab Assistant, which had already undergone extensive testing.
While technically working, the prototype’s capabilities were limited, and the user experience was barebones—essentially a chat interface with no structured guidance. Having a very tight deadline (around two months), rather than discarding this work, we saw an opportunity to build upon it, leveraging its technical foundations while significantly enhancing its usability and integration within our platform. The prototype had already undergone extensive testing, but we needed to seamlessly adapt it to our platform and user needs while also considering a previously released Learning Assistant in our Labs environment. We needed to find the most efficient way to utilize the existing work without unnecessary redevelopment, while ensuring a smooth user experience.
Don't have time for this?
Jump straight to the final solution
Don't have time for this?
Jump straight to the final solution
Don't have time for this?
Jump straight to the final solution
What makes a great learning assistant?
We did the research.
To ensure our Learning Assistant was positioned effectively, I analyzed how competitors were integrating similar assistants into their platforms.
I've also spent some time doing secondary research to find out something more about what worked and what didn't for brands like Duolingo or Khan Academy, and from studies conducted at Harvard and Vilnius University.
This research provided valuable insights into industry trends and best practices. Key findings showed that the most successful learning assistants provided contextual support, reduced cognitive load, and encouraged active engagement through structured guidance rather than open-ended chat interactions.






Making the most of what was available: evaluating, improving, and integrating an existing prototype.
Given the time constraints, we relied on existing testing data from the prototype rather than conducting user research from scratch.
OUR PROCESS IN A NUTSHELL:
Evaluating Strengths and Weaknesses:
We conducted an in-depth analysis of the prototype’s technical capabilities and limitations, examined testing data from early users and mapped out common frustrations.
Defining Integration Opportunities:
By analyzing how the assistant fit into our platform’s existing learning flow, we identified opportunities to improve its effectiveness. This included clearly defining the way users interacted with the assistant, ensuring responses aligned with platform-specific content, and refining user journeys to make the assistant feel like a natural extension of the learning experience.
This careful evaluation helped approaching the integration process with clarity, making sure that improvements were impactful, and aligned with technical feasibility and user expectations.
Anticipating user needs: how Ela’s smart actions can make learning easier
To ensure the Learning Assistant provided real value, we first evaluated common user scenarios and identified key moments where AI-driven support could enhance the experience. For example, when a user completes a lesson, they might want to test their knowledge to reinforce learning. By anticipating these needs, we designed Smart Actions—predefined prompts that provided clear starting points for interactions. These actions helped users navigate the assistant’s capabilities without requiring prior AI knowledge, ensuring a smoother and more engaging learning experience
Anticipating user needs: how Ela’s smart actions can make learning easier
To ensure the Learning Assistant provided real value, we first evaluated common user scenarios and identified key moments where AI-driven support could enhance the experience. For example, when a user completes a lesson, they might want to test their knowledge to reinforce learning. By anticipating these needs, we designed Smart Actions—predefined prompts that provided clear starting points for interactions. These actions helped users navigate the assistant’s capabilities without requiring prior AI knowledge, ensuring a smoother and more engaging learning experience
Anticipating user needs: how Ela’s smart actions can make learning easier
To ensure the Learning Assistant provided real value, we first evaluated common user scenarios and identified key moments where AI-driven support could enhance the experience. For example, when a user completes a lesson, they might want to test their knowledge to reinforce learning. By anticipating these needs, we designed Smart Actions—predefined prompts that provided clear starting points for interactions. These actions helped users navigate the assistant’s capabilities without requiring prior AI knowledge, ensuring a smoother and more engaging learning experience



Summarizing content for better retention
Summarizing content for better retention
Understanding and retaining large amounts of information can be challenging, so we introduced the Summarize action to help users quickly grasp key takeaways from a lesson or article. By distilling content into its most essential points while removing unnecessary details, this feature provides a structured way to reinforce learning efficiently. For example, if a learner takes a break and returns to their course after a few days, they can use Summarize to quickly refresh their memory and get back on track without having to re-read the entire lesson
Understanding and retaining large amounts of information can be challenging, so we introduced the Summarize action to help users quickly grasp key takeaways from a lesson or article. By distilling content into its most essential points while removing unnecessary details, this feature provides a structured way to reinforce learning efficiently. For example, if a learner takes a break and returns to their course after a few days, they can use Summarize to quickly refresh their memory and get back on track without having to re-read the entire lesson



Helping learners test their knowledge
along the way
One of the most impactful Smart Actions is the Test feature. When a user finishes a lesson, they have the option to immediately quiz themselves on the core concepts covered. This helps reinforce retention, allowing learners to assess their understanding and fill in any gaps before moving forward. The AI generates multiple-choice questions that provide instant feedback, turning passive learning into an interactive experience.



Breaking down concepts
with “Explain”
Breaking down concepts
with “Explain”
For users struggling with complex topics, the Explain action offers clear, structured guidance. Instead of getting distracted searching for answers on Google or relying on ChatGPT, learners can request a simplified breakdown of a specific concept. They also have the option to receive explanations through analogies or step-by-step logic, making even the most technical subjects more accessible and easier to understand.
For users struggling with complex topics, the Explain action offers clear, structured guidance. Instead of getting distracted searching for answers on Google or relying on ChatGPT, learners can request a simplified breakdown of a specific concept. They also have the option to receive explanations through analogies or step-by-step logic, making even the most technical subjects more accessible and easier to understand.
How might we make the Learning Assistant as easy to find and use as possible?
During the design phase, we explored multiple approaches for how users would discover and interact with the Learning Assistant, considering both technical limitations and the overall learning experience.
How might we make the Learning Assistant as easy to find and use as possible?
During the design phase, we explored multiple approaches for how users would discover and interact with the Learning Assistant, considering both technical limitations and the overall learning experience.
How might we make the Learning Assistant as easy to find and use as possible?
During the design phase, we explored multiple approaches for how users would discover and interact with the Learning Assistant, considering both technical limitations and the overall learning experience.



We selected the most flexible and future-proof option.
We finally decided for the solution that was most aligned with the existing lab assistant, and that we considered more flexible for the future.
We selected the most flexible and future-proof option.
We finally decided for the solution that was most aligned with the existing lab assistant, and that we considered more flexible for the future.
We selected the most flexible and future-proof option.
We finally decided for the solution that was most aligned with the existing lab assistant, and that we considered more flexible for the future.
We also included a Ela "shortcut" so our users can select a word or concept from anywhere on the page and get an explanation in seconds.
We also included a Ela "shortcut" so our users can select a word or concept from anywhere on the page and get an explanation in seconds.
We also included a Ela "shortcut" so our users can select a word or concept from anywhere on the page and get an explanation in seconds.
Design system: I created a new new color palette
Our Ai Lab assistant was facing discoverability issues and one of the reasons that was happening was likely to be that the CTA was in the same colour as all primary actions on the platform. I experimented with different colours and finally developed a new palette. We started with a single shade from the new QA branding and extended it across both the Learning and Labs assistants for visual consistency. After evaluating various options, we chose a bold, highly recognizable yellow. This ensured maximum visibility without conflicting with existing content types, making interactions with the assistant impossible to miss—in the best way possible.
Design system: I created a new new color palette
Our Ai Lab assistant was facing discoverability issues and one of the reasons that was happening was likely to be that the CTA was in the same colour as all primary actions on the platform. I experimented with different colours and finally developed a new palette. We started with a single shade from the new QA branding and extended it across both the Learning and Labs assistants for visual consistency. After evaluating various options, we chose a bold, highly recognizable yellow. This ensured maximum visibility without conflicting with existing content types, making interactions with the assistant impossible to miss—in the best way possible.
Design system: I created a new new color palette
Our Ai Lab assistant was facing discoverability issues and one of the reasons that was happening was likely to be that the CTA was in the same colour as all primary actions on the platform. I experimented with different colours and finally developed a new palette. We started with a single shade from the new QA branding and extended it across both the Learning and Labs assistants for visual consistency. After evaluating various options, we chose a bold, highly recognizable yellow. This ensured maximum visibility without conflicting with existing content types, making interactions with the assistant impossible to miss—in the best way possible.






Ela’s design in detail
As part of the process, I created a new component for our Design System in Figma, in both dark and light modes. This built upon the version released in Labs a few months earlier, enhancing its UI, which had previously been quite basic.
Ela’s design in detail
As part of the process, I created a new component for our Design System in Figma, in both dark and light modes. This built upon the version released in Labs a few months earlier, enhancing its UI, which had previously been quite basic.
Ela’s design in detail
As part of the process, I created a new component for our Design System in Figma, in both dark and light modes. This built upon the version released in Labs a few months earlier, enhancing its UI, which had previously been quite basic.






Let's do a quick retro!
WENT WELL
Leveraging existing work is a powerful time-saver. By building on a well-tested prototype, we likely saved two months of research and testing, allowing us to focus on adaptation and refinement instead.
Designing for seamless integration is critical: Ensuring consistency with our platform and previously released AI tools was essential to creating a unified experience, data has proved that none of our users felt lost when using the assistant, and they rated the experience highly.
Collaboration is key: Close alignment with engineering, AI team and labs team helped us efficiently adapt the prototype without compromising the quality of the experience.
TO IMPROVE
Building a feature is only part of the challenge—successful integration and clear user communication are just as crucial. We faced discoverability issues, with adoption lower than expected after the first month. This may be due to unclear copy or lack of visual distinction on some pages. We’re actively investigating and exploring solutions. Without clear signposting, even valuable features can go unnoticed, reinforcing the need to ensure users can easily recognize, understand, and adopt new tools.
Let's do a quick retro!
WENT WELL
Leveraging existing work is a powerful time-saver. By building on a well-tested prototype, we likely saved two months of research and testing, allowing us to focus on adaptation and refinement instead.
Designing for seamless integration is critical: Ensuring consistency with our platform and previously released AI tools was essential to creating a unified experience, data has proved that none of our users felt lost when using the assistant, and they rated the experience highly.
Collaboration is key: Close alignment with engineering, AI team and labs team helped us efficiently adapt the prototype without compromising the quality of the experience.
TO IMPROVE
Building a feature is only part of the challenge—successful integration and clear user communication are just as crucial. We faced discoverability issues, with adoption lower than expected after the first month. This may be due to unclear copy or lack of visual distinction on some pages. We’re actively investigating and exploring solutions. Without clear signposting, even valuable features can go unnoticed, reinforcing the need to ensure users can easily recognize, understand, and adopt new tools.
Let's do a quick retro!
WENT WELL
Leveraging existing work is a powerful time-saver. By building on a well-tested prototype, we likely saved two months of research and testing, allowing us to focus on adaptation and refinement instead.
Designing for seamless integration is critical: Ensuring consistency with our platform and previously released AI tools was essential to creating a unified experience, data has proved that none of our users felt lost when using the assistant, and they rated the experience highly.
Collaboration is key: Close alignment with engineering, AI team and labs team helped us efficiently adapt the prototype without compromising the quality of the experience.
TO IMPROVE
Building a feature is only part of the challenge—successful integration and clear user communication are just as crucial. We faced discoverability issues, with adoption lower than expected after the first month. This may be due to unclear copy or lack of visual distinction on some pages. We’re actively investigating and exploring solutions. Without clear signposting, even valuable features can go unnoticed, reinforcing the need to ensure users can easily recognize, understand, and adopt new tools.
Some of the future enhancements we're exploring
Looking ahead, we are exploring ways to make Ela more integrated in the experience by developing some spontaneous triggers into exam and lab result pages, providing targeted support based on user performance.
Additionally, we aim to enhance engagement by introducing interactive elements such as flashcards, games, and lab challenges, transforming the assistant into a more dynamic and hands-on learning companion. Below are some mid fidelity mockups to illustrate a few of the ideas we're are currently exploring.
Some of the future enhancements we're exploring
Looking ahead, we are exploring ways to make Ela more integrated in the experience by developing some spontaneous triggers into exam and lab result pages, providing targeted support based on user performance.
Additionally, we aim to enhance engagement by introducing interactive elements such as flashcards, games, and lab challenges, transforming the assistant into a more dynamic and hands-on learning companion. Below are some mid fidelity mockups to illustrate a few of the ideas we're are currently exploring.
Some of the future enhancements we're exploring
Looking ahead, we are exploring ways to make Ela more integrated in the experience by developing some spontaneous triggers into exam and lab result pages, providing targeted support based on user performance.
Additionally, we aim to enhance engagement by introducing interactive elements such as flashcards, games, and lab challenges, transforming the assistant into a more dynamic and hands-on learning companion. Below are some mid fidelity mockups to illustrate a few of the ideas we're are currently exploring.


