Overview
In this pivotal design project, I lead a pilot initiative to introduce a distinctive functionality within the Byju's classes product. Taking on a strategic role, I navigated through the intricacies of a multifaceted project, carefully balancing the digital and physical user experiences while addressing logistical challenges. A crucial aspect involved delving into the ongoing debate surrounding data collection—weighing its potential to enhance technology against the imperative of delivering an optimized user experience. This venture demanded a holistic approach, seamlessly integrating various elements to shape a comprehensive solution.
My Role
As the lead designer for the project, I spearheaded the UX strategy, leading a team of four designers in defining the user experience vision and creating UX flows/journeys. Collaborating closely with stakeholders, I balanced technical constraints with user needs for a successful outcome. Additionally, I provided guidance on visual design to align with Byju's design language and brand. Throughout the project, I mentored junior designers, fostering their skills. My advocacy for user-centered design consistently prioritized student needs, ensuring a seamless and intuitive user experience.
Requirement
Develop a robust system within the Byju's live classes product that enables the seamless administration of written tests online, mirroring the format of traditional quarterly exams in schools. The design should not only facilitate the smooth execution of these assessments but also include features for result analysis. The system must provide comprehensive support for evaluating and interpreting test outcomes, ensuring a valuable tool for both educators and learners in the continuous improvement of the learning experience.
Background
Byju's aimed to leverage Assess-ed platform for faster exam evaluation through question-wise analysis. However, this required using constrained answer sheets, creating a logistical hurdle for the vast student base. Manually defining answer areas was another option. To prioritize a smooth student experience, we debated whether students should mark answers or if the system could automatically identify them. Ultimately, we opted for a seamless student experience, with the tutoring team agreeing that forgoing question-wise evaluation was a worthwhile compromise. In this decision, I actively advocated for the user-friendly approach.
Design strategy
User Research
User interviews were conducted with small group of students from classes 7-9, our target age group, on how they prepare for and approach traditional written exams.
Insights:
• Minimizing stress and anxiety was paramount for students transitioning to a new online exam format.
• Accommodating a grace period at the end of the exam emerged as a critical need to account for potential technical hiccups or last-minute submissions.
• Enabling intuitive navigation through the digital question paper was essential, considering many students were accustomed to physical question paper formats.
• Providing quick access to answer keys immediately after the exam would allow students to self-evaluate their performance and identify areas for improvement.
Design execution
After formulating the design strategy and aligning with cross-functional stakeholders, including the product management team, business team, and tutoring team, the execution phase commenced with the following steps:
Solution: Key features & design rationale
Pilot Testing & Refinements
Initial pilot testing with around 500 students revealed key areas for improvement. We enhanced the user experience by:
Prioritizing Upload Visibility: Implemented a sticky banner with the upload button to ensure it remains easily accessible.
Optimizing File Selection: Increased the clickable area for the "Select Files" button to improve ease of use.
Emphasizing Instructions: Redesigned the flow with clear diagrams to ensure students thoroughly read the initial test instructions.
Outcome & Impact
Pioneering Solution: Designed and launched Byju's inaugural scalable platform for subjective exams, paving the way for a new, more comprehensive form of digital assessment within their ecosystem.
Proven Student Engagement: Achieved an impressive 70% engagement rate among 150,000 users. This demonstrates the platform's user-friendliness and its success in replicating the structure and rigor of traditional exams in a digital environment.
User journey through key screens
Following are some of the key screens and interactions a student encounters from accessing their exam to analyzing their results.
Refinements after pilot test:
Learnings from the Project
Prioritizing User-Centered Design: Witnessing the successful 70% engagement rate reinforced the importance of making user-centric decisions, even when faced with technical constraints or differing stakeholder perspectives.
Collaborative Cross-Functional Alignment: Achieving buy-in and alignment across teams like product management, business, and tutoring was crucial. Clear communication and finding common ground paved the way for a cohesive solution.
Iterative Refinement through Testing: Conducting usability testing and incorporating user feedback at multiple stages allowed us to continuously enhance the experience, leading to key improvements like the sticky upload banner and clear instructions.
Balancing Innovation and Familiarity: Introducing a pioneering digital platform while retaining the familiarity of traditional exams was a delicate balance. This project highlighted the value of leveraging user research to bridge the gap between new technologies and user expectations seamlessly.