1. Project Overview
Purpose:
This project aimed to optimize the user testing tool at General Motors. It is an internal user testing system for evaluating new products, apps, and technology services before deployment company-wide. The goal was to reduce manual processes for Product Quality Leads (PQLs), increase tester participation, and streamline the process for Product Managers (PMs) and engineers to manage the issues log. Additionally, an onboarding request form was integrated into the internal tool to improve workflow efficiency and reduce communication overhead.
Phases of Testing:
The Test Drive program uses a multi-phase approach to ensure comprehensive testing:
First Phase: Small group of IT testers.
Second Phase: Larger IT organization involvement.
Final Phase: Entire IT organization (~8,000 testers).
2. Role, Duration, and Tools
Role:
As the UX/UI Designer and Researcher, I led the design process, conducted user research, and collaborated closely with cross-functional teams to develop solutions that improved system efficiency and user experience. I also worked on creating an automated dashboard for real-time data and report consolidation.
Project Duration:
This project started in January and is ongoing, with continuous improvements and refinements based on user feedback.
Software & Tools Used:
Figma: For creating wireframes, prototypes, and high-fidelity designs.
Miro: For conducting collaborative design thinking workshops.
Google Forms: Utilized to collect and analyze user feedback.
3. Problem Statement
Core Problem:
1. Manual Process Inefficiencies: Manual processes bogged down PQLs, slowing down test setup times, which took up to five days.
2. Low Tester Participation: The complex workflows discouraged testers from participating.
3. Disjointed Processes: PMs and engineers had difficulty managing the issues log effectively.
4. Onboarding Form: The existing onboarding request form was separated from the internal tool, leading to inefficiencies and communication gaps.
4. Research and Discovery
Research Methods:
1. User Interviews & Surveys: Engaged with PQLs, testers, PMs, and engineers to understand the pain points.
2. Usability Testing: Observed interactions with the tool to identify inefficiencies.
Key Findings:
1. Manual Tasks: Significant manual processes slowed test setups, leading to inefficient issue management.
2. Low Engagement: The complex workflows contributed to low tester participation.
Separate Onboarding Forms: The separation of the onboarding form caused delays and additional email communications.
5. Design Process
Strategy:
Led design thinking workshops to ideate on improving the user experience and automation of the Test Drive tool. The focus was reducing manual tasks, consolidating tools, and integrating the onboarding form directly into the internal tool.
Prototyping & Testing:
Developed wireframes and high-fidelity prototypes in Figma and conducted usability tests to refine the design and improve workflows.
Key Enhancements:
Automation of Routine Tasks: Automated data collection, report generation, and task assignments.
Integrated Onboarding Form: Streamlined the process by integrating the onboarding form into the tool, reducing emails and manual steps.
Real-Time Dashboards: Created a dashboard for PMs and engineers to track issues, sort them, and categorize them by priority.
6. Design Improvements
Improved Visibility:
Heuristic: Visibility of System Status
Impact: By positioning active projects at the top and clearly marking phases, users can quickly identify their projects' status, reducing confusion and improving efficiency.
Error Prevention Enhancements:
Heuristic: Error Prevention
Impact: Implementing pop-up confirmations and preview options before sending communications reduced user errors and improved reliability.
Integrated Communication Tools:
Heuristics: Consistency, Minimize User Memory Load
Impact: Integrating communication tools into the admin page reduced context switching and allowed users to focus on tasks.
Organized Data Presentation:
Heuristic: Recognition Rather Than Recall
Impact: The tool improved navigation and reduced user cognitive load by using dropdowns and categorized data displays.
7. Anticipated Impact and Outcome
Projected Impact:
30% Reduction in Setup Time: Automation and tool consolidation are expected to reduce setup time from 5 to 3 days.
20% Increase in Tester Participation: Simplified workflows and gamification features (such as leaderboards) will drive higher engagement.
Faster Issue Resolution: Improved issue management processes are projected to reduce the time to resolve issues by 25%.
Feedback:
Early user testing indicated positive responses to the streamlined workflows, with users expressing increased satisfaction.
8. Conclusion and Next Steps
Ongoing Improvements:
Continue refining the tool based on user feedback and look to automate processes for further efficiency.
Future Considerations:
Building on the current leaderboard, add more gamification features like rewards and challenges to boost tester engagement.
Lessons Learned:
Engaging stakeholders early in the design process and using iterative testing helped ensure alignment with user needs and business goals.

Back to Top