Innovating with a Deep Understanding of the Systems We Build On
Overview
We designed the series using a “Goldilocks” approach, making complex native topics feel approachable, useful, and just-right for a diverse audience of 700+ designers.
As Capital One grew its design org, many designers were assigned to native app work without deep platform
knowledge. Android and iPad experiences were suffering and customer-facing quality declined. I proposed and created WNKF, a weekly 30-minute training series to close knowledge gaps, build confidence, and standardize native design quality across iOS and Android.
Problem
- Capital One’s consumer servicing apps had lost ground in quality, particularly across Android and tablet experiences:
- Many designers lacked native app experience — especially Android.
- Features were breaking in production (e.g., landscape issues, missed accessibility targets).
- Teams were designing mobile apps like mobile web.
- Designers felt uneasy asking questions in reviews or critiques.
- Design reviews consistently flagged the same UX/UI errors across LOBs.
Solution
- I worked with VP-level leadership to carve out a new quality-focused role. From there:
- I created WNKF, a recurring training series focused on platform fundamentals and mobile design best practices.
- Collaborated with a content strategist and junior designer to co-author the curriculum.
- Built a design checklist aligned with Apple HIG and Material Design guidelines.
- Focused early sessions on accessibility, motion, grid structure, and platform behavior differences.
- Tracked attendance, feedback, and cross-referenced with design QA results to target training gaps.
- Designed every class to be inclusive and “just right” in complexity using the Goldilocks Test (our narrative model for accessible teaching).
Collaboration
- Partnered across Experience Design, Product, and Tech to ensure relevance and adoption.
- Reported directly into the VP of Experience Quality.
- Worked with design reviewers and subject matter experts to reinforce training through QA processes.
- Used insights from design review trends to refine training content in real-time.
Measurements
So, how do you measure the success or failure of something like this? Well, it wasn’t easy and I had to look at several metrics.
Our attendees
- I took detailed attendance. Using the tools in Zoom, I could track everyone that joined, how long they stayed and when they dropped.
- In a spreadsheet, I tracked:
- each person that attended a session,
- where they aligned (Design/Product/Engineering and line of business)
- how long they stayed on the call
- who accessed the recording of the session that was posted immediately after the call.
Design was required to bring any new flows through a review process. There was a well documented checklist that product and design could use as a guide to ensure the work aligned with the design system and its respective platform.
From weekly design reviews
- I meticulously tracked design review results and feedback – documenting specific line items that didn’t pass review.
- I tracked the project, line of business and designer.
- During training sessions I would sometimes pivot and address problem areas that arose in reviews.
- Over time, I began to cross reference and compare the collected data.
Results
- 20+ sessions delivered, including guest speakers and deep dives on platform-specific patterns
- Attendance grew to 100–200 designers per session
- Peer feedback consistently praised clarity, tone, and usefulness
- Design review failures dropped in LOBs with high WNKF participation
- Designers began proactively reaching out for feedback before QA
- VPs began requiring attendance for designers in struggling areas
- Reduced downstream rework and engineering tech debt caused by design inconsistencies
- Content still in use through an accessible internal video library
“You’ve helped me SO so much and all the wonderful knowledge aside, I just appreciate how genuine you are as a person and colleague.”
~ New XD team member, via Slack
Retrospective
WNKF began as a response to a design quality gap and became an embedded part of how Capital One upskills its product teams. It fostered a culture of curiosity, inclusion,
and shared accountability. By prioritizing accessibility in both design and education, we raised the bar for experience quality across the organization.