CIBC EDA Sharepoint

Migrating accessibility content & validating it with users

type

Website Design

TEAM

2 Designers & 2 Accessibility Consultants

TIMELINE

Sept 2022 - Dec 2022

Stage

Shipped

OVERVIEW

I worked on the Enterprise Digital Accessibility (EDA) SharePoint Hub which a centralized platform designed to help internal teams implement accessibility across design, development, QA, and content.

This project followed an initial redesign phase. After improvements were implemented based on prior findings, I led a new round of usability testing to evaluate how well the updated system performed in practice.

Due to NDA constraints, specific visuals and internal details are limited.

Role

  • Implemented design updates and led end-to-end usability testing

  • Designed research plan, moderated sessions, and structured tasks

  • Delivered actionable insights to inform next steps

The Challenge

The system had already been improved, but was it actually working?

Accessibility resources were centralized into one hub, yet uncertainty remained:

  • Could users navigate efficiently?

  • Did the structure match their expectations?

The Problem

A well-organized system doesnt guarantee usability

Even with improved content and structure:

  • Users could still get lost in navigation

  • Labels could be misinterpreted

  • Finding information could require trial and error

Opportunity

Instead of assuming the redesign was successful, we had a chance to validate it

Instead of assuming the redesign was successful, this was an opportunity to validate how the system actually performed in real use. By observing how users navigated, interpreted labels, and searched for information, we could uncover where the experience broke down and where it worked, giving us clear direction for refinement based on real behaviour, not assumptions.

Process

Migration Testing

We began by migrating the Accessibility Hub from Confluence to SharePoint . Content was restructured, key components (like sliders and accordions) were prioritized, and accessibility improvements were implemented across pages.

This phase focused on improving how information was organized and presented.

Once these changes were in place, I led usability testing with over 16+ users on Teams, to evaluate how the updated system performed in real use. And rather than testing alongside the migration, this phase was intentionally done after to answer a critical question:

Did these improvements actually make the experience more intuitive?

What this looked like:

Stages

Migrate Edit Implement Prepare & Research Execute Analyze

Research

Measuring through completion & behaviour

To evaluate the effectiveness of the updated Accessibility Hub, I designed a structured usability testing plan focused on both task success and user behavior.

The study aimed to understand:

  • How easily users could navigate between pages

  • Whether page categorization and naming aligned with user expectations

  • How quickly users could scan and locate relevant information

Participants:

I recruited 16 participants across key user groups:

  • QA

  • Designers

  • Developers

  • Content

This ensured the system was evaluated from multiple perspectives, based on how each role interacts with accessibility resources.

Methodology:

I conducted 30-minute moderated usability sessions using Microsoft Teams.

Each session included:

  • A short pre-survey to understand user background and familiarity with accessibility

  • Task-based scenarios tailored to the participants role

  • Screen and audio recording for observation

  • Post-test reflection questions to capture feedback and sentiment

I paid close attention to moments of hesitation, what users expected to find, and how they interpreted the system as they moved through it. Each session was treated as a conversation rather than a strict test, allowing deeper insight into their mental models.



Insights & Impact

Once I kind of understood where to go

The study clarified where the experience was working and where it wasnt. Instead of broad assumptions, we now had clear friction points, validated strengths and direction for future improvements.

Next Steps:

Based on the findings, the next steps would focus on refining the experience further:

  • Clarifying ambiguous terminology to better align with user mental models

  • Improving navigation predictability to reduce trial-and-error behaviour

  • Continuing iterative testing to validate future improvements

Reflection

It's more than just meeting standards

This project was my first time working closely with an accessibility team, and it was a really valuable learning experience. I had the opportunity to collaborate with accessibility consultants who supported me throughout the process, helping me deepen my understanding of how accessibility is applied in real-world systems.

Beyond usability testing, I was also able to contribute to other research and design activities, including persona creation, journey mapping, and additional testing initiatives. These experiences gave me a broader perspective on how accessibility fits into the larger product and user experience.

This project reinforced that accessibility isnt just about meeting standards, its about making systems intuitive and usable for everyone.

Due to NDA constraints, not all aspects of the project and contributions are shown here, but Id be happy to speak more about my experience and process.

Create a free website with Framer, the website builder loved by startups, designers and agencies.