ASHA Certification

Refining The Application Process

 
It’s the last impression. A bad dessert can ruin the meal.
— Anne McManus
 

Design Challenge

Certifications typically require not only full-time job experiences, but also intensive education, a standardized exam, and finally submitting the application itself. The process leading up to certification is rigidly systematic, but the entire certification experience culminates in the application submission process. This final step can shape the applicants’ sentiment towards the profession as they enter their careers.

Client

The American Speech-Language-Hearing Association (ASHA) is the national professional, scientific, and credentialing association that aims to make effective communication accessible and achievable for all. Their mission is to promote the interests of and provide the highest quality services for professionals in audiology, speech-language pathology, and speech and hearing science, and to advocate for people with communication disabilities. 

Deliverables

  • Top findings from each recorded interview

  • Documentation of prioritized issues per page

User Need

The goal of this research study is to assess the usability of the existing Certification On-Line Application (COLA) in order to design a high-quality Assistant Certification Application based on the existing template. The Membership and Certification teams seek to understand what is working well and what is frustrating in the existing COLA application - ASHA Certificate of Clinical Competency in Audiology (CCC-A) and Speech-Language Pathology (CCC-SLP).

Constraints/Role

  • Timeline: 5-week research sprint

  • Role: UX Researcher

Key Observations

  • Users found tabular structure and conventional traffic light color systems to be helpful and informative

  • Information architecture needs reorganizing to successfully convey information needed to complete the application

  • The submission flow needs restructuring so that it follows users’ path of least resistance

  • The Clinical Fellowship report form needs to be more inclusive of users outside of consistent work situations


 
jose-alejandro-cuffia-TC5P6ZRxDbI-unsplash.jpg

Research Plan

 

Methodology

8 individuals were selected from a pool of 720 total survey responses (of which 470 were unique and complete) and invited to participate in a virtual interview session as they applied for or reviewed the ASHA certification application. Our research objectives were to identify:

  • What parts of the application worked well and were well-defined?

  • What parts caused frustration or difficulty?

  • What are their next steps in the application flow?

Throughout the session, users were asked about their motivations for applying and general feelings toward the process, and encouraged to think aloud as they worked. Participants concluded the study by answering once again how they felt about the application experience post-completion.

Screener Survey

This was my first instance of utilizing Ethnio to create and conduct a screener survey. Participants were asked to answer on the following:

  • Student status: not all applicants are current students; some might be applying from within the work force

  • Certification Type: at minimum, participants needed to seek either or both ASHA certifications

  • Application progress: the ideal participant would be applying for certification from the very beginning in order to capture the full scope of usability

  • Willingness: participants had to be willing to walk through the application process with me for an hour of their time (incentivized)

participant.jpg

♦️Selecting Participants

Participants were contacted, selected, and scheduled on a rolling basis. Due to the ratio of applicants certifying for Speech-Language Pathology (SLP) vs. Audiology, I selected 7 speech-language pathologists and 1 audiologist for the study. It’s necessary to note that audiologists do not have a clinical fellowship requirement for their certification, so it was important to dedicate time towards understanding how SLPs navigate that section of the application.

I also selected participants based on location (United States) in order to ensure consistent cultural reactions. In addition, due to time constraints, I decided to include individuals who had already started the application process or had submitted within recent months.


 

Synthesizing Data

 

♦️Severity Rating

To organize the findings, I used the client-established rating system to categorize each usability issue or finding in terms of its importance or impact on the usability of the application site. Ratings are based on both the number of users who experienced the problem, and the severity of the problem affecting the user and their completion of the task at hand. Each individual user experience issue documented was assigned a priority level:

 
 

♦️Consolidation

Throughout the sessions, I asked users to complete the application to the best of their ability. I observed their movements to see which sections they had the most ease/difficulty with, and asked them to reflect on what went well, what needs improvement, and what users’ next steps were per section.

I recorded each response, observation, and action within a spreadsheet organized by page and question. To identify the severity of findings, I created a frequency table for each part of the application mentioned across all user sessions - helping me focus on particularly negative parts of the application process while emphasizing successful aspects that guided the users well.

Screen Shot 2020-07-14 at 12.19.32 PM.png

Users who made strong emphases on issues were given a 3, while users who did not encounter the issue were given a 0. I used the final numbers to calculate number of users affected, and each issue’s severity ranking. To do this, I took the sum of points over the total amount of points possible (8*3 = 24) to get a percentage that guided the severity ranking.

  • Those with a percent affected rate of more than 30% received a Severe ranking

  • Those with a percentage between 25% and 29% received a Medium ranking

  • Those with a percentage less than 24% received a Low ranking

Taking these, I was able to create visuals to help guide the Certification team as they moved forward with their future application designs.


 

Next Steps

 
  • Conduct more secondary research into application heuristics and best practices

  • Incorporate current findings into Associate Certification process

  • Implement changes to current certification application, and conduct A/B testing to determine differences in user experience

  • Expand pool of interviewees to include a wider range of participants


Things I Learned:

♦️Selecting Participants

Introduced to managing a rolling interview process and learning how to balance between participant qualification and quantity during selection process.

♦️Severity Ranking

Utilized a ranking system to identify and prioritize existing issues in order to effectively advocate solutions moving forward.

♦️Consolidating Interview Data

Realized the effectiveness and efficiency of utilizing a frequency table (in a time crunch) to generalize micro observations and responses into macro statements.