top of page

Tripling student participation for the UBC AMS Academic Experience Survey

The University of British Columbia Alma Mater Society (AMS) has held a long-running survey since 2012 on student perspectives called the Academic Experience Survey (AES). The survey is crucial for the AMS to carry out its mission of advocating for and improving the lives of UBC students. However, in recent years they have faced challenges such as reduced student responses and an overly broad survey questionnaire. The UBC AMS contracted Kai Analytics in 2023 to carry out the survey with a focus on improving the questionnaire and increasing the response rate to pre-pandemic levels.

What We Did

We collaborated with UBC AMS to revise the questionnaire, disseminate the survey to students, and analyze the results across many demographics.



300% increase in student responses (compared to previous year's survey, administered by a different vendor).

Pitch Deck Icons-03.jpg

Shortened the questionnaire to reduce survey fatigue while preserving the ability to trend data.


Revised survey questions to prioritize factors most crucial to the AMS' mission.

Avatar 109

Each project has been marked by professionalism and exceeding our expectations. Mr. Chang is always available for questions or issues that arise, and a thoughtful partner in our work.


3 Key Insights


Focused Questionnaire Review

We worked with the AMS through an iterative process to finalize the survey questions with maximum impact, clarity, and intelligibility. We carefully reviewed past questionnaires to ensure that any changes would allow for trend analyses. In addition to reducing the number of questions, we refocused the survey by adjusting the ratio of questions related to the overall university experience and questions related specifically to the academic experience.

This included: 

  • Discussing actions the AMS took in response to previous survey results and sharing highlights with the student population leading up to the survey launch. This helped build awareness and let students know their feedback mattered.  

  • Reconsidering questions that may have very similar answers (e.g. questions on belonging and questions on well-being) 

  • Reviewing questions to see if they were within AMS’s mandates and scope of service offering.

Ultimately, our discussion with the AMS team focused on which areas the leadership team could take action on, were in their capacity, and most aligned with their mandate as well as UBC’s Strategic Plan.

Randomized Survey Elements to Reduce Fatigue

After the survey design, we programed and tested the survey in our online survey tool, allowing the AMS team members to provide valuable feedback by commenting directly in the system.


In addition to randomizing the order of question choices, we also randomized the order in which the middle sections of the survey appeared. This is because response quality tends to be strongest when participants begin the survey, so changing the question and topic section order can help address this. 


Communications and Timing

It can be challenging to catch the attention of students, who often deal with busy schedules and a barrage of events and learning tasks. The AMS team identified February to April as the best time to conduct the survey, as afterwards students would be too focused on exams.


We met this time constraint even in the face of unexpected challenges by proposing a timeline with plenty of leeway within the suggested window and then adjusting from there to be flexible to the AMS’ needs. During data collection, we kept the AMS team updated on response numbers and made suggestions to increase them through communications when we felt they could be improved.

bottom of page
Privacy Policy Cookie Policy