Schools everywhere have begun to rely heavily on collecting feedback through student satisfaction surveys and other questionnaires. Unfortunately, practices for implementing the data from these surveys has not necessarily kept pace with this rapid adoption. Australian researchers Ilana Finefter-Rosenbluh, Melissa Barnes, and Tracii Ryan studied the challenges of taking the data from these surveys and implementing it on the individual level of instructors in a classroom. They present the highlights in this article.
For their study, Finefter-Rosenbluh et al. implemented two Student Perception Surveys. The first collected feedback from students on their instructors and learning environment, and the improvements they wished could be made; and the second queried whether the students had noticed any changes from their instructors in response to the survey.
Alarmingly, the students did not perceive any changes based on their feedback. The interviews and focus groups that followed point to some key answers.
Data Everywhere, But No Interpretation
Instructors, the researchers found, feel that they are largely on their own when it comes to interpreting survey results. They are often given the raw qualitative data from students’ comments without any additional insights. Sometimes, the students’ answers are contradictory, such as one student requesting more assignments and another stating that there were too many assignments. Without some sort of interpretation, this kind of raw data is nearly impossible for instructors to act on.
Instructors also expressed experiencing an individual bias such as a tendency to assume certain comments were written by students with lower grades due to a lack of effort, making the instructors less likely to take the responses seriously.
Instructors reported facing more challenges when the subject of a comment was not something they could control, such as elements of the school curriculum. They felt it was unfair that complaints about course content were seen as a reflection of their teaching quality rather than the effectiveness of school policies. This leads into another problem: Some teachers felt that the surveys were an affront to their own ability to teach and determine their own methods. Resistance to change from established ways is not an unusual reaction from employees, but there are many ways institutions can work with instructors to involve them in making changes and acknowledge their expertise so the changes will be a part of a healthy working relationship. This will reduce negative reactions to the survey and analysis results.
On the other side of the study, students were pessimistic about their feedback’s reception, citing incidents where they tried to give opinions to their instructors and were ignored. Many students were aware that some instructors were trying to respond to the survey feedback but didn’t know where to start and had no support. The problem of handling qualitative feedback had come full circle to erode the students’ trust.
Learn more about qualitative data analysis in our free e-book.
A guide to qualitative analysis tools and best practices for professionals.
Key Insights
The researchers acknowledged that further studies are needed to fully understand these results and how institutions can take them into account moving forward. This study was done in a very specific geographic location, with a particular age group; however, it still highlights some important ideas in the field of qualitative analysis:
o Study Design and Purpose: Why is the survey being taken? If it’s to seek insight into an instructor's methods rather than to address policy change, institutions need to take into account that there will be aspects of teaching that instructors have no control over. Furthermore, institutions should carefully consider legal precedent as to how they can use student comments: A recent Canadian case ruled against Ryerson University’s use of student course evaluations as a measure of faculty teaching effectiveness, especially when it came to promotions and tenure
o Existing Attitudes and Assumptions: Consider existing negative attitudes teachers and students have about the survey beforehand and adjust the questions accordingly. This also includes presenting the survey in a manner that will reduce feelings of defensiveness and negativity, like being clear about the purpose and importance of the survey.
o Follow-Up: If a survey is going to change anything, instructors need support from their institution to enact those changes. Even if the survey focused on individual teaching styles, instructors still need resources to respond to student requests and the survey may have wider implications that the institution should consider on a policy level.
o Effective Qualitative Data Analysis: Some assume that qualitative data, unlike the numbers of quantitative data, is self explanatory and doesn’t require special care. Nothing could be further from the truth. Even if answers to open-ended questions are succinct and detailed, we all have our own biases, and reading through pages of responses is time-consuming, leading to skimming and being overwhelmed by the sheer amount of feedback. Natural Language Processing (NLP) in contrast aggregates and categorizes the written responses in a way that highlights key ideas, making the information more digestible and reducing misinterpretation.
Important Implications for Your Institution
There are some broader implications to this study that all educational institutions should consider.
o In this study, students felt like their feedback didn’t matter because of the lack of response. In our experience, when an institution communicates that they are acting on or responding to open-ended responses, students realize their voice are heard and that it was worth their time to complete the survey. For example, after one institution sent out an email updating students as to which concerns they were addressing and how, responses went up to 59% from 46%, and the overall student satisfaction rate reached 91% in the second survey.
o Collecting open-ended responses but then not acting may cause liabilities for your institution. Qualitative analysis techniques can detect incidents of students who, unfortunately, face discrimination, hate crimes, or even potential sexual violence on campus. A thorough qualitative analysis is not only good research, but a matter of safety.
We hope that this study and others like it will lead institutions to consider these key aspects of conducting surveys in the future, resulting in more efficient and constructive feedback and a better path forward for education. Although there may be weaknesses to conducting surveys, we believe that with care and attention they can reveal a great deal and be the truly useful tools they are intended to be.
If you found this article useful, you might enjoy our newsletter. It’s a bi-monthly email that keeps you up to date on what we’re up to and articles on topics we find interesting.
If you want to dive deeper, sign up for a free, 30-minute consultation to see what Kai Analytics can do for you.
Comments