Kingsley Foster takes a closer look at the problems with online surveys like the NSS amidst worries that the NSS will be used to assess teaching quality and to justify University fee increases.
Every day when I was working in London, I picked up the Metro to another story of 50% of workers feel this, 60% people hate this, 80% of people are voting for, and the list goes on. This is because having survey results attached to a story gives us an illusion that the story is supported with legitimate research, and gives a nice sound bite to the story. Marketing and PR companies have fully embraced the survey trend and what better way to get lots of results fast than internet surveys!
The National Student Survey (NSS) is an online survey that all final year undergraduates are invited to fill out with the data being used as a way to rank universities. This year it has caused controversy as the results may be linked to the price of University tuition fees in the future.
Related article: Lecturers in ‘gagging’ row over NSS boycott
National Union of Students (NUS) has begun to publicise their campaign for final year undergraduates to boycott the NSS. Worried that the survey will be used to justify increased tuition fees under the Teaching for Effective Learning Framework (TFEL), the NUS has requested a national boycott of the survey to damage the legitimacy of the TFEL and of the survey itself.
Why we’re asking you not to complete the National Student Survey! Don’t let your data raise their fees. pic.twitter.com/fvwVDQtnZb
— Bristol SU (@Bristol_SU) January 17, 2017
In order for NSS results to be published, a minimum 50% response rate in each subject area is required (subject to a minimum of 23 responses). With such a low threshold it seems likely the NSS will not be prevented from publishing of statistics, but the move will allow the NUS to undermine the results post-publication. Yet if you put aside the politics of the issue, there is another question of legitimacy here, of the actual survey itself.
Half the price and more than double the respondents in some cases, so it’s a no brainer really. However, internet surveys are in reality a poor substitute to personalised respondent questioning. Recent research conducted by Responsive Management found online surveys can produce inaccurate, unreliable and biased data. They found there was four main reasons why surveys can produce improper results – sample validity, non-response bias, stakeholder bias, and unverified respondents.
Related article: Uni: your NSS boycott is futile
Sample validity and unverified respondents don’t have much weight in regards to NSS as the sample size is limited to final year undergraduate students, and would be verified by their ID numbers. Unless there was a fault in collecting the data, then all respondents would be verifiable.
Furthermore, stakeholder bias applies more to marketing and PR companies, who purposefully write questions in a way to get answers they want. We would hope the NSS was designed in a way that would not sway respondents one way or the other.
— Sorana Vieru (@SoranaBanana) February 8, 2017
According to HEFCE (Higher Education Funding Council for England) over 72% of students out of 431,000 responded to the survey. Yet, what about students who didn’t get to their final year? Non-response bias is those respondents who complete the survey because they are specifically interested in voting a particular way.
We can imagine those who have got to their final year and are about to complete their course are more likely to be favourable towards the university than those who dropped out. For those who have left their course early are more likely to have been let down in some respects by the university, leading to them leaving their course, potentially hiding serious failings of the university in the survey.
Even if we put this issue to one side, the non-response bias is evident from the survey completion rates. 72% have responded to the survey leaving over 28% of students not responding, and overall the response rate has to be just 50% for results to be published.
There are many reasons why one might not fill out the survey, but we can guess it would be political or ambivalence. Those who were unimpressed with course teaching or facilities may choose to not vote as protest. Ironically this could be a win for the university who would then have the results skewed towards a more positive result.
The report concluded that phone surveys were more likely to give you more detailed and accurate results, with an increased chance of verifying survey respondents, However, due to the sheer size of the students being asked to respond to the survey this would be impractical.
Online surveys in the case of NSS are an effective way to gathering lots of opinions in an efficient manner, but the usage of that data in supporting the TEFL framework is questionable to say the least. Surveys are great at giving you a catchy headline but don’t expect much substance behind the flash!
Let us know what you think via Facebook, Twitter or in the comment section below!