Survey Protocol form
The Survey Protocol form is applicable to staff wanting to conduct student surveys with a sample of 250+ students and across more than one department.
The form must be completed so the Student Engagement Team in the Centre for Learning and Teaching can approve your survey and add it to the repository of questions asked of students at the University of Bath. This means large student surveys conducted at the University can be monitored, datasets can be combined with other datasets for trend analysis and benchmarking, and existing data that you could use instead of running another survey can be identified.
The following information is a guide to the questions and explanations that the form contains.
Submit your survey proposal now
Principles for conducting student surveys
These principles apply to all surveys administered to more than 250 University of Bath students and students across more than one academic department. They apply to both internal surveys and those conducted by external stakeholders. The principles do not apply to surveys run by the University of Bath Students’ Union (SU).
Student surveys at the University of Bath will only be conducted if:
- the purpose of the research being conducted is clear
- similar and up-to-date data does not already exist and your question set can not be accommodated in an existing University of Bath student survey.
- surveying is the most appropriate way of data gathering when alternative approaches have been considered, such as:
- secondary analysis of existing data
- focus groups/world café methods
- consultation with academic reps or University of Bath SU Student Officers before surveying students en-masse
- the survey fits into the timeline of current student surveys and does not clash with other surveys
- the survey has been designed to be inclusive and to optimise participation
- there is a clear plan to promote the survey at Faculty/School or Institutional level effectively, with a timeline of promotional activity
- there is a clear plan in place to analyse and use the results
- individual respondents will not be identified in any reporting, protecting their anonymity
- there is a draft timeline in place as to when students’ comments will be addressed and a plan for the resultant actions to be shared with students
- the surveys protocol is completed and the survey question set submitted so that the questions can be added to a repository of questions
- there is an agreement that the Student Engagement Team can publish a short summary of the survey and main themes covered on the Student Voice website and Student Data Archive Wiki pages to ensure the results can be used as effectively as possible
- where possible, template institutional questions are used to ensure consistency and enable cumulative analysis and triangulating of responses across surveys
Survey journey maps
When developing a student survey, it's important to consider the context of your participants and how that might impact both your response rate and data. By minimising overlaps and duplication in student voice activity, we reduce the potential for survey fatigue and maximise the opportunity for high engagement and quality data.
Areas to consider include:
- time pressures for students
- positive and more challenging aspects of their student experience which coincide with your survey
- timing of other student voice activities
An overview of this information has been created with the SU to identify key milestones, events, emotions and surveys that students experience during their time at Bath alongside the timing of other surveys.
Visit the student journey maps
Survey journey maps have been developed for:
- undergraduates
- taught postgraduates (PGT)
- 100% Online PGTs
- part-time distance-learner PGTs
- degree apprenticeship learners
- doctoral (ProfDoc and PGR) students
Doctoral, 100% online and degree apprenticeship students follow a different study schedule from standard undergraduate and PGT on-campus courses. For more information, contact:
- 100% Online Distance Learners - lpo@bath.ac.uk
(Please note that the Learning Partnerships Office only oversees seven distance PGT courses. Several other distance courses are overseen by academic departments. For further information, please contact Academic Registry).
- Degree Apprenticeships - apprenticeships@bath.ac.uk
- Doctoral - Doctoral College Development and Student Experience team
There is a list of student surveys that outlines when different institution-wide surveys are run and which groups of students are targeted.
Designing an effective survey
Surveys are successful when:
- using a survey is the best method of capturing data
- the questions are well-designed
- the survey has a high response rate
- the results are communicated to the appropriate audiences
- participants are made aware of the results and any corresponding actions that will follow
Survey and questionnaire design training
At least one person involved in creating your survey must have recent training or experience in survey and questionnaire design.
Please contact the Student Engagement Team for support.
Decide on your target population
Your target population is the group of students who your survey is targeted at. You should only target your survey at the most relevant students, for example do not advertise a survey to all students when it is only targeted towards graduate teaching assistants.
Decide on a survey platform
The University recommends several survey platforms:
- Jisc Online Surveys is free but you must request an account from DDaT
- Mentimeter
- Microsoft Forms
- QuestionPro
- REDCap is a survey tool popular in the Health sciences but you must request access from the Library
Do not use Google Forms or Survey Monkey because they do not meet university requirements.
Ask the right questions in the right way
Make sure that questions are accurately worded and appropriately displayed to give you the information you need. The Survey Repository contains the suggested wording for certain topics, such as wellbeing or demographics.
If you are creating your own questions, follow guidance on cognitively testing questions before launching the survey.
Closed-response questions
Closed-response questions are quick for respondents to complete. Single- and multiple-choice, drop-down list and Likert scale questions are the most common type of closed-response questions. Examples of closed-response questions are:
- 18-21, 22-25, 26-30 (single-choice)
- cognitive or learning difficulty, long-term condition, mental health condition (multiple-choice)
- Agree/disagree (Likert scale)
There should always be a ‘Prefer not to say’ or ‘Not applicable’ option.
Open-response questions
Open-response free-text questions allow respondents to type answers in their own words. This can be more valuable than closed-response questions where respondents are limited to the questions asked.
Follow Jisc guidance on the best types of question to ask for different topics.
Keep the survey length to a minimum
Don’t ask questions for the sake of asking questions. Too long = incompletion. Too short = research question(s) isn’t answered. Check the Survey Repository to see if the data you want to collect already exists.
You should consult with stakeholders or peers to ensure questions cover the necessary topics.
If your survey is quite long, it’s a good idea to prepare respondents for how long it will take them in advance and to include a progress bar.
Structure the survey into logical sections
On the first page, clearly describe the aim of the survey. The survey should be broken up into a logical order of pages with relevant questions grouped together. Having multiple pages can make the survey less daunting to respondents compared to one long page.
Consider adding sub-questions and routing to ask different questions to different respondents depending on their answers.
Demographics are generally kept until the end of the survey. Make sure you use the correct terminology for different demographic characteristics.
Top and tail the survey
Students take time out of their busy schedule to complete surveys, so being polite and courteous will show your gratitude for their responses.
The start of the survey should include:
- survey aim and information
- login instructions
- prize or incentive information
- GDPR statement
The end of the survey should include:
- a thank you!
- right to withdraw procedure
- survey contact details
Using images
Do not overload the survey with images. The University logo at the top of a survey can give it authority. Multiple-choice image questions, where appropriate, make the survey more interesting for respondents. If the survey is supported by the SU, request to use their logo alongside that of the institution. Make sure you resize large graphics to smaller, more easily downloadable sizes. Check if your survey provider allows you to resize images that you've uploaded, rather than having to adjust them in software like Photoshop.
Testing the survey
Before launching your survey, user testing should be performed to make sure the survey works as it should. Ask stakeholders to complete the survey and listen to suggestions. Pilot the survey with a couple of students to see if the questions make sense.
Delivering an effective survey
Open the survey at an appropriate time of year
There are several large institutional student surveys that run annually for different groups of students. The Survey Journey Maps list the opening and closing dates of these surveys, and these should clash as little as possible with any other surveys with 250+ students being targeted.
Encourage students to give constructive feedback
Open-response questions are valuable in helping students convey their opinions and students can provide richer insight by making their responses constructive. Constructive feedback is:
- honest
- specific
- respectful
- solution-focused
- includes the bad and the good
- is about doing better next time
Direct them to guidance about how to make their feedback constructive.
Employ appropriate methods of promotion
Using a combination of promotional methods will ensure you target different groups of students. Methods you could use include:
- word-of-mouth
- scheduling time in teaching sessions
- academic reps contacting students
- Microsoft Teams messages
- emails via student mailing lists, MailMerge or individual emails
- Student News in 10 newsletter
- Student Homepage
- Moodle items and pop-ups
- social media
- posters
- campus screens
Use Microsoft Word or PowerPoint or Canva to design posters, presentations, infographics and social media material.
The Department of Communications has guidance on communicating messages to students.
Keep key messages brief
Promotional materials should only include key information about the survey. Messages should be brief and straight to the point. Extra details can be written at the start of the survey. Key information includes:
- survey name
- purpose or aim of survey
- closing date (and time)
- prize or incentive
- call to action
If sending emails, the subject line should entice students into opening and reading its contents. Remember mobile phones have a narrower display than other screens so keep email subjects short.
Create a survey promotion schedule
Once you have decided which promotional methods to use, collate these into a survey communications plan. This should list the date the activity is being implemented and (if applicable) stopped, what the activity is, content details and who is responsible.
There is a student voice communications plan template available from the Department of Communications. Information can also be visualised in a Gantt chart.
List when and how you will communicate the results and respond to feedback to students. Be prepared to receive and respond within a short timescale.
Monitor response rates
Track daily or weekly response rates to understand which promotional methods have the largest positive impact on response numbers.
Use incentives
An incentive (such as a book token, an e-voucher or even real cash) can improve the response rate of your survey. An incentive may encourage some people to take the survey who would not otherwise, but it is unlikely that anyone would be deterred from it. Make the incentive appropriate to your audience. Don't make the incentive too big or else your survey will be circulated widely to people with no relevance to your survey. Mention the incentive at the start and end of the survey and on promotional material.
Effectively analysing and reporting survey results
Closing the feedback loop
It is vitally important that you communicate with respondents (or the relevant student cohort) in a timely manner about what the overall pattern of results to your survey were and what you or others are going to do about it. If nothing can be done to some points, then communicate why that is. Students tell us repeatedly that they get a lot of opportunities to give feedback, but that they don’t know how it is used. By closing the feedback loop, we build a sense of two-way communication and community that encourages future engagement with surveys and feedback activities.
Read guidance on how to close the feedback loop.
Reporting and Action Planning
For each set of questions in your survey, you should have a plan from the outset for who will be responsible for acting on the outcomes of the survey, when they need the results by and in what format. For instance, if the results need to be seen by specific committees or working groups, you should know when the deadlines for those meetings are for consideration of the results in time for action to be taken.
Sharing your dataset
We encourage survey owners to share survey data widely to reduce the number of surveys we need to send students and to make most effective use of the data we collect.
Under GDPR, in most circumstances it is important that we maintain respondent anonymity when sharing data. Although respondents may not be sharing any personal identifying details explicitly, it may become possible to identify individuals when looking at cuts of the data. For instance, if there are several demographic questions included in a survey, such as age, disability, course and year of study, then it may be possible to identify which student responded even without their name.
Read guidance on how to securely share and anonymise data.
The Data Governance Team offers training and support related to data protection.