Learner surveys: the anatomy of a survey
A step-by-step guide to reviewing your learner survey as a whole.
Making your survey high-value for you and your participants
A well-designed learner survey doesn’t only capture feedback on a course, it can also tell you a lot about your learners – but only if they fill it in. In this video, Adam Le Nevez talks about how a well-structured survey can prompt learners to think more deeply about their learning while helping you derive business insights about your audience.
0:01
Hi, I'm Adam Le Nevez from the APS Academy and I'm here to give you a little bit of advice on how to build your survey and the anatomy of a survey.
0:10
What are the different parts of a survey and what do they do?
0:13
And there are really three parts.
0:15
The first, it's the obvious one.
0:17
It's what we've been talking about in this set of resources, and it's the questions that you have about the learning experience.
0:24
What is it that the learners are telling you or the respondents to your survey are telling you
0:32
be that a face-to-face course an eLearning or whatever.
0:33
And these are linked to your key evaluation areas.
0:35
There's lots of materials in this set of materials that we have here that will give you some more insight into that.
0:42
So those questions are important.
0:43
They need to be linked to the key evaluation areas that you have.
0:47
But there's a second, and I would argue just as important or maybe even more important set of questions, and these are demographic questions, or questions about the people responding to the survey.
0:58
Now, why are these important?
0:59
It's important because the first set of questions, your questions about the learning experience, will give you an insight into whether the course was effective or not.
1:08
But the demographic questions will help you understand for whom. Was it universally well-liked?
1:14
Were they segments of your audience for whom this was not an effective course and for others for whom they were?
1:21
So, if you have, for example, 10% or 20% of people who responded negatively to your questions around the value of the course or the efficiency of the course, these set of questions will enable you to drill down and figure it out.
1:35
Now, it might be that they're the wrong level for the course.
1:39
They know too much or too little and so are not the right audience.
1:43
It could be the motivation for which they've come to the course and were they obliged.
1:48
We did some work in the Academy last year and the year before to really look at the drivers of responses and declining responses in some of our courses.
1:59
And we realised that there was a correlation between the motivation or the reason why people were coming to the course and the experience that they had.
2:07
And in particular, people who were obliged to come or were required to be there, those people scored much lower their learning experience than those who were there of their own volition or wanted to be there.
2:20
It's kind of obvious, right?
2:22
But you're not able to understand that and to drill down and get that data unless you ask those questions.
2:29
So questions about the learning and questions about the learners, these are super important.
2:33
So think very carefully about the learners.
2:36
The third part of your survey, well, this is the text that isn't questions, and this can be the introductory text.
2:42
So why is the learner here?
2:44
Why should they fill this survey in?
2:47
Having a high survey response rate means getting people to click on your link or your QR code, but then actually to decide to go through and fill in that survey and to give you valid answers.
2:59
So giving people assurance that you're going to treat their data with respect,
3:06
so having your privacy policy, explaining to them what you're going to do with their data, giving them confidence that you know what you're doing with their data,
3:16
this will give them the psychological safety they need to be able to complete the survey confidently and willingly.
3:24
The text that isn't questions and isn't about the learning experience or the demographics is
3:31
also a really useful nudge for you to help learners create a frame of mind when they're filling in the survey that involves reflective practice. And this is valuable for them.
3:43
Not only are you asking them to give you data, but you're asking them to think carefully and deeply about the learning experience that they've undertaken and how that might be of value to them.
3:53
It shouldn't be a leading question that creates a bias towards the answer but it gives you the chance to prompt them to be reflective about their experience.
4:09
And we know that people who reflect on their learning experience and go back over time, reflecting on their learning experience, have a much better chance and higher rate of implementing what they've learned and not forgetting what they've learned.
4:23
So all in all, there are these three parts.
4:25
You need all of them
4:26
and if you get the three parts right, then you'll have a nice, quick, succinct, meaningful survey that gives you the data you need and gives learners or respondents an opportunity to be really thoughtful about what they've just been through.
What makes a good learner survey?
A good learner survey is one that is:
- clear in its purpose so that you measure the right things to achieve your goals.
- as short as it can be while still answering your key questions so that you respect peoples’ time, boost response rates, and simplify the data work.
- simple and easy to complete so that you gather more and higher-quality responses.
- sequenced logically to avoid issues that may arise from question order.
- thought-provoking to make it more engaging for, and valuable to, learners.
- intentional about the use of mandatory and voluntary questions to minimise survey drop-outs and low-quality data entries.
1. Clear in its purpose
A survey can be technically good and still fail to serve its intended purpose. Get clear on why you are administering a learner survey and how you will use the data you collect as early as you can. Check in throughout the process to ensure that you haven’t lost sight of this! Your purpose should guide your design decisions.
“Evaluation is the systematic and objective assessment of the design, implementation or results of a government program or activity for the purposes of continuous improvement, accountability and decision-making”
– Australian Centre for Evaluation, emphasis added.
In L&D, for example, you might set out to evaluate the effectiveness a new leadership program so that developers or facilitators can improve it for the next delivery (continuous improvement) and/or so that leaders can decide whether to run it in the future (decision-making).
2. As short as it can be
The shorter your survey, the better your response rate is likely to be. Consider whether you would prefer in-depth information from 5% of learners or a few key pieces of information from 50% of learners.
Context matters when deciding survey length. A standalone, single course or e-learning might be a good candidate for a short survey. In contrast, a longer survey may be appropriate for a lengthy program of high strategic importance or investment.
It’s also important to only ask for information that you intend to use and can’t find elsewhere. This is respectful of the time and effort that people have taken to respond to your survey.
Generally speaking, it is a good idea to ensure your survey takes under 10 minutes to complete. If it’s a 5-minute survey, it’s also much easier to build it into the end of a learning experience and boost your response rate even further.
3. Simple and easy to complete
In addition to writing survey questions that are clear and easy to answer, consider the user experience and cognitive load (i.e. mental effort) of completing your survey as a whole.
Put yourself in a learner’s shoes. A survey with 5 free-text questions will take considerably more time and mental effort than if it had some multiple-choice questions. That said, a multiple-choice question with 15 different response options to choose from can be taxing.
Ranking 5 items in order, depending on what they are, can also take a lot of cognitive effort, despite appearing like a simple task. The best way to get a sense of the user experience is to test it with real people (if no-one else, some willing colleagues).
Testing prior to launch is also an opportunity to check how a survey displays on different devices (i.e. mobile, laptops). It’s important that your survey is accessible and easy to use. Much of this functionality is baked into software – but it’s up to you to check it.
4. Sequenced logically
The order of survey questions can affect how learners respond to them. This can influence both data quality and the user experience. A logical sequence of questions is much easier to follow.
When ordering your questions it is best to:
- cluster questions by topic, potentially in distinct pages and ‘sections’
- sequence questions from general to specific OR specific to general
- start with a simple and easy question
- ask important questions early
- avoid asking sensitive or boring questions first
- ask demographic questions last (unless they inform what questions appear next, known as survey branching logic).
5. Thought-provoking
Surveys can be engaging and genuinely valuable to learners. During development, consider how the evaluation process can benefit those who participate.
A great learner survey might, for example, provide value by creating structured opportunities to reflect on an experience or think forward to how learnings will be applied at work. A survey delivered prior to training could be an opportunity to set individual learning goals that are used by the facilitator to shape the course delivery. These are just some examples of how you might integrate evaluation into the design and delivery of a course or program.
You can also make your survey more engaging by introducing a mix of question types and avoiding anything that could be annoying. Questions that are repetitive, poorly crafted, confusing, apparently disconnected, or mandatory to complete might cause some people to provide low-quality responses.
6. Intentional about the use of mandatory and voluntary questions
Mandatory questions force users to respond to proceed, whereas voluntary questions can be skipped. Some survey tools also have a third option where if a respondent tries to skip a question they will be prompted to answer it, but can skip if they try a second time.
Mandatory questions are useful because more complete datasets are easier to analyse, interpret, and report on. This feature can also be helpful for questions that are used to inform survey branching logic. Branching logic is a function available in some survey tools where an answer provided to an earlier question changes the subsequent questions that an individual sees.
But it’s not always wise to make every question mandatory. These questions can lead to drop-outs and random or nonsense answers that allow people to continue the survey. So, consider making some survey questions voluntary, particularly those with open text response options.
Make sure to include these 3 elements in your survey
1. Introductory and concluding text
Ensure that your survey opens with text outlining its purpose and how data will be handled in accordance with privacy requirements. Beyond this, introductory text can also help motivate people to participate and prime them to answer questions in a particular way.
|
APS Academy Example |
|---|
|
“Thanks for participating in [COURSE NAME]. This survey should take 5-10 minutes to complete. As you complete this survey, we encourage you to think deeply about what you've learned and the experience you've had. Sharing your reflections helps ensure the course meets learner' expectations and our own quality standards. Thank you for helping us understand your experience. We will use your feedback to improve this course for future participants. Privacy statement This information is collected under requirements of The Privacy Act 1988. Information is collected for the Public Service Commissioner’s functions under the Public Service Act 1999, which include coordinating and supporting APS-wide training and career development opportunities, fostering leadership and reporting on the State of the Service. The information you provide will only be used for evaluation purposes. No information which can identify you will be released or reported.” |
For your concluding text, ensure that you confirm that a response was recorded and thank the learner for their time. It is also good practice to include a generic email for people to get in touch with you if they have any questions or feedback on the survey itself.
Post-survey text can also be a good opportunity to include any messages about what individuals can do to apply their learnings back in the workplace. It might be your final word – so make it count!
2. Demographic questions
Demographic questions ask for personal background information such as learners’ APS classification level, team/branch, work location, prior training attendance, or reason for attending a course. Questions like this allow you to manipulate the data to answer specific questions, for example:
- For whom is the training most and least effective?
- Who did and didn’t find the training valuable?
- Who are the right people to enrol in this training program?
- What is the effect of mandatory attendance?
Questions like these might not be immediately of interest to you but can be useful for interpreting trends in the data.
| Imagine that you’re interested in the reason for a drop in the perceived value of a course. Demographic data might surface that recent learner cohorts all had much higher levels of prior experience, which could explain why they found the foundational course less valuable. This points to a possible issue with enrolling the right people rather than a problem with course design or delivery. |
As demographic questions lengthen surveys and can collect sensitive data, it’s worth being strategic about what is included. As always, do not include anything unnecessary and be transparent about why the data is being collected and how it will be used. These questions are best positioned right before the concluding text, unless there is a reason to include them earlier (e.g. branching logic).
3. Non-demographic survey questions
The majority of survey questions are not demographic, and instead collect data against your Key Evaluation Areas. Depending on what those are for your particular evaluation, you could include questions about learners’:
- Experience in the course/program
- Opinions about the value, relevance or quality of the course/program
- Feedback on the facilitator’s performance
- Expectations about if and how they will apply learnings
- Levels of motivation or intent to apply learnings
- Overall satisfaction.
Learn more about writing learner survey questions or explore an example learner survey from the APS Academy
Learning Surveys: Collective suite
A checklist to help write and review learner survey questions. Plus, a summary of the question types available to you.
A step-by-step guide to reviewing learner surveys as a whole. What makes a good learner survey? What are its key components?
Tips for administering learner surveys successfully and reminders about ethics, privacy and security.