Learner surveys: writing questions
Are you ready to write or review your learner survey questions? Here is a great place to start! Follow the checklist below to help ensure your survey questions are effective.
Are you asking the right questions?
When Ange Stoddard started evaluating learning, the surveys she designed were full of interesting, thought-provoking questions. Then, she realised there was a better way. In this video, Ange shares her hard-earned insights into how to draft questions that will give you the data you really need.
0:00
Hi, I'm Ange Stoddard from the APS Academy
0:03
and a big part of my role is working with L&D
0:07
folks on designing learning surveys.
0:10
I wanted to share a little bit of a
0:13
story that at the start of my career, I wasn't working specifically in evaluation,
0:18
but as many people who work in the L&D space do, kind of doing a little bit of
0:23
everything.
0:25
And the first surveys that I wrote for training that I was developing and
0:30
delivering, I think I just was following my curiosity.
0:33
You know, it was long,
0:35
I wrote it pretty fast and it was everything that I was interested in.
0:39
What I found after implementing that, you know,
0:43
it was too long,
0:44
I was asking my staff to dedicate too much time to it
0:49
and ultimately, I didn't end up using a lot of the information that I collected.
0:53
It was interesting, but I hadn't thought really carefully
0:56
about how I was actually going to use all that information at the end.
1:01
And so now when I'm looking at survey design,
1:04
I try and take a far more critical lens and importantly,
1:08
slow down a little bit and look at each question one by one and ask
1:14
what am I, what do I really need to know?
1:17
Is this a key evaluation area for me?
1:20
Is this a key piece of information that I
1:23
need to know and what am I going to do with it at the
1:26
end?
1:26
And I think thinking through that use case is a really valuable part of the
1:31
process.
1:32
Why is this question here and what am I going to do with the data at the end?
1:36
And if I can't think of a really good answer to that question,
1:39
then I won't include it.
1:41
And that can be hard sometimes because I think as evaluators, you know,
1:44
we're intrinsically curious. Or even L&D folks, you know,
1:47
we want to know what learners think and how they think we can improve.
1:52
And I think one of the mistakes along the way is that learners aren't always the
1:56
best people to ask these questions to.
1:59
You might really want to know how much their skills have improved,
2:03
but are they able to tell you that accurately?
2:06
Is that the best way to get that information?
2:09
And so I think that's the part of building good learner surveys is thinking
2:13
really carefully about what information you need, how you're going to use it,
2:17
and whether or not you have that information already, for example.
2:20
That's an important thing to consider.
2:22
And finally, are learners the right people to ask?
2:27
Can they tell you that information or do they have the right expertise?
2:31
I think an example of this is there's a temptation to ask them,
2:35
'how was the learning design?',
2:37
'was it a good mix of activities?',
2:39
for example.
2:40
And you know, they can tell you their views on that,
2:42
but if they're not learning designers, then then maybe they're not the right
2:46
people to be asking if you're looking to review your training workshop,
2:49
for example.
Survey question checklist
Are you ready to write or review your learner survey questions? Follow the checklist below to help ensure your survey questions are effective.
1. Can you describe a compelling reason to ask this question?
Survey questions should collect data that cannot be found elsewhere and align to your key evaluation areas.
If not, you do not need them. It can be tempting to include questions because they sound interesting. But in evaluation, that is not enough. The data you gather must be useful. Being clear on exactly how you intend to use the data helps ensure you collect the right information.
Get into the habit of writing down your rationale for each question. It might sound excessive – but it can play a big part in crafting a robust, fit for purpose survey.
Deciding if there is a compelling reason to ask a particular question comes down to your specific context.
|
Example |
|---|
|
‘What prior learning have you done on this topic?’ (Drop-down menu of known available offerings and an ‘Other – please specify’ option) Potential issue A compelling reason to ask this question might be you are evaluating a new program and deciding whether to impose a prerequisite for enrolment. There is a clear use case here: identifying the correct audience by exploring how prior learning affects outcomes. In contrast, imagine you are evaluating a number of ongoing courses. If you have no plans to use the findings of this question for program improvement or business decisions, there is no compelling reason to ask it. You are wasting the respondents' time. |
2. Will the learner have the information and expertise needed to answer this question?
Ask learners about their learning experience and aspects of the training that are clearly observable and don’t require specialist knowledge. This is especially important when evaluating the quality, design and delivery of the training.
|
Example |
|---|
|
‘The training had a good mix of activities and theory’ (5-point Likert scale) Potential issue Learners will have different views about what is a ‘good mix’. Unless your audience are experts in effective learning design, their opinion may or may not align with good practice. What seems like effective learning and what is effective learning are different things. Better approach Have a learning designer conduct a review of the course design and materials using the Learning Quality Framework. |
3. Will the question measure what you intend it to measure?
An important concept in survey design is measurement validity, which refers to whether your questions measure what you think they measure. Testing your survey in advance and asking others to review your survey can help to catch validity issues.
|
Example |
|---|
|
‘To what extent did the training increase your understanding of the concepts explored?’ (4-point Likert scale). Potential issue
Better approach Asking learners about how much they think they learned is rarely useful. Learning itself is best assessed through other means, like tests. In contrast, learners can report accurately on their experience and aspects of the training that are observable without specialist knowledge. |
4. Is the question easy to answer?
Making your questions easy to understand and answer boosts response rates and improves data quality, validity and reliability.
Check whether your questions include jargon, are confusing or long-winded, or require excessive time and effort to answer. Addressing these issues can help reduce the cognitive load on participants (i.e. mental effort) and limit survey fatigue.
|
Example - length |
|---|
|
Reflecting back on your experience throughout the entire course, including the learning resources and activities, and thinking about the nature of your job role and typical work tasks, when will you begin to apply what you have learned on this course to your work? Select the ONE response that best applies.
Potential issue This question is confusing because it is long-winded, providing unnecessary extra information for the reader to think about. Better approach ‘When will you begin to apply what you have learned on this course to your work? Select the ONE response that best applies.’ (Same response categories). |
|
Example - level of detail |
|---|
|
‘What percentage of the course was spent doing practical activities?’ (1 – 100%). Potential issue
Better approach An alternative question measuring a similar aspect of the quality of course design is provided below. ‘In what ways has the course prepared you to apply what you have learned back in the workplace? Select any responses that apply.’
|
5. Is the question phrased neutrally?
Leading or loaded questions encourage respondents to answer in a certain way. Asking neutral questions provides more valid and reliable data.
|
Example |
|---|
|
‘How has this course prepared you to be a better manager?’ (Open text) Potential issue This question assumes the course prepared the learner to be a better manager. Learners may be inclined to give a positive response because it reflects better on them as employees (social desirability bias). Better approach ‘What did you personally get out of this course?’ (Open text) |
6. Are you asking one question at a time (i.e. no double-barrelled questions)?
A double-barrelled question asks for one answer despite posing more than one question. These questions easily confuse respondents, leading to distorted data. Look for the words ‘and’, ‘but’ and ‘while’ when eliminating double-barrelled questions.
|
Example |
|---|
|
‘The training was relevant to my job role and a good use of my time.’ (5-point Likert scale) Potential issue
Better approach Split into 2 questions (or just choose one): ‘The training was relevant to my job role.’ (5-point Likert scale) ‘The training was a good use of my time.’ (5-point Likert scale) |
7. Are the answer categories exhaustive, mutually exclusive, and matched to the question type?
Exhaustive categories form a complete list so every respondent can find the one answer category that applies to them. After listing probable response options, consider whether you might still need an ‘Unsure’, ‘N/A’, or ‘Other, please specify’ option. Sometimes, you will need multiple of these!
Mutually exclusive categories do not overlap, which is important if you are asking respondents to only choose ONE response that applies to them. This rule doesn’t apply if you allow multiple selections or if you are asking for the best/most applicable option.
Matching answer categories to the question means there is a logical flow from question to answer. Matching errors tend to show up after multiple edits and are best caught through proofing and testing.
|
Example - not exhaustive |
|---|
|
‘Did the course meet all of your accessibility needs?’
Potential issue These answer categories do not form an exhaustive list. A respondent might identify as having no accessibility needs, meaning neither response here is applicable. Better approach Add response category: N/A |
|
Example - not mutually exclusive |
|---|
|
'What is your current APS level?'
Potential issue These categories are not mutually exclusive because two options apply to APS 4 staff members. Better approach One possible fix is to change the answer category of ‘APS 4 – 6’ to ‘APS 5 – 6’. |
|
Example - unmatched answer categories |
|---|
|
‘How relevant was the training to your current job role?’ (Strongly agree, agree, neither agree nor disagree, disagree, strongly disagree) Potential issue These responses do not make sense as a response to the question asked. In other words, they do not ‘match’. Better approach ‘To what extent do you agree with the following statement: “The training was relevant to my current job role”’ (5-point Likert scale). |
8. Will the data be structured in a useful way?
When writing questions, consider your end goal. Why are you collecting this data? How will you use it? Will it be easy to visualise, analyse and report on? Minimising the use of free text questions, which collect unstructured data, will make the survey easier for you to analyse and for respondents to complete.
Learn more about different question types in the following section.
|
Example |
|---|
|
‘What motivated you to attend this course?’ (Free text) Potential issue
Better approach ‘What motivated you to attend this course?’ Select ALL that apply.
|
Question types
Choosing the right question type will assist you in matching answer categories to the question. It’s useful to include a mixture of question types to keep the survey engaging and avoid ‘straight lining’ (where a respondent continually selects the same response for all questions). Equally, constantly changing the question type increases the cognitive load on respondents. Below are some common question types:
- Multiple choice: Select only one OR multiple answers from a defined list of responses
- Likert scale: a style of multiple choice that lists different levels of agreement with a statement (e.g. strongly agree, agree, disagree, strongly disagree)
- Free text: Type a response
- Matrix table: Combines multiple questions with the same response choices
- Rank order: Rank a series of answers in order of preference
- Slider: Respond using a sliding scale tool.
Use free text questions sparingly and only when you have the resources to analyse the data. You might be surprised by how often multiple choice can work just as well, if not better. Save free text questions for when you need the juicy details and nothing else will do!
Avoid rank order questions where possible because they tend to take a disproportionate level of effort from both participants and evaluators. Likewise, avoid matrix questions as they impose a higher cognitive load on respondents.
Learning Surveys: Collective suite
A checklist to help write and review learner survey questions. Plus, a summary of the question types available to you.
A step-by-step guide to reviewing learner surveys as a whole. What makes a good learner survey? What are its key components?
Tips for administering learner surveys successfully and reminders about ethics, privacy and security.