Skip to main content Skip to main navigation
Australian Public Service Academy Australian Public Service Academy Australian Public Service Academy Australian Public Service Academy

APSLearn Login

Main navigation

  • APS Craft

    APS Craft

    • Integrity
    • Working in Government
    • Engagement & Partnership
    • Implementation & Services
    • Strategy, Policy & Evaluation
    • Leadership & Management
  • APS Professions

    APS Professions

    • APS Academy Campuses
    • Complex Project Management Profession
    • Data Profession
    • Digital Profession
    • Evaluation Profession
    • HR Profession
    • Procurement and Contract Management Profession
  • APS People

    APS People

    • Diversity and Inclusion
    • Health and Wellbeing
    • Work Health and Safety
    • Your APS Career
  • APS Induction
  • Learning Experiences

    Learning Experiences

    • Courses
    • Course session dates
    • Microcredentials
    • Events
    • Resources
    • Cross Agency Training Hub
    • Nationwide course sessions
    • SES Executive Coaching
  • Our Services

    Our Services

    • APSLearn
    • APS Academy course offerings
    • APS Academy Training Venues
    • Onboard learning experiences
    • Share and reuse learning experiences
  • L&D Guide

    L&D Guide

    • Learning Design and Administration
    • Learning Culture
    • The Learner and Career Development
    • Learning Technologies
    • Evaluating Learning
  • News

    News

    • Academy News
    • MyAcademy
  • About us

    About us

    • Faculty
    • Learning approach
    • Contact
    • Supplier FAQ's

Learner surveys: writing questions

Are you ready to write or review your learner survey questions? Here is a great place to start! Follow the checklist below to help ensure your survey questions are effective.

The image features 6 small wooden cubes. Each cube has a simple, black emoticon face. The cubes are arranged in a row on a wooden surface with a dark background. A hand is picking up the fourth cube with a smiling face.

You are here

  1. Home
  2. Learner surveys: writing questions
  • Linkedin
  • Twitter
  • Facebook
  • Email
Print
Contents

Are you asking the right questions?

When Ange Stoddard started evaluating learning, the surveys she designed were full of interesting, thought-provoking questions. Then, she realised there was a better way. In this video, Ange shares her hard-earned insights into how to draft questions that will give you the data you really need. 

0:00
Hi, I'm Ange Stoddard from the APS Academy

0:03
and a big part of my role is working with L&D

0:07
folks on designing learning surveys.

0:10
I wanted to share a little bit of a

0:13
story that at the start of my career, I wasn't working specifically in evaluation,

0:18
but as many people who work in the L&D space do, kind of doing a little bit of

0:23
everything.

0:25
And the first surveys that I wrote for training that I was developing and

0:30
delivering, I think I just was following my curiosity.

0:33
You know, it was long,

0:35
I wrote it pretty fast and it was everything that I was interested in.

0:39
What I found after implementing that, you know,

0:43
it was too long,

0:44
I was asking my staff to dedicate too much time to it

0:49
and ultimately, I didn't end up using a lot of the information that I collected.

0:53
It was interesting, but I hadn't thought really carefully

0:56
about how I was actually going to use all that information at the end.

1:01
And so now when I'm looking at survey design,

1:04
I try and take a far more critical lens and importantly,

1:08
slow down a little bit and look at each question one by one and ask

1:14
what am I, what do I really need to know?

1:17
Is this a key evaluation area for me?

1:20
Is this a key piece of information that I

1:23
need to know and what am I going to do with it at the

1:26
end?

1:26
And I think thinking through that use case is a really valuable part of the

1:31
process.

1:32
Why is this question here and what am I going to do with the data at the end?

1:36
And if I can't think of a really good answer to that question,

1:39
then I won't include it.

1:41
And that can be hard sometimes because I think as evaluators, you know,

1:44
we're intrinsically curious. Or even L&D folks, you know,

1:47
we want to know what learners think and how they think we can improve.

1:52
And I think one of the mistakes along the way is that learners aren't always the

1:56
best people to ask these questions to.

1:59
You might really want to know how much their skills have improved,

2:03
but are they able to tell you that accurately?

2:06
Is that the best way to get that information?

2:09
And so I think that's the part of building good learner surveys is thinking

2:13
really carefully about what information you need, how you're going to use it,

2:17
and whether or not you have that information already, for example.

2:20
That's an important thing to consider.

2:22
And finally, are learners the right people to ask?

2:27
Can they tell you that information or do they have the right expertise?

2:31
I think an example of this is there's a temptation to ask them,

2:35
'how was the learning design?',

2:37
'was it a good mix of activities?',

2:39
for example.

2:40
And you know, they can tell you their views on that,

2:42
but if they're not learning designers, then then maybe they're not the right

2:46
people to be asking if you're looking to review your training workshop,

2:49
for example.

Survey question checklist

Are you ready to write or review your learner survey questions? Follow the checklist below to help ensure your survey questions are effective.

1. Can you describe a compelling reason to ask this question?

Survey questions should collect data that cannot be found elsewhere and align to your key evaluation areas. 

If not, you do not need them. It can be tempting to include questions because they sound interesting. But in evaluation, that is not enough. The data you gather must be useful. Being clear on exactly how you intend to use the data helps ensure you collect the right information. 

Get into the habit of writing down your rationale for each question. It might sound excessive – but it can play a big part in crafting a robust, fit for purpose survey.

Deciding if there is a compelling reason to ask a particular question comes down to your specific context. 

 

Example

‘What prior learning have you done on this topic?’ (Drop-down menu of known available offerings and an ‘Other – please specify’ option)

Potential issue

A compelling reason to ask this question might be you are evaluating a new program and deciding whether to impose a prerequisite for enrolment. There is a clear use case here: identifying the correct audience by exploring how prior learning affects outcomes.

In contrast, imagine you are evaluating a number of ongoing courses. If you have no plans to use the findings of this question for program improvement or business decisions, there is no compelling reason to ask it. You are wasting the respondents' time.

2. Will the learner have the information and expertise needed to answer this question?

Ask learners about their learning experience and aspects of the training that are clearly observable and don’t require specialist knowledge. This is especially important when evaluating the quality, design and delivery of the training.

 

Example

‘The training had a good mix of activities and theory’ (5-point Likert scale)

Potential issue

Learners will have different views about what is a ‘good mix’. Unless your audience are experts in effective learning design, their opinion may or may not align with good practice. What seems like effective learning and what is effective learning are different things.

Better approach

 Have a learning designer conduct a review of the course design and materials using the Learning Quality Framework.

3. Will the question measure what you intend it to measure?

An important concept in survey design is measurement validity, which refers to whether your questions measure what you think they measure. Testing your survey in advance and asking others to review your survey can help to catch validity issues. 

 

Example

‘To what extent did the training increase your understanding of the concepts explored?’ (4-point Likert scale). 

Potential issue

  • We are surprisingly poor judges of our own learning. This means there is very little correlation between someone’s perception of what they learned and what they actually learned. So, if your goal is to measure learning, a question like this will have a validity issue.
  • Even if learners could accurately assess their understanding of concepts at the completion of training, this doesn’t guarantee they will remember it in a month’s time.
  • This question is therefore measuring the opinion of learners, not the extent of their learning.

Better approach

Asking learners about how much they think they learned is rarely useful. Learning itself is best assessed through other means, like tests. In contrast, learners can report accurately on their experience and aspects of the training that are observable without specialist knowledge.

4. Is the question easy to answer?

Making your questions easy to understand and answer boosts response rates and improves data quality, validity and reliability. 

Check whether your questions include jargon, are confusing or long-winded, or require excessive time and effort to answer. Addressing these issues can help reduce the cognitive load on participants (i.e. mental effort) and limit survey fatigue.

 

Example - length

Reflecting back on your experience throughout the entire course, including the learning resources and activities, and thinking about the nature of your job role and typical work tasks, when will you begin to apply what you have learned on this course to your work? Select the ONE response that best applies.

  • ​Immediately (within the next week)
  • ​Within the next month
  • Within the next 1 - 3 months
  • Longer term (beyond 3 months)
  • ​I am not sure
  • ​I do not intend to apply anything covered in this course
  • ​I will not have the opportunity to apply anything covered in this course

Potential issue

This question is confusing because it is long-winded, providing unnecessary extra information for the reader to think about.

Better approach

‘When will you begin to apply what you have learned on this course to your work? Select the ONE response that best applies.’

(Same response categories).

 

Example - level of detail

‘What percentage of the course was spent doing practical activities?’ (1 – 100%).

Potential issue

  • This is a tricky question! Even facilitators would need to take a moment to recall the session structure and content, perhaps doing some rough calculations before answering.

Better approach

An alternative question measuring a similar aspect of the quality of course design is provided below.

‘In what ways has the course prepared you to apply what you have learned back in the workplace? Select any responses that apply.’

  • During the course I had the opportunity to practice new skills
  • The course has prompted me to reflect on how I will do things differently
  • I have taken away useful tools or resources
  • I have increased my confidence in my ability to use what I have learned
  • I can put what I have learned into my own words
  • I have a plan of how I will apply what I have learned
  • I am still unsure how to apply what I have learned
  • Other – please specify

5. Is the question phrased neutrally?

Leading or loaded questions encourage respondents to answer in a certain way. Asking neutral questions provides more valid and reliable data.

 

Example

‘How has this course prepared you to be a better manager?’ (Open text)

Potential issue

This question assumes the course prepared the learner to be a better manager. Learners may be inclined to give a positive response because it reflects better on them as employees (social desirability bias). 

Better approach

‘What did you personally get out of this course?’ (Open text) 

6. Are you asking one question at a time (i.e. no double-barrelled questions)?

A double-barrelled question asks for one answer despite posing more than one question. These questions easily confuse respondents, leading to distorted data. Look for the words ‘and’, ‘but’ and ‘while’ when eliminating double-barrelled questions.

 

Example

‘The training was relevant to my job role and a good use of my time.’ (5-point Likert scale)

Potential issue

  • This question measures 2 distinct dimensions: whether the training was relevant AND whether it was a good use of time. A positive response may indicate a learner thinks the training was relevant to their job role AND a good use of their time. Equally, it may indicate the learner thought the training was ONLY relevant or that it was ONLY a good use of their time.

Better approach

Split into 2 questions (or just choose one):

‘The training was relevant to my job role.’ (5-point Likert scale) 

‘The training was a good use of my time.’ (5-point Likert scale)

7. Are the answer categories exhaustive, mutually exclusive, and matched to the question type?

Exhaustive categories form a complete list so every respondent can find the one answer category that applies to them. After listing probable response options, consider whether you might still need an ‘Unsure’, ‘N/A’, or ‘Other, please specify’ option. Sometimes, you will need multiple of these!

Mutually exclusive categories do not overlap, which is important if you are asking respondents to only choose ONE response that applies to them. This rule doesn’t apply if you allow multiple selections or if you are asking for the best/most applicable option.

Matching answer categories to the question means there is a logical flow from question to answer. Matching errors tend to show up after multiple edits and are best caught through proofing and testing.

 

Example - not exhaustive

‘Did the course meet all of your accessibility needs?’

  • Yes
  • No – please specify

Potential issue

These answer categories do not form an exhaustive list. A respondent might identify as having no accessibility needs, meaning neither response here is applicable.

Better approach

Add response category: N/A

 

Example - not mutually exclusive

'What is your current APS level?'

  • APS 1 – 4
  • APS 4 – 6
  • EL1 – 2

Potential issue

These categories are not mutually exclusive because two options apply to APS 4 staff members.

Better approach

One possible fix is to change the answer category of ‘APS 4 – 6’ to ‘APS 5 – 6’.

 

Example - unmatched answer categories

‘How relevant was the training to your current job role?’ (Strongly agree, agree, neither agree nor disagree, disagree, strongly disagree)

Potential issue

These responses do not make sense as a response to the question asked. In other words, they do not ‘match’. 

Better approach

‘To what extent do you agree with the following statement: “The training was relevant to my current job role”’ (5-point Likert scale).

8. Will the data be structured in a useful way?

When writing questions, consider your end goal. Why are you collecting this data? How will you use it? Will it be easy to visualise, analyse and report on? Minimising the use of free text questions, which collect unstructured data, will make the survey easier for you to analyse and for respondents to complete.

Learn more about different question types in the following section.

 

Example

‘What motivated you to attend this course?’ (Free text)

Potential issue

  • Open questions produce unstructured data and increase the time required to complete a survey. Where there is a clear list of probable responses, use multiple choice or drop-down options to add structure to your data.
  • If you are using an open text field for contact information, include data validation (for example, an email address must contain @) to limit typing errors.
  • Context matters: Open questions provide nuance and deeper insights, allowing respondents to explain the ‘why’ behind their responses.

Better approach

‘What motivated you to attend this course?’ Select ALL that apply.

  • My attendance was mandatory/required
  • My manager/supervisor encouraged me to attend
  • Recommendation by others who have attended
  • To develop my skills for my current job role
  • To develop my skills for a future job role
  • Other – please specify

Question types

Choosing the right question type will assist you in matching answer categories to the question. It’s useful to include a mixture of question types to keep the survey engaging and avoid ‘straight lining’ (where a respondent continually selects the same response for all questions). Equally, constantly changing the question type increases the cognitive load on respondents. Below are some common question types: 

  • Multiple choice: Select only one OR multiple answers from a defined list of responses
  • Likert scale: a style of multiple choice that lists different levels of agreement with a statement (e.g. strongly agree, agree, disagree, strongly disagree)
  • Free text: Type a response
  • Matrix table: Combines multiple questions with the same response choices
  • Rank order: Rank a series of answers in order of preference
  • Slider: Respond using a sliding scale tool.

 

Use free text questions sparingly and only when you have the resources to analyse the data. You might be surprised by how often multiple choice can work just as well, if not better. Save free text questions for when you need the juicy details and nothing else will do!

 

Avoid rank order questions where possible because they tend to take a disproportionate level of effort from both participants and evaluators. Likewise, avoid matrix questions as they impose a higher cognitive load on respondents.

Learning Surveys: Collective suite

The image features 6 wooden blocks arranged horizontally on a wooden surface in a brightly lit room. Each block displays a word, collectively forming the phrase ‘AND THE SURVEY SAYS...’. The background is blurred with warm lighting, directing viewers’ attention towards the text.

Learner surveys: an introduction

Introducing our resource collection for L&D practitioners who are developing learner surveys. Find out what learner surveys are good for and when to use them.
The image features 6 small wooden cubes. Each cube has a simple, black emoticon face. The cubes are arranged in a row on a wooden surface with a dark background. A hand is picking up the fourth cube with a smiling face.

Learner surveys: writing questions

A checklist to help write and review learner survey questions. Plus, a summary of the question types available to you.

The image shows a vertical stack of 7 wooden blocks with the words ‘Tell Us What You Think’ reading from the top to bottom block. The blocks are stacked against a blurred, warm background that softly transitions from dark to bright light.

Learner surveys: the anatomy of a survey

A step-by-step guide to reviewing learner surveys as a whole. What makes a good learner survey? What are its key components?

The image shows 6 wooden blocks stacked in a pyramid on a wooden surface. Each block has a black icon. The top block displays a check mark. The middle row contains 2 blocks with folder icons. The bottom row has 3 blocks with clipboard icons. The background is warm and has a stack of documents with colorful tabs.

Learner surveys: implementation

Tips for administering learner surveys successfully and reminders about ethics, privacy and security.

Topics
Evaluation
L&D Section
Evaluation
L&D Categories
Evaluating Learning
Was this page helpful?
Last updated
19 March 2026

Links & Downloads

Learner surveys: an introduction
Learner surveys: writing questions (You are here!)
Learner surveys: the anatomy of a survey
Learner surveys: implementation

Acknowledgement of Country

The APS Academy acknowledges the Traditional Custodians of Country throughout Australia and recognises the continuing connection to lands, waters and communities.
We pay our respect to Aboriginal and Torres Strait Islander cultures and to Elders both past and present.

Museum of Australian Democracy
Old Parliament House 
Parkes 2600

APSLearn terms & conditions

About us

apsacademy@apsc.gov.auContact us
ABN: 99 470 863 260 
  • LinkedIn - external site - external site - external site

Help us improve

We are always looking for ways to improve the user experience of our website

 Share your thoughts

  • © 2021 APS Academy. All rights reserved.
  • FOI
  • Privacy Policy 
  • Terms of use
  • Accessibility
Back to top