Skip to main content Skip to main navigation
Australian Public Service Academy Australian Public Service Academy Australian Public Service Academy Australian Public Service Academy

APSLearn Login

Main navigation

  • APS Craft

    APS Craft

    • Integrity
    • Working in Government
    • Engagement & Partnership
    • Implementation & Services
    • Strategy, Policy & Evaluation
    • Leadership & Management
  • APS Professions

    APS Professions

    • APS Academy Campuses
    • Complex Project Management Profession
    • Data Profession
    • Digital Profession
    • Evaluation Profession
    • HR Profession
    • Procurement and Contract Management Profession
  • APS People

    APS People

    • Diversity and Inclusion
    • Health and Wellbeing
    • Work Health and Safety
    • Your APS Career
  • APS Induction
  • Learning Experiences

    Learning Experiences

    • Courses
    • Course session dates
    • Microcredentials
    • Events
    • Resources
    • Cross Agency Training Hub
    • Nationwide course sessions
    • SES Executive Coaching
  • Our Services

    Our Services

    • APSLearn
    • APS Academy course offerings
    • APS Academy Training Venues
    • Onboard learning experiences
    • Share and reuse learning experiences
  • L&D Guide

    L&D Guide

    • Learning Design and Administration
    • Learning Culture
    • The Learner and Career Development
    • Learning Technologies
    • Evaluating Learning
  • News

    News

    • Academy News
    • MyAcademy
  • About us

    About us

    • Faculty
    • Learning approach
    • Contact
    • Supplier FAQ's

Learner surveys: an introduction

When to use learner surveys and what to think about before drafting question.

The image features 6 wooden blocks arranged horizontally on a wooden surface in a brightly lit room. Each block displays a word, collectively forming the phrase ‘AND THE SURVEY SAYS...’. The background is blurred with warm lighting, directing viewers’ attention towards the text.

You are here

  1. Home
  2. Learner surveys: an introduction
  • Linkedin
  • Twitter
  • Facebook
  • Email
Print
Contents

Do you really need a survey?

Surveys are great for capturing data but they aren’t always the right tool for the job. In this video, Julia Hinsliff from PM&C’s BETA team shares her insights on how to decide if a survey is right for you and offers her top tips on making it simple and pragmatic. 

 

0:00
Okay, so you're probably thinking about whether

0:03
you need to do a survey with your learners,

0:05
and I would ask you to think about before you start designing that survey,

0:09
whether you can go and find some of the data you need from another source.

0:15
As learning and development practitioners, we do have access to things like learning

0:20
management systems and training needs analyses and staff surveys.

0:24
Sometimes the information you need can be gathered from those sources and you might

0:27
not need to run a survey at all.

0:30
If you do get to the point where you think, 'Oh yeah,

0:32
I need to find some more data, we don't have enough here',

0:35
you need to think about what sort of data you're looking for.

0:38
You might be really interested in a topic and need to go quite deep to find out

0:43
what's going on.

0:44
In that case, you're probably likely to need some sort

0:48
of interviews or focus groups to get a real understanding of the topic that

0:52
you're exploring.

0:54
On the flip side though, if you're trying to think about a broad

0:58
topic, or you already know quite a bit about it,

1:00
but you want to understand what's happening for a larger group of people,

1:04
the survey will be a great way to get that information.

1:07
The length of your survey is really going to affect how good the data is.

1:13
People get turned off if things are too long.

1:15
We talk about respondent burden sometimes and say if people are asked to do too

1:19
many surveys or they're too long, they're just not going to answer it

1:22
correctly or they might not answer it at all.

1:24
So keep the length of your survey short and just ask the questions that you need.

1:29
A good survey is all about good questions, questions that are easy to answer and as

1:35
short as possible.

1:37
And a good survey is also fit for purpose.

1:40
So you want to be making sure that your surveys are answering the question that

1:45
you have at hand and not asking too many questions and asking to get some data

1:50
that you don't need

1:51
in the end. If you're looking for information about

1:54
whether someone has increased their knowledge in a particular area,

1:58
and it doesn't matter whether they're male or female or forty or twenty,

2:01
then don't ask questions about gender or age.

2:04
That information isn't going to be necessary and it just takes up time for

2:08
the survey.

2:10
I'd also like to say that don't let great be the enemy of the good.

2:14
So make sure that you're being quite pragmatic in the way that you design your

2:18
surveys.

2:19
They don't need to be perfect,

2:20
it is important to test them though.

2:22
So if you can get someone to read through your surveys,

2:25
send it out to a few friends, get them to check that they are easy to

2:29
understand and easy to answer, then you're more likely to get a really

2:32
good survey set up before you start using it with your learners.

Julia Hinsliff (BETA team, PM&C) shares her advice on how to build a pragmatic survey.

What learner surveys work best for

Learner surveys, like all evaluation tools, are good at measuring some things and poor at measuring others. So, it’s important to start by clarifying what you want to measure. Your overarching key evaluation areas (KEAs) could be, for example: 

  • Effectiveness: Are your courses delivering their intended outcomes?
  • Relevance: Did the course match the learners’ needs and job context?
  • Quality: Did the course design and delivery provide a good learning experience?
  • Application: Are participants applying what they learned in the workplace?

Read more about key evaluation areas and explore a case study from the APS Academy outlining our current KEAs. It’s important to note that KEAs are not survey questions, even though we have framed them here as questions.

To decide whether a learner survey is the right tool for you, consider your KEAs alongside the characteristics of learner surveys listed below.

 

What learner surveys are good for (and not good for)

Good for

✅ Quick, efficient data collection

✅ Hearing from lots of people

✅ Gathering insights into learners’ beliefs, attitudes, experiences, perspectives, and intentions

✅ Identifying quality issues in courses

✅ Collecting mostly quantitative, structured data

✅ Collecting sensitive data (if it’s anonymous)

Not good for

❌ Measuring actual changes to knowledge or behaviour (rather than self-reported changes)

❌ Understanding learners’ experiences or perspectives with nuance and depth

Consider your local context before getting started

There are many ways to design a learner survey. Throughout these learning resources, our goal is to help you develop a fit for purpose survey that suits your unique context. Here are some things to keep in mind:

  1. Evaluation frameworks, plans, and program logic models. Does your agency have an evaluation strategy or framework in place? Is there an existing evaluation plan or program logic model for the program/project which you can (or must) use?
  2. Evaluation approaches. Is there a particular evaluation approach you will follow (e.g. Kirkpatrick, Brinkerhoff, Thalheimer, Phillips, Capability, Opportunity and Motivation-Behaviour)? You don’t need to strictly follow one of these models, however they may provide you with a useful structure as you design and administer your survey. If your agency already evaluates other learning activities, consider using standardised survey questions so data can be compared.
  3. Existing data collection tools and data sources. Can any of your questions be answered by existing tools (e.g. APS Employee Census, other internal surveys) or data sources (e.g. enrolment data, capability reviews)?
  4. Agency-specific needs and priorities. What insights matter most in the context of your agency and key stakeholders? Are there any reporting requirements or performance metrics you need to consider?
  5. Resources available. What time, resources, and budget do you have for this work? Does that align with what you have planned?
  6. Local expertise. Can you seek advice or support from any internal evaluation or L&D colleagues?
  7. Tools available. What software is available to you (e.g. Qualtrics, MS Forms, Power BI, LMS functionality)?
  8. Final product. How easy will it be to visualise your survey data to tell a persuasive and accurate story?

Decide when to survey your learners

A learning survey can be administered at any time, from the moment a participant enrols in the course, to months or years after they complete it. So, when is the right time? That depends on what you need to know and what participants can tell you.

Before the training

A pre-program survey can collect data that captures a baseline for future comparison. Comparing a snapshot of ‘before’ and ‘after’ training can be valuable but be mindful that participants can only give you an opinion of their level of skill or knowledge. 

Before training, you might also want to collect participant information to help you tailor the learning experience for a particular cohort. For example, you might ask participants about their learning goals, prior experience or reason for enrolling.

During the training

A midpoint survey can be useful if you are delivering a program over several days, weeks, or months. For example, you might check for quality issues or affirm that the focus has been relevant for the participants thus far. If the facilitators can make meaningful positive changes for that cohort, it is worth asking questions during the training. Online polling tools, sticky notes, or facilitated reflections can also achieve this purpose.

Immediately after the training

Post course surveys collect feedback and information about a learner’s experience while it’s still fresh in their minds. By scheduling 5-10 minutes at the end of the training, you can maximise your response rate and scaffold the survey as an embedded part of the learning experience.

At this moment, learners can also tell you about their intentions to apply learning, the characteristics of the work environment they are going back to, and what they expect the impact of the training on their performance will be. But they can’t tell you about whether the learning has had a long-term impact on their performance.

Well after the training

Surveys administered a few weeks or months after the training can be useful for measuring if and how participants have applied learnings in their job. This is self-reported information rather than direct evidence of behaviour change but can still yield useful insights. Surveying a manager can provide another perspective on whether the performance of a course participant has improved and if not, why not?

Re-engaging with learners can also serve as a ‘nudge’, a behavioural economics term referring to a design choice that seeks to influence behaviour. This might simply be a prompt to recall and reflect on key learnings and their application. But it can also nudge learners towards behaviours associated with successful learning transfer, including sharing with peers or taking on a relevant work project soon after training.

There is no clear consensus on the best time to administer a follow-up learner survey. The sooner you ask participants, the less time they will have had to apply their learning. However, the longer you wait, the less likely participants will respond to your survey and be able to accurately recall the focus of the training. Somewhere between three weeks and three months after training is likely appropriate, depending on your context.

Learning Surveys: Collective suite

The image features 6 wooden blocks arranged horizontally on a wooden surface in a brightly lit room. Each block displays a word, collectively forming the phrase ‘AND THE SURVEY SAYS...’. The background is blurred with warm lighting, directing viewers’ attention towards the text.

Learner surveys: an introduction

Introducing our resource collection for L&D practitioners who are developing learner surveys. Find out what learner surveys are good for and when to use them.
The image features 6 small wooden cubes. Each cube has a simple, black emoticon face. The cubes are arranged in a row on a wooden surface with a dark background. A hand is picking up the fourth cube with a smiling face.

Learner surveys: writing questions

A checklist to help write and review learner survey questions. Plus, a summary of the question types available to you.

The image shows a vertical stack of 7 wooden blocks with the words ‘Tell Us What You Think’ reading from the top to bottom block. The blocks are stacked against a blurred, warm background that softly transitions from dark to bright light.

Learner surveys: the anatomy of a survey

A step-by-step guide to reviewing learner surveys as a whole. What makes a good learner survey? What are its key components?

The image shows 6 wooden blocks stacked in a pyramid on a wooden surface. Each block has a black icon. The top block displays a check mark. The middle row contains 2 blocks with folder icons. The bottom row has 3 blocks with clipboard icons. The background is warm and has a stack of documents with colorful tabs.

Learner surveys: implementation

Tips for administering learner surveys successfully and reminders about ethics, privacy and security.

Topics
Evaluation
L&D Section
Evaluation
L&D Categories
Evaluating Learning
Was this page helpful?
Last updated
19 March 2026

Links & Downloads

Learner surveys: an introduction: (You are here!)
Learner surveys: writing questions
Learner surveys: the anatomy of a survey
Learner surveys: implementation

Acknowledgement of Country

The APS Academy acknowledges the Traditional Custodians of Country throughout Australia and recognises the continuing connection to lands, waters and communities.
We pay our respect to Aboriginal and Torres Strait Islander cultures and to Elders both past and present.

Museum of Australian Democracy
Old Parliament House 
Parkes 2600

APSLearn terms & conditions

About us

apsacademy@apsc.gov.auContact us
ABN: 99 470 863 260 
  • LinkedIn - external site - external site - external site

Help us improve

We are always looking for ways to improve the user experience of our website

 Share your thoughts

  • © 2021 APS Academy. All rights reserved.
  • FOI
  • Privacy Policy 
  • Terms of use
  • Accessibility
Back to top