Skip to main content Skip to main navigation
Australian Public Service Academy Australian Public Service Academy Australian Public Service Academy Australian Public Service Academy

APSLearn Login

Main navigation

  • APS Craft

    APS Craft

    • Integrity
    • Working in Government
    • Engagement & Partnership
    • Implementation & Services
    • Strategy, Policy & Evaluation
    • Leadership & Management
  • APS Professions

    APS Professions

    • APS Academy Campuses
    • Complex Project Management Profession
    • Data Profession
    • Digital Profession
    • Evaluation Profession
    • HR Profession
    • Procurement and Contract Management Profession
  • APS People

    APS People

    • Diversity and Inclusion
    • Health and Wellbeing
    • Work Health and Safety
    • Your APS Career
  • APS Induction
  • Learning Experiences

    Learning Experiences

    • Courses
    • Course session dates
    • Microcredentials
    • Events
    • Resources
    • Cross Agency Training Hub
    • Nationwide course sessions
    • SES Executive Coaching
  • Our Services

    Our Services

    • APSLearn
    • APS Academy course offerings
    • APS Academy Training Venues
    • Onboard learning experiences
    • Share and reuse learning experiences
  • L&D Guide

    L&D Guide

    • Learning Design and Administration
    • Learning Culture
    • The Learner and Career Development
    • Learning Technologies
    • Evaluating Learning
  • News

    News

    • Academy News
    • MyAcademy
  • About us

    About us

    • Faculty
    • Learning approach
    • Contact
    • Supplier FAQ's

Learner surveys: implementation

Tips for administering surveys successfully and reminders about ethics, privacy and security.

The image shows 6 wooden blocks stacked in a pyramid on a wooden surface. Each block has a black icon. The top block displays a check mark. The middle row contains 2 blocks with folder icons. The bottom row has 3 blocks with clipboard icons. The background is warm and has a stack of documents with colorful tabs.

You are here

  1. Home
  2. Learner surveys: implementation
  • Linkedin
  • Twitter
  • Facebook
  • Email
Print
Contents

The survey design is finished and now it’s time for implementation.

On this page, we share tips and advice for building your survey using an online tool, testing it, distributing it, and managing the data that starts flowing in. Alongside this are some reminders about data ethics, privacy and security for your survey.

Taking a human-centred approach to your data

Good design is first and foremost about people. In this video, Nancy O’Hare talks about the people involved in successfully administering a survey and the benefit of making the process smooth and simple for them. 

0:00
Hi, my name is Nancy O'Hare and I work in

0:02
learning evaluation here at the APS Academy.

0:06
So you've designed your survey, now it's time to implement it.

0:10
I'm going to give you a few quick tips on how to take a human-centred design

0:14
approach to make sure your survey is implemented as smoothly as possible.

0:19
So we're going to think about three key groups: our learners, our facilitators,

0:25
and our survey data analysts.

0:28
So starting off with our learners, first of all,

0:31
these are people doing the survey.

0:33
How can we make their experience as smooth as possible?

0:36
We want to have a QR code and a link as options for completion.

0:41
Another thing is always to make sure the survey works on both mobile and desktop.

0:47
And then through your user testing, you also want to make sure it meets the

0:51
Web Content Accessibility Guidelines to make it accessible for all learners.

0:56
Next, we're talking about our facilitators.

0:59
So our facilitators are the people who are distributing the survey,

1:02
and they really hold the key to boosting our response rates.

1:06
The most effective way to boost response rates is to distribute the survey within

1:11
the last five minutes of your course.

1:14
Doing so provides a captive audience.

1:16
The audience and learners are more likely to fill the survey in and getting those

1:21
high response rates gives you higher quality data.

1:24
Our third group of people is our survey data analysts.

1:29
So these are the people who are going to actually break down your results and

1:32
produce them into insights for executive, for your own course design and

1:36
development, and for external audiences.

1:38
So a key consideration for survey data analysts is how you can actually

1:43
recode your responses to make all the hard work being done in the back end and

1:48
they can visualise the data as easily and quickly as possible.

1:52
So to do this, you might want to set up a workflow

1:56
directly from your survey tool into a consolidated Excel spreadsheet.

2:01
And from there it can be exported into a visualisation software such as Power BI

2:05
to create a live feed of data which is easy to then manipulate and manage for

2:10
different audiences.

2:12
So those are our three key groups to think about when implementing your survey.

2:17
Good luck and I wish you all the best.

Building the survey

Building a survey starts with selecting a pragmatic survey tool. A pragmatic survey tool is secure, familiar to your team and easily exports responses. Consider whether your organisation has pre-approved survey software and whether it is accessible via a group license (such as Microsoft Forms, Qualtrics or Converlens). Where possible, integrate your organisation’s branding to build authority with learners.

After building your survey, test it! Many survey tools have in-built testing assistants. Complement these insights by sending the survey to a diverse audience to test. Ask them to try to “break” the survey and explore different user journeys. The checklist below can help you get started with testing.

  • Does the survey display correctly on mobile and desktop?
  • Is the survey accessible via link and QR code?
  • How long does the survey take to complete?
  • Does the survey branching work correctly?
  • Are the right questions optional and mandatory?
  • Are any questions consistently misinterpreted?
  • Do date fields produce dates in the expected format?
  • Can you recode any response options to make analysis easier? (e.g. coding Likert responses as 1 – 5)
  • Do any automatic response exports work as expected?
  • Can you easily visualise the test data?
  • Does the survey meet the latest Web Content Accessibility Guidelines (e.g. appropriate colour contrast, consistent navigation and compatible with assistive technologies)?

Delete or archive any test data before launching your survey.

Boosting response rates

Having a great survey is useless if no one fills it in. Even if you have a large number of responses, a low response rate means it will be difficult to generalise findings to the broader group. Imagine if most survey respondents said a course wasn’t relevant to their role, but only 5% of learners did the survey. This low response rate means you can’t be confident that most learners had this experience.   

The single most effective way to boost response rates is to reserve time at the end of a session for survey completion. Maximise responses by taking a digital-first approach and providing learners with multiple ways to complete the survey, including a QR code to scan and a link they can open online. Incentivise completion by providing facilitators with talking points to communicate why feedback is valuable and how it will be used. Sometimes, it’s even possible to share how feedback from other participants has already been used to improve the learning experience.

At course completion, email the survey invitation to participants, along with any additional materials from the course.

 

Tips for writing and distributing survey invitations

  • Personalise invitations: include the course name, whether the survey is anonymous and how long it will take to complete
  • Tell learners why reflecting on their learning is valuable and how you will use their feedback
  • Distribute the survey as soon as possible: save time for survey completion at the end of a session and automate a post-course survey email and polite reminder
  • Make communications verifiably authentic: include your contact emails on any automated messages.

Meeting your obligations

The Commonwealth Evaluation Policy provides a principles-based approach for the conduct of evaluations across the Commonwealth. It says evaluations need to be:

  1. fit for purpose
  2. useful
  3. robust, ethical and culturally appropriate
  4. credible
  5. transparent where appropriate.

In addition to these principles, we provide some prompts and references to support you in accounting for the ethical, privacy, and security considerations of learner surveys.

Ethics

Explore the Australian Evaluation Society’s Guidelines for the Ethical Conduct of Evaluations.

Privacy

Consult your agency’s privacy team for advice on privacy requirements. This could include a Privacy Collection Notice for participants and a privacy impact assessment. Refer to Chapter 5 of the Australian Privacy Principles for more information.

Ensure any collection and disclosure of data aligns with the Privacy Collection Notice. Are responses anonymous? Can aggregated responses be shared with other agencies? Where is the data stored?

Security

Follow key Australian government guidelines for the APS, including the Information Security Manual, the Australian Privacy Principles and your agency’s ICT Security policy.

Ensure the data is secure and only accessible by those with an operational need.

Analysing your data

Most survey software has basic data analysis and visualisation tools. Get more out of your data by creating a dataflow to automatically export results to an Excel spreadsheet in your organisation’s filing system (using your survey software or Power Automate). You can then go one step further by linking your LMS data, workforce data and survey data in a Power BI dashboard.

Tips for data visualisation

  • Know your audience: are they senior leaders or peers? Are they interested in quick insights or deeper exploration?
  • Have a clear objective: are you reporting data, identifying trends, illustrating return on investment, undertaking course quality assurance?
  • Follow basic design principles: maintain white space, use colours and contrast to create emphasis

Learning Surveys: Collective suite

The image features 6 wooden blocks arranged horizontally on a wooden surface in a brightly lit room. Each block displays a word, collectively forming the phrase ‘AND THE SURVEY SAYS...’. The background is blurred with warm lighting, directing viewers’ attention towards the text.

Learner surveys: an introduction

Introducing our resource collection for L&D practitioners who are developing learner surveys. Find out what learner surveys are good for and when to use them.
The image features 6 small wooden cubes. Each cube has a simple, black emoticon face. The cubes are arranged in a row on a wooden surface with a dark background. A hand is picking up the fourth cube with a smiling face.

Learner surveys: writing questions

A checklist to help write and review learner survey questions. Plus, a summary of the question types available to you.

The image shows a vertical stack of 7 wooden blocks with the words ‘Tell Us What You Think’ reading from the top to bottom block. The blocks are stacked against a blurred, warm background that softly transitions from dark to bright light.

Learner surveys: the anatomy of a survey

A step-by-step guide to reviewing learner surveys as a whole. What makes a good learner survey? What are its key components?

The image shows 6 wooden blocks stacked in a pyramid on a wooden surface. Each block has a black icon. The top block displays a check mark. The middle row contains 2 blocks with folder icons. The bottom row has 3 blocks with clipboard icons. The background is warm and has a stack of documents with colorful tabs.

Learner surveys: implementation

Tips for administering learner surveys successfully and reminders about ethics, privacy and security.

Topics
Evaluation
L&D Section
Evaluation
L&D Categories
Evaluating Learning
Was this page helpful?
Last updated
19 March 2026

Links & Downloads

Learner surveys: an introduction
Learner surveys: writing questions
Learner surveys: the anatomy of a survey
Learner surveys: implementation (You are here!)
Course: Introduction to Data in Government

Acknowledgement of Country

The APS Academy acknowledges the Traditional Custodians of Country throughout Australia and recognises the continuing connection to lands, waters and communities.
We pay our respect to Aboriginal and Torres Strait Islander cultures and to Elders both past and present.

Museum of Australian Democracy
Old Parliament House 
Parkes 2600

APSLearn terms & conditions

About us

apsacademy@apsc.gov.auContact us
ABN: 99 470 863 260 
  • LinkedIn - external site - external site - external site

Help us improve

We are always looking for ways to improve the user experience of our website

 Share your thoughts

  • © 2021 APS Academy. All rights reserved.
  • FOI
  • Privacy Policy 
  • Terms of use
  • Accessibility
Back to top