Skip to main content Skip to main navigation
Australian Public Service Academy Australian Public Service Academy Australian Public Service Academy Australian Public Service Academy

APSLearn Login

Main navigation

  • APS Craft

    APS Craft

    • Integrity
    • Working in Government
    • Engagement & Partnership
    • Implementation & Services
    • Strategy, Policy & Evaluation
    • Leadership & Management
  • APS Professions

    APS Professions

    • APS Academy Campuses
    • Complex Project Management Profession
    • Data Profession
    • Digital Profession
    • Evaluation Profession
    • HR Profession
    • Procurement and Contract Management Profession
  • APS People

    APS People

    • Diversity and Inclusion
    • Health and Wellbeing
    • Work Health and Safety
    • Your APS Career
  • APS Induction
  • Learning Experiences

    Learning Experiences

    • Courses
    • Course session dates
    • Microcredentials
    • Events
    • Resources
    • Cross Agency Training Hub
    • Nationwide course sessions
    • SES Executive Coaching
  • Our Services

    Our Services

    • APSLearn
    • APS Academy course offerings
    • APS Academy Training Venues
    • Onboard learning experiences
    • Share and reuse learning experiences
  • L&D Guide

    L&D Guide

    • Learning Design and Administration
    • Learning Culture
    • The Learner and Career Development
    • Learning Technologies
    • Evaluating Learning
  • News

    News

    • Academy News
    • MyAcademy
  • About us

    About us

    • Faculty
    • Learning approach
    • Contact
    • Supplier FAQ's

Evaluating Learning & Development - getting started

How to get started evaluating L&D.

Decorative image

You are here

  1. Home
  2. Evaluating Learning & Development - getting started
  • Linkedin
  • Twitter
  • Facebook
  • Email
Print

So, you want to evaluate a learning and development offering. 

Above all, we suggest you keep it simple and focussed. Whatever sort of evaluation you undertake, stick to what is feasible. It’s better to have a simple evaluation that measures the things that matter most, rather than a complex and ambitious one that you cannot successfully deliver.

And keep it focussed on the data that will be most valuable. It’s not enough to gather interesting information; the data you collect needs to be useful to you, your stakeholders and your organisation by providing actionable insights. 

As with most things, good planning is key. An evaluation plan provides a structured and logical way of working out what you want to measure, how you are going to measure it, and what you will do with the data you collect. Start planning by establishing your Key Evaluation Areas (KEAs).

Key Evaluation Areas (KEAs)

The major themes or areas of focus that you intend to explore through evaluation are called KEAs. Although they can sometimes be framed as questions, they are not to be confused with the questions you might ask in a survey or interview. Instead, they are the overarching questions you want answered by conducting an evaluation.

The OECD has defined six criteria which serve as an excellent starting point for creating KEAs for an L&D evaluation.

  1. Effectiveness: Is it achieving its intended objectives/outcomes?
  2. Impact: What difference does it make in the long term for learners and the agency?
  3. Relevance: Is it doing the right things for learners and the agency?
  4. Coherence: How well does it fit in your broader L&D landscape and agency context?
  5. Efficiency: How well are resources being used (e.g. time, budget)?
  6. Sustainability: Will the benefits last?

Depending on the purpose of your evaluation and the resource you have, you might focus on some but not all of these KEAs, or you may identify other areas. Your agency might also have an evaluation strategy or pre-existing KEAs to which you will need to align.

Read how the APS Academy has adapted the OECD KEAs in its approach to evaluation learning.

Data sources and collection tools

Once you know what you want to measure, you can choose the evaluation tools and approach that will allow you to collect the data you need. Choose an approach that is feasible and proportionate to the importance of what you are evaluating. 

Some of the data sources and collection methods you can use are introduced below.

Existing data

Before collecting new data, consider what existing data is available that could help answer your evaluation questions. Datasets that might be useful include:

  • Program/course materials and documentation
  • Registration and attendance data
  • Corporate and business area plans
  • Capability reviews
  • Data relating to similar programs/courses for comparison
  • APS Employee Census data, or other existing surveys
If you are evaluating the impact of a major L&D intervention, think big! What existing performance indicators might help detect change (e.g. an APS Employee Census question)?

Surveys

Surveys are a great low-cost, time-efficient way to capture the views and experiences of lots of people. Surveys are useful for collecting mostly quantitative (numerical) data. Surveys are less effective at collecting qualitative data since you might not have all the context you need to make sense of the responses, or the capacity to analyse a large volume of written comments. 

Check out our resources on building better learners' surveys to learn more about this important data collection tool for L&D evaluation.

It’s common to survey learners but also consider their supervisors or program facilitators.

Interviews and focus groups

These methods are both useful if you want to explore and capture the in-depth stories or experiences of selected individuals. They also let you clarify questions and dive deeper if you come across an interesting insight. Interviews are typically a discussion with one individual whereas focus groups are facilitated group discussions.

Only go down this path if you have sufficient time and resources. Qualitative (descriptive) data is rich but can make analysing and reporting your findings particularly intensive.

When selecting participants, random isn’t always better. The ‘Success Case Method’, which was developed for L&D by Robert Brinkerhoff, identifies the most and least successful participants to help surface enabling factors and barriers. As this method introduces selection bias, the findings cannot be generalised to the broader group. If you want to generalise, random is better.

Assessments and tests

Assessments enable you to directly measure what individuals have learned. A quiz or test is an example of an assessment, but there are other approaches available. Assessments can also be embedded learning activities that incorporate opportunities for feedback, serving a purpose beyond evaluation.

If you are designing a quiz, consider scenario-based questions that require learners to apply what they learned rather than recall facts.

Observations

Observing training first-hand can tell you a lot about a learning experience. Whether you take a structured or unstructured approach to taking and recording observations, it’s a good idea to think about what you will look for in advance and how you will capture the data in a usable way.

As observations may cause anxiety for facilitators and learners, take steps ahead of time and during the training to establish psychological safety. 

Choose an observer who has a relevant background (e.g. learning design, subject matter expertise), enabling them to make more informed observations.

Resources and services provided by the APS Academy and its partners

  • Building better learning surveys is a series of webpages which have been designed for L&D practitioners who work in survey design. The pages provide practical guidance on how to design and administer learning surveys for both novice and experienced L&D practitioners.
  • Learning Evaluation Hub Services include 1-on-1 guidance, peer review and options for in-depth partnered projects, available to APS staff working on evaluating L&D.
  • The Learning Evaluation Resource Hub is a GovTEAMS library of practical guides and templates, resources and case studies showcasing APS agency practices.
  • Evaluate Connect is a community of L&D practitioners committed to building capability in the evaluation of L&D. The APS Academy coordinates meetings every 6-8 weeks and is open to Commonwealth and State and Territory public servants.

For access or additional information about any of the above please email APSLearningEvaluation@apsc.gov.au

Topics
Evaluation
L&D Section
Evaluation
L&D Categories
Evaluating Learning
Was this page helpful?
Last updated
19 March 2026

Acknowledgement of Country

The APS Academy acknowledges the Traditional Custodians of Country throughout Australia and recognises the continuing connection to lands, waters and communities.
We pay our respect to Aboriginal and Torres Strait Islander cultures and to Elders both past and present.

Museum of Australian Democracy
Old Parliament House 
Parkes 2600

APSLearn terms & conditions

About us

apsacademy@apsc.gov.auContact us
ABN: 99 470 863 260 
  • LinkedIn - external site - external site - external site

Help us improve

We are always looking for ways to improve the user experience of our website

 Share your thoughts

  • © 2021 APS Academy. All rights reserved.
  • FOI
  • Privacy Policy 
  • Terms of use
  • Accessibility
Back to top