By Andrei Turenko, Health and Social Policy, Behavioural Economics Team of the Australian Government (BETA)
At the Behavioural Economics Team of the Australian Government (BETA), we generate and apply evidence from the behavioural and social sciences to find solutions to complex policy problems - supporting confident, efficient and effective policy making which puts people at its heart.
BETA works across all parts of the policy cycle, applying various approaches and techniques to support policy decision making and develop solutions. A crucial aspect of this is understanding if solutions actually work. To address this, we use a range of quantitative and qualitative techniques such as randomised controlled trials, online experiments, focus groups, interviews, and surveys to help us understand the impact of solutions. Evaluations can be done both before and after implementation to understand if a solution is appropriate to roll out, and how a solution has actually worked in the real world. Ideally a combination of both will lead to the best outcomes, mitigate potential risks and support the effective use of resources.
Deciding which methodology to use will often depend on the ultimate goals of the evaluation and the feasibility constraints we face. Where possible, we aim to adopt a multimethod approach that combines relevant quantitative and qualitative techniques.
A multimethod approach to evaluation
A recent example of a multimethod approach was our work with the Department of Employment and Workplace Relations (DEWR) on evaluation of the Fair Work Amendment (Paid Family and Domestic Violence Leave) Act 2022. This legislation entitles affected employees to take 10 days of paid family and domestic violence (FDV) leave when needed.
What was the impact of this new entitlement on employers and employees? To answer this question, we conducted 5 primary research activities: a survey of 594 victim-survivors, interviews with 22 victim-survivors, a survey of 1,437 employers, interviews with 15 employers and a survey (including a randomised experiment) of 3,008 Australian workers.
We found the leave entitlement was performing as intended, delivering significant benefits to those accessing it. The leave entitlement was also remarkably well-supported by victim-survivors and employers alike, and has the potential to reduce workplace discrimination of victim-survivors. We also found that a lack of awareness may be constraining the uptake of the FDV leave and recommend increasing awareness, amongst other actions as potential next steps to ensure FDV leave is accessed by those who need it.
The appropriate use of trials
At BETA, we are also strong supporters of rigorous causal evaluation and, where appropriate, will use randomised controlled trials (RCTs) to learn whether an intervention achieves its intended policy outcome. A randomised controlled trial is an evaluation technique where people are randomly assigned to different groups, with one (or more) group receiving a proposed solution/intervention, while the other group acts as the control, and receives the business-as-usual approach.
For example, in a recent project with the Department of Climate Change, Energy, the Environment and Water (DCCEEW), we used numerous online RCTs to test interventions which empower homeowners to take action to upgrade their home’s energy performance. This is an important, but often difficult decision for many homeowners. Most Australian homes were built before minimum energy efficiency requirements were introduced nationally in 2003. Identifying the most suitable and cost-effective home energy upgrade can be complex, with uncertainty leading to inaction.
We ran five online, survey RCTs with almost 14 thousand people to test the impact of showing people home energy ratings on real estate listings. Our findings indicated that mandatory disclosure works. Australians in our study were more likely to choose to inspect homes with high energy ratings when disclosure of ratings was mandatory. We also saw that people selling homes were more likely to choose upgrades that improved Home Energy Ratings when told it was mandatory to disclose the rating in a real estate listing. In terms of specific decision tools, we saw that a simple text-based prompt led 2.4% of Australians in this study to seek more information about home upgrades. This is a strong response for a prompt. However, an online home upgrade decision support tool did not increase respondents’ intention to make home upgrades, but did increase their confidence in choosing and planning upgrades. The findings of this research have been used to inform the Home Energy Ratings Disclosure Framework – Version 2.
In the right circumstances RCTs and online experiments can provide robust causal evidence for policy and decision-making, as well as complement other forms of research. However, RCTs are not without limits and are not a one-size-fits-all evaluation method. Researchers need to select the appropriate evaluation method on a case-by-case basis, weighing up each methods strengths and weaknesses and appropriateness for the issues at hand.
Supporting evidence-based policy making
These examples demonstrate BETA’s adaptable approach to evaluation. We strive for rigour and completeness (understanding if and why something has worked), while adopting a pragmatic and resource-sensitive approach – all with the goal of providing timely and robust evidence to our colleagues across the APS and improving the lives of Australians.