Empowered: How timely evidence can drive policy and regulatory change

Andrei Turenko: Thank you, Secretary, and thank you, Susan, for those opening remarks. My name is Andrei. I'm a Senior Advisor here in BETA. I will be with you throughout the day, introducing a great lineup of speakers and asking them questions on your behalf. I would also encourage you to submit your questions via the Q&A function on top of your screen. As Susan mentioned, today's program focuses on applied behavioural science in government. We have some great examples coming up from Australia and overseas. We are also excited to host a panel discussion on embedding evidence in the policy cycle later in the day. So without further ado, we will jump into our first presentation.

I would like to welcome my colleague, Laura Bennetts-Kneebone. Laura is a Senior Advisor here at BETA. She leads our Energy, Environment and Economics team. Prior to joining BETA, Laura spent 10 years at the Department of Social Services working on the Longitudinal Study of Indigenous Children and other Australian longitudinal studies. Today Laura will be taking us on a policy development tour covering 4 recent BETA projects and showcasing how behavioural insights are saving Australian’s money. Over to you, Laura.

Laura Bennetts-Kneebone: Thank you so much, Andrei. I'm so pleased to be here today, speaking to you all. I would like to tell you today a story about a big project or rather a series of projects and about how we saw major changes to regulations occur, by being able to deliver the right evidence at the right time.

When I talk about timely evidence in government, a major aspect of that timing is the policy cycle. Many of you may be unfamiliar with the policy cycle, so let me introduce it.

No, not that policy cycle, this cycle. The model is just that, a model. It's far more complex than this, but it gives us a way to think about the general process and also identify where behavioural science can play a role. BETA works in many points of the policy cycle. Let me illustrate. Sometimes we work near the top around one or two o’clock, we might ask questions like, Why isn't this working? Do we see someone else doing this better? What does it say in the literature or what are people actually experiencing when they interact with this government policy or program? I'll give some examples of projects that we've done that are heavily focused on this early stage area. One was a national survey of mental health related stigma and discrimination.

Another was a survey of net zero attitudes and behaviours to understand what prompts people to install solar panels or to make other home upgrades, and as the Secretary just described, we've done recently a survey of vehicle purchase decisions to understand barriers to buying electric vehicles.

Quite often we work at around the four o’clock mark on this circle. We design and test a few different options. We want to know what will work before we start recommending anything, and to do that testing, we do research that might include a randomised control trial which could be embedded in a survey. Or it could be a field trial. We might do a pilot test or user testing discrete choice experiments. There are lots of different methods for testing and examples of projects that have focused heavily in this stage have included online wagering, designing and testing gambling activity statements to see if they encourage people to wager less money online. We tested the effect on behaviour of mandatory disclosure of home energy ratings and tested letters to GPs about prescribing antibiotics.

Sometimes we also work on the other side between six and eleven. Imagine that a decision has been made, but questions might remain about how to implement it. Or it might have already been implemented, but road testing in the real world has revealed some new issues. We may want to evaluate it and see whether it worked. We have some projects that focused on this stage two, which included reviewing the cybercrime reporting tool and recommending ways to make it clearer and easier to navigate, testing nudges to encourage people to pay more off their credit card, or Form-a-palooza. This was a big workshop we ran to improve government forms. Some people prefer to work earlier in the cycle and others later. I'm agnostic about this. I think that offering the right type of assistance at the time that it's actually needed helps to drive change.

And I love to hear about projects in that early stage, in the middle stages or in the later stage of the policy cycle.

Our work on energy bills and market engagement intersects at various points with the policy cycle, and it demonstrates really clearly how the right evidence at the right time can make a huge difference.

So let me get on to the case study and tell you this story. In late 2020, my manager told me that a meeting was being arranged with the AER about a new partnership and I had to look up AER to see what it meant. Australian Energy Regulator. The Energy Minister at the time, Angus Taylor, had submitted a rule change request saying we need better bills.

He pointed to research saying that bills were too complicated and didn't support people to engage in the market and switch to better plans. The rule change request that I'm talking about is a rule in the national energy retail rules, and they specified how bills should be laid out. They make rules for these essential services so that people can get consistent bills from different retailers. And the responsibility or the right to change those rules sits with the AEMC, the Australian Energy Market Coordinator. At that time, consultations were underway and an early recommendation had indicated that the AER might be asked to work out what the new rules should be. They would need to make evidence based decisions, and they guessed that they might get about a year to design a new guideline.

So they brought BETA on as partners in order to be ready, talk about timely. It did happen and we had only a few months to deliver a major piece of research, leaving them enough time to write the new guideline.

There were a few things that helped. BETA had already done a major piece of design research on energy bills a few years prior. This had actually fit into the rule change request and as part of that work there had been focus groups and user testing where people had looked at different bills and different designs and help to design bills. So we already had a tested, well designed bill as our starting template.

Now the well designed bill had nice big clear font and a logical layout and best of all, we could add and remove features to it, to test how well they worked in a survey experiment. We decided to test, well, quite a lot of things, but I'll just tell you about a few of them. The few, these were some of the key ones. One was a message telling customers if their current retailer had a cheaper plan and if so, how much they could save by calling them. We called this the better offer message. We also tested a link to the Energy Made Easy comparison site which sat right beneath the better offer message on the bill and a plan summary. Knowing what you're currently paying and how your own plan works helps a lot when deciding if another plan is going to save you money.

In our survey experiment, we found that the better offer message made people two to three times as likely to consider switching plans without the message. Most people didn't consider switching as an obvious way to save on their bill. They thought about things like switching off the lights or running your washing machine in the middle of the day when your solar panels were working. They thought about setting the thermostat at a different temperature on your heating and cooling. But very few people thought about switching to a cheaper plan and the message we found changed that and brought it to more top of mind. Within a few months, we delivered draft results to the AER and presented a summary of findings in a public forum, which was mainly attended by interested parties such as government, retailers and consumer advocates, and we published an interim report, followed soon after, by a final report.

Not long after that, the we finished drafting and published the better bills guideline for consultation, and then it was later finalised.

The better bills guideline was heavily influenced by the research that BETA and the AER had conducted in partnership, and it introduced three key changes to energy bills. One, the bills had to follow design principles to ensure that they use simple language and were presented in a way that was easy to understand. 2, the content of the bills had to be formatted in a tiered system. That means that critical information had to be upfront on the first page. And 3, a better offer message. Information about Energy Made Easy had to be upfront on the first page and the bill needed to contain a plan summary so people knew about their current plan.

Now retailers, were going to need a certain amount of time to redesign bills to comply with the new mandatory guideline. So in that time period, we moved to the next project, the redesign of Energy Made Easy, bringing people to the website is one thing, but would it be effective in helping to switch? A website like this works very, very differently to a nudge. Comparing lots of different plans is a complicated task, and complex decisions require support. People like to be walked through the important factors, but not overwhelmed with the low impact ones and they need to compare the costs and benefits of different choices and look at them side by side. They also need confidence that the information is unbiased, honest and evidence-based. Government officials making decisions don't work all that differently to any other Australians, by the way. They are similarly time poor, appreciate clear communication and need to have evidence to help them understand the likely consequences of any decision. Here we started from something that had already been implemented that had been running for a number of years, and we conducted some quick evaluation and then went right back to the start of the policy cycle. We reviewed the complaints and comments data to see what the issues were with the website and we used a wide range of methods. You can see them listed on this screen, to understand what was and wasn't working, what people wanted and needed, and we came up with ideas to support them. So this included pop-up surveys on the website. We looked at the Google Analytics data. We did qualitative user testing of the website and of prototypes and then we did a survey of switching preferences and use of comparison websites and an embedded survey experiment. So a lot of different mixed methods and different research methods to support different things. After that, we tested ideas to see what would work. Now we found that as a decision support tool, the website concept was good and working relatively well, but it fell down in a couple of key places where we could see a need for improvement. People needed to answer a series of questions before they could see personalised results. The plans that were available to them in their postcode with their particular combination of factors, whether that's solar panels or controlled load or smart meter can vary. What kinds of plans you are eligible for and can be delivered to your home. Now quite a lot of people were dropping out before they even got to the list of results. Secondly, many people who did get to the results list stopped right there. So we actually wanted to encourage them to identify a plan of the ones that were appearing there or shortlist a few plans and then click on them to see the details and we could see through the Google Analytics that many people weren't taking that next step. They were looking at the big list of plans and then not clicking through to look at individual ones. We tested a range of strategies and made many recommendations that the AER agreed to implement.

Now by chance, the better bills implementation deadline for the guideline was September 2023 and the Energy Made Easy website refresh occurred the following month. What I haven't yet mentioned is that throughout 2022 and early 2023, we had been working with the AER on a plan to evaluate the impact of the better bills guideline, and we'd also made plans to measure the impact of the website changes, which we wrote into the Energy Made Easy draft report. So as the implementation for both approached, we were poised to capture the impacts and to find out, had we been effective in the task we'd been set.

I'm going to take one step backward here, which is just to point out that with the better bills guideline, people getting the better offer message could get one of two messages. One would tell them if they were already on their retailer’s best offer and the alternative was to tell them that the retailer had a better plan ready and waiting for them. We didn't know what the distribution of that was going to be, but we found out later that 81% of customers got a message saying there was a better plan ready and waiting and all they needed to do was call the retailer and get switched on to the cheaper plan. However, we very frustratingly couldn't measure how many people contacted their retailer to get on the cheapest plan because we don't have that data, but we could measure how many people took the more difficult, more complex step of going to Energy Made Easy to make a search. So the link was underneath that better offer message. To measure it, we ran a pop-up survey on the Energy Made Easy website in the months before and after implementation.

For the guideline, we found that only 5% of visitors to the website got there because they followed a link from the energy bill and previously under the old retail rules, this was a teeny tiny link that was in hidden on the back page, and the most common way for people to find Energy Made Easy website was through a Google search.

That jumped 20% of visitors and it stayed there for the rest of the year. So that means in in real numbers, that of the roughly 1,000,000 annual users of the website, about 200,000 in that first year who visited because they saw that message, the better offer message on the bill, and presumably many, many more that we can't measure, simply called their retailer asking for the cheaper plan that was available now.

The Energy Made Easy website mentioned on energy bills, went from the second lowest mentioned reason in June to the second highest by October, and remained steadily at number 2.

Through the pop-up survey, we were also able to answer another question. We were able to see that, of the people who came to the website because they saw the message on their bills, around 63% of that group had never switched retailers before, other than when they'd moved house, but they'd never stayed in the same house, but chosen to switch. So, these were people who otherwise would have been really unlikely to find the website through other means. So in terms of our Energy Made Easy website, what happened when they got to the website? We used Google Analytics to see the before and after results. To see the list of energy plans, users needed to answer a set of questions about their energy use and where they live. The wording of the questions, various roadblocks and information requirements, meant that about 30% of users stopped before seeing the results page. 70% got through to see the full list of plans.

We made a range of recommendations to these. One example was that people who are moving to a new house, were bypassed a series of questions that didn't relate to them about their current meter identifier number or their current energy usage, because in a new house you’re going to use a different amount of energy. Each house takes a different amount to live in, has different appliances, etc. So people who said they were moving house got flipped straight through to the results and it just used some general stats for you know, the average energy usage of a, for example of a five person house or a three person house, whereas people who had been living in the same place for more than a year were encouraged to enter the energy usage. And we found ways to speed up that process as well and to streamline the questions and to make them easier to complete.

So what happened? Changes to the website meant that the number of people dropping out was effectively halved, only 16% dropping out before getting a result, and 84% saw the list of plans available to them. Now I mentioned just then about entering your data. There were different ways that you could do that, asking consumers to use their national metering identifier data by was the quickest and most accurate results, so it's quickest to do and it produced the most accurate results. A lot of people don't know what a national metering identifier is, and were put off from that question, but we found that by recommending a quick explainer to say that this is where you can find your NMI number on your bill, and you can just enter it and then it will automatically load your data and understanding those data sharing options make people much, we hoped it would make people much more likely to use that option and get the most accurate results, rather than tediously entering all the usage for the last year. Now what we recommended doing was defaulting people to see that option, and then if they weren't comfortable with that one, offering the alternative and the use of defaults here preserved individual choice about whether to link to their energy consumption data, but encouraged people to use an option that was objectively better in multiple ways.

And we found that it was very effective making the NMI the default, increased the selection of that option by more than 25 percentage points.

Similarly, we found that people were getting overwhelmed with the information on the results page showing all the plans and only 9% were clicking through to view individual plans. We suggested a range of interventions, like putting the most important information on the main results page, so the main supply and usage charges and any key barriers to selecting that plan and having of course the summary of the total cost and we removed information about fees that are very rarely charged and shouldn't be the focus of that decision because they make up only a very small proportion of costs.

And we found that on the refreshed website, the proportion who decided to view an individual plan almost tripled. So by providing an easy base for comparison, users were more likely to feel confident to view individual plans once they reached the results page.

Now I've shown how we'd woven in and out of the whole policy cycle, providing evidence to the Australian Energy Regulator at different stage and this is all published in our reports.

So was it all over after that? Apparently not. So earlier this year we received a follow up request from the AER to advise on how to deal with an implementation issue. They told us that some retailers had been reusing old plan names for new plans that had different prices and this caused a lot of confusion with the better offer message because it would say you're on plan A, if you change to plan A, you'll save money, which didn't make sense to people. So in May the AER made a decision that retailers that reuse plan names will have to add the following information after the better offer message. Basically this message says if this plan has the same name as your current plan, you are on an older version of the plan which has different rates and you can still save money by switching to a newer version. Soon after that, the AER told us that the AEMC was consulting on the issue that people who don't read their bills are missing out on the better offer message. Some people pay by direct debit or will they just look for the total amount in the email accompanying the bill, or they might pay through the app.

And in this case too, we're able to draw on existing evidence to build on the evidence base available to decision makers. So in September this year the Australian Energy Market Commission made a final rule that requires retailers to alert each customer of opportunities to save in other communications that company a bill.

And the AEMC stated that they made a more preferable final rule after drawing on insights from the Behavioural Economics Team of the Australian Government and the Australian Energy Regulator to promote behavioural change, and their research, along with stakeholder feedback, suggests that a more visible, better off a message will encourage more customers to switch, once a customer has decided on switching, the process is usually quick and straightforward.

Perhaps now you'll see why I don't feel strongly about the best time to engage in the policy cycle. Early is good, but late is never too late. We find that the behavioural economics toolkit is flexible enough to offer insight and evidence at all stages throughout the cycle. It is deeply satisfying to see that the work of your team has contributed to making a difference. Sometimes a small difference for many people, and sometimes a large difference for a small cohort. We want to support Australians to interact with government and with essential services as painless, painlessly as possible.

And we want to make it easier to make those complex decisions that will benefit their individual circumstances in the long run. Part of the way we accomplish this is to make a difference for government and government agencies supporting them also to make complex decisions by delivering the best evidence exactly when they need it.

Thank you very much.

Andrei Turenko: Thank you, Laura. Incredibly interesting work. Thank you for sharing that.

Really keen to see where this goes from here. I think the concept of explaining complex information to people in a simple way and encourage them, empowering them to make a decision is something we see across many domains, so very keen to follow this along. We do have a couple of questions from the audience.

But if I may start with one of my own, the examples of the interventions that you've demonstrated all showed positive effects. Did you test anything that was not as effective? Were there anything surprising that you found that you'd like to share?

Laura Bennetts-Kneebone: Yes, absolutely. So we tested some things that worked and some things that didn't work. In a presentation like this I'm obviously going to talk about the most interesting ones, but all the results that we had, we published in our report. One thing that surprised me, and maybe challenged some expectations, was that there was a perception at the start that making the bill simpler and shorter, making it a basic bill would make it much easier to understand, and that's why making it just one page, or two pages at the most would make it easier for customers. And intuitively this sounds right. So we tested different bills of different lengths and what we actually found contradicted and challenged that a little bit. So by taking too much information off bill, customers didn't have the information they needed to understand what might be going wrong. So for example, if the bill was much higher than they expected and they wanted to look at the bill to understand it, then they couldn't necessarily find that information. So it was causing them to be at a disadvantage when they might want to challenge a bill or try to switch. The other thing we found was that having a shorter bill didn't make it quicker to pay the bill, so as long as all the key information that you need to just pay and go on with your life is on the front page, then people can cope with one page or three pages just as easily, but still have those pages two and three there as backup if and when they need them, but they don't necessarily need to look at them every single time.

Andrei Turenko: Fantastic. We have a question from the audience. How does the percentage of customers on the best offer compare across different regions or demographics?

Laura Bennetts-Kneebone: Oh, it's a great question. I wish I knew the answer. So because of the methodology we used, we could only have a very short pop-up survey on the Energy Made Easy website and so it didn't include a lot of demographic information.

The question we squeezed through about whether you'd switched previously was in there, and so I don't know how much it varies across other demographics. I'm afraid there is, but that would be an interesting question.

Andrei Turenko: Absolutely. We have another question coming through, some compliments on seeing the policy cycle displayed and applying the work we've done onto it. Can you explain the content of policy analysis and policy design?

Laura Bennetts-Kneebone: Yeah. I can in the sense that it applies to BETA and in terms of how we interact with those particular areas, because I'm sure that interacting with the policy cycle probably looks a bit different if you work in a different part of our department or in other departments. You have very different kinds of things. You have to deliver, but for us, policy analysis and policy design is looking at the current settings and looking at the proposed settings, doing backgrounds, sort of discovery or diagnostic research to understand what's going on, and then to explore what the behavioural insights or the behavioural economics literature says about those issues. And we also look at what's been done in other states or other countries to consider options. There'll be a process of prioritisation of the options available, which ones are actually practical and possible and which ones we think are likely to deliver the best results, and then mocking those up or finding a way to test them so we might have several different interventions. If it's a message, that's probably the easiest to explain. You might test several, all different messages and so then that testing process might be where we work in a survey experiment and we do that design work to test different designs and then once one's been identified as the way forward, the work on that will happen to finesse that design and polish it. That more so, it's not just the minimum viable product, which is what we might test.

Andrei Turenko: Brilliant. We have a question specific to the AER case study. How did you determine the ultimate outcomes and what did you use to measure those?

Laura Bennetts-Kneebone: To determine the ultimate outcomes that we were looking for, we leaned very heavily at what the requirement was of the AER. So when the AEMC tasked the AER with developing them there was several clear things, several clear outcomes that they wanted the guideline to achieve, and it didn't just say make the bills better. It talked about making them simpler to support, and clearer, to support consumer understanding and to support engagement in the market.

And so when we were thinking about our outcomes, we were thinking about things that would show better comprehension of the bill and likelihood to engage in the market or thinking about engaging in the market. And so in the survey experiment context, we had showed people a bill and we had a series of questions asking if they could tell when it was due and how much it was and if they could find various bits of information. And we used that to determine whether it improved comprehension and then we looked at the better offer message and that's the example where we used that qualitative outcome measure where we asked people how could William save money on his bill. We showed them this bill and we saw that people who saw the better offer message were much more likely to type in a suggestion about going to Energy Made Easy or call your retailer to switch plans, or, so they were thinking about it. We didn't prompt that outcome. We left it as an unprompted outcome on how to save money and then counted how many people thought of switching.

Andrei Turenko: Brilliant. Maybe we can also take a step back. I'm really curious to find out, you know, why do we need to encourage people to compare energy plans? Can you talk us through the key behavioural insights in the decision making process and what are the common barriers people have in getting around to it?

Laura Bennetts-Kneebone: Yeah.

Oh, there's a lot. I don't think comparing your energy plans and switching is anyone's favourite activity to do on the weekend, but there are certain cohorts who seem to be much less likely to do so. Why you need to do it is because multiple inquiries had found that you could save a huge amount of money and so you know, in the realm of hundreds of dollars each year, so a significant difference to many Australian’s budgets and that the people who could save the most money were those who had no engagement with the market. They'd signed up to an energy plan when they'd moved into their house. I don't know, 10 or 20 years before and they just stayed with them. And over time the prices had crept up and people have no visibility of what other people are paying really. Like they could find it, but it's not really obvious, like you might go into the supermarket and see how much other things cost. And so they're paying for it. And they're like, oh, it's expensive and everyone says it's expensive, but they don't know that they're paying a lot more than, for example, their next door neighbours.

And I think older, older people who'd spent most of their lives in a system where it was a publicly owned infrastructure and there was no need to switch, and even in the early days after privatisation, there was not many plans to switch to yet and not a lot of benefit. They were not even aware that it was something they could do or hadn't really thought about doing it or didn't have a perception that it would save them a lot of money. And so for people who are really engaged and checking every, every year or eighteen months, they had a good perception and were, you know, Googling it and searching for other plans and pursuing it. But other people who were not engaged were being left behind and other barriers, of course, are that it's just wildly complicated to compare the different plans. They vary on lots of different features, but all price based effectively and one might have a really high base rate, but then a big discount and then it might seem like it's a really cheap plan because it's got a huge discount or it might have a conditional discount that you have to do particular things to qualify for, like paying on time or referring friends and people might be likely to overestimate their ability to get that discount. All these things kind of contribute to people not remembering to do it, not thinking about it and being overwhelmed when they try and being scared of picking a worse plan. That was something that certainly came up. People weren't looking around and were not confident that they were not going to fix something that was going to leave them objectively worse off.

Andrei Turenko: Fantastic. I think I have a really good follow-up question to this, also from the audience. On a more personal note, have the new bills helped you get a better deal yourself?

Laura Bennetts-Kneebone: The irony here is that I moved to WA just before the implementation of this policy and in WA there's a not a competitive energy market. There are two retailers and they operate in different regions, and our bills are all subsidised by the WA government, especially for remote areas. So no, not me, but I routinely have friends who know I worked on this, sending me messages and screenshots saying, hey, I got this message and it's telling me how much I could save and I went on energy made easy and I'm like, oh, how did you find it? What happened?

Andrei Turenko: Ha ha. I think I'll have to send you a message as well, Laura. This has inspired me to do a quick review of my bills. Perhaps we can do one or two more. This is an interesting one. Was there any resistance from the service retailers to adopt changes? Where it meant they potentially had to spend money to implement them. How did you? How did the project address these issues?

Laura Bennetts-Kneebone: Yeah. So I think I mentioned at the start that there was consultation. And so the first thing I did was read through all the submissions to that consultation. Very many of them came from retailers and also peak bodies. So I got to see there was a bit of diversity of opinion, some retailers were fine with it. Others objected to it and we also heard objections from some people. I don't remember who, but I think it was some retailers suggesting that people wouldn't see it. It would be ineffective. So it would be like a big lot of design for no purpose, and it wouldn't solve the problem. So people wouldn't see it, they said. People don't read their bills or they, you know, they just pay in the app or they just pay with direct debit or bill smoothing. And some people said that they'd be confused about the message, that it would, that people wouldn't trust it because it came from the retailer or, you know, so they put forward these kinds of arguments. I didn't see arguments saying, oh, no, we want to be able to charge people higher rates it wasn't quite like that, but, but others were OK with it. The policy is definitely not one BETA came up with. It was one that the AER had been thinking of and Victoria had implemented it a couple of years earlier and it had never been evaluated because it was implemented at the same time as a raft of other changes to encourage market engagement. And so nobody quite knew whether Victorians had engaged in the energy market because there was a subsidy for doing so or because of all the advertising or because of this message on bills. So most of the pushback was all, it won't work anyway, and it will cost us, you know, valuable real estate on the front page. And it confused people.

Andrei Turenko: Well, glad it all worked out and it all went well. Perhaps we can finish with this question, just very future looking. Are there any other consumer issues which you're looking to address either maybe specific to your team or if you have any broader thoughts on whether BI could be useful going forward?

Laura Bennetts-Kneebone: Oh, I think there are a lot and the ACCC certainly uses behavioural insights in addressing various consumer issues. There's recently been another announcement from the AEMC around access to concessions, energy concessions, on bills and concerned that people who are eligible for concessions may not be receiving their entitlements because they don't know they're eligible. So I think that's an obvious place. But they crop up a lot and we've used behavioural economics to look at things, like in the past, we've published work on payday loans and add on insurance, so when you get insurance added on when you're purchasing a product that may not be the best. So those are all consumer areas where a little bit of a warning or identification of the risk or slowing people down, can really help and I think there's lots of those.

Andrei Turenko: Fantastic. Thank you, Laura. Thank you for the presentation and best of luck in the in future research.

Laura Bennetts-Kneebone: Thanks Andrei.

In 2020, the Australian Government's Minister for Energy requested an amendment to the National Energy Retail Rules. New rules aimed to simplify power and gas bills for households and small business owners and support consumers to engage with the market to find better energy deals.

But what actually makes a bill better?

How easy can it be to switch to a better plan?

This presentation will take you on a policy development tour, covering 4 BETA projects and showcasing how behavioural insights are saving Australians money.

Presenters

Laura Bennetts-Kneebone.

Laura Bennetts-Kneebone

BETA, Department of the Prime Minister and Cabinet

Laura has worked at BETA since 2019. She is currently a Senior Adviser of BETA’s Energy, Environment and Economics team, designing behavioural solutions to policy problems and testing them with RCTs. Prior to joining BETA, Laura spent 10 years at the Department of Social Services working on the Longitudinal Study of Indigenous Children and other Australian longitudinal studies. She has qualifications in Social Research (ANU), Linguistics, Anthropology and Chinese (LaTrobe).