New directions for the application of BI to complex policy problems
- Hi, everyone. Welcome to the last of three sessions in BETA's BI Connect 2022 virtual series exploring leading work across the behavioural insights community. Today's session focuses entirely on work within BETA, and, in particular, our recent shift towards projects further up the policy cycle, rather than the more typical service design focused BI projects. I want to begin today by acknowledging the traditional custodians of the land from which we're all joining today, including the Ngunnawal people here in Canberra, and pay my respects to their elders, past and present. I extend that respect to Aboriginal and Torres Strait Islander people joining us here today. For those of you who weren't able to make the other BI Connect sessions, I'll quickly introduce BETA and myself. My name is Amelia Johnston. I'm BETA's acting Managing Director. I've been with the team since 2018 and I'm passionate about delivering better policy outcomes through more rigorous evidence about people's behaviour and choices. BETA is the Behavioural Economics Team of the Australian Government. We're a unit at the centre of the Australian Government applying behavioural insights to public policy. We improve the lives of Australians by generating and applying evidence from the behavioural and social sciences to find solutions to complex policy problems. Part of our mission is to build behavioural insights capability. This event is among a range of initiatives we conduct as part of that work to share knowledge. If you're feeling inspired by today's discussion, please visit our website and you'll find our reports and tools that can help you learn more about applying behavioural insights in a project of your own. It's time now for me to introduce the fantastic panel of BETA staff who are here with me. Firstly, Andrea Willis is a senior advisor in BETA, leading one of our project teams. Andrea is an experienced senior leader with deep behavioural science experience, having worked across the public and private sectors. Prior to joining BETA, Andrea worked for the Department of Communications and the Arts, Austrade, and various private sector organizations, including Yahoo. Within Andrea's team, we have Andrei Turenko. Over the last six years at BETA, Andrei has managed a wide spectrum of projects, including large-scale evaluations as well as qualitative and quantitative projects. Prior to joining BETA, Andrei wrangled data as senior intelligence officer at the Australian Taxation Office. Shae Ffrench is acting senior advisor, leading BETA's strategy and impact team. Shae has worked at BETA for three years, leading projects in areas including women's economic security, career decision-making, and protective health behaviours. She's passionate about the role of qualitative research and helping us develop a deep understanding of problems and design better solutions. Within Shae's team, we have Aurore Chow. At BETA, she has led projects in safety and diversity in organizations, planning for later life, and women's workforce participation. Prior to working at BETA, she was a lecturer at the Australian National University. So welcome to all the panellists. In today's session, you'll hear from the four members of the BETA team I've just introduced who will share reflections about two upstream BI projects they've worked on recently. But I'll start with offering some opening comments before we move to presentations on each project. Then we'll move to a panel discussion on the lessons we can draw from these projects for others who are interested in applying BI earlier in the policy cycle. Please submit your questions for the panel via the Q&A tab. If there are quick clarification questions, we might address them on the way through, but we'll return to any more substantive questions in the panel discussion at the end. So to get us familiar with the Q&A function, and to get a sense for who's in the audience, we've put a couple of statements in the Q&A tab. If you can navigate to that tab now, and like the ones that describe you. So hit the Like button on the statements that describe you. I'll give you a minute to do that now. Fantastic. Thanks, everyone. So it looks like the vast majority of people joining us are in Australia. So we have around 85 people so far saying they're joining us from Australia, and a handful joining us from somewhere else in the world. The majority of people dialling in work in the public sector. So there's 85 people in the public sector, around 15 joining us from the private sector, and a couple from the not-for-profit sector. And we have, looks like about an equal balance between members in the audience who are behavioural insights practitioners and people who are enthusiasts about behavioural insights. So welcome, everybody. Thank you so much for joining. Before I throw to my colleagues for their presentations, I'd like to set the scene and offer some reflections on applying behavioural insights upstream in the policy development process. I think we all know that behavioural insights is bigger than nudges. I'd say that BI brings three key tools to policy. Firstly, a richer understanding of human behaviour and a focus on primary research about behaviour that is specific to the problem at hand. Secondly, behavioural design, including nudges. And finally, rigorous evaluation methods, including randomized control trials. As a field, I think behavioural insights mostly started out applying these tools to tweak existing policies and programs and make them more effective. I think we've all realized, over the life of the field's existence, that a huge proportion of government policies and programs aim to change people's behaviour. Whether it's something simple, like encouraging people to put the right things in the recycling bin, or something complex, like promoting genuine competition in the banking sector, a huge number of government policies have an element of behaviour change. And many of those are about supporting people to follow through in their intentions, which are the kinds of problems we are most interested in. And once you follow that idea to its conclusion and ask what policy interventions will be most effective in supporting behaviour change, you find behavioural insights naturally becomes relevant at quite an early stage of the policy development process. Certainly, in BETA, we're finding ourselves working increasingly in this space as time goes on. But this is not a new idea. I don't know for sure, but I wonder if the concept of present bias was part of the rationale for compulsory superannuation in Australia that started in the 1990s. So I think it's something that has been a part of the field throughout its life, but it's becoming increasingly prominent as the field matures. Now I want to get a bit more specific about what we mean when we're talking about working upstream. As I mentioned before, the BI field has mostly focused at the end of the policy cycle, at the implementation and evaluation stages. This is where BI has proven to be really helpful for things like improving compliance and take-up rates, making interfaces more user-friendly, reducing regulatory and administrative burden, and testing implementation options and evaluating impact. So we call this downstream in the policy process when the major policy settings have already been determined and BI is helping with tweaks and implementation and evaluating improvements. This is hugely important and valuable work to make policies work better for people. We all know that the details of implementation really do matter for outcomes, and BI will always have a clear value proposition for these kinds of problems. But we see, and the other BI teams we talk to inside and outside of government are also seeing, opportunities to bring a behavioural lens earlier in the policy cycle. So to go back to the very start of the policy cycle, we have the agenda setting stage. This first stage involves choosing a problem on which to focus. Now we think BI can play a role here by applying different criteria to identify a problem in the first place. Orthodox economics usually focuses on market failures as a reason for government intervention, things like market power externalities or information asymmetries, but BI offers evidence and a framework that suggests the government intervention can improve outcomes in other kinds of situations. We tend to focus on consumer outcomes, for example, to identify a problem, and look at consumer decision-making processes to understand whether they understand their options and are satisfied with outcomes. The Australian Government has recently been introducing lots of regulation to make it easier for people to switch banks or super funds. And I think that's an example of problems that wouldn't be identified through a traditional market-failure lens, but would be identified through an understanding of the impact of friction on consumer decision-making, which is a BI concept. So the next stage is problem definition. At this stage of the policy cycle, behavioural insights can help to understand the drivers of status quo behaviour and barriers to the target behaviour. This behavioural lens is pretty rare in early stage policy development, at least in my experience, even for lots of policies that relate to human behaviour. It's so valuable for three reasons. First, it takes the citizen perspective, which helps you see the problem differently, and, I'd argue, more effectively. Second, it's driven by real-world evidence and doesn't make assumptions about behaviour. And third, it tries to unpack the different cohorts affected by a particular policy. And this is the stage where BETA has primarily been operating through our upstream projects. And you'll hear more about this in the presentations today on some of BETA's recent project work. We tend to use qualitative methods or administrative data at this stage, and the goal is often to build a richer picture of human behaviour and how people are interacting with government systems. At BETA, we're also playing in the policy design stage. In this stage, we design or support, sorry, we design or support the design of behaviourally-informed interventions. These could be traditional policy tools, like incentives, communications and regulations, which are informed by our research and our nuance and understanding of human behaviour, or they could be from the BI intervention toolkit, like nudges. Because this stage comes before the policy has been settled, the program in question often doesn't exist yet, we can't always use real-world RCTs, although we try to wherever possible. There are some different techniques we can use here, like randomized survey experiments to test possible responses to policy options. The important distinction here is that we are coming in before the final policy design is set, which gives us more scope for influence and allows us to embed options for iteration and evaluation down the track. Of course, there are policies that are out of scope for BI, but many policies do involve individual behaviour. And I'd suggest behavioural insights is always relevant for those policy problems, even if the intervention we design or inform is ultimately a traditional policy solution rather than a nudge. I see this increasing focus on upstream work as an evolution of the field, and BETA is still learning how best to maximize the benefits of our research and advice in this space. So today isn't about sharing a shiny new methodology for upstream BI work, but rather just opening a conversation about the trends and opportunities we're seeing and the lessons we're learning along the way. So we'll now move to presentations on two projects, before the panel discussion and an opportunity for audience questions. Please feel free to add questions in the Q&A tab as we go, before you forget them. So I'll now hand over to Andrei to take you through the project on attracting high-achieving teaching candidates.
- Thank you.
- Actually, I think I-
- I'm sorry.
- Sorry.
- It's switched.
- I switched the order on you, Amelia, sorry.
- No, no, that's okay. Shae. I will hand over to Shae-
- Thank you.
- To talk about women's labour force participation.
- Thank you, Amelia. Hi, everyone. I'll be taking you through a current upstream project we've been working on with the Office for Women, which has been looking at understanding barriers to women's workforce participation. We'd normally take a whole hour to talk through this suite of research that we've been doing on this topic, but I've got a max of 15 minutes, so I'll be spinning through quite quickly. But like Amelia said, we do have the panel discussion afterwards. There'll be opportunities to discuss the project further. So to start us off, I'd like to talk a little bit about the problem we're focusing on. This project is about lifting women's workforce participation. So by this, we mean the rate of women who are either in paid employment or currently looking for work. There's been big improvements in this over the last 40 years. But as of 2021, the ABS puts the female participation rate at around 76%, meaning about 1/4 of working-age Australian women are not in the labour force. And this does lag behind several other OECD countries. Men's labour force participation is much higher. The gap in participation rates between genders is about nine percentage points. You'll see from the graph on the left that the participation rates start out quite similar for young men and women. And as you might expect, the gap starts to emerge at the ages when women take time out of work to care for children. 1/4 of working-age women not in the workforce might sound really huge, I think, initially, it did to us, but it's important to know that the majority of these women aren't actually able to work or available to work. So this could be that they're engaged in unpaid work, like caring for children or older family members, or that they aren't able to work due to disability or ongoing health issues. However, the ABS does estimate that there are around 1/2 million women who want to work, are available to work, are able to work, but are not currently in the labour force. And this was, yeah, I think this was surprising for us to hear 'cause we're like, "Wait, hang on, what's the problem here?" Like, "We're in a tight labour market. There are jobs available. There are these women who want to work. What's stopping them from following through on this?" And I think part of the initial answer is that we don't actually know a lot about this group. Since they aren't working and they aren't unemployed, they're often quite invisible to government. They're not always receiving government benefits or payments. So obviously, as a starting point, if we want to sort of know how to remove barriers and allow these women to follow through on their intentions to join the labour force, we need to start by knowing more about them. So returning to that policy cycle that Amelia was talking us through earlier, the agenda has been clearly set. So the Women's Budget Statement is really clear on this. It calls increasing women's workforce participation an economic and social imperative. It's really well established that increasing female labour force participation improves outcomes both for the economy and for individuals. And for individuals, this isn't just about, you know, their lifetime earnings or retirement income. We also know that when people have access to paid work, we see improvements in wellbeing and social inclusion as well. The government's obviously already made some big moves on this, including, you know, for example, the changes to make childcare more affordable. So where do we come in? The Office for Women, which also sits within PM&C, approached BETA and asked us to conduct research to better understand the problem. What are the barriers that women face? What are the areas where government might be able to help? They're also interested in understanding the impact of the COVID pandemic on young women's careers, as this group was the hardest hit for job losses during lockdowns. So obviously, these are incredibly broad issues and really massive research questions. We're talking about a really wide range of cohorts, women from diverse backgrounds with complex lives and different intentions for what kind of work they would want to be undertaking. And you're probably thinking like, "Hang on, what's the behaviour that we want to see here?" You know, when we're thinking about full-time workforce participation as a target, this is obviously a really zoomed-out behaviour. There might be hundreds of behaviours, over years or decades, that lead to someone either joining the workforce, staying in the workforce, or coming out of the workforce. So this project sits extremely early in the policy cycle, as we mentioned, in the problem definition stage. But even that is perhaps a little bit misleading because we're not just defining one problem. We're actually looking at a wide range of complex, interrelated, messy problems, we're identifying new problems, and we're looking at which are the problems that represent the best opportunities for government to get involved in and address that might have the greatest impact on participation rates. So what have we done? We started with a review of the existing literature and available data on women's workforce participation in Australia. This helped us sort of map the gaps, see what knowledge we might need to gain to build a better picture of what's going on in these women's lives. We then planned a program of mixed methods research, starting with qualitative research with two more specific priority groups. So we did that work, finished up that work in July of this year. So for the qualitative research, the first group we were interested in is women who have been out of the labour force for a really long time. So have either been out of the labour force for 10 years or more, or have actually never worked at all. Previous research found this is the group of women out of the labour force that really do face the biggest barriers to re-entry and could definitely benefit from some kind of government support. So we spoke to 20 women, aged 35 to 64, from a range of backgrounds, and with varying prior experiences in the workforce. Since the topics are really broad, it was incredibly important to us to get sort of our sample frame right. Like we thought a lot about the type of women we wanted to include in this study, where the gaps were, and who we maybe might not need to talk to for this. So, for example, we agreed with the Office for Women that barriers to participation for women with preschool-aged children, that's really well understood. We know what the issues are there with availability, affordability of childcare. So we decided to recruit participants who were out of the labour force for reasons other than caring for young children under the age of six. So these were really wide-ranging interviews. We talked to these women about their past work experiences, what was going on in their lives when they left and remained out of the workforce, and their attitudes and expectations towards, you know, towards paid work. So for some women, we were asking them to really cast their minds back to decades prior and tell us about, you know, these periods in their life where they were working and then when they left the labour force. The second group we were interested in is young women who had their early careers disrupted due to the pandemic and associated lockdowns. We conducted nine focus groups with a total of 34 young women across the country who either lost work or had a really hard time finding work during the pandemic. Again, it was important for us to really think deeply about who we wanted to talk to and how we organized this. So participants in our study were aged 18 to 33. The majority were in their late teens and early 20s. Most were in periods of transition during the pandemic, either moving from secondary school into uni or TAFE, or completing education and moving into the workforce for the first time. We focused primarily on women living in Melbourne and Sydney who had experienced those really long, really disruptive lockdowns. And since we landed on small focus groups as the best methodology for these young women, it was really important that we grouped people who had similar experiences. So we organized peer groups based on location and experiences with job loss. So we spoke to these women about, you know, what it was like losing work during the pandemic, looking for work during the pandemic, switching to online learning, challenges entering the workforce, and how these experiences have really shaped their future career aspirations. Finally, we're also currently undertaking new analysis on the HILDA longitudinal survey data and the multi-agency MADIP dataset. This data analysis is focused on the knot in the labour force cohort and has two main purposes. First is to build, you know, a more detailed demographic profile of women who are not in the labour force. You know, what's going on, their life experiences, their interactions with government, and how this compares to other groups. Second, we're also looking at, you know, what life events or triggers are predicting women leaving or re-entering the workforce. So this data analysis work is ongoing, and we're aiming to share results with the Office for Women early next year. I might just pause here briefly before we go to the next slide to talk about why we chose the methodologies that we did. We found that for projects that are really early in the policy cycle like this, combining administrative data and qualitative research is a really powerful combo. You get that, like, big picture population view, And then the qualitative work means you get that rich, messy, complex picture of what's going on for the individuals in those datasets, so how that's playing out for actual human beings. I've always personally found that qualitative research is so helpful for idea generation. So, you know, when we're moving from this sort of early problem definition and then hopefully moving into policy design following that. Qualitative research allows us to get to those unknown unknowns and it prompts you to think really deeply about what kind of intervention is actually going to make a difference, like what's really going to move the dial for these people. So these presentations weren't supposed to focus too much on findings, but I couldn't help myself, so I'll quickly run through some key findings from our qualitative research with women who've been out of the labour force long-term. And their reasons for initially leaving the workforce included caring responsibilities, both for children and older family members, their own illnesses, or extensive experience with disability, and then experiences of domestic and family violence. We were surprised and saddened to see how common those experiences of domestic violence were for such a small sample that we looked at. When we were thinking about re-entry into the workforce, we looked at people's intent to return to the workforce, their capability to do so, and the opportunities that were out there for them to return to work. Generally, what we found is that these women really wanted to return to the independence and personal fulfilment of paid employment, but they might be sort of hesitant to try. They might be wary of obstacles to finally work, based on their past experiences in the workforce. And when we were thinking about capability, we heard about barriers to the material capability. Like do I have transport? Do I have a computer? Do I have appropriate clothes for work? Do I have enough money? If I want to start a small business, do I have enough money to, like, register for an ABN? So those, like, really practical barriers. We also heard about capability barriers in terms of skills. So people who have been out of the labour force for 10 years or more were sort of probably rightly conscious that their skills might be out of date, and so there was questions about what training they might need. And they were also worried that they didn't have what it takes or know what it would take to sort of look for or apply for an interview for a job successfully. And then we just heard concerns about generally falling out of a loop, like, "It's been 30 years since I had an office job. What are offices like these days?" You know, "What programs am I going to need to use?" So women tended to remain out of the workforce because of these barriers to re-entry. And what really struck us about these findings was that these stories really challenged, I think, some assumptions that we might have had about people who have been out of work for decades. And this really largely wasn't about not wanting to work. And re-entry was really so much more complicated than just deciding one day like, "Hey, I'm going to look for a job." There was so much more to it than that. And that the experience of work and looking for work and thinking about work was so interconnected with people's identities and really needed to fit around their complex lives. So for the work we did with young women who had their early careers disrupted by COVID, this research wasn't focused actually on participation, necessarily, as most of the women we spoke to had lost work, but then since re-joined the workforce. But this was focused on sort of early indicators that they might be on a lower job trajectory long-term, associated with lowered confidence and aspirations for their career. We set out to talk to these young women about their careers, but really where the conversations ended up going was focusing on the mental health challenges that they'd experienced during the pandemic. In our, you know, small sample, mental health challenges during the lockdowns really were the norm. And many of these young women told us that they now felt really burnt out, that they hadn't really resolved these issues, and that had some really clear implications for their career as well. It's just, it's really hard to sort of confidently move forward in your career if you're struggling with anxiety, if you're struggling with depression. We did talk to them about their job losses during the pandemic. Another finding was that, in most cases, where these job losses were in casual jobs, it didn't necessarily represent their real career path. For many of these women, actually the most disruptive thing was the move to online learning. They told us about feeling, like, quite unprepared to enter the workforce. They felt they had gaps in their practical skills. They felt they had gaps in their professional networks. Often they hadn't been able to undertake internships or work experience that would've really helped set direction for their career. So where to from here? We've seen, I suppose, a lot of benefits of conducting this kind of broad, early qualitative research to really bring into focus, like, the complexities of people's lives, how work is interrelated with so many other factors, and, like I was saying before, what it takes for government interventions to really make a genuine difference in people's lives, and usually it is about thinking about how these things interact with each other. A surprising benefit I guess we've found from conducting this sort of, like, early broad research is that we've essentially been able to use it as a bit of a bank of findings or insights that we've been able to draw on for quick turnaround advisory work. I think a key challenge for evidence-based policy development is often these things move at just, like, an insane cracking pace and there's not time to do new research, there's not time to go out and do these, like, fact-finding missions. So for this project, we've been able to use these findings to inform our advice across a range of policy areas, including encouraging more gender-equal parenting arrangements and more tailored employment services. So that's it from me, for now. And I'll pass back to Amelia.
- Thank you, Shae, for that great presentation. Really interesting to open the hood on such a complex policy problem. Before I hand over to Andrei, I might just touch on, there are a couple of sort of specific methodological questions that have come up while you were talking, Shae, so I'll raise those now. The more meaty questions I'll leave for the panel discussion later. But one person asked whether we use the quantitative research to inform the qualitative research in this project.
- Yes, we did. We did some desktop research early where we were looking at ABS data and that really informed, like, which were the groups that we should be focusing on, where the gaps were. So that sort of looking at quantitative analysis of datasets informed the design of our methodologies. Yeah, if that answers the question.
- Yeah, thank you. Yes, it did. And the other question is about, what was our thinking as to why we combined peer groups together instead of having mixed groups? I think they're referring to the qualitative research.
- Yeah, definitely. So this is, I think, a fairly, let's see, yeah, I will say, like a standard approach for qualitative research, that if you're doing group discussions, you do want to have these to be with peer groups where people feel like the people that they're talking to are not going to be judging them, are not going to have, like, wildly different experiences where they're going to feel like they are not able to share what's going on for them. So this might involve, you know, on sensitive topics, you might think about, like, peer groups, so you have, like, gender-matched peer groups. Or, in this case, we were like, "Oh, man, it would be pretty rough if we put women in Melbourne, young women in Melbourne, in a group discussion with women in WA," and being like, "What was your experience of lockdowns?" and the women in Melbourne being like, "Oh my god, it was two years, it was horrible." Like, you know, "I wasn't able to leave my house." And the people in WA like, "Oh, no, we were totally fine." So it's just about, I think, making, cultivating an environment in a focus group setting where people aren't going to feel shut down by, like, what other people are saying.
- Thanks, Shae. Fantastic. Well, I might move now to Andrei to talk a little bit about initial teacher training.
- Thanks, Amelia. Alrighty. So this is a project we conducted throughout 2021 around attracting high-achieving teaching candidates into the initial teacher education, so ITE. Now this project commenced on the back of the announcement by the then Minister for Education Alan Tudge in about April 2021 to establish the Quality Initial Teacher Education Review. And the focus of that review, and, I guess, the broader suite of work, was to improve student outcomes across a range of different metrics, including reading, mathematics, and science, by 2030. So the Quality Initial Teacher Education Review, or the QITE, for brevity, had two sort of broad questions it was looking to answer. The first one was, how can we best attract and select high-quality candidates into the initial teacher education? And second one is, how can we best prepare them to become effective teachers? BETA was involved mostly with the... Sorry, could we skip back to the first question?
- Oh, sorry.
- So we were mostly involved with the first question of how do we best select high-quality candidates? And the graph below is lifted straight out of the discussion paper for the QITE. Essentially it shows the entrants into the undergraduate ITE courses and the ATAR at the time of entry from 2019. And what the graph shows, that only 39% of entrants had an ATAR of 80 or above. And so the view at the time was that we need to increase in proportion all those high-achieving young students, and, indeed, mid-career professionals, entering education from those kind of more accomplished academic backgrounds. All right. So in terms of where we came in on the policy cycle, it was after what Shae's project was looking at. So we came in at the policy design stage. Essentially, the question, the problem definition was already established. What the QITE panel asked us to do was to look at the specific incentives and approaches we could use to attract this high-quality candidates into teaching. The panel was given about, I believe, nine months to come back to government with their response. So we had a fairly specific timeframe, very specific question. And so in order to do that, I'll jump into the methodology, we ran a survey with about 1,900 Australians. So that survey was made up of about 500 young high-achievers or students with an ATAR of 80 or above, and 1,300 mid-career professionals. We read a survey which had a discrete choice experiment component. This is a methodology that was fairly new to us, but has been around the marketing and research sphere for a while. For us, this was the first time we used it in the field. And the advantage of this approach is that it essentially lets you compare different incentives and the relative importance of incentives or different attributes compared to others. So to give you an idea here, we were testing with participants across four different attributes. We were testing different study incentives, work incentives, starting pay, and top pay. The way that the discreet choice experiments work is that participants are presented with a couple of different packages. One package has a specific set of different incentives. So, for instance, and I hope you guys can read it at home, this particular package had paid work in the school throughout the entire period of study. The work incentives was guaranteed ongoing employment as a teacher, the starting pay was 65, and the top pay was 115. Package two was slightly different and offered different incentives. And the package three was neither of these packages, you could elect you didn't want to be a teacher. So we asked participants, which one, if you had a choice, would you pick as a career? We then asked them to repeat this process up to seven times. And at the end, we calculated what were the incentives that were most commonly picked, which sort of dominated and made people pursue a career in teaching. So the results were quite interesting. So on the right-hand side, I quickly explain what we are looking at here is the bars across the four different categories of incentives represent the probability in percentage points of choosing teaching as a career. So if this particular incentive was offered to you, this is the probability that you are more likely to pursue that, to be teaching, essentially. So a couple of interesting findings here, perhaps unsurprisingly, as we increase starting and top pay, we do see more people pursue teaching. In the study incentives and work incentives part, we see that most study and work incentives get valued higher than a $15,000 increase in starting pay, which was quite an interesting finding. You see the $30,000 scholarship was the most popular incentive across both study and work options. And it was almost, and perhaps not surprisingly, as effective as increasing top or starting pay by the same amount. For the mid-career professionals, we see a largely similar story. However, if you look at specifically study incentives, we see there are three strong contenders for what people found attractive. We saw that paid work throughout study, $30,000 scholarship, and mortgage or rent payment or assistance were the most popular, or most sort of enticing incentive options. Similar to young high-achievers, mentoring wasn't seen as a particularly strong incentive. In terms of work incentives, we saw that guaranteed ongoing employment in a nearby school was about two percentage points higher than just generally guaranteed ongoing employment. So that equates to around $5,000 increase in starting pay. So we were commissioned April 2021. Had a fairly tight timeframe to fit into the delivery schedule of the QITE report, which was due to report back to the government by the end of that year. So we ran our discreet choices experiment in July 2021, with our findings being fed in back into the review by September 2021. In the February of the following year, both us and the QITE review published our findings. The QITE review referenced our work a lot, specifically in the first part around where it was talking about how people are finding education and what they find attractive around a career in teaching. And on the back of this, we also established a two-year partnership with the Department of Education, which we are currently progressing across a few different projects. So if you're interested in more around further discussion around some of our findings, you can find our report on our website. I think that's about it.
- We're just going to...
- Oh, sorry. I beg your pardon. All right, so then we have... For the two-year partnership we have with Education now, we've split our work forward program across what's the problem definition where we're looking at a few different and new areas for working upstream in the policy cycle to help define the problem. And also, on the back of some of our earlier advice and some of the initiatives being implemented by the current government, we are looking to get involved in the valuation of certain education programs.
- Fantastic. Thanks so much, Andrei. Now I think we're going to move to a quick comparison between the two projects. But before we do, Andrei, we've had one clarification question. So there's been a question about why we need more high-quality candidates in teaching. People are unclear on the problem we're trying to solve here.
- Sure, sure. I mean, it's a great question. Perhaps, and I don't want to speak out of turn for our education colleagues, I mean, I think the problem of education, well, the problem of deficient number of teachers of education has been ongoing for a little while. And the efforts of QITE review were just to provide government with additional policy levers or ideas for how we can bolster the stock of teachers.
- Great, thank you very much. Now I think we are going to move to... Do we have a slide to compare the two?
- Wait, I think we decided we could probably do that through the questions.
- Okay, yes.
- Okay, fantastic.
- Okay, well, let's move then to the panel questions. But thank you very much to both teams for really fascinating presentations on the two projects. And it's great to see that there are some questions popping up in the Q&A field. So please continue to pop your questions in there. But in the meantime, we're going to start with a couple of questions that I would like to ask the panellists and then we'll move to audience questions. First of all, I want to ask you both, or all four of you, how have these projects been different to typical BETA projects, and how are they similar to other BETA projects?
- Actually, I can-
- Aurore, thank you.
- Yeah, I can jump in on that one. So I worked on the women's workforce participation project that Shae talked about. In our project, we found that the methods that we used were quite similar to other projects that we've done. So we did desktop research, data analysis, interviews, focus groups, and we'd used those methodologies before in previous projects. And we also maintained that behavioural lens in this project. So even though, like Shae said, that the particular behavioural was quite zoomed-out, it was joining the labour force, which isn't a single behaviour, it's the product of many behaviours, we could still have that same theoretical underpinning as we were designing and carrying out the research. As far as differences, the breadth and the length of the project was different from many of our projects, at least our advisory projects in BETA. This project has been ongoing for at least a year, and it'll continue to be ongoing. And so I don't know if that would be maybe a common feature of other upstream work, that it takes a bit longer. And then the breadth meant that there were a number of components. This really ended up being this whole suite of projects, and it required different skills in each of the projects. And so it ended up involving a lot of different team members over a really long time. Maybe another difference is how we shared the findings. So often, in BETA, we have a project partner, we produce a report, it goes to the project partner and sometimes it gets published, but it's mostly kind of that back and forth; whereas with this one, we started out presenting to the project partner, and then they suggested other people we could present to, and then that snowballed into other areas. And we ended up finding that there were a lot of areas in government that are all interested in women's workforce participation. So we talked to people in Education, and where else did we talk to?
- Employment and Workplace Relations.
- Yeah, yeah. So there ended up being kind of this broad interest in the research, which was exciting, but it also meant that the project didn't have this neat tidy endpoint where, like, so we've presented the findings and then all those people who are working on that can now work on a different project. But it sort of had this long tail. And so I think that was probably a little bit difficult for, like, workforce planning within BETA. So that could be a relevant point for other teams.
- Fantastic. Thanks, Aurore. Andrea, did you have some perspectives on this one?
- Yeah, so it's funny, as you were talking Aurore, I was like, "Mm, ours was a bit different." Ours was a bit different. So for Andrei and I, and the team that worked on this piece of work, there was a lot that was super familiar in terms of a regular, if there is such a thing, BETA project, it was a really contained and really neat and very, very specific piece of work, which is often where we find ourselves. You know, it drew on this kind of survey and experimental methodology. And we had really clear insights sort of driven from the data, from that experimental element. And as we've increasingly found ourselves over the last few years, it was in a hypothetical setting, which is a pretty comfortable space for us to be working over the last few years. But unlike the Office for Women work, the methodology was completely new. So as Andrei said, new to us, but not new to other researchers. But although Andrei made it sound very simple in his summary, the methodology and the maths that sits behind the DCE is incredibly complex. So that required a lot of internal upskilling for us, which, given the timeframe, just sort of added a layer of pressure to this piece of work. I guess, a couple of other differences from our perspective, while we were feeding... We were feeding in much earlier in the policy cycle than we typically would, and it was this kind of balance of doing something very, very specific, but not really having any control over what that specificity was focused on. So as Andrei said, that the panel came to us with a very clear request to conduct this study, they wanted to test incentives, and the basis of the piece of work they wanted us to design was drawing from a study that the Grattan Institute had previously run. Similar to Shae and Aurore's project, another reflection is that the behaviour, or, like, the decision that we were really focusing on here was big, like it was really zoomed out. A lot of the projects, certainly in my team, have been much more focused on a very discreet behaviour, like completing a form, or buying a product, or how a labelling influences someone's purchasing behaviour. But here we were looking at a career decision, which, like, feels very, very far away from, you know, a basic consumer decision. So that was quite a different kind of space, I think, for the team to be in. And then, finally, while we are a behavioural economics team, I really feel like, over the last few years, we've been very much focused on the behavioural side; whereas in this piece of work, we weren't kind of breaking down behavioural biases or looking at models of behaviour change. This was much more on the economic side where we were really interested in how people value incentives, the trade-offs that they make when they're faced with multiple incentives, and ultimately how they respond to those. So it definitely fitted really comfortably within our remit, but also, at the same time, felt quite different.
- Thanks very much, Andrea. Now I want to move to... Like, I think one of the big arguments for BI being involved earlier in the policy cycle is that we bring this more sophisticated understanding of human behaviour. So I want to ask, for both projects, you know, what was that value add in terms of the understanding of human behaviour that we brought to the table, and were there any surprising findings that changed our perspective? Andrei, do you want to kick us off on this one?
- I'll kick off the surprising findings on it. I don't think there was a strong prize going into this work. I think the panel came to us with a fairly open mind of asking us, "Look, we have all these policy ideas, these other ideas have been proposed by other stakeholders in the area, you know, we are keen to find out what works, what doesn't work." That was really nice, actually, and sort of to have an open field to play with. We did find some what I think are interesting findings, and also, I guess, more sophisticated, more nuanced findings, right? So because we had quite a large sample, in particular, of mid-career professionals, we managed to do a few different population cuts. So we could look at, you know, what do younger mid-career professionals value, should they pursue teaching, versus what do older mid-career professionals value? So an answer to that question is, older folks tend to value more sort of financial security, so will be looking for more payments during study, rent or mortgage assistance, those kind of things. Yeah, so I think... So while that potentially may not be surprising, and I think, if you think through those findings, they may appear kind of obvious in hindsight, it was really nice to provide a bit of certainty and clarity around what it means and, I guess, feed that up early in the policy cycle.
- Yeah, absolutely. And without that primary research, we don't always think about different cohorts and how they might react. Absolutely. Thanks, Andrei. Aurore, did you have something else on this?
- Yeah, I guess Shae kind of hinted at one of these when she was talking, that we focus on... At BETA, we try to be working on those intention-action gaps. So thinking about where there are intentions, but then people are not following through. And I think our assumption with the women who were not in the labour force was, you know, as we know, there's plenty of jobs, there's not enough people to do the jobs, so, certainly, you know, I think our assumption was if people are not working, it must be because they don't want to be working. Maybe they are taking a break. Maybe they are perfectly financially secure. Like, maybe they just don't want to be in the labour force. And like Shae said, like we found that that was not at all the case for the people that we spoke to. And what we found instead was that the key problem were barriers to opportunity. So jobs accommodating people who have caring responsibilities and letting them be flexible, accommodating people with a disability and allowing for flexibility there, being able to kind of see the value in people who've had gaps in employment. And those are really big, messy challenges to overcome, but it's better to know than to not know. And, I guess, if we hadn't done that research, you know, potentially the intervention could have been designed with the assumption that people didn't have the intention, and so there could have been maybe like a campaign of, like, "Go back to work, it's fun." Or, you know, like a motivational campaign, which would've totally missed the mark and misunderstood the audience.
- Yeah, I really agree. I think, you know, hearing stories directly from people that we're designing policy for really helps us check our assumptions and think deeply about, like, what's the best intervention design. And thinking of the research we're doing with the young women impacted by COVID, I think, particularly, it was just, again, reiterating the mental health impacts of the pandemic and how mental health and career progression are really super intertwined. So it made us think about things like what would an employment services program look like for someone who was struggling with social anxiety following, like, a long period of lockdown. So, yeah, I think there were some surprising findings.
- Yeah, I think another one that also seems to have maybe had a little bit of traction, and maybe, I think, sometimes in hindsight, findings are not surprising, like, oh, of course we should have known that, but so in terms of caring, I think often policy can assume that caring is an issue when a person's child is between zero and six; and then at six they go to school, and then you're free to go back to work between the hours of 9:00 and 3:00 and everything's neat and tidy. And one of the surprising findings from this was, like, how untrue that was, and how many people... You know, we had screened out people with children younger than six, and still, like, there was a lot of caring to be done that there wasn't an existing, like, service outside the home that that could provide for, so. So I think that idea seems to have had an impact in a lot of the conversations we've had going forward when we've been sharing the results. That's kind of the one that goes, "Oh, yeah, I guess. Okay, we need to take that into account."
- Hm. I feel like this is a good moment to ask you one of the audience questions, which is, how would you measure success of the women's participation in the labour force project? Seeing as there isn't a specific behaviour, and noting that some women may not be engaging in the labour force by choice.
- Yeah, yeah. I guess success for us was kind of identifying some policy opportunities. And some things have kind of come to light through doing the work and putting the work out there. We've then gotten opportunities to, like, make comment on the paid parental leave and maybe make comment on employment services.
- Yeah, do you want to add to that, Shae?
- Yeah, I think, look, I think this is a general question for all upstream, for all projects that we do upstream in the policy cycle because, like, you usually don't have trial results. You don't have something specifically that you want to then roll out and scale up. This is about, like, making sure we're focusing on the right things in the first place. So you don't necessarily have some, like, hard data that will show, like, if this has been successful, like we went down this path instead of that path and this is where we've landed, because it is so early. And this is one of those things where you just need to, I suppose, be comfortable with the fact that, like, this is just about trying to... Like, better is good. Like we're just trying to sort of start from a point of, like, let's just try and design the best policy intervention we can. Yeah.
- Great. Thank you very much. I think that's a great segue to another question I've got, which is, what are the benefits and drawbacks of working upstream in the policy cycle versus downstream? Andrei, did you have some thoughts on this one?
- Yes, sorry. I was just replying to a question online. So question three?
- Four.
- Four, yes.
- Apologies.
- That's all right.
- Right, so what I really enjoyed about this project, and, I guess, the broader point here is you kind of feel like you are embedding evidence-based policy early on in the cycle, which has been, like, a really rewarding experience, and perhaps not something that we get to do that often when we just evaluate at the backend, right? So for this specific project, we provided tangible and fairly compelling evidence early in the policy cycle to inform policy initiatives in the space. The drawbacks of this, and as Andrea alluded to, this was a new methodology for us, so it was challenging to, I guess, bring everyone on board at the same time. You know, we had to upskill fairly quickly, and not just the project team but, indeed, you know, our great data and evaluation team who would've done the QA process. So that was challenging, but certainly not insurmountable. The broader point I wanted to make here is that I think there is a question of opportunity cost of where is our value best applied, right? So, you know, there's only so many projects we're going to do in a year. So what's been an interesting sort of exercise is whether, you know, like, is it a greater benefit to get in early and provide policy advice at the start of the project or the start of the policy cycle, or is it better towards the backend? I certainly don't think there's any kind of dichotomy that we have to choose one or the other, but just an interesting way to approach future work.
- Absolutely. Thanks, Andrei. Aurore, did you have thoughts on this one?
- Yeah, I agree. I agree with what Andrei said about kind of that cost-benefit analysis, maybe, that is a little bit different in upstream work than further downstream. I think the breadth of these projects kind of has a cost-benefit analysis of, like, you know, there's this really great opportunity because the field is wide open, and you can kind of make a steer early on which can have a big impact later down the track. But then at the same time, it can be hard to kind of focus and target and, like, decide on a cohort. And there can be scope creep 'cause it's just, you know, you could kind of do anything. And I think there's that impact issue, like, at least in our project, there wasn't a clear policy home for it in the beginning and so it was like high risk, high reward. Like it could have ended up having no home, or it could end up finding its home. And so that's, you know, an analysis that you have to do. And I guess, maybe, also being in those high-priority areas, we found in our project, that we ended up being... So there was a research project, it had a timeline that was moving along, but because it was high priority, which is great because you have an opportunity to, like, make an impact, but opportunities would come up, you know? The secretary wants to know about X. Can you provide input? And we're like, "Oh, but we haven't written, we haven't completely finalized all the results, we're not 100% sure, and we haven't crossed all the t's and dotted all the i's." And I think, as a researcher, like, that feels really threatening and you kind of just have to be willing to do the best you can and provide what advice you can when it's asked for, which can be uncomfortable, I think, when it's new.
- Yeah.
- I think another point for us, if I can jump in-
- Of course.
- Is that the piece of work that we started off with in the teacher space, while it was really specific and very clear and compartmentalized, what we haven't talked about, because we are not really in a position to right now, is that that's grown into a much larger suite of work. And so being further up in the policy cycle has enabled that to happen. And so that's a huge benefit, because I think one of the challenges we often find in our team is the diversity of work that we have is, I think, what keeps everyone so excited. But at the same time, you constantly feel like you're trying to become an expert on a new policy topic. So being able to double down in that policy space and build on the work and the knowledge that we've got and do more and more work, albeit sort of in slightly different areas and with different trajectories, has been really rewarding to build on that existing knowledge and open those doors because other ideas have popped up. And that doesn't always happen when you're right at the pointy end of a policy implementation.
- Yeah, I agree.
- That is really satisfying. Like I feel like, really, now, I have something to say about this issue, whereas, sometimes, if you work for us on something for just a couple months, you don't always feel that way, yeah.
- Hm. Great. Thanks very much, guys. So my last question before we switch to the many audience questions there are there, thanks so much, guys, for being so engaged, is what advice would you give to a BI team who was looking to run a project earlier in the policy cycle? Andrea, do you want to start us off?
- Sure. So, I mean, I've got a couple of thoughts here and they're a little bit jumbled, so I'm sorry, you'll have to stick with me. But I think the first one is to be prepared to be okay with ambiguity. Sometimes we joke that ambiguity is a bit of a swearword in my team because there's a few people who, in particular, don't like it. And I think, even though, on face value, the piece of work that Andrei and I have run through today was actually incredibly clear, I think our experience really is that in a lot of these upstream projects you have to be okay with that ambiguity. And particularly, as I reflect on the work that Shae and Aurore have done, they had this huge, big problem and they were trying to unpick exactly what it meant and where there might be a role for us. And that's just completely filled with ambiguity. And so I think having a level of expectation that that's what you'll go into and being okay, feeling comfortable with it is super important. And, of course, for those BI practitioners on the line today, this is not something new. But it's not something that goes away when you go further upstream. I think my other big lesson, and this was certainly a huge takeaway from my team, was be willing to just take the opportunity when you get it. I'll admit that the opportunity to work with the QITE expert panel on this discreet choice experiment, opportunity for us came very quickly, sort of out of the blue, and it just landed on our desk as, "You guys have to do this, it's super important, it needs to be done now, there's no time to pause." So we just had to sort of take the plunge and jump into it. Reflecting back to April 2021 version of me, there's no way that I would've predicted that we would end up where we are today. So I think being prepared to just jump in when you see an opportunity. Maybe it will just stay in that sort of contained initial project, but maybe it'll grow into something a lot bigger. And being willing to kind of put your foot in the door, or jam your foot through the door, and sort of see what happens is really important. One thing that we didn't need to do in this project, but I think would still be really important in the space more broadly, is to be prepared to have some really early and honest conversations about feasible outcomes, if you can see them looming. So for us, again, this piece of work was so contained. You know, we certainly added our 2 cents worth into what incentives we should test. Just say, for example, one of the incentives that the panel wanted to test, and we were like, "There is a lot of evidence to suggest that this just won't work, and here are the reasons why," we would've put that forward. So even though we've, I think, promoted in this presentation that the value of being further up the cycle is to kind of explore options more broadly, if we know that the behavioural science literature and evidence is there to suggest it's not a good idea, then I think, be prepared to throw that on the table. And that, I think, just demonstrates the value and the rigor that our research and our sort of opinions and approaches have. And I think the final point, which is sort of related to the last point I made, is, I think, be really bold about putting forward the value that our work and our methodologies and our approach can have in this space. Certainly, we've tested ourselves, and I mean everyone in the team and everyone who's not in the room with us today, but I think what this has really shown to us is that our approaches, our methodology, the way we think about problems, the way we break them down, it has real value. And I think we should be proud to promote those services and that value perhaps a little bit more than we do.
- Thanks, Andrea. Aurore, did you have some views on this one?
- Well, I agree with Andrea about, like, seizing opportunities. I think, for people who are already in a BI team, maybe potential partners, like, they would know to come and approach a BI team at the implementation stage. But I think if a BI team is interested in getting involved early on, then it requires probably having some government literacy and going out and, like, finding the documents that tell you what is, you know, government's current highest priority, understanding that, knowing where that happens, and maybe even just, like, going out and approaching partners instead of waiting to be approached, probably. And another thing I was thinking as Andrea was talking was about, just going back to that idea we've talked about before of thinking about what's the policy landing pad? Because in upstream market, it might not be clear early on, you know, if it's exploratory and, like, oh, what are the problems here? What are the challenges? There might not be, like, a specific place for you to input when it comes to policy. And so just keeping that in mind and keeping your eyes open. And as the process goes on, to try to find that that focus point, yeah.
- And sort of being ready to pounce when new opportunities come up and saying, like, yeah, actually, maybe our research wasn't answering this question, it didn't start out to answer this question, but maybe we've got something to say on it and we can, like, use these findings to inform other sort of vast pieces of advice.
- Yeah, yeah.
- Yeah, okay.
- Yeah, I think that's a really important point, the kind of proactive nature of the work. And that's one of the benefits of working in a BI team within government instead of being, you know, outside government.
- Yeah.
- So I guess, a related question, maybe you've partly answered it just now, but online one of the questions is, how did it come about that there was enough time for this kind of research to solve the policy problem? It seems like urgent deadlines are often a huge barrier to upstream BI.
- Yeah, so, look, I think, maybe the easy answer to this is that, like, nothing was waiting on us to finish this research. You know, like, government has moved forward really boldly and strongly on these, on women's workforce participation. I mentioned earlier, making childcare more affordable, the moves towards expanding paid parental leave, and sort of encouraging a more gender-equal split of parenting responsibilities. Like, you know, this is all things that are, like, moving forward. Not everything is, like, waiting for the outcomes of this research. So it's sort of, like, happening in parallel to other work that's going on. Yeah, and like we said, it's just sort of like ready there for us to pounce on new opportunities coming up. Because you're right. Like there's just... We can't say like, "Oh, there's a new policy proposal. Well, hang on, let us take six months to run some qualitative research to inform that." That's just not possible. But, yeah, I think having this, like, bank of research to draw on is really helpful.
- And when it first... I think when the idea first came to BETA, the intent was for BETA to feed in some input before a budget.
- Yeah, it was.
- It was before a budget cycle, and that was six weeks away. And I think maybe someone put together, you know, some thoughts in the best that they could, but there was a really difficult decision to say, you know, we can't do this in six weeks, we need to push it out. And then, yeah, it ended up being a lot later. Then it kind of pushed it to, like, well, if it's not going to be this budget, then maybe a-year-from-now budget.
- Absolutely. And I think some problems are sort of obviously enduring problems that aren't going to be solved within a six-month timeframe. So investing a bit more in really understanding that problem is hugely beneficial. Andrea and Andrei, I wondered if you had anything to add on that question about finding space?
- It's a really interesting question. I mean, partly, this will be no surprise to the audience, we chose these two projects for a reason, that they're so starkly different in their kind of design and problem and the definition. I mean, in part I'm probably just reflecting what I observed as someone in BETA watching you undertake all of this work, but one thing that stands out to me was that the problem wasn't clearly defined when you first went in, so that gave you the scope to attack it quite broadly and those two cohorts kind of came out of that initial research. But you were able to do... I guess because you didn't go in with really, really strong priors, you were quite open to what you heard. And I think that approach worked really well. In terms of the timeframe that it takes and the trade-off to delivering results, because the problem wasn't defined, maybe you didn't have those drivers; whereas for us it was the complete opposite of, like, we had to feed in at this point, come hell or high water, and that was the only choice that we had. So I probably haven't added a lot to the conversation other than to say I think this is a great example of, like, time and place, and us saying, "Yes, we're happy to dive in and take a risk and see what happens." And the results have been... Well, the findings that we've obtained have been of interest and you've been able to, as you say, dip back into that primary research to answer other questions and provide further insight. So it's just had that lovely kind of flow-on effect that, again, I don't think we could have predicted, but has been, you know, a kind of enduring feature of that piece of work. Do you have something, like, a bit more articulate to say?
- No, I'm not articulate. I think my sense is like, especially with our case when it was quite time-sensitive, it comes down to a bit of scope-setting or goal-setting right at the start with policymakers. Because I suppose we know a bit more about, you know, how long this type of research could take, you know, it's up to us to communicate what they can expect at end of our research. And I guess being clear upfront about it, whether, "No, this is what we can deliver, if this is still of value and of interest to you folks, and if yes, we can proceed with that," but I guess it's being realistic about deadlines.
- Yeah. No, absolutely. And I think there are two real contrasting approaches. On the one hand, we've got clearly defined scope, doing really quick kind of agile sort of research in a short timeframe to a very clear question, versus, on the other hand, I guess, trying to add to the definition of a problem that was not very clearly specified when it came to us, but we had confidence that it was an enduring problem and that any foundational research we did here would be a benefit. And I guess the other thing that happened is there was a change of government while that project was continuing, and it continued to be a priority. So it was a priority of the former government and of the new government. So, you know, there are certainly, you could imagine scenarios where that wouldn't be the case. But, you know, this was a good policy problem to choose. There's a lot of interest in specific questions about teacher training, but I see Andrei's answered that online. Andrei, did you have anything that you wanted to add to your response online?
- Look, no, I think it's a great question a few people have raised, you know, how do we equate... Like how do we define high-quality teachers? I don't think we can. And certainly, I'm glad it wasn't part of this research. You know, we had a pretty clear remit to sort of say, "All right, if we want to have this cohort of people enter the profession, what can we do to shift the dial?" And we did our best to sort of answer that. I think we are very mindful not to enter the conversation of what makes a good teacher.
- Absolutely.
- I think it's an interesting point, though. I agree we shouldn't comment on it, but if you look back, and our copy of our report is available on the website with those graphs that Andrei showed earlier, but one of the incentives we measured was whether initial teacher education courses had a minimum ATAR of 80 or not, and whether that would incentivize potential students to take up that course, with the belief that it would be attracting students with a similarly high ATAR to themselves. And that was one of the least effective incentives, so I think that kind of shed some perspective on that discussion as well, at least in terms of quantifying high-quality by students' ATARs.
- Fantastic. Thanks, guys. So the next comment online is more of a comment than a question. Somebody said the presentations really show that, particularly as public servants, it's so important to actually engage with the public you serve, even if it's messy. We would wholeheartedly agree with that sentiment. And that's a good plug for qualitative research, I think, as well as, you know, some of the more tightly-defined quantitative instruments, like surveys that we conduct. Another question about qualitative research. The qualitative work was described as time-consuming, not easily contained, and not neat. How do you sell that to your clients?
- Well, we were very lucky that we had an extremely supportive project partner in the Office for Women who were similarly, like, really keen to, like, let's go. "Let's go talk to real people." Like, "We can't design policy for people without engaging with them." So, yeah, it wasn't a hard sell. I mean, like, actually, they came to us with the request to conduct this type of research. And I think what was really great, actually, for... I think we've got some Office for Women people on the line, but what was really great for that is that we were able to invite, 'cause we conducted the research ourselves. I think there was a question in there as well about, like, did you conduct the qualitative stuff yourself or did you outsource it? So we ran the sessions ourselves and then we invited members, team members from the Office for Women to sit in, to observe, and take notes. And I think that just really drove home, like, these individual stories and how policy changes can actually, like, play out for individuals. So it was really great to then, like, go back and chat with our partners at Office for Women. They were like, "Oh my gosh, that one woman who had those experiences," and, like, it really made, I think, the policy design process really real for them, so. So we didn't have to sell it, but, yeah, I can see how it might be a hard sell for someone who's, like, working to a very, very tight deadline.
- Absolutely. Now I want to return to one of the questions we had early on, which is, how can policy professionals in the APS adopt these BI components in the policy life cycle? 'Cause I guess this is a question about our advice for policy advisors who may not have access to a BI team in their department to conduct this work for them. Shae, do you have thoughts on that?
- Yeah, sure. So I guess it might sound a bit obvious, but I think that the main advice I'd give a policy team is to start by really thinking deeply about, like, and this is maybe a counter 'cause we were just talking about our research that didn't have these very specific behaviours, but start by thinking about the individual specific behaviours you want to see happening. Often in policy land, we're faced with these, like, massive intimidating problems or goals and it really can be hard to see how you get from here to there. So a really helpful exercise can be to just sort of ask yourself questions like, what behaviours are we seeing now in this space? Why is that a problem that we're seeing these behaviours? What behaviours do we want to see instead? Why are people not doing these behaviours already? And I think that sort of reflective questioning process, what it does is it sort of helps you map what evidence you already have. Like, you know, okay, well, actually, no, we know we can answer that question because we had this started to say, like, what's currently going on. Or actually, no, we can't answer that question. We don't actually even know what behaviours are going on. And that sort of reflective process would help you figure out what you know of already, what you need to go out and find, and sort of, like, really narrow your focus in on how this might play out for individuals.
- Yeah. Yeah, I'd absolutely agree. I think that behavioural lens is really crucial to apply to, you know, policy development processes where human behaviour is a key part of the outcomes. Yeah, as Shae said, I think a big part of it is about asking the right questions, interrogating the evidence, and, I would also say, not being shy about going out and collecting more evidence yourself or commissioning a researcher to collect more evidence to fill in some of the gaps. It doesn't always have to take a year. Sometimes it can happen quite quickly. So, yeah, I think that that real kind of empirical focus, the behavioural lens, asking the questions, and then, of course, we're big advocates for testing and evaluating solutions during the implementation phase as well. So I think that's the other advice I would give to policymakers is to create some space. You know, during the policy design phase, create some space for testing and iteration and evaluation during the implementation phase.
- Yeah, 'cause it's possible that there's not time to do all the research before the policy needs to be designed, but you can, you know, do the best you can in the time that you have before the initial policy design, but then planning for further research after to see how it's working, and also who it's working for, and who it's not working for, trying to identify, you know, what's our target population and who are we missing, and is there a way to catch those people, also. And I'll just add a plug for the... On the BETA website, there's a behavioural discovery tool. And I find it helpful when I'm doing my job, and so it might be helpful for others. And it talks you through those questions that Shae was talking about. So what is the problem? Is there a behaviour? Who's doing the behavioural? Are they motivated to do it? Does it benefit them? Does it not benefit them? So it kind of helps you step through those questions to identify, if you were going to do research, what type of research would you maybe want to do.
- Awesome.
- And if it's a cracking problem, then let us know, yeah?
- Some of our greatest projects have started off with an email from a policy team saying, "Hey, we're doing this thing and we wondered whether you would be interested." So even if we don't have capacity to take it on, we might be able to point you in the right direction or give you some quick advice. Or it could turn into a BETA project that we'll be presenting at BI Connect one day.
- Absolutely. Good plug.
- So there's another question here about whether, in the reporting phase, we overlaid behavioural theories to explain the why behind the insights that came from the research. Andrea and Andrei, I wonder if you have any reflections on that?
- Not for this project, I don't recall.
- Yeah, we didn't. Yeah, we didn't for the DCE project, but it's a really clever question. And normally it is something that we would do. Even if a project comes to us very clearly, where the intervention has already almost been conceptualized and all we're doing is kind of operationalizing it and BI-ing it, we would-
- Sorry, BI as a verb just made me laugh.
- Yeah, we would always sort of go back and then really carefully break down what the relevant behavioural biases are and where we see them playing out in that policy space. But Andrei's right, in the incentives work, we didn't do that.
- Thanks, Andrea.
- We did-
- Sorry, yes?
- Yeah, so in our work, we ended up, I wouldn't say we applied the model from the beginning, it was more like the results emerged and it kind of mapped onto a model, and so it became a useful way of talking about the results. The COM-B model. So that motivation, capability and opportunity kind of lead to the behaviour, and that's kind of how we ended up talking about our research with the project partner and with stakeholders, yeah.
- Mm, and I think that was a really useful organizing framework for a really kind of complex set of qualitative findings. So that, yeah, overlaying the model onto the findings, I think, was really helpful.
- So there's another question about qualitative research. How did you decide how many people to include in the qualitative research? How do you choose between individual interviews and focus groups?
- That's great.
- Those are two really good questions that I could probably talk about for an hour, but maybe take the second question first. So the reason we did interviews with the women who'd been long-term out of the labour force is that we really, we realized that we wanted to talk to them about, like, a huge amount of time and really get into depth about, like, the details of their experiences and, like, we sort of anticipated that that could get quite personal and sensitive, and I think individual interviews are the best methodology for that type of like, sort of mapping of someone's experiences and lives and really getting into detail for, like, an individual. For the young women impacted by COVID, look, honestly, our initial instinct was to do interviews there as well because, again, we wanted to understand their individual experiences. But we realized, as we were doing our ethics processes, that this was likely to be a bit of a power imbalance, potentially, if we're talking to, like, an 18-year-old about losing their job in a cafe and we're, like, you know, out, we're sitting in Department of Prime Minister and Cabinet. And we're like, "That could be quite intimidating," even though we don't think we're personally intimidating people, but, like, that that could, I think, colour their experience of the research. So we felt that, also... So the recency of the experiences and, you know, we were talking about peer groups earlier, the fact that the women we wanted to talk to all had quite similar experiences, like they've all gone through these lockdowns sort of at the same time, that that sort of sharing of experiences and relating to each other about, like, similar, that's where that lends towards a focus group methodology. On the sampling question, I think, we do get this a lot around qualitative research, that qualitative research sampling, like the number of people that you talk to and your sample frame in terms of, like, who you talk to and how you organize it, it's a little bit more of an art than a science, to be perfectly honest. You know, there's nothing wrong with, like, quite a small qualitative sample. You don't need to hit a magic number in order to be able to, like, you know, present qualitative findings. I think what we do is we start by thinking about, like, the logical ways that we would divide up a qualitative sample and making sure that we've got, like, enough people from each type of group so that we're not sort of relying on individuals to, like, speak for a group of people. And that's particularly true when you're thinking about people from diverse backgrounds. You don't want to just say, like, "Oh, we've included one person from a coloured background," and then they're speaking for their entire cultural group. That's really tricky and we don't try and do that. So sorry to not really give you a specific answer on the sample size question, but it also can come down to, like, resourcing questions. It's time-intensive to conduct qualitative research, so you can't do hundreds of interviews.
- And also, sometimes you can... Like, if you're doing it yourself, this is easy, but even if you're outsourcing it, there can be an agreement from the beginning of, like, okay, let's say we set out to talk to 20 people. If at the end of those 20 people, we still feel like every single person has a totally unique answer to this question, if that happens, we'll do another 10 and then we'll reassess. So sometimes you can kind of take it in a staged way. Yeah. Yeah.
- Absolutely. Thanks, guys. And one final question on qualitative research. How did you make sure that the behaviours you've captured in qualitative research represented the true and unbiased behaviours of the cohort you're studying?
- Ooh, okay.
- Yeah, that's a good question. Sorry, I'm just trying to think of what I would say. Yeah, I think... I mean, I think what we were careful not to do is to ask people to talk hypothetically about how they're going to behave, because we know that is incredibly fraught and, you know, not very valid to do qualitatively. It's much more valid to have people talk about past behaviours, to have people reflect on previous experiences. So we had a few techniques of, like, prompts in the way that we approach the questions to get people to think back to previous experiences in their lives and make their past lives sort of more real so we could then draw comparisons between, like, where they started and where they were now.
- Sorry, so-
- I know, actually-
- So something people are not great at is explaining, like, why they did something.
- Yeah.
- So you try to avoid asking somebody why. So, like, instead of us saying, "Why did you leave the labour force?" You can ask, "What was happening in your life when you left the labour force?" And that can be kind of a more objective picture of what was happening, rather than the story they've created about why they made a decision.
- Yeah, exactly, to sort of avoid that post-rationalization. Like, yeah, I made a really conscious choice. And often we know it's much more complicated and messy than that.
- Fantastic. Thank you so much, guys. Well, I think we're very close to time. So we've had a really rich conversation. Thank you so much, all of the panellists, for bringing your fantastic project experience. I might just invite you to make any final comments, if anybody has any final things they want to add to the conversation.
- Nothing from me.
- Nothing else.
- No, okay?
- Thank you.
- Well, thank you to all the panellists and thank you very much to everyone joining us. It's fantastic to see the level of engagement, both through attendees and through the enormous number of questions we've seen in the chat. It looks like many of the questions have been answered in the chat. I hope that's responded to many of your questions there. If you're still curious or want some further answers, please do free to drop us a line at beta@pm&c, sorry, beta@pmc.gov.au. It's our generic mailbox. So drop us a line. We'll get back to you with any questions. And otherwise, I'd encourage everybody to have a look at our website. We have a publish-by-default policy. We publish all of our final reports there and there are lots of learning materials for people who are interested in learning a bit more about behavioural insights. Thanks very much, everybody. Have a fantastic festive season. See you later.
Andrea Willis
Senior Advisor in BETA leading one of our project teams
Andrea is a Senior Advisor in BETA leading one of our project teams. Andrea is an experienced senior leader with deep behavioural science experience, having worked across the public and private sectors. Prior to joining BETA, Andrea worked for the Department of Communications & the Arts, Austrade, and various private sector organisations including Yahoo.
Andrei Turenko
BETA
Over the last six years at BETA, Andrei has managed large-scale evaluations, as well as qualitative and quantitative projects. Prior to joining BETA, Andrei wrangled data as Senior Intelligence Officer at the Australian Taxation Office.
Aurore Chow
BETA
Aurore has led projects in safety and diversity in organisations, planning for later life and women’s workforce participation at BETA. Prior to working at BETA, she was a lecturer at the Australian National University.
Shae Ffrench
Acting Senior Advisor leading BETA’s strategy and impact team
Shae is an acting Senior Advisor leading BETA’s strategy and impact team. Shae has worked at BETA for three years, leading projects in areas including women’s economic security, career decision making, and protective health behaviours. She is passionate about the role of qualitative research in helping us develop a deep understanding of problems and design better solutions.