The ThinkND Podcast

Evidence Matters, Part 10: After the Evaluation

Think ND

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 1:11:33

Episode Topic: After the Evaluation (https://go.nd.edu/e5f25d)

Can a bus pass prevent a jail sentence? Listen in to discover the “myth-busting” power of co-discovery. Join King County, Notre Dame’s LEO, and Stanford’s REGG Lab as they explore how high-stakes partnerships transform rigorous econometrics into equitable, real-world policy change.

Featured Speakers:

  • May Lynn Tan, California Association of Food Banks
  • Daniel E. Ho, Stanford university
  • Becky Elisa, MBA, Sound Transit
  • Dr. Eyob Mazengia, Public Health Seattle and King County
  • David Phillips, University of Notre Dame
  • Matthew Freedman, University of California, Irvine
  • Maria Jimenez-Zepeda, King County Metro
  • Kimberly Dodds, King County, Washington

Read this episode's recap over on the University of Notre Dame's open online learning community platform, ThinkND: https://go.nd.edu/19c498.

This podcast is a part of the ThinkND Series titled Evidence Matters. (https://go.nd.edu/28b651)

Thanks for listening! The ThinkND Podcast is brought to you by ThinkND, the University of Notre Dame's online learning community. We connect you with videos, podcasts, articles, courses, and other resources to inspire minds and spark conversations on topics that matter to you — everything from faith and politics, to science, technology, and your career.

  • Learn more about ThinkND and register for upcoming live events at think.nd.edu.
  • Join our LinkedIn community for updates, episode clips, and more.

Welcome And Series Goals

20

All right, well, let's get going. on behalf of King County and, uh, the Lab for Economic Opportunities at Notre Dame and Stanford's Regulation, governance and Evaluation Lab, reg Lab, I wanna give a big welcome to all of you. this is the third event in our series called Evidence Matters, and it's great to see all of you here. joining us for this, discussion. I'm Carrie Hawk. I'm King County's Evidence and Impact Officer. Uh, this is my third day in a new role, where I'm focused in countywide and, I'm very passionate about advancing racial equity through bringing together the expertise of our staff, our research partners, and our community. I wanna give a shout out to Jenny Garris, who is, behind the scenes here, but has been very much front and center in helping us plan on pull this event off. So thank you Jenny. And, our intent with this evidence matters series is threefold. We're working to build skills and knowledge that supportive focus on equitable impact and community outcomes. We are celebrating our successes in our evidence-based practice, and you'll hear a lot of that today and sharing our challenges as well. And I hope you will hear some of that too. And we're building a community of practice so we can, we can learn from each other. And I'm really excited about this session in particular, which is focusing on how we build great researcher practitioner partnerships. back in October, um, at the end of October, we covered, um, how you build partnerships in the very early phases of an evaluation. Today we're following up on that discussion and we're here to talk about how we sustain partnerships between researchers and practitioners after evaluation results are in and over the long term. So we're really supporting a cycle of continuous learning. king County is really fortunate to have forged relationships with the, researchers you're gonna hear from today, with the Lab for Economic Opportunities, or Leo as it's known at Notre Dame and, uh, Stanford's Reg Lab. And we've done that across a really broad range of projects, dating back really several years now. And a big marker of our success is not that we've completed an evaluation and we've got the results. That's all really great. But, the more important marker of our success and how we're building evidence practice. Is that we're sustaining those partnerships to support that continuous learning. We learn from our evaluation work, we generate an you know, a different way of doing work based on those results. And then we generate a next set of questions for deeper learning and improvement. And I think you'll really hear that from all of our participants here today. So on our agenda today, we're gonna start with a moderated panel conversation, highlighting four projects where we've got the results of the evaluation, and we're gonna use those projects to really draw out the lessons learned, try to bust some myths about what it means for researchers and practitioners to partner together. And we're going to look at how important it is that we work together in not just the process of the evaluation itself and collecting the data and all of that. Really analyzing and making sense of the results, and particularly surprising ones. I'm really excited. We're gonna talk about a null result today because I feel like we don't give those enough attention, how we work together, disseminating what we've learned so that the knowledge, we've built influences decision making, and then how we engage together in that sustained learning practice. So that's our panel's gonna focus on, um, drawing those lessons out from, four projects you'll hear about and your participation, um, here is really important to us. We are planning to reserve plenty of time for your questions and your observations. So as you're listening to the moderated panel, um, please drop your questions or observations, into the chat. and we'll, um, Jenny and I will be picking those up and, and queuing them up for, questions in the q and a session. Uh, we are recording this, because we've, you know, often had a lot of interest in our past events and people wanting to, uh, listen, after the event occurred. So just note that we are are recording. So lemme now, introduce you to our moderator. Melin Tan and Melin is Assistant Deputy Director at Evidence for Action. And Evidence for Action is a national program of the Robert Wood Johnson Foundation. king County is a proud recipient of a grant from E four A and I'm really excited to have Melin here as our moderator because at E four A she leads, uh, technical assistance for applicants, uh, to the grant program. And so she has really developed. A wealth of experience, on how researchers and practitioners work together and has been incredibly helpful to us in King County in, making our work a lot better. And as well, Maylin and I, ourselves have been in a sustained partnership for several years now, and, um, I've learned so much from her. So, thank you, Malin. Thank you for moderating this discussion today, and I'll turn it over to you.

Court Appearances Experiment

Speaker 2

Thank you so much, Carrie and Jenny for organizing this series, um, and also for inviting me to moderate today. Yes, as research funders, we always encourage partnerships for research because we see them as more likely to generate evidence that's actionable and that can address root causes of inequities. But it's so rare to have the opportunity to hear firsthand what these partnerships look like in the real world. And not only, as Carrie said, what success looks like, but also, um, what the challenges have been. And today we'll get to hear experiences from not just one, but several different teams. And what's even more unique about this session is that we get to talk about what happens after the evaluation or more, um, athlete probably after the initial stages of the evaluation, which is a stage of the partnership that rarely gets a lot of attention, but it really does serve as a springboard for future actions. So I really commend Carrie, Jenny, and your entire teams on highlighting this important phase of the work for today's session, and I'm really excited to be part of it. So with that, lemme kick things off by introducing our first panel. we are gonna have a presentation about, court appearances experiment. And, Matt Friedman is a professor of economics at uc, Irvine, and he serves as the equity advisor for uc, Irvine School of Social Sciences. Dan Ho is a professor of law. Stanford Law and is also Professor of Political Science, and there are two of the investigators on this experiment. And Carrie is gonna represent the practitioner side of this partnership. So Dan, to kick off, could you give us a quick overview of the intervention and the experiment and the problem that you were trying to solve?

Speaker 3

Sure. well, thanks so much, Melin and, Carrie for organizing, uh, this event. over the course of many years, I've learned a, a tremendous amount, from the county, and I think it's so critical to, build champions for evidence within, government. so, to give you a brief overview of this particular, uh, project we worked on together, when criminal defendants are scheduled for court hearings, and they subsequently fail to appear. That can result in pretty dire consequences, um, including, subsequent arrests. FTAs, uh, failures to appear for low grade crimes can be high in some jurisdictions, around 30%. and there's lots of work that is considered how we might be able to bring down, these FTA rates. And our specific, hypothesis in this experiment was really to think about some of the very material transportation barriers that might exist. And so what this intervention entailed, and we'll talk a little bit about the exploratory process and all the other potential dimensions, that we, uh, considered, uh, was really to provide a transportation subsidy that ranged, at the beginning from$15 and at the end, to, to up to three months of free public transportation, to make it, easier for defendants, to show up, uh, to a subsequent court hearing. this was, uh, randomly assigned. and the outcome we were studying in the court system, was whether that transportation, subsidy, reduced, failures to appear to appear in subsequent hearings. typically occurring within six weeks. I think Matt will talk a little bit more about the results, but essentially, uh, two of the main kind of findings are that one, there was extremely high usage of the transit subsidy. and, we probably have thought about this more as a kind of, pilot study because, COVID-19 upended the entire court system in King County. But nonetheless, were able to provide some interesting bounds and information that seemed to suggest, uh, that transportation subsidies were probably, uh, not a, a very big mover, in terms of, ft a rates.

Speaker 2

Thanks. Carrie, before we hear more about the the findings, could you talk a little bit about why it was important to Metro, which is a transit agency to spearhead this particular intervention?

20

Yeah, absolutely. so, you know, at at Metro we recognize that though we're providing transportation services for people, transportation is really just an end, a means to an end. And so as a transportation provider, we're really trying to stretch our thinking and hold ourselves accountable for how we contribute to the outcomes that people really care about. And particularly, we're really trying to focus on people who are most dependent on us for their mobility and for sure, you know, a, a lot of that, a lot of the population that is involved in, criminal court proceedings. Do rely pretty heavily on, public transportation to, get around. And so we really wanted to think with our, our partners in the criminal justice system, about how we could work together to make sure we're, we're getting people the mobility that, that they need so they don't become further, further involved. And that's the outcome that I think we really, um, need to be thinking about.

Speaker 2

Great. So onto the experiment itself, contrary to expectations, it sounds like you did not find much evidence of an effect of transportation subsidies on people's failure to appear for court. Matt, could you please walk us through what you learned from this?

Speaker 4

Absolutely. Yeah. So as Dan, Dan alluded, our our study was interrupted by COVID, and for that reason, sort of think of it as a, as a pilot study. But, but as you and Dan, Dan mentioned as well, contrary expectations, the, the results of that pilot really pointed to little, if any impact of transportation subsidies alone on defendant's likelihood of appearing for, uh, schedule of court hearings. But this null effect is, is, is really interesting in its own right and raises a number of other, uh, important questions because while it doesn't rule out that that transportation itself is a barrier, it does suggest there may be other potentially more important barriers, for people, appearing in court. And so, you know, in many ways we think of this as kind of paving the way for future research that, for instance, combines transit subsidies with other outreach or support services like, uh, text message reminders or other, other, other mechanisms. So one other lesson that we learned that kind of touches on points raised by Dan and and Kerry both is that while the defendants in our study didn't necessarily use the transportation subsidies that we were providing to, to get to court appointments, they did travel quite heavily on, on their subsidized passes. So that, that, I think that highlighted both for us and, and our, in our partners, how transit reliant this population is and, and how generally they might benefit from, from lower or no cost, transit options.

Speaker 2

Great. I also understand that the research team and King County work together to make real time adjustments to the experiment as needed. Could you describe how the partnership supported that?

Challenges Data And COVID

Speaker 4

Yeah, absolutely. So one thing we were able to do with Metro's assistance was actually monitor the use of the transit cards we were handing out, as part of the subsidy early on in, in the intervention, during which time we actually observed. People who are receiving these subsidies. Defendants that were had future scheduled court hearings were actually rapidly exhausting, the credit that, that we provided sometimes well before the scheduled court hearing itself. And so with Metro's help and with DPDs, assistance as well, the Department of Public Defense, we were able to act very quickly, you know, using, using, uh, you know, our, our practitioner partners expertise in this to adjust the subsidy amount that we were providing to make for a much more impactful, uh, intervention that we would've otherwise had.

Speaker 2

Thank you. what are some other challenges that you ran into? Dan? Would you like to describe any and what your team learned from that?

Speaker 3

Sure. Uh, well, I, I should preface this by saying that I think one of the things that worked really well about this collaboration was that it involved. A fair number of stakeholders. The surprising number of stakeholders actually, involved, the Public Defender's office, king County Metro, the jail staff, to figure out the mechanism of how to actually deliver these transportation benefits. we talked with a huge range of, of parties really to identify all the sort of logistical, challenges. And there was a lot of really active problem solving going on to enable this, study to really, be able to move forward in the best way. but Melin, there were definitely, uh, some challenges we faced along the way. Most important of all, you know, we, we had finally iterated after some piloting, to, the scaled up, transportation subsidy. and then COVID-19 meant in-person hearings were no longer happening, and we had actually explored, um, and I think this is the, the really interesting part about the process of these partnerships, a range of other ways of branching off the, the research project. for instance, one of the biggest concerns, with COVID-19 in the criminal justice system has been what happens to the rights of criminal defendants. As all of these hearings move to virtual hearings, those of us who are teaching universities know that teaching on Zoom is completely different than teaching in the classroom. And there, there was a prior study coming in Chicago that showed that moving things to a closed circuit television would radically change, the outcomes in these kinds of criminal hearings. uh, unfortunately we weren't quite able to kind of, make that pivot. but, I'd say, you know, the way COVID-19 interrupted the court hearing process was by far the, the, the biggest challenge because we had, we were prepared to basically run this trial out for, for a significantly longer to, to gain more information. The other thing I maybe would highlight is, although it was, wonderful to work with, the range of stakeholders that also presented its set of challenges. So, you know, the public defenders very much, uh, rightly had the, the, their role as really representing their clients. And that meant, that there were certain questions we as researchers would've loved to examine, and, and see how they interacted. Like the adoption of risk assessment scores in particular courts. a factorial trial that actually crossed, uh, text messages, that seemed to be, uh, harder to kind of, uh, move, uh, forward, uh, on the, on the metro side. Literally, one of the questions was, uh, how much should we share the data about individual transportation rides when there is a possibility that there could be a subpoena that would land in the researcher or metro's, inbox if there were, if, if there's evidence that one of the defendants had actually engaged, uh, or, or was, was, was, uh, suspected of having engaged in, in, in some other criminal activity. And then I'd say the other one I would, point out is on the video hearing side, you know, uh, king County's Office of Performance Strategy and Budget had actually basically subsidized the technology to enable virtual hearings. We would've loved to have been able to study that in real time. but there were particular court administrators that just didn't really want to release that data. Turns out there is actually a Washington law that mandates that these recordings be made available, but it is so outdated that it allows you, one primary vehicle getting access to these is to pay$10 per cassette tape, uh, for these recordings, which made it really, you know, obviously a non-starter to scale that. so I think those are some of the challenges. But I think most of all, what I think the process illustrates is the role that someone like Carrie can really play to serve as a barrier buster, to figure out how to chart a path forward.

Speaker 2

Wonderful. Before we move on to the next panel, there's a, a question from the audience, um, about how did you arrive at the hypothesis that lack of transportation was a causal factor for failure to appear? and I guess I would add onto that, did this shift to online, appearance as due to COVID challenge that hypothesis for your team?

Speaker 3

and Matt, you should feel free to jump in here, but there, there basically we had done a kind of literature review and there was, there were a number of pieces that had pointed to transportation barriers as being a potentially a, a, a significant cause. The, but no one had really studied it. And so that's where this kind of unique partnership between, Metro, and the, the, public defenders, came into to play. Matt, do you wanna add anything?

Speaker 4

Yeah, I think, I think actually in talking to the public defenders themselves in Seattle as well, that was something that they highlighted that they believed was in fact a, a major barrier. And historically they had been passing out, you know, single bus passes that, that allowed, defendants to get to court precisely because they viewed transportation as an important barrier. And so I think that, you know, we had both, both kind of a prior literature that was more general right. About, about this topic, but then on top of that, some local knowledge, right, that pointed to this being, uh, potentially important obstacle for many defendants as well.

20

Yeah. And a lot of it came out of as well, at, at Metro, we've really been trying to move from these Very administratively onerous and not very helpful, like paper, single tickets to really getting people on our orca, our orca programs that give them access to discounted fair. And we found that, you know, a lot of people who were involved in court proceedings, weren't enrolled in the programs that they could be beneficiaries of. And so we really wanted to explore that. And as we started thinking about that and reaching out to our partners, like the public defenders, you know, that just kind of, told us more about, you know, they were constantly seeing, their clients as having transportation challenges. So that's really kind of where this, where a lot of this started.

Free Transit Pilot Overview

Speaker 2

Great. Another example of how having strong relationships with practitioners can really ground truth. Some of the. So moving on, we're gonna move on to our next panel. I'd like to introduce, another trans, the investigators of another transit study. Um, David Phillips is an associate research professor with a lab for economic opportunities, and Maria Jimenez is with King County Metro and she's the program manager of the Work Reduced Care Project. David, could you provide us with an overview of the team and this pilot evaluation?

Speaker 5

Sure. So there's a, thanks for the opportunity to be able to, to talk today. yeah, so there's a big, wonderful team working on this project. It includes a big team for Metro that includes Maria and Terry is a team from, uh, Leo, including me. Uh, Matt Friedman, when you see Irvine, who York from a moment ago was also part of this project. Um, there was, uh, at various points in time folks at Washington State. DSHS that runs public benefits programs has been, very helpful because they interact with Metro a lot around sort of low income air programs. And of course we've gotten a lot of support externally for this research as well. Earlier stages from the Institute for Research on Poverty at the University of Wisconsin. And, and, uh, now as we're doing some, ambitious second steps of this e from, from Mainland's group. and so that's the team and, and the evaluation has really had two parts. There was an original pilot period that where we learned a lot about how people's travel behavior changes in response to getting free transportation. And so we worked with, DSHS clients, so people were receiving, for instance, food benefits in the state of Washington and worked to provide some of them with free transportation on, on King County Metro for a few months. and did a randomized control trial where people either had access to that or the status quo, which would be a Lyft work, a Lyft pass, which gives a reduced fair free. Um, and what we found was that people who got the free. I used, transit about two to three times as much as folks with the reduced fare. And most of that wasn't just sort of simple mechanical switching. So it wasn't like people, okay, they're gonna use the card now instead of cash, but it's the same transit, or they're just gonna shift a card trip to a bus trip. Uh, most of that really was increases in, in travel, people taking trips that they wouldn't have taken before apparently, because, of cost being a barrier. And, and so that, that was from the first stage of this project is documenting that. And, and so that's now the project has moved into a second phase, um, where we're really focused on, expanding that and a couple of dimensions. And so, uh, this second stage. The para free transit study that's now looking at, uh, not just outcomes related to to transportation mobility, but also health and wellbeing, uh, employment and so on. Um, looking at how, uh, eliminating the cost of trans, uh, cost of transit affects people in all of these different ways, and particularly, uh, then focusing on, now Metro has a low income, uh, path that is fully free for a large number of people within King County rather than just sort of this, part of a research study pilot. And so what we're doing now is building a study around that actual program and asking some of these questions about how, um, access to fair, free transit impacts not just transit use and mobility, but employment, health, wellbeing, and so on. That second stage of the evaluation, um, is, is now designed and we're getting very close to launching, uh, the baseline survey that will, will start that evaluation, which we're really excited about.

Speaker 2

Thank you. so Maria, this next question is for you. the study, the initial study, the pilot seemed to some really impressive results. And, um, why was it important for Metro to continue the partnerships with Sam researchers on this new evaluation that David just described?

Speaker 6

Yeah, so, uh, David gave a really helpful, incredible overview of the project. And one of the things that we learned was that from the pilot was that we were heading in the right direction, that for customers for whom affordability may be a barrier, we provided them with an opportunity to ride travel at no cost to them. They would take advantage of that opportunity and ride transit more. but while we learned a lot, there is one and enough differences between the pilot and our new program that warranted additional evaluation, but also other additional unanswered questions. so first wanna highlight that for our original pilot, it was for individuals out or below 200% of the federal poverty level and enrolled in, um, in a benefit program through dshs. The ED annual, we're really looking at individuals that are enrolled in state benefit programs, but whose incomes are at or below 80% of the federal poverty level. and we can't necessarily infer that those two are the same. They might both be low income, but one is very low income. And the, the challenges that they face, their life experiences may completely different in a way that, that way it shows up in, in transit use would be different. and so secondly, we're also operating the edpa on a larger scale, um, and actually offering the transit benefit in a different way. So the way we're, the backend systems of how we're operating, those bene those benefit different, those transit benefits are very different. Um, and we're also operating it and making it available through additional partners. and as David mentioned, really most importantly, we're using the subsidized annual pass evaluation program, not to evaluate, not just if, um, providing transit benefit increases access to transit. But also if it increases mobility in such a way that it increases their, their health and wellbeing. and lastly, it was also just important to continue doing this research because, as anyone who's done evaluation knows, sometimes you have questions that you think you're gonna have a hundred percent answered the first time around, you think they're gonna be straightforward. And then when you're analyzing those questions and answers, you're like, oh, actually that's interesting. I would not have expected that result. Or, I'm wondering if people misunderstood this question or, you know, I'd really like to learn more. And so with this evaluation, we have that additional opportunity to dig a little bit deeper and really understand more what is that actual experience for customers and, um, how is that this transit benefit impacting their lives.

Speaker 2

Thank you. yes, the pilot study was definitely, seen like seminal work in the field, and I wanted to talk a little bit about the importance of disseminating the results. From that study, can you describe how you, uh, worked with the researchers together on communicating their, your findings and, um, particular, particularly some of the challenges that you had to work through?

Speaker 6

Yeah. One of the examples that I, I can start off with is, um, we worked on the case study for Results for America that disseminated learnings not just about the pilot, but also about our existing low income fair program as well as the subsidized annual pass. And one of the striking challenges to me was how, from, was how we actually then showcase what providing that transit benefit looks like. And so from like an academic perspective, it makes sense to focus less on the logistics of providing that transit benefit so that you can focus more on the learnings of the research project. but from a practitioner perspective, our tendency was first to really spell out the full details of like. Here are all the steps that need to happen in order to make sure that this person has now this pass in their hands, which on our end was a little bit of maybe overkill, but it was helpful to work with practitioners, not just on, on this project, but also across agency. And to work with the researchers to really kind of get at a middle, middle ground that really first and foremost highlights the challenge. highlights the learnings of being able to provide, uh, subsidized transit to customers, but that also doesn't neglect those important pieces that are important to decision makers of understanding that this, operating this type of program requires infrastructure and requires resources that, um, require investments and to understand that it's a labor intensive, program. And I think I saw Carrie put the link for that, case study in the chat. And I think we landed there then on a solid middle ground of being able to highlight both the learnings, but then also like in a separate clear section. Those challenges in a way that doesn't overwhelm, and, you know, drown out the learnings.

Speaker 2

David, would you like to add anything?

Speaker 5

Sure, yeah, I'd echo that. I think that similar to what Maria said, right, a lot of the challenges that these different audiences want different things, right? And, and Maria highlighted some of them. The other thing I'd highlight is that they want different things in terms of timeline too, right? The, in some ways the academic world wants, uh, a perfect answer on an infinite timeline, right? It's, it's okay with waiting for years to get just the right answer, and that's important for, for learning and, and for that audience. But it's also like sometimes King County Metro might, for instance, be, you know, running a program and need an answer right now. And so, like, I think making sure that, that having a good answer on the right timeline is sometimes more important than having a perfect answer on the wrong timeline. And, and I think one thing that helps us. Judge those things, right? That deal with those tensions of like, okay, the academic, the policy will have different, uh, complimentary to different goals. Because I do think the partnership of Right, having multiple projects where we're working together, it's like Marina and I are working together on this project, we're working together on other things as well. Having that collaboration over, multiple years and all of that like provides the context in which we have that sort of mutual trust, mutual working relationship to be able to go back and forth and, and make sure that we're, we're, um, both of those audiences in a, in a really good way.

Homelessness Prevention Initiative

Speaker 2

Thank you. I hope we get to talk a little bit more about dissemination during the QA period, but it's time to move on to our next panel. I'd like to introduce the panelists for the Youth and Family Home Homelessness Prevention Initiative. again, we have David Phillips representing Leo and Kim DODs is the Prevention and Diversion Manager at King County Department of Community and Health Services. Welcome. Uh, could you give us a quick overview of the intervention and what it was designed to accomplish?

Speaker 7

Sure. So I think historically there is this tension in homelessness prevention of how can you tell that the intervention you're using would actually prevent an episode of homelessness for the household that you're working. And so historically there are a couple different interventions. There's, there's this, you know, infusion of case management, supportive services along with, potentially some flexible funding that is used across the region, across the United States to help people prevent an episode of homelessness. And then now we've got this wildly used rental assistance option where people just get an infusion of some financial assistance or, and our rental assistance as we now, refer to it because it's so, through the COVID and the COVID emergency relief funding. We really see this. And so we really wanted to see, you know, how, how, how, uh, how successful is each intervention? And because there's such a cost variance for both when you're looking at case management services, it's pretty expensive intervention to have a case manager and provide that flexible, flexible financial assistance. And it is way more economical to just provide an infusion of, financial resources. So for us it was really how much value is there behind that whole case management component value, not only in, are they equally successful in sustaining their housing, but what is the additional values given through case management services, say for, you know, economic impacts overall for health and welfare impacts. So it was really looking at the two different. Interventions and saying, what can we see in both interventions that will help us to better understand, you know, what is the most successful intervention and when you're quantifying success, and or is it worth that extra, financial resource to have the case management in these other areas such as criminal justice and, and uh, education, things like that. So we really looked at the two. If we, uh, randomize a group of people that look the same from a risk assessment score and some just get random assistance and some get case management and grant assistance, um, how does each group do? And then how does each group do, looking at some other factors as well. So that was the intervention.

Speaker 2

Thank you. And David, could you describe the evaluation part of it? and in particular, could you also talk about the aspects of the project setup or the design that contributed to being able to interpret findings and confide?

Speaker 5

Sure. So, and, and Kim did a good job of describing what the, some of the comparison was that we're making. But just to emphasize, right, this program really is a combination of case management and financial assistance and, and the, we did a randomized control trial, but not comparing that package to nothing. It was comparing that package to financial assistance only, right? The relevant question here wasn't should we do something to help prevent homelessness for this group? The question is, what's the best way to go about doing that? And, and, and how, in what situations should we make this investment in case management? Uh, and so the, the comparison was case management, financial assistance to just financial assistance, which leads to a research question about, how, how adding individualized assistance affects, uh, housing stability for people. So we ran that randomized control trial with about 600 people in it, into these two different groups, followed them and, and looked at, outcomes, uh, related to housing stability, but also downstream outcomes for other aspects, uh, of people's lives. And I think to your second question of. how the setup was important for building, confidence in the result. I do think one thing that, about, that I really like about this particular project is, is the process that we went through, um, to set things up. And so right. There's, as Kim mentioned, there's several different agencies, a couple dozen agencies that, that, that operate this program with funding from King County nonprofit organizations. And at the beginning when we had our sort of first, you know, academics, first draft of research design, we presented that to a large group of those case managers and executive directors. And they gave us a lot of good, really good feedback about what the right que research question was. How to think about measuring housing stability, how to think about the ethics of random assignment in the context when people are in an emergency situation. And so they gave us a lot of really strong feedback about that design. And so that led to a great collaboration over several months of interacting with those folks back and forth to keep iterating on this research design until we had something that really was asking the right research question, was measuring things in a way that was more convincing, uh, was doing thing in a, in a way that was, was ethically responsible with the folks involved. I think then when we came back a couple years later with, okay, we've gone through this long process now and, and now we have some results, I think there's more trust there, both in us as partners and in the design, as being convincing when people that have, have seen it and been able to speak into it. And when we've been able to benefit from the, the feedback people we're able to give to make, make for a better research design.

Speaker 6

Fantastic.

Speaker 2

and Kim, could you talk now about how your team has been able to use the findings in planning ahead for the next iteration of the initiative?

Case Management Costs

Speaker 7

Yeah. It, it was really interesting and it reminds me, you know, earlier of Matt, um, Matt's, you know, kind of their first results, we, our results were surprising, so we imagined that absolutely case management is the most important element of this program in ensuring that people maintain housing stability. What we realized is there was a subset of people that really did quite well without the intensive case management. And so because of that, we have learned that it's a both and for us in future iteration. So we're absolutely using the evaluation results, the randomized control trial evaluation results to teach us that we, we need both. So we can't, we don't need to have the super expensive intervention for everybody, but there, um, is a subset of people that absolutely need those layers of support in order to see, um, reduced eviction rates. Um, that was one area where we saw the case management definitely, provided some additional support in the actual instances of evictions. And, we saw that people had the use case management used resources more likely. So whether those resources were. Resources to ensure that they accessed healthcare. we did see that people who had case management support did, access some of those other resources very well. So I think that the need for, looking at the data and then implementing it has created a big shift in our program moving forward for the next six years of the levy. I think the other thing too is that partnership David talked about and kind of around the data. So sometimes the data tells you something and then the data gives you bigger questions. So when you're looking at the data and you're like, okay, well we could see a couple things pretty clearly, but just like in that first, you know, with the criminal justice system and you know, is it the fact that people don't, you know, have access to transportation? That, that, uh, increases the FDAs For us, it was kind of like, so one of the things we've also seen in this program is a pretty high case manager turnover. You know, what could that be leading to? So I think that being in an evaluation mindset too, honestly puts you in an evaluation mindset to where you're starting to ask questions along the line, along the way more often. And so I think that not only did we have some factual learnings that data really drove and said, okay, yes, we need a need. We have a need for both interventions. But it also continuously ask, begged us to ask the questions of, yeah, but what about this other thing? You know, and how intensely do we need to pay attention to that? So, uh, we did see that, you know, case management turnover may have had an impact. We didn't design a study to prove that, but we could see that it probably did. So we're also gonna be working harder in this next iteration of youth and family homelessness prevention to really work to retain our case managers and. Look at what does it, what does it cost us and what does it take to bring along a case manager that we can keep? So where we should have had 25 case managers over the course of this levy, we had 82. So again, one learning gives you another learning that kind of gives you another question. So, so definitely the data that we have taught us some things that'll, uh, that are creating changes in this next generation, but it's also taught us to ask more critical questions.

Speaker 2

Great. And there's a question from Carrie about, she said, I would be interested in better understanding how you disaggregated data and on what dimensions that allowed you to draw out the conclusions about the differential benefits of case management.

Speaker 5

Yeah, that's a great question. Maybe I can take a first pass. I'd say there were two things that were really helpful for us thinking about. Once randomized control trial of like case management versus less case management. okay, here's the difference then. Yeah, the next question is like, well what about case management? Why, why does this matter? Two ways we tried to dig into that. One is, I'm a quantitative researcher. There was also a couple of qualitative researchers who were part of the broader team, and so they were able to, observe client interactions with case managers, interview case managers and so on. And I think one thing we really learned from that was that what people meant by case man management had some common elements across agencies, but it also had a lot of variety. And so, then we can take that inside to the quantitative data and say, okay, there's a couple dozen agencies that are all running this. Do we see different outcomes of this quote unquote trait case management treatment? Right? Do we see that doing different things in different contexts with different agencies? And that's one way of, of administratively it's harder to run a study with two dozen agencies, but one nice thing after the fact is that we can see a little bit of variety of what, what case management means in different contexts and see how outcomes vary with that. So that's one way we try to sort of. Whether, you know, agencies that do more intensive case management get, get better outcomes to see whether agencies, um, that pay out assistance in one way another, see different outcomes. And that's, that's a little bit beyond the randomized control trial, but it helps dig into that, that second question. That's all about program design.

Speaker 2

Great. a great case to be made for mixed methods design and evaluation. Mm-hmm. Wonderful. So, it's time to move on to our, our fourth and last panel, uh, but there will be time for more questions after, after this, this next panel. So I'd like to introduce the, um, king County Food Safety Quality Improvement Project. Dan Ho is, uh, also on this project and in addition, he's joined by Becky Elias, who was, at the time of the study, was with the Food Protection Program for Seattle and King County Public Health. And Dr. Iia, the assistant division director of Environmental. For Seattle and King County. Welcome. so Becky, I'm gonna have you start off, you're no longer with King County, but at the time of this work, um, you started this work with Dan and and I would like you to talk about how the partnership came about and what you focused on together.

Speaker 8

Sure. Thanks so much. Um, happy to talk about it and it's a pleasure to be here with everybody. Uh, much of what the other panelists are saying really resonate with our experience or the lessons that we learned as we, went throughout the process. so for us, our program at the point in time was that we were at a place where we knew that the public was wanting us to deliver a food safety rating system, some sort of a public disclosure system for the public, and we wanted to provide it, but as a program at that point, we had a number of questions about wanting to understand the efficacy of those programs, both on what it did for consumer behavior. What the implications were for inspection, consistency and inspection accuracy. And at the time, the person who was actually my predecessor in the role provided me a research article that he had come across. Um, it was an article called Fudging the Nudge, and it was an in-depth look at restaurant inspections that were happening in other jurisdictions throughout the United States. And as I was reading this article, which happened to be authored by Daniel Ho, I was seeing a number of questions that the members of our food safety team had been thinking about and asking as we were considering creating a food safety rating program. And so, um, I'm somewhat of an extroverted person, and so I just reached out to Dan and said, Hey, we came across your article and you are touching on things here that our team has had a lot of questions and curiosities about. And, your research has quantified those things that you've seen in other jurisdictions when they've implemented a similar type program. And the conversation ended up being. The start of, one, a really adorable moment where Dan said, oh my gosh, you read my article. That's great. And two, that it sparked a number of, questions back and forth between us, where very quickly I was asking him a number of questions about things that he had learned from the research he had done in his data analysis in other counties. And he was peppering us with questions about what our plans were for implementing such a program, if we had any evaluation plans in mind about it. And so it very quickly turned into a creative and collaborative partnership that evolved into what you see where, this slide described the intended outcomes. And the, the three steps of how we went about it is that we didn't just partner on the creation of the food safety rating system itself, but almost as pre-work to that we really began our evaluation process of understanding the consistency and accuracy of the inspections that were happening across our team. and began there as a starting place to essentially build the evidence and build the foundation to then have an evidence-based, food rating system. Uh, so it really set up, set us up on a path for evaluation in our policies.

Speaker 2

That's a fantastic origin story. Thank you for sharing. eob, could you talk now about what the benefits have been to King County, from this research practice partnership over the years?

Research Breakthroughs

20

Of course. Thank you. Thank you. Uh, Meline. Becky, thank you so much. I could hear your excitement, uh, in how it all began and just, uh, enthusiasm, uh, is greatly appreciated. And for any successful partnership, I think there has to be a genuine terrorist on both parts and I think professor who has just this infectious in TCE in doing the work so well. And willingness to collaborate. That made it much easier for us, uh, to sustain this collaboration. And then, of course, initially how it starts makes a bigger, it makes a big difference. Like many programs, our program has collected hundreds and hundreds of thousands of data points over the years with very minimal, analysis done over the years to substantially inform our work. That is very sad to admit it, but that was a case we've done very, superficial analysis. So when we collaborated with Professor Dan Hall, I think in his team have showed us that what's possible, right? What's possible to learn from our data? It, it almost made us, feel like we've just opened this big box of data that we've been holding onto for so long with little interactions. Uh, uh, and so I think just that realization is one of the, benefits. collaboration in it. What I've sensed since the beginning of the collaboration is that there is a, a shift in our work culture relying on evidence-based decision making. Actually people starting to ask, okay, so what does our data say? Can we do these? Can we do that? That is a shift. That doesn't happen overnight. It happens because of sustained partnerships in, great collaborators like, uh, officer, uh, whole. and then it, of course, it helps us identify gaps, right? Uh, what we've learned is how little of internal capacity we have to do what, professor, Bo and his team were doing so we could then advocate for those, internal capacity, to be made available. and then, um. The other benefit that I'll add is that I, I do see that in, in sort of transfer of, uh, work culture to other sections as well. Other sections start questioning whenever we provide, update on what we're doing as a section. other section starting asking, oh, so you, you're doing partnership, you're collaborating, how is that working? And then for them to actually start thinking about the possibility of doing the same thing. But, more importantly, at the end of the day, the goal is really to improve our efficiency and ensure that we have actually hit the impact, the impact that we're going after, uh, and then be able to assess it, and then make, make change. one, one takeaway for me that continues to, uh, resonate is that there's no shame in knowing that we've been doing it wrong. I think what would be unfortunate is to continue to do. That way without actually, uh, evaluating our work. Um, so it, it has made us a bit vulnerable and be willing to engage in the work. I, I'm extremely grateful for the collaboration. Thank you.

Speaker 2

Great. Thank you so much for sharing those examples. and Dan, what were the most interesting aspects for you about working with Count King County over the years?

Speaker 3

Well, I've learned so much from this collaboration and in many ways a lot of the work we're currently carrying forward with the Reg Lab, kind of came out of a series of these long discussions, in this kind of a, a new model that I think we really forged. I, I agree with everything that Becky and EB said. The only thing I'll take offense to is that after many years of working together, EB, that you're still pre referring to me as Professor Ho. uh, uh, you know, you can, you, you should, uh, that's the only thing I'll take offense to anyway. This is a great collaboration, in, in huge part because of the kind of, leadership ethos, starting with these initial long calls with Becky and how EB came in. I think what really characterized it was this openness to exploring a pretty wide range of space. the willingness to kind of invest, some time to, to make that happen. I did not walk into this partnership with, an interest in food safety. I'd written an article about, uh, information disclosure that happened to touch on what for many has been like a kind of. exemplar in how we can get information disclosure, right? And many people have pointed towards restaurant grading, is the kind of information disclosure that doesn't inundate the consumer with information is, conveyed in its succinct fashion at the point of decision, and maybe that's likely, uh, to be effective. In the earlier work it showed, well, actually that's really challenging because you can have a staff of, dozens if not hundreds of inspectors. and, uh, that creates all sorts of unintended, sort of, uh, effects. And, and at least in some jurisdictions, the resulting grades were close to random because, uh, because inspectors were randomly assigned, to conduct inspections. So the two most interesting research elements that came out of this to me is number one, less about like food safety per se, but about this core governance question of how do we actually empower frontline officials to accurately and consistently administer the law, which turns out to be not, not just a problem in food safety. It's pervasive across the administrative state. We see the same exact pattern in patent examination, uh, in social security, disability adjudication, um, in all sorts of mind level administration of a law. And so what was really exciting to me is that we did this, uh, randomized control trial of a peer review program where we randomly enrolled half of the inspectors, to do, to spend one day outta the week jointly visiting facilities, independently observing conditions, citing health code violations, and then they had to compare notes and it turned out, a shockingly high rate of the time, they disagreed on a major health code violation. The peer review group ended up being both better at spotting violations after they learned from all of that. and the inter inspector variability was decreased, uh, by way of that intervention. And that, you know, turns out now to be, something that's, a lot of folks are talking about. I was, uh, have been involved with the administrative conference of the United States and we just published a recommendation in the Federal Register that actually borrows from this work and really advocates for increased reliance on these kinds of peer review programs in agencies, at the federal level, for instance, like the Patent Trademark Office, the Merit Systems Protection Board, who've done this kind of stuff. The second other thing was that on the methodological side, I thought one of the most interesting things from just the researcher perspective is we were involved in trying to figure out how to actually lighten the load to make a randomized evaluation of the food Safety Rating System possible. Essentially the way our team re really came in was we basically figured out how to do a step wedge randomized control trial by developing an algorithm to basically create synthetic reach sub regions within King County that were similar in all pretreatment ways. Uh, that allowed us to randomize the rollout, uh, and order in which the grading system was implemented, which also happened to coincide exactly with the fact that inspectors are typically assigned by zip code. And so you could kind of, uh, uh, stagger the training to kind of onboard people into this system. So, so, uh, uh, that's I think what was really exciting about this multi-year collaboration to me, and in many ways has influenced my work with a wide range of, of other agencies at the Reg Lab.

Speaker 2

Great. I so appreciate those examples. Um, particularly because, you know, I work with a lot of researchers and we're always trying to. Examples of how working with practitioners can improve the research design. But I think your, your work really highlights how it works both ways. Working with researchers can really help strengthen the way that, practices happen in, in agencies and some of your institutional practices. So I think that's a really great example.

Speaker 3

I think that's exactly right. Melin. And the term that I really like for this is, um, co-discovery. it really, you know, the best partnerships are really a process of co-discovery. Uh, it it, these partnerships rarely work when it's a researcher that comes in, they have a hammer and they're looking for a nail. and, uh, it's much more effective if you have a kind of open line of communication to explore the range of different, projects you might, be able to tackle together. And it changes the nature of the questions that you're asking. So I just remember one in those initial conversations with Becky, you know, uh, it was, you know, she was going in and trying to think about what are the, all the operational challenges in, in getting to this point of implementing a food safety rating system. and then, on my end it was, well, how do we line that up in terms of what we already know from the academic literature? What are the interesting questions? What are the unanswered questions? Um, and how do we really go through that list and figure out, which questions we really should tackle? And it's that exploratory process that I've actually really enjoyed as, as part of this partnership.

Speaker 2

Wonderful. Becky El, do you have anything to add?

Speaker 8

Just a couple, um, a couple pieces. One thing that resonated earlier, I think it was, professor Phillips, when you were talking about the complimentary nature of qualitative and quantitative evaluation at the same time. And a piece that I would say to that is, one, I completely agree, and two, kind of building on something EB said that the start of evaluation grows, I think more capacity, openness, and capability for further evaluation. And I think that also happened here with us, with this project where we started with one concept of the qualitative, or I'm sorry, the quantitative research going into the peer review process with our section team, and then also with the actual design of the algorithm behind the food safety rating system. But we also had with us a qualitative researcher who was helping us design from a human experience standpoint, the food safety rating system, and to think about the design of it, how was it read and perceived and understood. And the complementary nature of those two types of, research expertises to bring to the table, uh, and our capacity to leverage that expertise. I think we started to grow the skillset with Dan that then expanded that skillset when we worked with Dr. Esposa and her expertise. And to EBS point, it has been grown that capability of being datadriven and evaluation minded for the program in an ongoing way.

Speaker 2

Yo, would you like add anything?

Speaker 9

I

Speaker 5

and Dan,

Speaker 2

well, we're gonna transition to, a q&a session right now, but right before we do that, or as people are, um, into the invite panelists, um, the, Statements, things that you, uh, wanna just add, that you might have left out in your earlier time. Please feel free to chime in right now.

Advice For Partnerships

20

Hey Malin, I'll just add that, um, one thing that's really come through for me in listening, to these panels is, you know, each of these projects has had more than one researcher working on them. And, you know, while the focus of our conversation here has been around researcher practitioner partnerships, I think what's equally contributed a lot of success to this work is researcher partnerships, working together, like the, all of these projects have had so much richness brought to them because we've had different research perspectives at the table. And that's really, um. Much valuable for us as practitioners too. So big thanks to, you know, all the researchers who are really collaborating well, not just with us, but with each other too.

Speaker 2

Thanks Karen. So, to the audience, I do wanna invite you to use the raise hand feature if you'd like to, to ask a question. And you can feel free to take yourselves off mute. Um, or if you want, you can type your question into the chat. We have, two questions queued up already. so I'll just go ahead and start with one of'em. this one is for the, the agency, people. so collaborating with researchers outside your agency might seem daunting at first. What advice would you give to others who work in similar roles to you, perhaps in a different county or state? Part and how to go about doing it. Um, and Maria, I'm gonna, um, hopefully not put you on the spot too much, but I understand that you have, a really great perspective on this.

Speaker 9

Mm-hmm.

Speaker 6

Yeah. So, I was, I was a pretending, a webinar project, something where we were talking about research. And one of the things that was interesting to me was it was the question was, why are you interested in learning more about evaluation? And it was because I was thinking of myself as working with these researchers that I, was thinking of as like capital R researchers. And I had thought of myself as a little r researcher. So really trying to, you know, be, learning how to swim, with these bigger fish and how to be able to do it better. In learning more and in working more with these researchers, I'm realizing that there, there, that distinction that I had made in my head between capital R and little R was not true and it was not accurate. and I think as you all have hopefully taken away with you all today, that these partnerships are crucial and that we're all bringing in our own expertise, and that those expertises are from all of the, from either each researcher or each practitioner, that they're all contributing to this bigger learning. so I guess the thing that I wanna lead people with is don't be afraid of that challenge. it's a learning opportunity for everyone and don't doubt your own internal expertise.

Speaker 2

Thank you. Yeah, I would definitely second that. I think that all of us in our work, you know, we might be really good at what we're doing, but we always have blind spots. Particularly when it comes to doing social science research, um, it really helps to have the practitioner expertise, which is just as valuable as methodological, um, or other technical skills. So, so I, I, I definitely echo what you're saying. Kim, did you wanna, could you,

Speaker 7

I wanna echo that. they're completely complimentary and I thought originally, evaluating our program and then finding out that we were gonna meet with Notre Dame and evaluate evaluators and researchers from such a prestigious, you know, um, campus that, you know, oh my gosh, I was gonna be the dumbest one in the room. And, I love data. I'm a data nerd, so was super excited to have an opportunity to dig into data. But really didn't under, you know, understand the role I would play in research. And I played an a really important and critical role, I think because I was on the ground with the agencies and the people doing the work every single day and building those relationships. And, and so I do think they were completely complimentary and I never one felt invalidated by David and his team. I mean, completely the opposite. so I just wanna encourage people about that, that that whole, you know, the beginning, our premise of the question is daunting. that's all in your head. And that you need both pieces in order to really produce the best results. And, I, you know, my fears of not being the smartest person in the room, it, it just didn't matter. You know, I, I contributed something and, uh, that all went away. You know, they are smart. I'm not gonna lie. David isn't one of the, David is, well, the most brilliant people I've ever worked with. Uh, never did I have a moment that I felt that I wasn't fully contributing and I could see where both sides were needed. So I think that's super important that people understand that. So I just wanna add that.

Speaker 2

Thank you. Great. And there's a follow up question from Christine, which I think is related also to the practitioners. Um, how do you make the case internally within your team or department for initiating and sustaining these long-term partnerships with researchers?

Speaker 8

I'm happy to start to respond to that one because I think our project was an interesting one in that it, included needing to involve our 65 person team from the food safety program. It wasn't just a project that I could lead or that just the supervisors could lead with Dan. We had to really bring our whole team along. And so I think it was valuable in terms of making that case, but it was also, one, taking the time for us to understand what our case was. Of what we were trying to get to. And then also bringing people in through a pretty, thorough change management process and inclusion throughout the process. And so that looked like, you know, first, first Dan and I would have a lot of conversations. Then I would have them with ea Ben, our leadership team in the program to say, Hey team, what do we think about this? How do we think our team will react to it? And then we would have conversations where we brought our whole leadership team into conversations with Dan in addition to the technical leads on our team to develop the research questions together and then to develop the methods together. And then it included, how would we talk about this with our full team? How would we bring them along? And Dan became part of that change process with us where he actually came to Seattle and would do presentations with our team. And it included really managing that change and building that buy-in of our team from a workforce standpoint. We gave presentations to our electeds during that time so that they were understanding why there was a delay in our timeline of delivering the food safety rating system because we wanted to build the evidence based in the design. and so Dan became a partner with us on that, change management and in that building the case for why we needed the time we did to do the work. so I mentioned those pieces because one, they happened organically. It wasn't as formal as kind of building a case and giving a presentation to just pitch the case. It was building the coalition along with the case and then, having the buy-in from the various stakeholders that we needed to do it, even the restaurant industry. We met with the restaurant association throughout the process and described our process to them so that they were bought in to the credibility of the program.

Speaker 2

Thank you. Another question also probably for practitioners, but maybe for both sides. any examples of how logistics for project resources resourcing play out? So for example, I, I think this question is asking about how are these partnerships, sustained, uh, are they through county investments, research grants, and mix? Uh, any examples I think would be be helpful.

20

I can jump in with one point here, Melin. I think something that has really helped us a lot at King County is that, the researchers we have here are all part of, you know, organizations within their academic institutions that have created a little bit of runway for them. To that co-discovery phase that dam talked about, where we don't have to come in with the funding or the grant right up front, but really we've been able to spend the time and partnership thinking really carefully about what our hypotheses are, what we as practitioners most, um, need answers to for our decision making. What's possible to do, you know, both practically and from a research perspective. And then that has positioned us, I think, really well, to apply for and go, go after, after grants. And I think one of the, one of the things that has maybe distinguished us in some of our grant seeking is that we are coming in with an already really strong partnership and able to show. That, the work from the research is gonna be put to use because you've got the practitioners right there at the, at the table helping to lead it. but I'd love to hear from other folks here as well about, you know, the project resourcing and would love to hear from the, academic researchers as well about their, their thoughts on this.

Speaker 3

I do think that, uh, this is one of the most interesting questions of the long-term infrastructure for this kind of r and d, that is at the intersection between, academics and the public sector sector because so much of the funding stream is for very specific projects. But what you're hearing throughout, I think, these sets of presentations is that so much of what is valuable is that initial exploration, which takes time, trust building, and may not take you down a very pre-specified funding stream. and the time horizons particularly, you know, our, our, uh, group has done a ton of work with Santa Clara County Public health, around COVID response. It's a complete mismatch in terms of the funding cycle and the ability to pivot on a dime, as things changed, in terms of pandemic, uh, response. So I do think that's a really interesting challenge going forward. that is gonna require some reform also for how universities actually support, these, kinds of projects. I thought I'd mentioned maybe two other things that are, I think, related to this question. One is, that depending on the partnership, uh, the, the other, uh, logistical thing that's really important for the academics is data access. And, that can take time, depending on the sensitivity of the data. It can be a protracted process. I do think that as a community we need to be thinking, about more streamlined mechanisms. so our, our county public health officer and I have a, a sort of, have a little piece that's coming up that's really, making the argument that we should have a state and local analog to the Federal Intergovernmental Personnel Act that streamlines the ability of academics to be designated and, and, doesn't lead you, you know, six months down the, the road of general counsel negotiations around certain forms of, of, uh, uh, data access. Maybe the last thing I would say is that the other part that is really important to think about that I think was, was definitely something, that characterized some of the work with, with the, food safety, program is how to, uh, ensure that whatever you do build out. can be, uh, maintained by the agency if researchers, you know, move on to, to other projects. And I think, that is still, can still very much be a, a, a, a pain point. There are some agencies that have their own kind of internal analytics stuff and it's, it's easy to hand over code and whatnot. and, uh, you know, that's not always the case for, for all of our partners.

Speaker 5

Maybe one thing I'd, I'd build on what Dan was saying. I think one common theme that runs through all of these evaluations is the capacity of King County's internal staff to be able to support evaluation and having, having strong skills to be able to that strong organizational commitment to it. So I think part of the, Carrie sort of highlighted the academic side of like having some infrastructure on this side to support us. And I think similar, true on, on, the public sector side where. The public health and HS and Metro to have folks internally who have time to be able to dedicate to that and have the organizational commitment to it allows these long-term projects to build into a partnership as opposed to being a one off thing that's really dependent on one individual or one moment in time or one particular program.

Closing Reflections

Speaker 2

Thanks. That's very true. Well, I'm sure there's a lot more we could learn from all of you. It's been an amazing, conversation. Thank you all so much. But we're at time, so I'm just gonna turn it back to Carrie.

20

Great. Well thank you so much Melin for expertly moderating our panel. It was really great to have you here with us and thank you to all of our panelists. I learned so much from you. I just wanted to pull forward a couple of, a couple of things that I think are gonna last with me. So Dan, for sure, like. Co-discovery is the new, like word of of of my year. thank you so much for, for bringing that forward and for really also, I mean, I think one of the benefits I hadn't really thought about before of these kinds of partnerships is how your perspective as a researcher is really, um, you know, taking what we do and, and showing how it's a benefit in so many different issue areas that if we were doing this workout alone, like that kind of spreading of knowledge would never occur. So thank you. David, um, you really pointed to, the timelines. I love what you said about, you know, the academic is in pursuit of the perfect, uh, answer on an indefinite timeline and, all of, all of your, partnership with us in, you know, making sure that we get to. You know, imperfect answers to make decisions, that have to be made is, is really appreciated. Kim, you're, you know, you're talking about how once you're in an evaluation mindset, you're in an evaluation mindset and how, you know, the process itself is really contributing to learning. That's great. That's exactly what that, little calligraphy on the back of my wall says. so thank you for that. oh my gosh. there's no shame in knowing we've been doing it wrong. What would be a shame is to keep doing it. That's, I mean, that's, I think a lot of what, what this is, this is all about. I also wanna give out a shout. Shout out to one of our really amazing partners who I hope is still on call. The department of, human services department at the state, and he has been a partner with us on almost all these projects, like at every step along the way. And that has been absolutely, absolutely, absolutely invaluable. So thank you, strong, and we're gonna enjoy continuing to work with you. so thank you again. Learn so much from all of you. Thank you. to all of you who attended, I hope you'll, join us for future events. We'll have another event. We don't have a a, a date yet, but in a couple of months and look forward to your participation there. So thanks a lot everybody. Evidence Matters, Part 10: After the Evaluation