In 2016, Information Age Publishing (IAP) launched a seminal and timely series on the Current Issues in Out-of-School Time (OST). The series promotes and disseminates original theoretical and empirical research, promising practices, and policy perspectives from practitioners to further grow the OST field.
The fifth book in the series Measure, Use, Improve! Data Use in Out-of-School Time shares the experience and wisdom from a broad cross-section of out-of-school time professionals, ranging from internal evaluators, to funders, to researchers, to policy advocates. Key themes of the volume include building support for learning and evaluation within out-of-school time programs, creating and sustaining continuous quality improvement efforts, authentically engaging young people and caregivers in evaluation, and securing funder support for learning and evaluation. It features a chapter co-authored by Bryan Hall, Director of Research and Evaluation for the Sperling Center for Research and Innovation (SCRI) and Brenda Mclaughlin, Managing Partner for SCRI and Chief Impact Officer for BellXcel entitled, Beyond Youth Outcomes: Thinking Outside the Logic Model. SCRI spoke with book’s editors, Christina A. Russell, Policy Studies Associates, and Corey Newhouse, Public Profit.
Corey is the Founder and Principal of Public Profit, a consultancy that helps mission driven organizations use data to make better decisions and improve the quality of their work. She’s a former youth worker who has dedicated her career to helping others learn to love data as much as she does.
Christina is a Managing Director at Policy Studies Associates, where she leads external evaluations and helps build capacity to effectively collect and use data. She aims to ask the right questions at the right time to help organizations effectively use data to learn, improve, scale, and sustain programs.
Q: Why do you think a book about Data Use in Out-of-School Time is important to the field right now?
Christina: Both of us have worked as program evaluators in the OST field for a long time; we are committed to helping programs use data in practical ways. We know there can be a tension between using data for improvement and using data for compliance or to demonstrate outcomes and impact. Both of us want to see data demystified so it can be used in ongoing ways for continuous improvement. We wanted to edit a volume that would help program leaders and decision-makers use data, and use data well. We wanted to help readers understand how to make evaluation useful in their daily work.
Corey: There is increasing recognition among funders and decision-makers that continuous improvement is the path toward shifting conditions for learning and youth development, but we aren’t there yet in terms of really understanding what that means. Lots of program leaders and their teams know there is a different way to approach evaluation beyond collecting data to give to a funder, but they don’t know where to start. We believe there is momentum right now in taking a continuous improvement approach to program improvement and we wanted to edit a volume that would offer guidance on how to do that.
There is increasing recognition among funders and decision-makers that continuous improvement is the path toward shifting conditions for learning and youth development, but we aren’t there yet in terms of really understanding what that means.
Corey Newhouse
Q: What stands out to you in BellXcel’s chapter on Thinking Outside the Logic Model and its overall approach to evaluation?
Christina: BellXcel is at that exciting point in its evaluation capacity that it can step outside the box and really try to understand all the data it is collecting, whether those data are directly tied to the logic model or not, so that it can get the full picture of its impact, not just on the scholars but its staff as well. And they are able to do this because of the deep investments of time and resources to develop the tools and processes that bring rigor to their evaluation work.
Corey: “Yes” to everything Christina said. BellXcel does really reflect what it means to be a learning organization. I have been reflecting a lot about how in evaluation of youth focused programs we often unintentionally repeat very harmful narratives about youth in underinvested communities. And I say this as someone who has spent much of my professional life doing exactly that. How many more times do we need to learn that if you do not invest in communities people tend not to thrive?
BellXcel’s example of being open to other signals, open to other learning that is more on a systemic level, is part of that really critical mindset shift our field needs to take. We spend too little time actually looking at adult practice and holding adults accountable for what they’re doing; instead we try to measure in ever more minute ways how students are not reaching expectations. BellXcel is a great example of a learning organization that is able to look around and say, “Oh, wow, there are benefits to lots of people engaging in our programs, including the staff.”
Christina: But it isn’t just a matter of focusing on teachers’ development of certain competencies or their demonstration of a certain skill or knowledge. Instead, the question BellXcel is asking is “how do you take the learning and the experience of teaching in summer and more broadly think about how that affects your practice and the implementation of instruction?” It shifts the focus from outcomes to practice changes that lead to outcomes, and then opens up discussion about how to improve practice.
I have been reflecting a lot about how in evaluation of youth focused programs we often unintentionally repeat very harmful narratives about youth in underinvested communities. And I say this as someone who has spent much of my professional life doing exactly that. How many more times do we need to learn that if you do not invest in communities people tend not to thrive?
Corey Newhouse
Q: One of the main points the chapter makes is to think outside your theory of change to consider unintended outcomes and impacts. Does this finding resonate with your own evaluation experiences?
Christina: It certainly resonates with me. BellXcel was already investing a lot of resources in training for teachers and staff in its summer program and so the adults were implicitly always part of their theory of change. What BellXcel did was take a holistic approach to examining its theory of change to make these investments a more visible part of it. BellXcel was curious about the data and willing to look for things that they didn’t necessarily expect in the data. They didn’t just examine scholar outcomes, which are an explicit element in their theory of change, but asked what else they were learning through the data that they were collecting. It’s almost like a pop-out theory of change because they expanded that box around the reason for investing in teachers during the summer.
One caution I have is there is a danger in saying “Well, we need to think outside of the theory of change, so let’s just measure everything.” The way BellXcel unpacked its theory of change, to me, suggested they looked inside it to further understand what was happening in their programs, and also looked outside to ask new evaluation questions they hadn’t considered before. They kept their logic model as their north star, while using data they were already collecting to explore new questions.
What BellXcel did was take a holistic approach to examining its theory of change to make these investments a more visible part of it. BellXcel was curious about the data and willing to look for things that they didn’t necessarily expect in the data. They didn’t just examine scholar outcomes, which are an explicit element in their theory of change, but asked what else they were learning through the data that they were collecting.
Christina A. Russell
Q: Do you have examples of other organizations or initiatives that have experienced this same finding?
Corey: A few examples come to mind of the unintended, and often unmeasured, outcomes of OST programs. First, as schools are increasingly being called upon to address whole child development, to think about having better climate, and a greater focus on social and emotional learning, we see teachers working in OST programs that are rooted in positive youth development, taking some of the good developmental practices they are learning about in the OST settings and implementing them during the school day. A second very practical example is that with the COVID-19 pandemic OST programs are supporting youth and their families in new ways, demonstrating that they have a set of previously unnamed assets they are bringing to communities—distributing food, doing “drive-by” home visits, helping connect families to resources. I am hopeful that elevating the full range of assets that OST programs are providing will help schools and OST settings will see each other as more co-equal resources, more co-equal allies in supporting youth.
Q: What advice do you have for OST programs trying to do evaluation right now as they pivot to respond to COVID-19?
Corey: We have been encouraging programs to understand that as their program is shifting, their evaluation needs are shifting, their data collection needs are shifting, and that’s all okay. We really encourage our OST programs to consider how they are demonstrating intentionality, and how they are showing the principles that are guiding their decisions. Not surprisingly we are hearing that physical and emotional safety and a sense of pro-social connections are top of mind. If youth don’t learn how to multiply this year that’s okay. The goal is to make youth feel safe and supported and connected. If those are the main goals, there are programmatic choices coming from that which cascade into evaluation efforts.
Christina: Yes, and that points to the idea of a theory of change as a constantly refined living document. There has been so much adaptation that happened really quickly in the OST field as programs take on new roles, and rethink how they are delivering services in this time of COVID, branching out to do others kind of really essential support services, like the food distribution and device distribution that happened a lot in the spring. And then their question is “how do we measure that?” How do we evaluate this new role of OST in the time of COVID?” The answer is that simple documentation is key. Programs should be documenting what they are doing, what services they are providing and who they are partnering with, to demonstrate that they are continuing to serve the youth and families, just not in the ways they were doing before the pandemic. Another approach to evaluation during COVID-19 is to better use the data you already have. There is a chapter about that in our volume. Now, more than ever, is time to maximize available program data, not create new tools and data collection procedures.
Not surprisingly we are hearing that physical and emotional safety and a sense of pro-social connections is top of mind. If youth don't learn how to multiply this year that's okay. The goal is to make youth feel safe and supported and connected. If those are the main goals, there are programmatic choices coming from that which cascade into evaluation efforts.
Corey Newhouse
Q: Based on the chapter, what do you hope BellXcel explores in future evaluation efforts?
Christina: I am interested in the question of how professional development and staff supports for summer teachers spreads to other staff in their schools. BellXcel’s chapter talks about how one of the unexpected findings in the survey was hearing BellXcel summer teachers go back to their schools and be really excited about what their summer experience had been, what they had learned, what they had implemented, and talking about that with their colleagues. Did those conversations “rub off” on other colleagues?
From an evaluation perspective I would want to ask “how does that spread happen”? Is there a tipping point where a core number of teachers from within a school can really change the culture and accelerate the adoption of new practices? A future study could be to examine diffusion of teacher practices within a school and how the BellXcel summer experience can change not just one classroom, but become more integrated into school-wide approaches to teaching and learning.
Corey: We did a local evaluation of a summer program that was STEM focused and they employed a lot of school year teachers as summer instructors. We saw really similar things where teachers would say, “I finally get to teach science the way that it should be taught. I finally get to do inquiry-based science and not just grind my way through the textbook.” But whether they are able to take these practices back into the school day depends on factors we don’t know enough about. Too often we attribute the ability to change practice as a static variable: either the principal and the leadership is supportive or not. I would be curious to learn from case study analysis about the mindset shift in the school building that results in teachers being able to integrate OST practices into their classrooms.
Christina: I think it’s the school policies, Corey. What policies need to be in place to make sure that this change into school day practice isn’t happenstance, or isn’t just the result of a super-motivated teacher who comes in? What are the levers that are in place to allow practice change to happen– as a teacher, in an individual classroom, and then enabling that to spread to the whole school? One of those levers is school policy.
Q: Are there any final reflections you want to share about BellXcel’s approach to evaluation?
Corey: The overall arc of BellXcel’s evaluation and research agenda demonstrates that it is possible to shift the focus away from individual student grades and test scores toward a recognition that there are a range of ways OST and summer programs benefit lots of people involved, including the adults. The more examples we can have of that within our larger ecosystem the better. It is exciting to see BellXcel ahead of the curve in terms of thinking differently about what to focus on, which can continue to help set the narrative for the field more broadly in terms of how we approach evaluation in summer and out-of-school time.
Christina: And just to build on that, one of the core audiences for the volume is program leaders and practitioners. BellXcel’s chapters shows how they have learned from measurement and evaluation over time and used that knowledge to influence adult practice. As such, it is a good example to show program leaders how and why investing in data collection and in building their capacity of staff to use, interpret, and be curious about data can help to grow the organization and deepen impact. BellXcel’s chapter illustrates why, if you are just starting out and there are so many priorities when you are delivering programs and providing direct services, it is important to engage staff in thinking about data being collected, not just using data for reporting.
The overall arc of BellXcel's evaluation and research agenda demonstrates that it is possible to shift the focus away from individual student grades and test scores toward a recognition that there are a range of ways OST and summer programs benefit lots of people involved, including the adults.
Corey Newhouse