Libraries, Education and Neighborhoods Committee 882024

Code adapted from Majdoddin's collab example

View the City of Seattle's commenting policy: seattle.gov/online-comment-policy Agenda: Call to Order; Approval of the Agenda; Public Comment; Department of Education and Early Learning (DEEL) Families, Education, Preschool, and Promise Levy (FEPP) Outcomes; Adjournment. 0:00 Call to Order 1:48 Department of Education and Early Learning (DEEL) Families, Education, Preschool, and Promise Levy (FEPP) Outcomes

Click on words in the transcription to jump to its portion of the audio. The URL can be copy/pasted to get back to the exact second.

SPEAKER_06

All right, good morning, everyone.

The August 8th, 2024 meeting of the Library's Neighborhood, excuse me, Library's Education and Neighborhoods Committee will come to order.

It's 9.30 a.m.

I'm Maritza Rivera, chair of the committee.

I will note that Vice Chair Wu is excused today.

Will the clerk please call the roll?

SPEAKER_02

Council Member Moore?

Council Member Hollingsworth?

Present.

Council Member Morales?

SPEAKER_08

Here.

SPEAKER_02

Chair Rivera?

Present.

Three council members are present.

SPEAKER_06

If there's no objection, the agenda will be adopted.

Hearing no objection, the agenda is adopted.

There's one item of business on today's agenda.

We have a presentation from the Department of Education and Early Learning on the Families Education and Preschool Promise Levy Outcomes.

I'd like to thank Director Chappelle and his team from the Department of Education and Early Learning for coming to the council chambers today to present.

We will now open the hybrid public comment period.

Public comment should relate to items on the agenda or within the purview of this committee.

Clerk, how many speakers are signed up today?

SPEAKER_02

Currently, we have zero in-person speakers and zero remote speakers signed up.

SPEAKER_06

Okay, thank you.

I will now close public comment period then.

Clerk, please read today's first item into the record.

SPEAKER_02

Agenda item one, Department of Education and Early Learning, Families, Education, Preschool, and Promise Levy Outcomes.

SPEAKER_06

Thank you.

This item has been read into the record.

As I said earlier, we're joined today by the Department of Education and Early Learning deal, Duane Chappell, who will share information about the FEPP levies outcomes.

Thank you for being here.

Director Chappelle will you introduce yourself and have your staff introduce themselves and then you can begin your presentation.

Yes we can hear you Director Chappelle and I also note that Councilmember Hollingsworth is online.

So she's here as well.

Thank you, Izzy.

Great, thank you.

And can I remind you though to get a little closer to your mics?

Sometimes it's hard to hear.

Thank you so much.

Go ahead, Director Chappelle.

SPEAKER_00

Well, again, it's an honor to be here.

So today we'll update you on the FEPP levy evaluation activities.

We'll also discuss insights from our monitoring and performance management efforts, as well as our process and outcomes evaluation.

Just want to just say that your input is valuable.

So thank you again for being here.

SPEAKER_06

Director Chappell, I also want to just tell my colleagues, if you do have questions, just please raise your hands.

It's a rather long presentation, and I don't want you to have to wait till the end to ask questions.

please do feel free to raise your hands throughout the presentation.

Thank you, Director Chappelle.

SPEAKER_00

Go ahead.

Thank you.

So to the purpose of today's meeting is to update City Council, this committee on our evaluation activities for the FEPP levy.

We'll share insights from our ongoing process and outcomes evaluations to date.

These evaluations help us ensure that our strategies are effective, or should I say effectively, improving educational outcomes from preschool through post-secondary education.

We'll also highlight areas for improvement to guide further course corrections and decisions.

We'll go to the next slide.

Thank you.

So...

The Feb levy is focused on supporting educational outcomes from preschool through post-secondary attainment.

As you can see, our first year of the levy occurred in 2018, excuse me, 2019-20.

And shortly after that, COVID-19 occurred and in our community and it occurred in our community and impacting our day-to-day lives into isolation and remote spaces.

In education, this continued to our third year of the levy.

Most of our students returned back into classroom in April of 21. At our March Libraries Education and Neighborhood Committee meeting, my team and I, we reported out our FEP year four annual results for the school year 22-23, indicated by the teal color, which you see on here.

This provided us the trends on how many were served and our progress towards our outcomes.

And just a brief reminder, our goals for this levy is for Seattle students to be kindergarten ready, to graduate high school, college and career ready, and attain some type of post-secondary degree or certificate.

This past June, you can see that in orange, we concluded our year five implementation of our levy.

So next year, we look forward in reviewing the FEP year five with this committee.

And just know that this FEP levy currently has two years remaining.

Year six will start in September, and this levy concludes in 2526. Thank you.

So our evaluation approach includes two key components.

The first, the annual report, which provides a snapshot of our monitoring and performance data, highlighting trends and outcomes related to the FEPP levy investments.

Reviewing data often allows us to track progress over time and identify areas for improvement.

The second component is the process and outcomes evaluation.

And the process and outcome evaluation help us understand the impact of our strategies, ensuring transparency and accountability as we work towards both short-term and long-term goals.

As you can see, we conduct various evaluations and reports, and reports are available on our website.

The process evaluation use qualitative and quantitative data to understand whether a program is being implemented as intended.

Share preliminary outcomes and identify areas with room for improvement so we can act quickly to take course correct.

It also, the outcome or impact evaluation primary use qualitative and statistical methods to illustrate a program's effectiveness, whether outcomes can be attributed to the program.

So today, we are going to cover two process evaluations and two outcome evaluations at a high level, but I encourage those that are interested to read the full reports because they will offer a deeper analysis of the effectiveness of our program.

So at DEAL, We are committed to comprehensive evaluation across all of our FEPP investment areas.

Every program participates in ongoing monitoring and performance management as part of our continuous quality improvement process.

A select subset of programs undergo process or outcome evaluations are selected based on criteria like stakeholder engagement, data quality, potential to see impact, and having the resources or capacity to carry out.

This strategic selection process helps us generate new evidence and fill knowledge gaps, guiding our decisions with best available data.

So today we'll review four evaluations across the FEP levy areas, including early learning, K-12, and post-secondary.

And we'll preview an upcoming evaluation for our Promise program.

Department-wide, we'll first learn about our FEP levy implementation process evaluation that was carried out by Mathematica.

For early learning, we'll look at the Seattle Preschool Program Impact Evaluation, which was completed by Education Northwest and American Institutes for Research.

In K-12, we'll cover deals, internal evaluation of the K-12 school-based investments that was conducted by our deal performance and evaluation team.

And the post-secondary, we'll discuss the Seattle Promise process evaluation and preview Westat Insight and the Washington Student Achievement Council's external impact report, excuse me, impact evaluation.

SPEAKER_06

Director, I just want to note that Council Member Moore has joined us.

Good morning, Council Member Moore.

SPEAKER_00

Good to see you.

SPEAKER_06

And also a flag for colleagues and folks watching that the links to each of these reports are on this slide.

SPEAKER_00

Thank you.

So just understanding our evaluations, it does require some clarity on some key terms.

Qualitative data, which is information gathered through observations, interviews, and focus groups.

Quantitative data is numerical data that can be analyzed statistically for insights.

Then you have descriptive analysis, which are evaluations that describe strategies and trends, but don't imply causation.

And then we have casual inquiry, and that's evaluations determining the extent of an intervention produces its intended outcomes.

With casual inquiry, we list a few key terms we will often use throughout today's presentation.

And then we have the quasi-experimental design, which aims to isolate the relationship between an intervention and the outcomes of interest.

So today, many of our students will isolate our investments to our levy outcomes, for an example, like kindergarten readiness, or passing standardized tests, or high school graduation, or post-secondary completion and retention.

And just to ensure that we're comparing similar groups of students, propensity score matching is a technique used to have similar groups for comparison.

And then the last is statistical significance.

This is the degree to which the relationship between variables, so for an example, the difference in an average outcomes between two groups, It could be Seattle preschool program compared to non-Seattle preschool program children on third grade reading.

And that differs from relationships predicted by random chance.

Statistical significance is carried out by conducting inferential statistics such as a chi-square test or regression analysis, just to name a few.

So I just wanted to name these definitions to frame our analysis and help interpret our findings accurately.

Okay.

So the first evaluation, as I mentioned a few moments ago, we'll be sharing is our FEPP Levy Process Evaluation, which was conducted by Mathematica.

Mathematica was selected through a competitive bidding process.

Between 2023 and 2026, Mathematica is conducting two evaluations, examining levy implementation and impacts with special attention to the K-12 investments.

The evaluation includes engagement with advisory committees from various stakeholders, including the K-12 researchers and practitioners that impact levy-funded programs.

The committee advises on design, data collection, and findings, ensuring that the evaluation aligns with our principles and goals The advisory committee members, they were identified by deals, executive leadership based on Mathematica's recommended framework for the types of stakeholders' perspectives to include.

And priority was engaging representatives from as broad as a range of levy partners as possible.

in addition to quantitative research subject matter experts.

So just a few examples of advisory committee members include Levy Coordinator from Cleveland High School, Beacon Hill Elementary School Principal, Seattle Public Schools Director of Research and Evaluation, the Seattle Preschool Program Director, and just to name a few, we had program staff from REWA, we had the Executive Director of ACE, and we had the Director of Housing and Economic Opportunity, Neighborhood House, so just naming a few before we get started.

So the evaluation, again, that we'll discuss today is a process evaluation that focuses on the extent to which the levy was implemented as intended.

Mathematica, they developed a conceptual framework for the FEP levy evaluation that identified key implementation principles from the FEP levy implementation and evaluation plan, Program strategies and practices target outcomes and system conditions that are necessary to support Effective implementation of the Feb levy according to the intent that was laid out in the legislation so Implementation principles included prioritizing investments that ensure educational equity for groups for others from educational justice and ensuring that there's authentic student and family and community engagement, and implementing competitive funding processes to implementing performance-based contracts.

Mathematica, they focused on the following system conditions.

And those were FEP levy initiatives and investments that are aligned and coordinated across the pre-K to post-secondary continuum.

They focus on strong infrastructure in place.

Is there a strong infrastructure in place to support implementation principles, whether that's leadership, or staffing or technology?

And then the community-based organizations as partners, do they have the capacity to implement the program with fidelity?

So what I'm gonna do now is I'm gonna pass this over to Dr. Fajardo to talk to us a little bit about the methods and so forth.

SPEAKER_04

Thank you, Director Chappelle.

So to answer the questions Dr. Chappelle laid out, Mathematica, carried out or used various data sources.

They had 10 interviews with school administrators and levy-funded partner organizations.

Six of those were school administrators, four of those were partner organizations.

They also carried out a survey of leaders of levy-funded partner organizations, of which 59% of the 91 funded partner organizations filled out the survey.

They also carried out six focus groups with staff, families, and high school students at the levy-funded schools.

Mathematica also reviewed over 20 documents randomly, and those documents included contracts, it included our investment strategy documents, and looked at our funding processes documents.

They also looked at SPS administrative data on students, teachers.

They looked at their academic records and looked at school climate data.

With the focus groups, there were seven schools represented across the focus groups and interviews.

Four of them were elementary or K-8.

One was a middle school and two were high school.

The consultants selected schools, prioritizing maximum variation in perspectives and contexts, including considerations of grade level, level of funding, and the range of interventions implemented at each schools.

SPEAKER_06

Dr. Fajardo, can I just stop you for a second and ask you a question about the survey?

59% of the 91 funded partners responded.

Is there, I mean, you have contracts with these funded partners as part of their contract to make sure they fill out these surveys because we don't know how well...

THINGS ARE GOING IF WE DON'T GET FEEDBACK FROM ALL THE PARTNERS.

SO IS THAT SOMETHING YOU ALL ARE DOING OR THINKING ABOUT DOING?

I JUST DON'T KNOW WHY IT'S SO LOW ON THE RESPONSE ON THE SURVEY.

SPEAKER_04

YEAH, THAT'S AN EXCELLENT QUESTION, COUNCIL MEMBER RIVERA.

SO THERE IS A DIFFERENCE BETWEEN Contracts that we have with our levy-funded partners, oftentimes contracts and external evaluation and research are a little bit different.

Typically for evaluations and researchers, there is an opportunity for levy-funded partners or recipients to opt out.

So they give an opportunity for folks that don't want to do it an opportunity to opt out.

We can encourage and find ways for Mathematica or other evaluators to increase this sample size.

In fact, we actually did send emails to principals.

We sent ways to encourage increased survey responses.

But I hear you that could we add in the clause of our contract to ensure that they can fill out a survey.

So that's something that we'll look into and we'll do better.

Thank you.

SPEAKER_06

I think it's really important, too, because it's, again, hard to evaluate if folks aren't, you know, they're the ones that have the information that we need to see how well things are going or to see where we need to make changes.

So I would encourage that.

Thank you.

SPEAKER_04

Thank you.

So based on the data sources that Mathematica used, they used the qualitative, whether it be focus groups or interviews, to generate themes from the data.

They analyzed the surveys to create a descriptive analysis for the study.

There are limitations, as Councilmember Rivera just noted, that not all funded organizations are included in this sample, so that is definitely a limitation to this study.

In addition, there was a focus on K through 12 investments, mainly because we have other external evaluations that we will present today that were capturing similar information.

And we did not, as a collective, the advisory did not recommend doing two evaluations at one time for programs like the Seattle Preschool Program or the Promise.

So Mathematica is triangulating the data to inform their study.

Now shifting to the findings, what Mathematica found where a majority of students served by levy funded programs were students furthest from educational justice and culturally responsive practices, programs and practices were enhanced under our levy.

So the survey that I mentioned earlier, those that filled it out responded that 76% of partners reported primarily serving black students, 53% Latinx students, 29% Asian students, and 31% of other students of color, and 33% immigrant and refugee populations.

90% of our partners agree that their agency has strengthened its capacity to provide culturally responsive services Strategies such as the levy-funded family support workers and instructional assistants who speak Spanish increase their capacity to provide linguistic, responsive, targeted interventions.

SPEAKER_06

So just to underscore, since we only have 59% responded, what would it have looked like if we had had the other 41% who didn't respond?

So I just want to underscore the importance of really requiring that as you're doing these contracts with the partners so we have robust information by which to make the analysis.

SPEAKER_04

Yeah.

Thank you.

SPEAKER_06

Thank you.

SPEAKER_04

So as Director Chappell opened up in our Mathematica's evaluation questions, we allocate funding through competitive requests for investment processes.

where we incorporate community impact and application reviews.

This ensures we meet community needs and priorities.

We also have a data-informed decision-making and continuous quality improvement.

Levy-funded organizations report using data into their decision-making processes and engaging in performance-based contracts at deal.

So again, we use performance-based, meaning that we set goals for each of these levy-funded schools, and we expect and give additional dollars if they meet those performance-based contract agreements.

We also have our systems, conditions, and capacity.

This is where D.O.

supports our funded partners through strategic advising, technical assistance, and professional development opportunities.

However, partners reported mixed capacity to implement programming as intended.

So we at Deal look at the findings and figure out what happened and see ways that we can ensure our partners can be successful.

Overall, these efforts help build system conditions and capacity, enabling effective and impactful programs.

Looking ahead in 2025-2026, a Mathematica-led impact evaluation will investigate the impact of FEPP investments towards closing educational equity and closing opportunity gaps across the pre-K to post-secondary continuum.

Mathematica's analysis will focus on K-12 outcomes using quasi-experimental design to find causal impact.

These designs will examine long-term trajectories of population-level academic outcomes, a method that requires multiple years of data to ensure our trends are solidified and impactful programs.

SPEAKER_07

Chair, can I ask?

Yes, of course.

Can you give an example of what that means?

SPEAKER_04

Yeah, so today we are going to share our internal deal, K through 12 school-based investment.

We, in one of those studies, only use one cohort.

So what Mathematic will do is they will pull together multiple cohorts to ensure that the result is impactful.

Oftentimes, if you only look at one cohort and you find positive results, if you look at the next cohort, they may have negative results.

So if you have multiple cohorts, it strengthens our decision-making process that what we're finding is, in fact, accurate versus just one cohort.

So many of our studies that we'll show today is this is an iterative process, and we will continue to add cohorts so we can build to our knowledge of what we're finding over time.

SPEAKER_08

Okay, thank you.

SPEAKER_04

But thank you, Council Member Morales.

Great question.

SPEAKER_00

All right.

Thank you, Dr. Fajardo.

So thank you for sharing about a mathematical Mathematica evaluation study findings.

So what now we're going to do is shift to our Seattle preschool program impact evaluation.

So we conducted a competitive bidding process to identify an external evaluator for the Seattle Preschool Program.

We selected a team of researchers from Education Northwest and American Institutes for Research.

These organizations are both experts in research and evaluation, and they have a strong track record for evaluating preschool programs.

The focus for this evaluation is to assess our SPP student, program, and system outcomes using mixed methods.

We take a sequential approach to evaluation.

So conducting three evaluations over the course of the levy supports our continuous quality improvement, and it also informs any necessary adjustments to the data and quality systems.

The evaluation receives guidance from an advisory committee representing key stakeholders in the early childhood education and SPP providers, Seattle Preschool Program Provider Community.

An example of an advisory committee members are like instructors at the North Seattle College Early Childhood Education Department, We had a teacher at Daybreak Star.

We had one of our family childcare hub coordinators, childcare resources.

We had program director from Voices of Tomorrow, which is a preschool agency.

And then we had our partners at Seattle Public Schools, early learning manager.

And so, our evaluation of the Seattle Preschool Program focuses on two key questions.

The first is, what is the impact of Seattle Preschool Program participation on kindergarten readiness, as assessed by the WaKIDS state testing among Seattle Public School kindergarteners, excuse me, kindergarten students over time.

That's the first question.

And the second question is, what is the impact of SPP participation on grade three reading and math assessment scores and kindergarten attendance?

So Dr. Fajardo will now walk us through the high level findings.

And just as a quick reminder, all of these evaluations that we're presenting today are available on our website as council member mentioned earlier.

SPEAKER_04

Thank you, Dr. Chappell.

So on this slide here, we're looking at a descriptive sample of our Seattle preschool program.

The table indicates the program's growth over time since 2015, 2016 through 2022, 2023. What we're seeing in the numbers is the number of children served.

As you can see in 2015, 2016, the program started with 269 students.

It increased to 612 and we increased it to 970 students.

We are now at over 2,000 students in 2022 2023 for this study.

We looked at our 2017 2018 for our longitudinal comparisons for third grade outcomes.

So what that means is did you participate in our Seattle preschool program as a four-year-old typically and We will track you into kindergarten, and they will track you to first, second, and to third.

So for today, we're looking at kindergarten.

Did that SEP student meet kindergarten readiness standards?

Did that same student meet third grade standards in the future?

So tracking students as a cohort over time.

Part of the reason why we're showing the additional years is because, to Council Member Morales' point, We need more additional cohorts.

And as the sample size increases, it strengthens our reliability that the findings that we're getting are in fact true.

So in future years, we will continue to conduct similar studies.

In fact, this cohort that we're gonna share today In three years, we can see if SPP had an impact in their sixth grade outcomes, right?

So that is almost like a seven-year gap to do that level of a longitudinal analysis.

So just putting that here is we will continue to conduct additional studies to understand additional cohorts.

Today's focus is really gonna be on the 2017, 2018 970 students to compare kindergarten and third grade outcomes.

The map on the right, you can see who was included in the longitudinal study, and that is in orange.

As I mentioned, the blue indicates additional sites that were joined SPP after the 2017-2018.

So that's just to show you that we have more additional sites that we can include in future studies.

SPEAKER_06

So Izzy, just to confirm the This study in 2017-2018 school year, they started preschool and you're evaluating them in 2021-22 when they're in third grade.

SPEAKER_04

Yeah, so we're evaluating them preschool and then assessing them again in kindergarten and then assessing that same group when they reach their third grade.

Okay, thanks for the clarification.

Yep.

Okay, so now to answer the questions, Education Northwest, they looked at administrative data on children and teachers.

They looked at the teacher's strategies gold assessment and the WaKIDS assessment data.

To do this comparison, COMPARISON, WE AT DEAL WORK WITH EDUCATION NORTHWEST AND THE STATE'S EDUCATION RESEARCH DATA CENTER TO COLLECT ADDITIONAL DATA THAT WILL ALLOW US TO COMPARE THE STUDENTS This study also conducted surveys of SVP families, teachers, and administrators.

They also conducted qualitative data that included focus groups and interviews.

They interviewed teachers, administrators, families, and DEALS coaching staff.

Today, we are not going to be sharing the qualitative data.

We're mainly gonna be focusing on the quantitative data.

But again, I encourage you, for those that are interested, to please see the report.

Now, with the data sources that Education Northwest collected, they used the mixed methods outcome analysis.

Again, they used qualitative coding for themes.

They provided descriptive statistics for survey and focus groups.

But to do the questions that Dr. Chappelle mentioned, they used a multivariate regression analysis.

The quasi-experimental impact analysis, this is done to ensure that we are comparing similar students to similar students.

So can we match similar students participating in the Seattle Preschool Program?

Can we find a similar group based on characteristics of their background that did not participate in the Seattle Preschool Program?

That will allow us to make a causal impact of our program.

SPEAKER_06

And just for folks who may be watching and are not familiar with the terms, the science terms, quasi-experimental is because you're not actually identifying.

You can't identify actual children.

So it's a quasi-experiment.

It's a...

not a full experiment because you're not actually dealing with the kids because of ID purposes, correct?

So I just want to like set.

SPEAKER_04

No, thank you for that.

Yeah, no, that is true.

In education, oftentimes a gold standard is to do a randomized controlled trials.

That is, you find a group of students Half of them will be a part of the intervention, and half will not be part of the intervention, and then you track them over time.

The way that Seattle Preschool Program was developed did not allow for that true randomization to occur.

The next best thing to do is a quasi-experimental, which is a common causal technique to isolate our intervention to the outcome.

So again, it's not...

The randomized controlled trial, which is the gold standard, but this is the next closest thing that we can get given the data that we have and the program that we rolled out.

So thank you, Council Member Rivera.

That is an excellent point to point out.

SPEAKER_06

Well, it's an important one and also important to explain why you can't do the full experiment is because you can't ID the kids.

There's a, you know.

SPEAKER_04

Correct.

Thank you.

There were some limitations even with the study.

We were not able to get child-level individualized education program data, and we were missing comparison group assessment data in some cohorts from the data that we received from Education Research and Data Center.

So it definitely limited our cohort analysis.

So this study that we will show today is for one group, and we should celebrate the findings but also use that with caution.

Okay, so building off the quasi-experimental design, what Education Northwest did was looked at child characteristics that were enrolled in SPP and a state-funded preschool in 2017-2018.

NOW, THE LEFT, VERY LEFT, YOU CAN SEE THE CHARACTERISTICS THAT EDUCATION NORTHWEST LOOKED AT FROM GENDER TO RACE AND A FEW BACKGROUND CHARACTERISTICS OF THE STUDENTS, WHETHER THEY RECEIVED ENGLISH LEARNER SERVICES, WHETHER THEY WERE ELIGIBLE FOR FREE OR REDUCED PRICE LUNCH.

YOU CAN SEE THAT THE ORIGINAL SAMPLE, YOU CAN SEE THAT THE STATE PRESCHOOL AND THE SEATTLE PRESCHOOL PROGRAM HAD NOTICEABLE DIFFERENCES.

in particular with the eligible for free and reduced price lunch, where the state preschool had that at 97%.

Compare that to the Seattle Preschool Program, had that at 29%.

So you could see that there are noticeable differences and it's best not to compare these groups as is.

So what happens is you create an analytic sample using some of the variables that are on the table and you match them.

So Education Northwest did that through a technique that allowed for getting us closer to compare the same students to similar students.

So what you can see is after the analytic sample, the state preschool has 694 students compared to SPP at 256. You'll notice that now all of the students that were part of the cohort analysis were eligible for free or reduced price lunch, thus creating a better sample for us to compare these groups of students.

SPEAKER_06

You're creating a sample pool that are as closely matched as possible in order to make the comparison.

SPEAKER_04

Yep.

Yeah, no, thank you for that.

Yeah, so you want to get it as close as possible so it can strengthen our results.

Because if you continue to use the original sample, the results may not be accurate.

So thank you for that, Councilmember.

Okay, so now to answer the question on how SEP students did in kindergarten readiness compared to similar students.

What Mathematica found was that students that participated had higher SPS kindergarten attendance rates and had higher WaKIDS scores in all domains compared to state-funded preschool children.

That alone is the impact that we're seeing.

It was statistically significant, and the chart will show you how much of an impact SPP had in each of the domains.

So if a student entered at a math domain expecting a 50th percentile, if they did not participate in SPP, we can anticipate, or we can estimate an impact of an additional 29% on that assessment.

That was the largest gain from all of the domains.

WaKIDS assesses social, emotional, physical, language, literacy, cognitive, and math.

So we had some large effects on SPP, and we are encouraged by these findings.

SPEAKER_06

Mostly in the math area or subject.

SPEAKER_04

Yeah, it had the largest in math.

Okay, so that was our walk kids.

So again, encouraging that our SPP students that participate in our program are more likely to be kindergarten ready in all of the domains and compared to similar students that did not participate in our program.

So that is encouraging and exciting news.

NOW SHIFTING INTO OUR THIRD GRADE ASSESSMENT, STUDENTS THAT WERE IN THE SAME SAMPLE, HOW DID THEY DO IN THIRD GRADE MATH AND ELA COMPARED TO STATE-FUNDED PRESCHOOL CHILDREN?

AND WE FOUND WAS, STATISTICALLY, WE FOUND THAT SVP PARTICIPANTS HAD HIGHER THIRD GRADE MATH AND ELA SCORES COMPARED TO THE COMPARISON GROUP.

So these results indicate that participating in an SVP may be expected to improve elementary school outcomes.

We're encouraged by this finding, and we hope to evaluate future cohorts on this to strengthen our result here that we're seeing, and then also continue to go down into further years in their academic journey.

Okay, looking ahead, Education Northwest will conduct additional analysis to understand the impact of SVP in more recent years and with more students.

D will leverage the set of current recommendations outlined by Education Northwest.

They recommend developing a system to share information about children in SPP with kindergarten teachers and families to support the kindergarten transition.

This is more of a warm handoff.

So our SPP schools and teachers have spent the last year with these students and now how do they hand them off to kindergarten so they can be successful in a new environment?

Offer more training opportunities for both directors and teachers to support the needs of multi-language learners and children with special needs.

Consider supports to help teachers access both planning and release time.

So this is part of the CQI that Dr. Chappelle mentioned that we always look at to see, to make sure that we can improve our program to meet our outcomes.

SPEAKER_06

Basically, this just shows that the RSPP program is trending like the national trend that shows if kids have access to preschool programs, they're going to be kindergarten ready and have better outcomes in elementary school.

SPEAKER_04

Yep.

Thank you.

SPEAKER_00

Thank you, Dr. Fajardo.

So this now, this next part, you'll see that what DO did, what we conducted an internal evaluation of our school-based investments, our school-based investments that was focused on student intervention and school level impact.

I want you to know that our analysis explores trends and interventions to identify promising practices and evaluates our school-based investments impact on the levy priority outcomes using data from school year 22, 23.

SPEAKER_06

And these are for K through 12.

SPEAKER_00

Yes, this is K through 12. Thank you, council member.

So our performance and evaluation team conducted an internal analysis.

The evaluation was focused on two areas, student interventions and school level impact.

We explored intervention-level outcome trends to identify promising practices and opportunities for future analysis.

Then we evaluated the longitudinal impact of the school-based investment on levy priority outcomes using a quasi-experimental methods with available data.

The evaluation conducted was in 2024. We leveraged implementation data from school year 22 to 23. So as I mentioned a few moments ago, some evaluation questions.

Next, here are two questions, or should I say two areas of inquiry or evaluation addressed.

And what they did, they addressed these following questions right here.

What interventions does DEAL fund at the SBI schools?

And are these investments linked to student achievement?

Which student interventions are most effective?

And are students at levy funded schools more likely to improve their academic proficiency compared to non-levy students?

So these questions really helped us assess the impact of our investments and to identify areas for improvement.

SPEAKER_06

How many schools do you have?

SPEAKER_00

There's a total of 30 schools, 20 elementary, five middle, and five high schools.

SPEAKER_05

Thank you.

SPEAKER_00

Say it again, Council Member.

SPEAKER_05

Thank you.

SPEAKER_00

Okay.

So what I'll do now is pass this off to Dr. Fajardo to talk a little bit more about the overall impact.

SPEAKER_04

Thank you, Dr. Chappelle.

So to answer the first question, we looked at the student intervention outcomes.

We used a correlation analysis where we looked at the descriptives and performed a chi-square test.

Oftentimes this is a pre-post study period comparing to non-participant trends.

This study here looked at the first year or year one of intervention data for school year 2022, 2023. The second study or to answer the second question, we looked at the school level impact.

This is a longitudinal cohort analysis looking to find causal impact.

So if you were a first grader, or in kindergarten who did not meet a kindergarten readiness, we assess whether by third grade you were meeting that standard.

If you were in sixth grade not meeting an academic standard, did you meet that academic standard in eighth grade?

Same thing for ninth grade.

We looked at ninth grade over time for the freshman students that were participating in our intervention.

By the time they got to our 12th grade, did we see improved academic outcomes?

And I'll provide more details into the data sources and our cohorts in future slides.

SPEAKER_06

And just as a reminder, that Chi-square test just looks at the significant, it measures statistical significance.

SPEAKER_04

Yes, correct.

It's just an additional test.

So, as Councilmember asked a few minutes ago how many schools we support, we support 30 SPS schools, 20 elementary, five middle schools, and five high schools.

This is a total population of about 16,000 students indicated by the blue box.

I do wanna also note that SPI compliments other levy investments.

We also have school-based health centers, wraparound supports that we have and culturally specific and responsive programming that is in addition to our school-based investment.

So it's also important to keep that in mind as we're looking at the information.

SPEAKER_06

And you have school-based health centers at 29 of the 30 schools.

Correct.

SPEAKER_04

So school-based investments has two main investment strategies.

There's a student level and this is more direct intervention where this is provided by school and community-based organizations to support attendance, academic performance and college and career readiness.

Oftentimes this can be in small groups or it can be in one-on-one.

This is indicated by the orange box, which is 8,000 students participated in targeted interventions.

The school-level strategies include capacity building and continuous improvement efforts, such as high-quality instructional practices, effective leadership, school climate, and family engagement.

What that means is that all schools participate in these school-wide strategies.

In the next slide, we are gonna be focused in the 8,000 students that are served by direct interventions.

So these direct interventions are categorized by three buckets.

There are academic interventions, which target competencies in core academic subjects such as math, English, language and arts, and literacy.

Of the students participating in these academic interventions, we had 59% of our students receiving this academic intervention.

Some of these examples include John Muir Elementary, which works with tutors and a math coach to provide one-on-one and small group tutoring.

Aki Kurose Middle School assigns students to math empowerment courses to shore up mathematical reasoning and strengthen students' math identities.

And high school, multiple schools provide after school tutoring and Saturday day school focused on these subjects.

46% of our students participate in enrichment activities.

This is focused on 21st century skills, such as leadership, teamwork, critical thinking, and social emotional learning, or college and career readiness.

So this is in addition to the academic pieces and enrichment.

Some enrichment examples, several school partners with Space Between, a CBO that provides mindfulness support for staff and students while others provide time during the day for students to explore interests and topics that they might explore in college or career.

Within middle school, Denny, Aki, Washington, Mercer partner with Parks and Rec's Community Learning Centers to provide afterschool enrichment and identity development programming along with initiatives like My Brother's or My Sister's Keeper.

With an integrated support, this is also oftentimes called wraparound.

23% of our students participated in this intervention.

Now, these interventions support students and their families facing barriers to attendance and engaging with services, such as students' case management and referral programs for families to connect them to basic needs.

Academic interventions may be closely tied to academic outcomes.

Enrichment and integrated supports may be more closely tied towards attendance, right?

We want to get students excited about school.

We want them to get excited about engaged with college and career.

Those are oftentimes longer term.

strategies to see impact on academic outcomes.

So I just wanted to frame it there.

And also, this does not equal 100% because many of our students participate in multiple interventions, right?

So Dr. Chappelle could participate in an academic intervention, an enrichment intervention, and an integrated support intervention, right?

So majority of our students participate in many of these.

Okay.

SPEAKER_06

Yes, Council Member Moore.

SPEAKER_09

Thank you.

If we could go back to the slide 29. I don't know if you can answer this, but under the integrated supports wraparound, how does that work in terms of, are there school resource officers that the families are connected, referral for families to connect them to basic needs?

The reason I'm asking this question is that I'm in other contexts really interested in rental assistance and how we proactively identify families at risk of homelessness or housing instability and security as well as food scarcity and wondering how we can be more targeted and proactive in that.

So I'm just curious kind of how that works.

SPEAKER_01

Yeah, and I can try to answer that.

Council Member Moore, my name's Colin Pierce.

I'm a senior advisor to K-12 school programs, and I work with 10 of the 30 schools receiving school-based investment dollars.

And schools have different strategies for addressing and pairing students with resources.

So some schools will have a single sort of family connector who is at the school.

This is something that's funded both through our wraparound services and then also through the school-based investments.

Other schools, and this is more common at the high school and middle school level, there will be case managers, like student success coordinators, who will be connecting students to the resources that they need based off of what those case managers are seeing on the ground.

And other schools, Interagency Academy, which serves students who are credit deficient mostly, they have an entire room for various resources on campus in order to get that right away to students.

So there are folks at every school who are connecting students to those resources.

And some, Schools like John Muir Elementary, for example, has a relationship with Mercy Housing, which is right down the street.

And so there are folks within Mercy Housing who are working directly with families and connecting to the school.

SPEAKER_06

And are those SPS resources?

I mean, because we do our resources as the city and through the levy, we provide supports.

And then also Seattle Public Schools provides this support also in some of their schools, or I think most of their schools.

Do you know?

SPEAKER_01

Yeah, so some of the resources are Seattle Public Schools resources.

There are McKinney-Vento supports for students, unhoused students.

And those are funneled through the district.

But there are other supplemental supports that we provide.

And the folks who are connecting students and families with these wraparound supports, you know, don't necessarily always distinguish between which are district and which are city and which, you know, which are coming from which places.

They're really focused on like, what are the needs right now?

And how can we get students connected with those services?

SPEAKER_06

And I only raise that because we're multiple entities are doing, working on this investment, which is great.

I mean, it's what we should be doing, but yeah.

And there's a lot of collaboration, I understand.

But correct me if I'm wrong.

SPEAKER_01

Yes, there's a lot of collaboration.

Oftentimes, Teams within schools are meeting regularly in order to not only raise concern about students or sort of report back on successes, but also to improve their own systems for connecting students to resources.

SPEAKER_00

It speaks to the power of our partnership.

SPEAKER_06

Yeah, and just as a Seattle Public Schools parent at Ingram High School, I have seen this working at Ingram High School and how the team of resources folks who help kids at the schools come together and meet in order to problem solve.

And it's not just the students that need, for example, what Seattle Public Schools calls a 504 plan.

It's in general, if a student needs support, folks who work in this space, both SPS staff, as well as the supports that we provide will work together.

SPEAKER_09

Thank you.

SPEAKER_04

Thank you.

Okay, so now we, so that was a descriptive sample of what the students were participating in, and now we're getting into the correlation or the chi-squared test to answer the first question.

Now, what we're seeing here is our outcome is focused on attendance.

As a reminder, regular attendance for Seattle Public Schools is 90% or more.

Here, I want to acknowledge, before we get into the data, that the data since the pandemic Since students returned back to class, attendance has been a problem at the district, district-wide.

So not just, you know, our school-based investments, but district-wide, there has been a decline since the pandemic.

That is no different for our students.

So what we're looking at the chart is we are seeing students that were in the school year 21, 22, and were basically said, who was not going to school 90% or more, right?

And then we said our interventions are targeted to improve attendance.

Right.

So then we say in the next year, did we see attendance improve as a percentage?

OK.

And what we saw, though, is that the green is students that participated in our integrated supports.

The blue is our enrichment intervention.

And then the gray is students that did not participate in any of our interventions.

And what you'll see is attendance in general was an issue for all of these students.

However, Within integrated supports, we found that there was a smaller decline in attendance compared to non-participants.

It was statistically significant at a 5% difference, and this suggests that our interventions are effective in mitigating attendance issues particularly for high school students.

So again, ideally, we would want this to be a zero or positive.

However, keeping it in context, we can see that our integrated supports does make a difference in mitigating the attendance crisis that we're seeing at the district.

So definitely something to highlight.

And I wanted to make sure that we understood that.

Although negative, our investments are still kind of supporting and to not have it be the 14% that you see in the gray column.

Okay, I'm going to shift gears into academic progress.

So what we're seeing here is students that met our Smarter Balanced assessment in 2022, what that means is students that were participating met that standard at 18% in blue, and I'm looking at the pre-post SBA map chart.

So students that were participating in our math intervention, how many of the students were meeting the SBA standard assessment in 2022?

What we found was that 18% of those students were meeting that standard.

Part of our levy charge is to focus on students that need the help in academics.

So that is what we expect.

Now, the students that received an academic intervention, how did they do at the next assessment back in 2023?

And what we see is there was a 6% increase.

So that means if you were a student, as a collective, you increased 6%.

Now, how does that compare to students that did not participate in a math or in an academic intervention?

And what you see is students that did not participate, they were meeting the standard at 49%.

A year from there in 2023, there was no change, right?

So we see and feel confident that our academic interventions demonstrated higher gains and SBA assessment results after one year compared to non-participants and that was statistically significant.

Definitely a positive highlight for our elementary and middle school academic interventions.

The same trend was true for our ELA.

And then now we're looking at the bottom chart where we see a 5% difference within our students that received an intervention.

And you'll see that non-participants had a 1% gain.

So definitely encouraging that our investments are targeting the students that need the most support.

And we're seeing a bump in the next following assessment.

So something to celebrate.

I'm gonna move us into the second question that Dr. Chappelle mentioned, and now we're looking at logbook.

SPEAKER_06

Sorry, I'm working this out in my head, but you're saying we're targeting, we're definitely targeting the students that need the most support, but I don't, what I see is that the students that are getting support are being helped, but they don't know, How can you make the assertion that the students that are getting the help are the ones that need it most?

I mean, it seems like it's two different things, right?

People that are participating are doing better, but I don't know your sample size.

You see what I'm saying?

SPEAKER_04

Yeah.

SPEAKER_06

It's not necessarily that we're capturing everyone who needed the most support.

SPEAKER_04

Yeah.

SPEAKER_06

Those that needed the most support.

Just the ones that had this support did better.

SPEAKER_04

Exactly.

Yeah.

So the ones that were struggling academically and received their support were seeing an increase in both math and ELA.

Yep.

Thank you for the clarification.

Okay.

Moving us along to our second question, which is more of a longitudinal question, right?

We are looking at elementary, we're following a cohort, and this particular analysis, we're looking at SPS students who entered at 20 elementary schools, our school-based investment schools, in the school year 2019-2020, And we're looking for students that were below kindergarten readiness standards.

So like if you entered kindergarten and you were assessed and you were below kindergarten readies, how did you do in math and reading at the third grade for that school, okay?

And what we're seeing here is for those students that were not kindergarten ready, we found a statistically significant finding that students at SPI elementary schools are twice as likely to achieve third grade math proficiency compared to similar students that did not receive levy dollars or levy fund interventions.

So that is if Dr. Chappelle participated in a, was not kindergarten ready, received a services from our levy, he would be twice more likely to be third grade math proficient compared to a student that did not receive levy interventions.

A very positive finding, statistically significant and was twice as likely.

However, we did not find that statistically significant odds for reading.

They were just as likely.

What that means is if Dr. Chappelle was not kindergarten ready, participated in an intervention, Colin Pierce on this side was not part of a levy funded, they were just as likely to have the same result.

So there was no impact.

For the ELA.

Then the other, right?

But we found it for third grade math proficiency.

SPEAKER_09

Can I ask a question?

No, no problem.

Any thoughts on why that is, that the reading proficiency was better but not statistically significant as opposed to math?

SPEAKER_01

I can speak to some conjectures on it.

I think one may be that students were farther behind in math to begin with.

And so we're seeing increased likelihood as a result of that.

But also just as a reminder, this is a single cohort for the first year after students returned in person.

And so what we're seeing is really just from over the course of that one year.

And it may be that reading gains are achieved over a longer period of time as well.

So as we get more information from upcoming year's assessments, we should be able to see whether that may be the case.

SPEAKER_09

Thank you.

SPEAKER_04

Council Member Moore, we will take that comment and we'll see if we can find an answer and get back to you.

SPEAKER_06

Great, thank you.

Any other questions, Council Member Moore before Council Member Morales?

SPEAKER_08

Okay, so these were kids who were in kindergarten just before COVID started.

I guess what I'm trying to understand is do you know, like when we're talking about the interventions or the investments that they participated in, does that mean necessarily that they were involved in something all three years?

Do we know and does that impact how they perform?

SPEAKER_04

Yeah, no, that's an excellent question.

And that's definitely an inquiry that we're interested in looking at.

So the question, when I hear that question, I think of like dosage over time.

So, right.

So if you were a kindergarten who was not, you were below standard, you received an intervention that year, did you continue to receive interventions in the following years?

I would think yes, but I would need to confirm.

So definitely we'll get back to you on how many interventions and if they were included in that cohort process.

SPEAKER_08

Well, and for this cohort in particular, I'm thinking, I remember visiting a school last year.

And at that time, these were second graders.

And the second grade reading teacher saying, like, these kids are still catching up, right?

Because...

they didn't have kindergarten.

They didn't sit down in circle time.

They didn't have an adult reading to them.

And so they come into her class and they have no idea what that means because they didn't experience it.

So there's a lot of other, I guess, maybe that's culture of the classroom learning to do, or I don't know, social emotional learning that they missed out on.

And so it's just making me wonder about what other services, in addition to the academic stuff, services that they need, what this particular cohort needs in order to catch up from having missed such important learning, like cultural learning about what it means to be in school.

SPEAKER_01

One thing, and I just want to make sure I'm clear on this.

We're looking at the school level impact here.

So this is not just students receiving intervention.

This is by virtue of being at a school receiving levy funds.

This is the impact.

So it's not just the students receiving direct intervention on these subjects.

And so that I think we can tease apart and we obviously, we've got anecdotal schools telling us what combinations of things are effective, but as we dig into the data more and as we have more years of this data, we'll be able to look for, is it a combination of academic and wraparound services or enrichment that is helping to achieve some of this?

Or is it just the school level work they're doing on improving the quality of instruction?

SPEAKER_04

Yeah, no, thank you for that, Colin.

Yeah, so I should have been clear when I started this second question.

This is now focused in the blue circle, not the orange circle.

SPEAKER_06

Okay.

I mean, it seems like there's a lot you need to dig into here because math was impacted positively.

So yes, these students went through COVID and there were impacts, And yet math, they were twice as likely at the SBI schools to be at proficiency, but not the ELA.

So it seems like there's a lot at play here and to dig into as to why, because you would have expected that the impacts of COVID and not being in the classroom, et cetera, would have had impacts on both at the same rate, but they're starkly different.

SPEAKER_04

Yeah, no, for sure.

And I think the digging deeper is perhaps looking at a very more narrow focused on the ELA.

And that's where you could do a mixed method study that looks at qualitative and quantitative data to get into some of the weeds that we're talking about to understand maybe it was a particular curriculum, maybe it was a particular program.

Oftentimes it can be variation in what schools select.

So maybe it was a couple of these, right?

So there's a lot that we could dig into and I will make note of that and hopefully it followed back up with this committee.

SPEAKER_06

Thank you.

Because I think we don't have to wait until, I appreciate what you're saying, Colin.

But I think we also have, we can do a sub analysis of this data for this particular cohort still working with the SBI schools and those principals to find out what is going on here starting now.

versus having to wait till sixth grade when they probably do these assessments again, because a lot of time will be lost.

And I think that these quasi experiments, that's part of what makes this challenging as well, doesn't give you all the information, but you can do another quasi experiment to really dig into this particular question.

So I hope that that's something you all are considering doing so that we can get to students sooner rather than later.

SPEAKER_04

Yeah, no, thank you, Council Member Rivera.

Yeah, that is true.

Oftentimes when data is presented, you get additional data inquiry, which is why I am proud to work for the department because we have continuous quality improvement, which regularly looks at data.

We assess and try to understand how we can do better.

So that is definitely an inquiry that we'll look into.

SPEAKER_06

Thank you.

Any other questions, colleagues?

Okay, sorry.

Thank you, Izzy.

Go ahead.

SPEAKER_04

No, it's okay.

Okay, so we did elementary.

Now we're shifting into another cohort.

This is middle school.

So for this analysis, we looked at students who entered sixth grade below grade level proficiency in school year 2018-2019.

And what we're basically seeing is can students that participate in our levy schools, can they be proficient by eighth grade in math, in reading, and ELA compared to non-levy schools, okay?

And what we found is we saw math again, BE FOUR TIMES AS LIKELY TO BE PROFICIENT COMPARED TO NON-LEVY SCHOOLS.

HOWEVER, IT WAS NOT STATISTICALLY SIGNIFICANT.

SO THERE'S SOMETHING ELSE HAPPENING FOR THESE STUDENTS THAT WE COULD NOT FIGURE OUT WHY IT'S NOT STATISTICALLY SIGNIFICANT.

SO SOMETHING IS HAPPENING IN OUR COMPARISON GROUPS.

FOR ELA AND READING, THEY WERE JUST AS LIKELY.

something we'll have to look into in the middle school, in particular school-based investments to see what is happening.

What these findings show is that, yeah, there's positive findings for math, however, not significant.

So that is the finding.

Moving on to high school

SPEAKER_06

And again, sorry, Izzy, just to underscore that then you all will be digging into the why, because that helps you shift the investment to, you know, to determine how to best, how to make it statistically significant.

SPEAKER_04

Yeah, correct.

So internally, we'll look at this and possibly, and I'm mathematical also be digging into this.

SPEAKER_01

If I can add something on that, just that while we're working with this data in aggregate, our schools are working on a regular basis examining their own data and trying to track why there may be growth or not.

We don't have access to the same data because of privacy and FERPA reasons.

We get this data in aggregate, and we can work with schools who are doing regular data analysis to get what they're seeing, even if we don't have access to the exact same data.

SPEAKER_06

And do you do that?

Do you do the...

They can't share the actual data, but they can share the result, the ending result, like we have here.

SPEAKER_04

Yeah, so we get de-identified data, meaning that Director Chappell, we know that he was in 12th grade, taking these classes, got this GPA, attended school this many days, received these GPA, and did not graduate, right?

Like we get that level of data and we also analyze it in aggregate like we're presenting today.

What Colin is saying is Seattle Public Schools has additional data sources.

That's what I'm talking about.

Yeah, like that can pinpoint Dwayne Chappelle and then can provide direct interventions.

So there's just a little bit of data that we, we get a lot of data but not the personal like identifiable data.

Yeah.

SPEAKER_06

But I guess my question is, does SPS share the results of their data, not Duane's actual numbers, but the results of what they're seeing so you can compare their results with your results?

SPEAKER_01

Yes, we can.

And I think the point I was trying to make is more that while we are looking at how we can identify what's happening In aggregate, the schools are already digging into that on their own, and we're in constant communication with them.

So both their observations and the aggregate data will be able to help inform us on what might be happening.

SPEAKER_06

We'd love to see that.

If you do a comparison between the SPS and what you're seeing, that would be really helpful as well.

SPEAKER_04

Thank you for that.

Thank you.

Okay, so I'm gonna move us into the high school cohort study.

Again, this is for our five school-based high schools compared to non-levy schools.

And what we're seeing for this study is we assess ninth graders who were not meeting or passing core courses.

in 2018-2019 or the class of 2022. And what we found was, and we looked at two outcomes.

One, did they pass courses in 12th grade?

and did they graduate on time?

What we found was students at school-based investment schools entering high school below standard credit accumulation had higher odds of on-time graduation than their non-SPI peers.

Again, they were just three times more likely to graduate, however, not statistically significant.

For passing core courses, they were just as likely.

So Levee High Schools compared to similar students in schools, just as likely.

SPEAKER_00

So what I'll do now is transition us to the Seattle Promise process evaluation.

As I mentioned before, our internal evaluation of the Seattle Promise focused on the 22-23 student experience and operational improvements that we made in response to COVID.

So using both quantitative data and qualitative surveys and a focus group, we assess trends and student perspectives on their college persistence and completion.

You can go to the next slide.

And so during...

During the 22-23 school year deals, our performance and evaluation team, they conducted a process evaluation focused on student experiences and preliminary results of the operational improvements in response to COVID.

And the data included, as I mentioned a moment ago, perspectives shared by students in surveys and a focus group as well as a quantitative data on the implementation of our outcomes.

And so what I'll, DO IS, AGAIN, HERE ARE THE EVALUATION QUESTIONS.

OUR PROMISE EVALUATION, AGAIN, FOCUS ON THESE TWO.

HOW DID PERSISTENCE AND COMPLETION RATES CHANGE FOR THE STUDENTS ENROLLED IN 21-22 SCHOOL YEAR?

AND WHAT FACTORS DID STUDENTS AND STAFF ATTRIBUTE through their persistence and completion.

So these two questions really helped us understand the trends and outcomes associated with our programs and identify additional areas for further improvement.

And so Izzy, if you can do like a moment ago and walk us through the findings.

Thank you, Dr. Chappell.

SPEAKER_04

All right, so shifting into post-secondary and to answer the questions we just shared, Here we have a table, and I'm going to walk us through the table and then go through the findings.

So the table here is we're looking at cohorts 2018, 2019, and 2020, and 2021. You'll notice that they are color coded.

2018 is the cohort where Promise first begun.

And this cohort had six high schools.

was before the pandemic hit.

So all of these results that we're looking at are for retention.

So a one year retention refers to students enrolled in their fall, first fall to spring.

And a second year retention refers to scholars enrolled in their second fall to spring.

And then we're looking at their completion data within three years.

So did they get a degree within three years?

And what you'll see for the first cohort was their year one retention was 57%.

In year two, that was 42%.

And within students completing within three years, it was at 38%.

Looking at this data is only focused on Promise Scholars, but if folks follow community college trends, these are definitely higher than national and Washington State community college three-year completion rates.

2019 cohort, we expanded support in the third year, which meant that we allowed for re-entry.

It was in the middle of COVID or COVID had just started.

So there was a little shift that needed to happen.

This year also had the full 17 high schools that were able to participate.

And you'll see that retention for one year went to 51% 36% for a two-year, and all students were completing within three years, within 31%.

Cohort 2020, this is a cohort that had supports in the second and third year, and we'll talk a little bit about the supports here in the second.

But their one-year retention increased to 54%, 42% in the second year, and 32% within three years.

Cohort 2021 had a one-year retention of 50%, a two-year retention of 46%, and we do not have completion data for three years yet because we need three years to happen.

And right now we don't have those three years.

So we look forward to coming back to committee when we have the data available and likely when we have our FYP year five report.

Okay, now that's the table.

Now let's dig through some of the findings.

What we found was within our 2021 cohort who received the most supports, three years, had a higher two-year retention rate at 46%.

This is the highest two-year retention rate of our Promise Scholars to date, and that includes our 2018 cohort.

That said, we do not know what their three-year completion rate will be just yet.

However, if we look at our cohort 2020, we are encouraged that at least the 32% is an increase from the previous year, although small.

These completion rates have not been to our 2018 completion at 38%.

Again, this was a smaller group of students, fewer high schools, and it was our first year.

So that is our findings for this promise completion rates.

Okay, so some of the early operational improvement results.

I talked about it a little bit in some of the changes that we made.

We had a reentry pathway introduced.

And what that means is if Director Chappelle at some point left the Promise Program, we were doing our best to try to bring Director Chappelle back into the Seattle Promise Program.

And what we found was 70% of students utilizing the reentry pathway in the fall identified as BIPOC.

70% of the scholars surveyed felt the reentry process was clear.

One quote from a student, reentry, it was a very easy process and I felt supported after a hard time.

We also introduced a path to UW transfer pathway as part of our operational improvement strategies.

And we have a current partnership with the University of Washington and Seattle colleges to, those that are interested in continuing their educational journey to the University of Washington.

We provide supports for those students.

And what we've been seeing with that UW Transfer Pathway is that students are admitted and enrolling at a higher rate than community colleges overall, 86% to 71%.

So definitely encouraging to see students complete their two-year degree and move on to a four-year institution like the University of Washington Seattle.

So definitely encouraging.

We have a quote here.

Pat, the UW staff, she really helped me figure out how to figure out the transfer process for UW nursing, and I learned a lot about requirements I need when I start the application process.

next school year.

So definitely encouraging supports that we're seeing with the Promise Program.

Okay, the other piece was as part of this evaluation is, you know, we asked Promise Scholars whether, you know, what made them continue to stay in the Seattle Promise Program.

Scholars identified tuition support, at 95% as a reason to continue with Seattle Promise, personal or career goals at 85% and family, friends or community at 66% as reasons to stay or continue with the Seattle Promise program.

We also, in the survey that we carried out, we asked scholars if they had supportive adults in their lives that encouraged them to continue in their education.

And 85% of those said they had an adult.

First generation was 85%.

Those continuing generation, 89%.

Students that were BIPOC, 85. And white students, 85. So overall, students felt that they were supported in their process.

Another data point that I want to highlight here was over 33% of multilingual scholars felt knowing more than one language helped them understand academic concepts.

So just a nod to knowing multiple languages.

SPEAKER_06

before we move on I do have a couple questions one is on this survey did you say how many students were surveyed and I missed it sorry if I did but yeah no it's okay it's 272 students were surveyed that's about 22 percent of the promise student population although low

SPEAKER_04

and being a higher education professional, I would say that typically institutional level surveys of this magnitude fall anywhere between 20 to 28% is a normal.

But yes, I know the question behind the question and we will increase our sample size to the extent possible to ensure we capture as many student voices so we can be sure of the data point we display here today.

SPEAKER_06

Thank you, Izzy, you're showing we've worked together for many years now, and so you know where I'm going with my questions.

Thank you for that response.

And then if you go to slide 38, so with the retention and completion rates, so on the retention rates, for instance, we're increasing from, well, increased, or I guess it decreased in 19, increased again in 2020, but then the BIPOC student completion rates in 2018 have just steadily decreased, even though the retention in 2020 increased.

So do you have information about that?

And I understand it looks from this chart like the retention rate, there is no indication of the BIPOC students' retention rate.

I think this is all retention rate.

But do you have a comparison?

SPEAKER_04

Not today, but we have it and can share it.

SPEAKER_06

That would be great because then you really can see, you know, is retention up, but Completion rates are not.

I mean, it's hard to tell.

If you're going to separate the populations, it's helpful to see.

Okay.

Then the retention for each of those populations you've separated out.

Okay.

So you can really compare.

SPEAKER_00

Thank you for that, Council Member.

SPEAKER_06

Okay.

SPEAKER_00

Okay, I think we're in the home stretch.

All right, so now we'll slide over to the Promise Impact Evaluation Preview.

So we partnered, at DEAL, we partnered with Westat and the Washington Student Achievement Council to evaluate the impact of the Seattle Promise.

Ongoing evaluation is set to conclude in December of 2025, and it will provide valuable insights into the program's effectiveness in advancing post-secondary access and degree completion.

Um, Westat, they are a leader in rigorous research, data collection and analysis, technical assistance, evaluation, and communications.

WSAC, Washington State, is a Washington State agency working towards raising educational attainment through strategic engagement and program management, as well as partnerships.

So the evaluation will be focused on a few things.

So phase one, it will be a detailed analysis of cohort outcomes, like enrollment, persistence, and completion.

And then phase two will be estimating program impact on student outcomes.

And as of right now, so like I mentioned, the review spans from January of 2023, and it will conclude in December of 2025. And the final report will be ready then in December 25 as well.

So here are, as I mentioned earlier, a few of the evaluation questions.

And so for this one, the key question in this evaluation were, who does Seattle Promise serve?

How do applicants fare in college enrollment and completion?

What characteristics are associated with persistence?

And does the program help to close race-based opportunity gaps?

So these questions aim to assess the program's impact and, as you know, identify areas for improvement.

And as before, what I'll do is ask Dr. Fajardo to walk us through some of the methods.

SPEAKER_04

Okay, so thank you, Dr. Chappell.

So we are now entering a causal space.

So we are going to be looking at correlational descriptive chi squares and logistic regression, and we'll do qualitative focus groups.

The impact study will have a quasi-experimental design.

It'll have a multi-level ordinary least squares regression.

and we'll have a propensity score matching.

Again, this will be conducted by Westat, a leading organization in evaluating Promise programs specifically.

We have been working in partnership to gain all of the necessary data to conduct this study.

Some of the data that has been a challenge to collect has been understanding students financial situation and that how that impacts their application, their enrollment, retention and completion.

We currently at deal do not have that data and therefore could not do this type of analysis.

So a third party and Westat through their research and their human subject review board have received approval to gain access to that data to complete this level of study.

So projected insights, we will assess to what extent did Seattle Promise Scholars differ from high school graduates overall?

Estimate the extent to which program components are associated with the Seattle promise applicants college enrollment progress retention and completion outcomes.

This will have longitudinal outcomes similar to the studies that we share today, they will estimate a causal relationship between Seattle promise and key outcomes compared to a similar groups of students.

Now that concludes all of the data that we'll walk through.

So I'm going to do is bring us back at a summary level just to signal to the audience and to you all council members.

What does this mean to deal?

One, I would say that the findings are highly encouraging to deal despite cohort data limitations due to COVID disruptions and data collection.

Today, we shared that we could do cohort analysis with the limited data that we had.

Some of those findings promising some of the data that we shared warrants deeper inquiry.

And the third piece is can point us in the direction that we want to head.

Like the second finding, SPP participants continue to be more kindergarten ready than their peers who attended other preschool programs.

Exciting new evidence for SPP impact on third-grade proficiency.

The third takeaway is promising statistical results for K-12 on elementary academic interventions for math, while both middle school and high school interventions showed improvements in school outcomes.

However, these were not statistically significant.

The fourth is Seattle Promise data shows progress towards closing racial opportunity gaps.

We have implemented programs to bolster BIPOC students in our reentry pathway as well as our UW admission.

I will acknowledge that we will have deeper inquiry into the retention of year one and year two and follow the BIPOC students to see if there's areas for ways to support them along those steps.

That will conclude the summary, and we will open up for any remaining questions that have not been answered along the way.

SPEAKER_06

Thank you, Deal.

Colleagues, any questions?

SPEAKER_08

I just have one more question.

Going back to the point that the chair made about BIPOC students showing a steady decline, I know I think maybe it was two summers ago now during the annual meeting, all the data that you put up, the charts showed particularly that Hispanic students were not achieving in the way that we had hoped or would have hoped.

So can you just talk a little bit about what's being done about that?

And is that information, if we dig around in here and the links that you've provided, is that information in there?

SPEAKER_04

Sonia, you want to take the Seattle Preschool Program of what we're doing in that space?

SPEAKER_03

So in the Seattle Preschool, in that space, we have a dual language initiative.

And so we are focusing, leaning in on supporting teachers to support children's language and literacy development.

And I think we are starting to see some promising trends among those students in terms of looking at their Teaching Strategies Gold assessment results and their WaKIDS results.

So thanks to our amazing performance and evaluation team, being able to really look at this data and make mid-course corrections along the way, doing a lot around training those teachers, providing coaching.

So it's very robust and it's promising in terms of that initiative and focusing on our dual language learners.

SPEAKER_08

So thank you.

I understand why there would be more focus at preschool to get kids ready for reading.

But I guess what I'm trying to understand is through maybe particularly through elementary school, if there are different kinds of interventions that are needed, if you know by the third grade level that Hispanic students are still not performing well, what do you do in that case to support them better?

SPEAKER_01

Yeah, and I can speak to some of the work that we're doing.

And also just recognizing that Hispanic students and Spanish speakers are overlapping but distinct populations.

So if we're talking about multilingual learners specifically, at the elementary level, we're working with our professional development providers to give schools more tools for how you support multilingual learners in the classroom and also in interventions.

And so that is one of the ways that we're helping to improve.

There's also ways that schools are implementing some of their literacy interventions that are helping to make them more accessible for multilingual learners.

And in terms of Hispanic students overall, we do have identity-based groups and mentorship that is helping to support Hispanic students.

So those are just some of the ways that our elementaries and middle and high schools are addressing that concern.

SPEAKER_06

Thank you.

Council Member Morales, I really appreciate you bringing up Hispanic students because in the late 1990s, I'm dating myself, but when I worked at the White House, Hispanic students had the highest dropout rate of any population in the United States.

And at the time, President Clinton put together a package, an initiative called the Hispanic Education Action Plan to try to address that dropout rate.

And here it is in 2024, and I feel like we're having the same conversation.

That is many years.

And so I'm this is more of a statement than a question obviously, but I just It is of concern the little progress that we seem to be making in and in all those years in the almost 30 years that have transpired since not quite 30 but almost 30 years, so You know would love to see and I don't know whether deal in particular or whether Seattle Public Schools is tracking a Hispanic students in particular and whether it's you know still the case that they are having or in Seattle in particular the highest dropout rate or non-completion rate but this is something I think that we need to look at to continue to look at and then I will say that I very much appreciate the presentation and the work that's gone into the evaluation and I feel encouraged by what I'm seeing in terms of the preschool program and the promise program.

I have to say I'm less encouraged by the K through 12 interventions.

I understand on the math side, it looks a little better on the ELI, but there's a lot of no statistical significance and that gives me pause.

So that is to say overall, I don't feel in the K through 12 space We are in a place where we can, well, I don't feel encouraged based on this lack of statistical significance.

And to my colleague, Council Member Moore's point earlier, you know, I know that interventions are happening at Seattle Public Schools that we are making investments in as well as the district.

But what I don't know is, and I think maybe this is something you were getting at earlier, Council Member Morris, how do we know how many students need interventions and whether all of the students that need intervention are getting captured?

Because I'm not clear what those data points are.

And I know there's a lot of overlap, the district and the state are tracking differently from the city.

So I just don't know how we can get to that, but I think it's important to, and particularly in light of the fact that, like I said, the data, And I should say, actually, that we had had colleagues earlier at a different committee meeting an assessment on the preschool program and on Promise, but we had it on the K-12 because Deal was waiting on the analysis that we just saw today.

And the analysis that we saw today is light on the details.

And so there's obviously more that needs to do more analysis in that space.

But it does, to me, create a sense of urgency about digging into more details of the K through 12 analysis so we can understand how we can make where we need to be making these investments to have significant positive outcomes to say with statistical significance that we've made some progress in that space.

So I would love to see Deal digging more into analysis on that because it's what, as you said earlier, Izzy, it's going to help us inform what changes we may make to those investments moving forward.

And just, we do have limited data, but it's really important to figure out where the gaps are and what we need to be doing in the K-12 space.

And I will mention that we had an impressive amount of K through 12 students here yesterday, actually, which I was very happy to see them and engaged.

I always love students being engaged, civically engaged, but it goes to some of the things that they were saying, some of the supports that they need.

And obviously there's a lot of overlap in the K through 12 space between education, So DEAL and HSD, the Human Services Department.

So, you know, I'd love to see more collaboration or maybe you're having it and we just don't know about it.

So I want to be fair there, but would love to hear more about it because I do feel like right now in particular what we're seeing and middle and high schools in this city with a lot of school violence, et cetera, all of this is related.

And so how do we analyze the data on the school-based investments that we're making?

Because some of those are wraparound to make sure that we're putting the investment where it needs to be put so we can make an impact positively on these students in the city, in all of these areas.

And I will note that in the K through 12 space, it's in the middle and high school intervention spaces that we are in the not statistical significant space.

So would love to see Deal digging more into that.

I think it's really critical.

It's really necessary to help inform where these investments need to be going in the future and not for the next levy, but starting now, if possible, so.

But thank you deal for always being a great partner and for coming today to to to present on on these outcomes and really looking forward to seeing more by way of data and analysis and recommendations on changes that we can be making to really make a significant impact in a positive way on students to borrow a statistical term

SPEAKER_00

Well, thank you council member for your insights.

It's always valuable.

We are committed to continuing our continuous quality improvement and making sure that we're partnering with our partners here at the city.

as well as our partners at, whether it's our Seattle preschool providers, Seattle Public Schools, our college partners, and other community-based organizations to work with to continue to make sure our kids have the best that they need.

So look forward to your partnership and our continued dialogue around everything that you just shared.

But before I go, I also wanna thank our amazing team for today, the work that you all did to help make this happen.

And for you all, some of you came from out of town to be here today.

So I just want to thank you all for your presence and support.

And with that being said, we look forward to, like I said, the continued partnership.

SPEAKER_06

Thank you, director.

And thank you deal staff, Sonia and Colin for being here and Izzy and for all the hard work that you do every day.

So thank you for that.

Appreciate the partnership and appreciate the hard work.

So.

Great.

Colleagues, unless there are no other questions, we'll let Deal go today.

Thank you so much.

All right, seeing no further questions, this concludes the August 8th, 2024 meeting of the Educations, sorry, Libraries, Education and Neighborhoods Committee.

Our next committee meeting is scheduled for Thursday, September 12th at 9.30 a.m.

That's after council recess.

If there's no further business, this meeting will be adjourned.

All right, hearing no further business, it's 11.17 a.m.

and this meeting is adjourned.

Thank you.

Speaker List
#NameTags