You are viewing the EdNews Blog archives.
These archives contain blog posts from before June 7, 2011
Click here to view the new First Person section of Chalkbeat Colorado

Archive for the ‘Achievement gaps’ Category

Fixing the education pipeline

Wednesday, April 27th, 2011

This post was submitted by Dr. Nancy L. Zimpher, chancellor of the State University of New York and the co-creator of Strive.

I’ll say at the outset that as an outsider to Colorado politics, I am not an expert on the candidates who are running to be the next mayor of Denver. But as a lifelong educator who has studied urban school issues for decades and helped create and implement successful reforms in Cincinnati and other cities, I would offer this advice to start: Elect the candidate who can bring this community together on education reform.

This is not simply a matter of opinion. Rather, it is a growing national consensus and the thrust behind a data-driven, evidence-based movement that’s been gathering steam among educators in recent years: In order to have educated, successful adults, we need to construct a solid education pipeline that runs straight from cradle to career. I readily accepted the invitation to participate in this week’s Great Teachers for Our City Schools National Summit in Denver because I see it as an excellent opportunity to share inspiring data on what’s happening in a few cities around the country.

The first five years of a child’s life are crucial in building a strong foundation for lifelong learning skills like critical thinking, language development, and problem-solving and social skills. This naturally leads to the idea that children need to be guided into education very early in life, and be programmatically supported in and out of the school setting all the way along the pipeline to ensure that they are prepared to succeed every step of the way until they begin their careers.

What we are finding is that there is an answer to this dauntingly tall order, and it lies in adopting a collaborative approach to building and strengthening the pipeline. In short, there is no single answer, no Superman solution and no silver bullet when it comes to education reform. It takes time, and lots of hard work from invested and interested community stakeholders to effect positive change.

Enter the Strive framework for education reform, a collective-impact approach, that I helped create in 2006. Since then, Strive’s “cradle-to-career” networks have made remarkable advances in public school districts in greater Cincinnati and northern Kentucky. Measurable improvements include increases in the number of preschool children prepared for kindergarten, improved fourth-grade reading and math scores and higher rates of high school graduation. Even college enrollment among graduates from public high schools has gone up by 10 percent. And at Northern Kentucky University and the University of Cincinnati, graduation rates for students from the local urban area high schools have increased by 10 and 7 percent, respectively.

The success of the Strive approach is based on the commitment of its influential, motivated participants from different sectors—local government, business, school districts, universities and colleges and non-profit and advocacy groups—who have collaborated to solve a specific social problem—rethinking, reorganizing, and redirecting existing resources to promulgate systemic changes and new approaches to problem solving that works. The framework is not meant to be a cookie cutter; rather, it is meant to be adapted to local needs. This is the key to Strive’s success, as we’ve begun to see in Houston; Oakland; Portland, Ore., and parts of New York state, where the Strive approach is being used.

We can have all the valuable, brilliant resources in the world in place to make sure that pipeline is continuous and secure—but none of that will matter if we don’t have effectively trained teachers in our classrooms, successfully guiding and supporting students every step of the way.

It’s clear then that an essential aspect of education reform must be concentrating our efforts on teacher education and preparation, making absolutely certain that every teacher who enters the classroom is clinically prepared, both pedagogically and in subject matter, with the same kind of readiness we’d expect of a pilot in a cockpit.

Last year, I co-chaired the Blue Ribbon Panel convened by the National Council for Accreditation of Teacher Education (NCATE) that, in itself, represented a largely unprecedented consensus. State officials, P-12 and higher education leaders, teachers, teacher educators, union representatives and critics of teacher education were all represented on the panel, which uniformly called for system-wide changes in how the U.S. prepares and supports its 4 million teachers.

A major recommendation of the panel was to move teacher preparation to a clinically based model. This will involve a major structural change, shifting responsibility and accountability for teacher preparation from solely that of higher education to a shared P-12/higher education model.

It makes sense. Teachers who serve districts rife with economic and social challenges that inevitably manifest themselves in struggling public schools not only require, but deserve, the most sophisticated, best quality clinical practice preparation if they’re going to be effective in the classroom. And teacher support can’t end with the awarding of a degree: higher education should be a constant resource for training and best practices for P-12 educators for the length of their careers.

It is a myth that one person or group can cure our education ills by themselves, no matter how visionary or passionate. Only by working together, by engaging public and private institutions of higher education, public schools, civic leaders and elected officials, will we see real, measurable, and sustainable results. Success in Denver—and in every U.S. school district—will rise or fall on collaboration, on how successfully we rally all stakeholders around a common effort to achieve our goals and implement meaningful reform.

Popularity: 34% [?]

NCLB tutoring expensive, ineffective

Thursday, January 20th, 2011

Holly Yettick is a doctoral student in the Educational Foundations, Policy and Practice program at the School of Education at the University of Colorado in Boulder.

How much achievement does $6 million-worth of tutoring buy? Not as much as you might think.

A report submitted to the Colorado Department of Education in June received little attention at the time but is now relevant as Obama pushes for the reauthorization of No Child Left Behind. In the report, contract evaluators for the Colorado Department of Education concluded:  “Across all analyses, few significant differences were found” between the achievement of Colorado students who received “supplemental educational services” (tutoring mandated by No Child Left Behind) and a comparable group of students who did not.

Other highlights of the report:

  • Depending on which group provided the tutoring, costs varied widely, from $20 to $89 per hour. (The average cost was $42 per hour or $1,123 per child in federal Title 1 funds.)
  • Success records also varied widely, but the majority of tutors were basically placebos in that their students performed almost identically to a comparison group of peers who were not tutored.
  • Tutoring disproportionately benefitted those with fewer challenges. Native English speakers made greater gains in math than a comparison group of similar students who did not receive tutoring. English learners did not. Students who were not in special education made greater reading gains than a comparison group that received no tutoring. Special needs students did not.

“Supplemental educational services” are available to low-income students attending schools that miss making “Adequate Yearly Progress” for three years in a row. These optional sessions take place outside of school hours. Although the students who received the tutoring were enrolled in 15 different Colorado districts, more than three quarters came from Denver Public Schools.  These services cost about $6 million in Colorado. Nationwide, the program serves half a million students and costs about $2 billion per year.

Nationwide, supplemental educational services are provided by a range of groups including for-profit companies, non-profits and school districts themselves. One thing that surprised me given the cost of the tutoring is that tutors are apparently not required to have four-year college degrees.

In Colorado, more than half of the 4,858 students served in 2008-09 (the most recent year for which data is available) got tutored by three for-profit companies: Tutor Train, (25%) Club Z! (20%) and Learn it Systems (10%). Yet a 2010 research synthesis conducted by The Center for Educational Partnerships at Old Dominion University found that tutoring provided by school districts cost less than commercial tutoring and produced better results.

While certainly discouraging, Colorado’s tutoring results are not unique. The Old Dominion research synthesis found that supplemental educational services had very small effects on reading and math achievement. By contrast, the Comprehensive School Reform program that has mainly been eliminated by recent reauthorizations of the Elementary and Secondary Education Act (NCLB’s official name) produced much stronger results at a much lower cost.

Considering NCLB’s emphasis on accountability for schools, it is also disturbing that the Old Dominion researchers concluded that:

Despite mounting evidence that SES is far less effective than previous Title I policies, we are not aware of a single instance in which a provider has been removed from an approved state list on the basis of failing to demonstrate positive effects on student achievement.

Under Obama’s blueprint for reauthorizing NCLB, supplemental educational services would be optional. Based on the research, I think that this is a good idea. If a specific tutoring provider is getting results, by all means, keep it on. But mandated tutoring was a good idea that just did not demonstrate good results.

Popularity: 47% [?]

Quality Counts subtext: Time to implement reforms

Thursday, January 13th, 2011

Paul Teske is dean of the School of Public Affairs at the University of Colorado at Denver.

Every year Education Week gathers data on all 50 states and compares them along various dimensions in the publication’s Quality Counts report.

As always, this year’s data are interesting and worth more than a glance (due to data lags, most of the data actually come from 2008).

While I usually use this data to show how badly funded Colorado is compared to other states (we appear to be 39th in cost-of-living adjusted per pupil spending), this year I want to discuss it, in aggregate, in a somewhat different way.

Quality Counts provides four databases for states, with scores incorporating multiple data comparisons.  Colorado’s rank among the 50 states (and letter grades) are as follows: school finance 44th (D+), K12 achievement  21st (D+), chance for success 11th (B), and transitions and alignment 29th (C).   Overall, Ed Week ranks Colorado 39th, with a grade of C.

While one can quibble with the indicators used and how they are aggregated, it is important to note that, unlike state level interest groups here in Colorado, Ed Week has absolutely no stake whatsoever in where Colorado is ranked.

When we lost the Race to the Top last year, there was a widespread local feeling that we were robbed, because of our active reform agenda.  And maybe we were.  But R2T included lots of emphasis on proven results, not just on future promises, and no matter how you slice it, aggregate Colorado performance to-date is simply not impressive.  Indeed, R2T winning states were ranked #1, #2, #3, #5, #8, #11, #19, #20, #22, #23, and #31 by Ed Week – all the winning R2T states (except for DC, a special political case I think), were ranked substantially ahead of Colorado (at #39).

As one engaged in the processes of ed reform in Colorado, I do feel the state has done a great deal of legislating, and now needs to focus on implementation, which if done well can have a good payoff.

Policy scholars like to paraphrase some version of Yogi Berra on pitching when discussing the topic of implementation, noting that it is 90 percent of half of the game.  That is, the legislative fight over SB 191 was exciting, controversial, and well-engaged on all sides, but now that it is passed, the group that is, even today, meeting to implement the details of the legislation, in a context where there is considerable statutory contradiction and ambiguity, may be more important in terms of getting it done.

In my view, we have greatly underdone successful implementation in Colorado, partly because it takes resources we don’t have.   We typically have to beg one of the four or five friendly local foundations for the couple of hundred thousands of dollars to do state level studies of new reforms.  Then we expect the districts to implement the reforms with no new resources, indeed at a time when their resources are being reduced significantly.  I think that if Republicans and conservatives didn’t support most of these reforms they would tend to call them “unfunded mandates.”

So, “reform with resources,” or “resources and reform” seem like good rallying cries for the near future in Colorado.

I’ll end with a couple of specific Ed Week data points, which I think help make my point:

Colorado is ranked:

  • 46th in eligible children enrolled in kindergarten programs (2009)
  • 49th in the change in the scale score reading-gap for 4th grade NAEP (change from 2003-2009)
  • 50th in the actual reading gap for 4th grade NAEP scale score (2009)

Popularity: 14% [?]

Crashing into a low bar

Wednesday, December 22nd, 2010

Let me be clear from the outset: I do not believe many if any education advocates look at our public education systems and see the status quo as acceptable. It so clearly isn’t that people who toss around that accusation are just throwing bombs.

There, are, however, plenty of people who attempt to explain away reported deficiencies in student achievement and post-secondary readiness by questioning the validity of assessments, saying that there is more excellent teaching and high-level learning going on in the nation’s schools than the most strident reformers want us to believe. And, of course, there are people who point to very real societal inequities as the main culprit in sub-par student achievement. Some also say disengaged parents hurt the achievement of some kids.

All that may well be true. But no matter what explanations one might care to devise, there is no explaining away this new report by the Education Trust. The Trust examined results of the Armed Forces Vocational Aptitude Battery, a series baseline aptitude tests to qualify people for admission to the military, and found that shockingly high percentages of high school graduates, especially students of color, couldn’t clear this low bar.

The Trust examined the scores of 350,000 high school graduates between the ages of 17 and 20 who took the tests between 2004 and 2009. here is what researchers found:

About 23 percent of the test-takers in our sample failed to achieve a 31—the qualifying score—on the (tests). Among white test-takers, 16 percent scored below the minimum score required by the Army. For Hispanic candidates, the rate of ineligibility was 29 percent. And for African-American youth, it was 39 percent. These dismally high ineligible rates for minority youth in our subsample of data are similar to the ineligible rates of all minority Army applicants as recorded over the last ten years.

As Trust President Kati Haycock wrote in her preface to the report, these results should serve as a wake-up call to high school educators…

…because this shatters the comfortable myth that academically underprepared students will find in the military a second-chance pathway to success. For too long, we educators have dismissed worries about the low academic achievement of “those students” with the thought that “if they’re not prepared for college or career, a stint in the service will do ‘em some good.”

Actually, “those students” will not have the military as a choice. Just as they have not been prepared to enter college or find a good job in the civilian world, they have not been prepared to qualify for the military.

Young people of color who pass the tests generally do so with lower scores than white test-takers achieve. And this has serious real-world consequences:

Since these scores determine eligibility for training opportunities, financial rewards, and scholarships, this means that young people of color have more limited opportunities in the Army once they get in than do their white peers.

If there’s any good news here, it’s that Colorado as a whole does better than the national average on the military aptitude tests. Soe 17.6 percent of Colorado test-takers scored too low to be eligible for military service. But the number rose to 33 percent ineligibility for African Americans (compared to 39 percent nationally) and 28 percent for Latinos (29 percent nationally). Still, those are not numbers that should cause jubilation.

Not to stuff coal in anyone’s stocking, but this report provides some food for thought over the holidays. How have we gotten to this point, and how the heck do we extricate ourselves from the mess? If you believe in the validity of the recently released 2009 international PISA exams, our top students are getting their clocks cleaned by top students in other countries. Meanwhile, this new report shows that students father down the ladder lack the skills to do, well, much of anything beyond manual labor.

And with that, I wish you happy holidays.

Popularity: 13% [?]

The performance of Denver’s charter schools

Tuesday, December 14th, 2010

The movie Waiting for Superman, and the recent signing of a district and charter compact, has energized an intense debate about the quality of charter schools compared to their traditional school peers. Local opponents of charters have focused much of their criticism by emphasizing a national statistic quoted in Superman which is based on a patchwork, multi-state CREDO study that concluded just one in five charter schools outperform traditional schools.

The question of charter school performance is vital. However this line of critique is largely irrelevant. The overwhelming majority of education policy and practice is not national, but local — charter results in Dayton and Detroit have little to do with school decisions in Denver. And in Denver, the very same CREDO study explicitly stated — and further analysis of more recent performance data confirms — that charter schools are doing far better than their traditional school peers.

Indeed, school districts across Colorado would be well advised to look at Denver’s model with an eye to replicating its success.

It’s helpful to quickly revisit the essentials. A central premise of charter schools is simple: encourage innovation and a variety of school models. Measure outcomes. Expand the good schools, and change or close the bad ones. This basic combination of innovation, evaluation, and adjustment should lead — particularly over time — to more high-quality schools and better outcomes for students.

Denver is a vibrant example of this theory.  Over the past several years a consistent (if fragile) coalition on the Board of Education has established a solid process for encouraging and approving innovative charter proposals. Denver Public Schools (DPS) created a comprehensive annual evaluation system to measure school quality. And both forged the collective political will to close charters that do poorly.

Denver’s charters now display numerous models, including Expeditionary Learning, dual-language immersion, and entrepreneurship. These innovations have indeed produced a wide variation in quality: on the 2010 School Performance Framework, three of the top five schools were charters – and so were two of the bottom five.

However the best charter schools are expanding to serve more students, while the worst are being reconfigured and face closure. And in the aggregate, charter schools in Denver are now doing far better than their traditional peers on both quantitative academic criteria and qualitative metrics.

In Denver, we have a rare and somewhat unique ability to make comparisons based on two local frameworks for measuring school quality: the Colorado Growth Model (which measures the academic growth in individual students from year to year), and DPS’s School Performance Framework (which derives academic data from the growth model but also includes non-academic measures such as student engagement and parent satisfaction). Using these frameworks should provide considerable insight into the performance of Denver’s charter schools. And what they show us is a significant different in school quality.

On the growth model, adjusted based on the number of tested students in each school, Denver’s charter schools outperformed their traditional peers in academic growth by 15 percent, with an aggregate median growth percentile of 61.3 versus 53.4 (median growth across Colorado is 50). Charter schools scored higher at every school level, and in all subjects, with a single exception. The gains were stronger in the secondary grades (particularly in math and writing) with differences of up to 30 percent.

Denver’s School Performance Framework (SPF), similarly adjusted for school size, also showed a similar 15 percent gap in academic growth among charter and traditional schools – and both schools served equal percentages of students in poverty. However, charter schools excelled even further on the SPF’s non-academic metrics, with astounding differences in measures of student engagement (43 percent higher), parent satisfaction (27 percent), and re-enrollment (16 percent) (see data section below). These gains extended across all grade levels.

The final statistic — a school’s re-enrollment percentage — is particularly interesting.  Among the many unproven claims against charter schools is that they filter out low-performing students.  It turns out that charters have a far better track record of retaining kids.

Here is a full summary of the charter and district data from both frameworks.

Even their strongest proponents agree that charter schools are not a panacea for all of public education, and there are many external and societal factors impacting schools that also badly need our attention. More research should continue to look at the similarities and differences — but the direction here is clear. Across a growing body of evidence and several years of data, the performance of Denver’s charter schools surpasses their traditional peers. Superman is not coming to Denver, but the charter schools here are doing very, very well.

Extra Credit: The Data

If one wants to dig into the details, here is a larger discussion of the data:

CREDO Study: The 16-state CREDO study, which looked at five years of data ending in 2007-08, and to which charter detractors regularly refer, has received its share of criticism over its methodology.  I don’t know it if it useful to revisit that debate, but what is inexcusable is that the same people who cite this study for evidence against Denver’s expansion of charters completely ignore its local conclusions.  The CREDO study, which used only charter schools in Denver for its statewide comparison, explicitly found and concluded that these schools performed “significantly better” than their peers (see their own press release). To argue the whole of the study while not acknowledging the most relevant part is patently absurd, even for partisan political hacks.

Colorado Growth Model: Using this 2010 data, I did a weighted average based on the number of students in each school who took the CSAP. The growth model splits grade levels neatly into K-5, 6-8, and 9-12 – so if a school is a 6-12 program, the growth model counts it as two different schools (a 6-8 middle and a 9-12 high school).  A similar division happens with K-8 schools. This allows for a more precise comparison by grade.  Under this formula, Denver has 135 district schools and 26 charters (which comprise 12% of students of all students taking the CSAP).

Remarkably, charter schools did better on academic growth in every subject and school level, with the single exception of elementary school math. Aggregated across all 161 schools, charters received higher median growth percentile (MGP) scores in reading (+4.8), in writing (+9.4) and in math (+9.3) for an average difference of +7.8 points, or almost 15% higher:

Breaking it down by grade levels, in the elementary school grades (charter students composed 9.3% of tested students), charters did slightly better on average (+1.4); better in reading (+1.1) and writing (5.7) and worse on math (-2.6).

In middle school grades, (14.8% charter students) the differences in median growth percentiles were stark: reading (+7.9), writing (+12.1), and math (+15.5), or a double-digit average of 11.8 points better.  This is a percentage improvement of between 15% and 30%. The district’s lowest scores were in the middle school grades, suggesting that the renaissance in Denver’s middle school years is primarily driven by charter schools.

High school scores (8.7% charter students) were also positive: reading (+1.0), writing (+7.2) and math (+10.1), or an average of +6.1 points or 11% improvement.

School Performance Framework: Note first that this is based on 2010 SPF data (which covers the 2009-2010 school year), which is different than the recent 2010 count day data listed at EdNews.  The 2010 SPF lists 18 charter schools and 114 district schools. Using this data, I again adjusted scores based on enrollment (so that each school provides a weighted average in its category). I did not include alternative schools in either group.

Across the entire city, district schools enrolled 67,203 students in 2009-2010, while charters had 6,105 (or 8% of the total). The percentage of students in poverty is very close: 73% to 72% FRL. However, in the aggregate on the SPF, charter schools did considerably better on growth (+8 points); status (+13), reenrollment (+13), student engagement (+17), and parent satisfaction (+12).

But to look even closer, Denver has 68 traditional (K-5) elementary schools and just one charter elementary school, which makes any comparison for K-5 meaningless. What happens if we subtract all 69 of these elementary schools and look again at the aggregate SPF metrics? You get this:

Without elementary schools, the relative performance of charters improves even further.  The percentage of students enrolled in charter schools rises to 14%, and the percentage of FRL students is the same (71%). However charter schools receive higher marks across the board — on quantitative academic criteria (+13 growth, +16 status), on re-enrollment (+15), as well as on the qualitative aspects of student engagement and parent satisfaction (+21 each).  These are remarkable and meaningful differences, and as close to a viable district-wide comparison as I think we can get.

What happens if we continue to drill down into specific school grades, comparing K-8 schools, 6-8 middle schools, 6-12 schools, and 9-12 high schools? Well, of the eight academic criteria, charter schools outperform district schools in seven. Charters also do better in every measured level in both student engagement and parent satisfaction. I won’t cover all levels (you can see the full results in the link above); however let’s look at one particular segment: middle schools.

There are 12 district middle schools, and 5 charters (I included KIPP SP, which is listed as K-8 but only offers grades 5-8). The 14% of middle school students in charters provide a reasonable volume for comparison.  What are the results?

The academic differences are remarkable: +23 points on growth, and +18 on status — and charters have 17% more students in poverty.  Charter middle schools also do far better on student engagement. Two of the five charter schools did not have re-enrollment data, and two also did not have parent satisfaction, so I did not do a comparison for either.  But the schools who did report re-enrollment and parent satisfaction were higher than the district school mean. The academic data here is so strong, it makes me question how much of DPS’s recent middle school success is due to the impact of charters.  I suspect it is considerable.

Charter 6-12 schools also had double-digit scores: growth (+25%), status (+19), student engagement (+19), and parent satisfaction (+39). Reenrollment was lower (-13), but this was based primarily on just one school with a particularly low score (and which is being reconfigured).  FRL was comparable with charters at 58% and district schools at 60%.

K-8 schools also showed higher scores across the board for charters: growth (+8), status (+13), reenrollment (+24), student engagement (+13), and parent satisfaction (+17), however they did so with 11% less FRL students.

Where did charter schools fail to outperform district schools? Only in 6-12 high schools — which had the fewest number of charter students at any level and composed just 4% of the total — and in one category.  Charter schools lagged in growth (-10), but were higher in status (+5), and student engagement (+33), and had an FRL population 15 percentage points higher.

That’s the data, which included results from 2002 to 2010.  Of course, the arguments about charter school performance in Denver is, at many levels, based on political calculations and interest groups who have priorities other than the educational outcomes for students.  Those people and groups will continue their protests regardless (in fact, it would not surprise me if they called for a repeal of the metrics themselves). Which, or course, does not change the data: Denver’s charter’s are doing very, very well.

Popularity: 33% [?]

Ghost alumnus

Wednesday, November 10th, 2010

Perhaps it is the proximity to Halloween, but what I find most troubling about the wrenching and difficult decision to close or transform schools are the ghosts: All of the kids who went through the school, received an education wholly inadequate to the demands of modern life, and are no longer in view. Traces of them linger, but they have largely vanished.

I see this with the controversy over Montbello. Many of the people attending public forums to comment on the plan are current students, their parents, and teachers.  If the reform plan goes through, teachers will lose their jobs, and they are fighting intently for their positions and livelyhood.  That’s their right, and should surprise no one.

Students and parents are fighting for the devil they know.  I continue to think that the opinions of current students and parents are critical, valid, and almost hopelessly biased (in much the same way that all parents believe their children are beautiful – to them they are).  Several years ago a survey of parents showed that 72 percent of them gave DPS overall a grade of “D” or “F”, however only 27 percent of them gave their child’s school a grade of “D” or “F.”  As a parent, I implicitly understand this — how could any parent admit to themselves that they are sending their child off to a failing school each and every day?

How could a student get up each day with the knowledge that their school will not prepare them for full lives as adults? Students and parents in a school will always believe that it is better than it is, or that it is about to become much better. This hope is essential, but it is not a strategy.

No one questions Montbello’s numbers: Of the freshmen who start at Montbello, six percent graduate and go on to college without needing remedial work.  94 percent do not.  The overwhelming majority of these 94 percent are the ghosts. Over a 10-year period when Montbello’s performance has been an ongoing issue, the school has had roughly 3,750 students pass through its halls.  Probably about 3,300 of them do not receive a advanced degree of any kind.  Maybe 10% of these can overcome their lack of academic preparation and are successful. Who is left?  One decade, 3,000 alumnus ghosts.

These ghosts should be a haunting presence over the current proceedings. What is missing from the meetings where school closings are debated are alumni lining the walls to defend these schools.  Where are the alumni who can point to the significant role that Montbello played in their future success? How the school nurtured and prepared them for the challenges they face as adults? How it fostered in them an interest, or kindled a passion that they were able to follow to be a leader in their company, industry, organization, or community?  Not the odd alumni who succeeded despite the school, but legions upon legions who left its halls and prospered.

So when the voices are raised and rage, look to quietness. When teachers fight for their jobs, and students to stay with their classmates, and parents who want to walk their kids to school — all of whom are absolutely right to champion what they want — remember the absence of all the students who were once there, and consider where they might be today.

In the cacophony over school closings, remember the echoing, haunting silence of the alumnus ghosts.

Popularity: 18% [?]

How to evolve the School Performance Framework

Monday, October 11th, 2010

Ooms is a member of the West Denver Preparatory Charter School board, and several other boards involved in education reform

The recent results of Denver’s School Performance Framework (SPF) was fairly minor news. That’s encouraging, because it means that evaluating schools, with a premium on student academic growth, is more and more part of the lexicon. No one will, or should, claim that the SPF is the only metric that matters, but it is pretty hard to argue that the data is not useful (although I’ll offer even money that someone in the comments may take up this challenge).

At the same time, after spending considerable time with the SPF, I also think it needs to evolve. Now I come to praise the SPF, not to bury it — in my opinion, the Colorado Growth Model (the engine of the SPF) is one of the most important developments in recent memory. However let’s take the SPF seriously enough to acknowledge its limitations and look for ways to improve it.

There are three main ways I think the SPF could evolve to include and sort data to provide a fuller view of school achievement. It’s been true for too long that some board members actively resist comparative data, which allows them to support pet projects and political agendas when a hard look shows their programs to be underperforming. Moving to a data-informed opinion is critical to make any significant changes in the way we educate our children.  The data I would add include: a confidence interval; inclusion of selective admissions, and a comparison by FRL.  These are all highly important variables in school evaluation. Let me explain each.

First, as SPF academic data is based on the CSAP, which is administered only in grades 3-10, so the percentage of students whose scores count toward a school’s ranking varies considerably. For example, elementary schools offer 6 grades (K-5), in which academic growth data is available only for 4th and 5th graders.  This means that — assuming every grade has an equal number of students — only 2 of 6 grades (or just 33% of students) are counted in the growth score, which is the single largest component of the SPF. There is a similar problem in high schools, in which all academic data is only available for roughly 50% of the student body (9th and 10th grades).

Assuming even distribution across grades, the percentage of students whose scores are included in the growth data varies considerably by type: elementary schools (33%); high schools (50%); K-8  (56%); 6-12 (71%) and middle schools (100%). Particularly for smaller schools — which are most often the elementary grades – this means that a pretty small cohort of kids can determine the academic growth score for the whole school.

What I’d like to see the SPF do is two-fold: first, there needs to be a confidence interval for each school. Now, as Paul Teske has pointed out, data is often based on sampling, and this alone does not invalidate the results.  However, at a minimum one should be aware to comparisons between schools where 100% of the students contributed academic data versus only 33%. The required math here is not that hard (here is an online calculator) — for a school of 300 students, to get 95% confidence that the growth score is +/- 5 percentage points, you need a sample size of about 170 students.  I don’t believe there is an elementary school in DPS that comes anywhere close to that standard, and my guess is that most have a possible swing on academic growth data of +/- 8 percentage points (so a mean growth score of 50% could be anywhere from 42% to 58% – which spans 3 SPF categories). That’s significant.

So in recognition of what will be very different confidence intervals, schools should thus be compared primarily by grades served (apples, meet apples).  Compare K-8 programs first among themselves and the median of their group score, and then among all schools. Maintain the overall ranking, but acknowledge the significant difference between the data sets of different grades served by setting them apart (example to follow).

Secondly, I’d like to see the percentage of students in each school that are selective admissions — students who are awarded places based on academic ability or skill.  This would include both entire magnet schools as well as selective admissions programs within larger student bodies. Simply put, it is deeply unfair to compare schools that can hand-pick students with those that do not. With few exceptions, the percentage of selective enrollment seats within many DPS programs is lost in the statistical bureaucratic muck, and badly deserves some transparent light. I’ve written about this previously, and I remain at a complete loss at a system in which schools with these different enrollment policies are ranked as if they are equal when they are clearly not.

Third is to more explicitly consider the percentage of students in poverty (or FRL). The correlation between subpar student academic achievement and poverty remains high, and particularly if we are serious about addressing the achievement gap, we need to look more closely at schools that have FRL higher than the district average (of about 65%), and less at those schools whose demographics only resemble those of our city when inverted.

What might this new SPF look like? Here is some of the data for DPS high schools (chosen because sample size is large enough to be interesting and small enough to be manageable):

Now I don’t have a confidence interval here — which is most useful in comparing schools who serve different grades — but given that all of these schools are relying on academic data from roughly 50% of their students, I’d sure like one.  Selective admissions reveals one school: GW, whose 28% selective enrollment is from their web site and may be slightly dated, but I’d bet it’s pretty close.

Note that the four lowest scoring schools (who are in the two “danger” categories) all have FRL above 85%, while of the top four (in the second highest category), only one does. Which leads us to the second part: a graph comparing the SPF score with the percentage of students who are FRL (red is the regression line):

What is telling here is the easily discernible pattern through the lens of FRL and achievement. The median point score – for the high school category alone – is 45%.  Three schools scored significantly above that median: CEC, East and GW.  East has open enrollment and an FRL of 35% (note that the latter is neither a pejorative nor discredits their high SPF score); GW has 52% FRL and a selective admissions policy for over a quarter of their students, which makes a considerable difference (my guess is that without these students GW would drop a category). The high school that is most impressive is CEC, with high SPF points, FRL of 81%, and an open-enrollment policy.* as benefits its isolated position in the top right.

Somewhat appealing are Lincoln and Manual – both received SPF scores just over the high school median, but did so with large numbers of FRL students. TJ had a somewhat higher score, but their relatively small FRL population shows them far below the trendline.  Kennedy looks remarkably average or below; South, West and North all disappointing, and laggards Montbello and DVS are already (and rightly) undergoing programmatic changes.

Now this view is largely lost in the overall SPF, which gave CEC an overall ranking of 24th and placed them in the second-highest category of “Meets Expectations.” But if you are a parent searching for a good high school program, you care a lot less about the comparison to elementary, K-8 and middle schools.  And you should take a hard look at the impressive results at CEC.

So while I believe it remains important to show the relative performance of all schools, this is how I would like to see the SPF evolve.  For the combination so evident in CEC is, to me, the rare trifecta that narrows the achievement gap: academic growth (hopefully with a strong confidence interval); open-enrollment policies;* and serving a large FRL population.

This trifecta is also really hard to do.  Last year I wrote a depressing post on the SPF which was more specific about the truly lousy prospects for high-poverty, open-enrollment students. The results this year were just not that different – the worst schools have somewhat narrowed, but there is still a long way to go at the top, particularly in grades 6-12.

However we should acknowledge the achievements that are being made: for high schools that is East and especially CEC , who are deserving of recognition not easily apparent in the overall SPF. My guess is there are similar schools in each of the different grade structures.  It would benefit all of us to have a clearer picture of who they are. Hopefully the SPF will take some tender steps towards this evolution.


*Update: I regretfully spoke too soon about CEC’s enrollment policy.  The school does not have geographic enrollment, and instead accepts students based on an application process that requests transcripts and grades, awards received, attendance data, and three recommendations.  This clearly places CEC (and they somewhat self-identify) as a magnet school with 100% selective admission.  To operate as a magnet with 81% FRL is commendable, but this is not a school with open-enrollment and their achievements should include this qualification.

Popularity: 18% [?]

Summer doldrums are no excuse

Friday, July 16th, 2010

Summer is a wonderful season, and a great chance to relax, on many dimensions. But as I watch my somewhat bored children squabble daily, I wonder about the wisdom of the long summer break, for parents as well as for kids.

And I remember the very solid research on the summer achievement gap, by Karl Alexander and his colleagues. This shows that as much as two-thirds of the K-12 achievement gap can be related to larger, accumulated summer learning losses for low-income students.

It is a little hard to get overly worked up about anything in 90-degree summer heat, but I always think that this is one of our real scandals in education policy.

We know, for sure – combining common sense, good brain theory and solid empirical evidence – that it is bad for students to have a 10-week summer break, in terms of their learning trends, and it is particularly bad for low-income students who don’t get exposed to the summer reading programs, museum visits and education-oriented camps and vacations than many middle-income families enjoy.

Politically, it is also pretty clear why we don’t reduce or eliminate the long summer break for students – many parents don’t like it (when it has been tried in some districts, though surely some parents would like to reduce the hassles of figuring out what to do with kids for 10 weeks of no-school ), the long summer break is traditional, recreational and barbeque industries lobby to preserve it (they really do, just like they have a stake in daylight savings time issues), we don’t want to pay more for more teaching time, many school buildings are not air-conditioned and that would cost more money, etc.

But this is a pretty stark case where we know, with absolute certainty, that our current policies are bad for all students and are especially bad for low-income students. Yet we allow these other political preferences to outweigh the possibility of actually utilizing the known silver bullet of summer learning time. There is a whole organization devoted to this issue.

True, a smattering of good summer intervention programs are targeted at low-income kids, such as this one described recently in EdNews. These efforts are worthy and important but, like voluntary charity generally, there aren’t nearly enough resources to come near solving the whole problem.

A promising recent study suggests that just giving low-income students books might be a cost-effective way to reduce some summer reading loss.

Still, it is frustrating that we don’t seem to want to summon the energy to take this on, full-bore.

Popularity: 5% [?]

Kansas City’s second act

Thursday, March 11th, 2010

The budget rumbles begin, but the story in KC has a lot more context.  To start, from the Wall Street Journal (or try the NYT):

The Kansas City Missouri School Board voted Wednesday night to shutter nearly half of its schools in an effort to avoid going broke. The action closes 28 of 58 campuses and eliminates about 700 of the district’s 3,300 jobs, including 285 teachers. [...] The Kansas City School District, which serves 18,000 students, was twice as large a decade ago. That decrease has led to cuts in state funding. The district now runs a $12 million monthly deficit and expects to run out of money by 2011. [...] Less than one third of elementary school students are reading at or above grade level. In nearly three quarters of the schools only one quarter of the students are characterized as “proficient,” according to the district.

If you are closing 48% of the schools while eliminating only 21% of the jobs, you are either closing schools that are close to empty, or your plan is probably doomed, or more likely both. The origin of this fiscal collapse is the long steady decline in the academic quality of KC public schools, where two-thirds of all elementary school students are already behind at least one grade level.  How did we get here?

Kansas City is best known among educators as the location of one of the great failures in education reform in the history of recorded time, where over $2 billion (and $2B bought a lot more 25 years ago) failed to make a dent in an underperforming system.

In 1985 a federal district judge took partial control over the troubled Kansas City, Missouri, School District (KCMSD) on the grounds that it was an unconstitutionally segregated district with dilapidated facilities and students who performed poorly. In an effort to bring the district into compliance with his liberal interpretation of federal law, the judge ordered the state and district to spend nearly $2 billion over the next 12 years to build new schools, integrate classrooms, and bring student test scores up to national norms.

It didn’t work. When the judge, in March 1997, finally agreed to let the state stop making desegregation payments to the district after 1999, there was little to show for all the money spent. Although the students enjoyed perhaps the best school facilities in the country, the percentage of black students in the largely black district had continued to increase, black students’ achievement hadn’t improved at all, and the black-white achievement gap was unchanged. (article here)

Money can’t buy you love, and it can’t buy a district high-quality schools.  As the impact of declining budgets starts to creep across Colorado, while dissident voices cry out for educational systems and regulations where quality is a secondary concern, it’s worth noting that high-quality schools do more to protect a district’s financial position than anything else. And if quality goes, it takes a whole lot down with it.

When budgets falter, most interest groups usually try to protect their pet projects or personal interests.  This often hastens the decline.  Quality first.

Popularity: 5% [?]

On serving low-income kids in NW Denver

Wednesday, December 16th, 2009

Editor’s note: Anne Button is a member of Northwest Neighborhood Middle Schools Now, a group working with the Skinner Middle School principal to attract a more diverse student body.

In his Dec. 15 blog post (“Maybe it’s time for East Denver Prep”), Alan Gottlieb cites the expressed desires of parents in Northwest Denver for “a strong IB program at Lake” and “a strong program… at Skinner Middle School” and asks: “Has anyone done a detailed analysis of whether these programs would serve the area’s low-income kids well?”

I’m not aware of any such study (other than the unpublished DPS report on IB that Gottlieb later posted), but data show that Skinner is making incremental gains and could make more gains for all students, including low-income ones, as a more socioeconomically diverse school.

While faced with an increasingly low-income population (free and reduced lunch rate has increased from 80 to 90 percent in the past five years) Skinner has made gains in proficiency, going from 14 percent proficient and advanced in math schoolwide in 2004 to 33 percent in 2009. In reading and writing the gains are not as steep, but they are gains nevertheless.  Of course there’s more work to be done, but it is a promising trend, especially given the growing poverty level of its students, and it speaks to the abilities of the Skinner staff.

Gottlieb asserts that low-income students would probably be better served in a West Denver Prep than in a strong Lake IB or a supported Skinner, as evidenced in his statement that parents advocating for a strong Lake IB and Skinner are “driving out a program [WDP] that would benefit low-income kids, in favor of a couple of programs that may or may not serve low-income kids so well.” This, he writes, is part of an “unfortunate dynamic, too little discussed.”

Unfortunately, Gottlieb doesn’t offer proof to back his belief that West Denver Prep “would benefit” low-income kids, while the Skinner and the Lake IB “may or may not serve low-income kids so well.”

West Denver Prep has been effective with the kids it has served. But so far there are data on only 57 kids who have made it through all three grades at the first WDP, and no data yet on the second WDP, which just opened this year.

Even with this limited amount of data—and in the face of repeated criticisms of WDP’s attrition rate of losing 44 percent of its 2009 eighth graders since they started in sixth grade and not replacing them with new students, as happens in most public schools—I believe WDP deserves the opportunity to demonstrate the potential replicability and scalability of its model by serving more kids.

The district in June approved the addition of two more West Denver Preps in the Northwest quadrant, a promise to which it is, likely, contractually bound. Many parents, including those in the group Northwest Neighborhood Middle Schools Now, are simply asking the district to be strategic about where it places those two schools, keeping in mind the gamut of learners coming out of our elementary schools and the oversupply of middle school seats already in the quadrant.

Kids from all income levels, including low-income kids, should have a choice among strong options. As Gottlieb noted, WDP’s structure is not for everyone. And certainly Skinner has a way to go to get more of its kids to proficiency. But it has begun the process of improvement and is making strides, if admittedly incremental ones, toward becoming a strong option for low-income kids in the neighborhood.

NW Middle Schools Now is teaming with Skinner’s excellent principal, Nicole Veltze, to build on her notable successes and bring more socioeconomic diversity to the school. The goal is not to take resources from the 90 percent of kids already at Skinner who are already receiving free and reduced-rate lunches, but to add resources to the school and to improve its overall achievement rates.

According to a study released last month, “a growing number of studies have linked a school’s socioeconomic status with student achievement.” A 2005 study, for instance, found that a school’s socioeconomic status had as much impact on achievement growth of students as a student’s individual economic status.

Gottlieb himself, once a champion of socioeconomic diversity in schools, wrote in 2002: “The promise of better student achievement is the strongest argument for fostering more economically integrated schools, as Piton’s study illustrated. The study was conducted by Dianne Lefly, a Ph.D. statistician and the research manager in the Denver Public Schools Assessment & Testing Department. Among the findings: Low-income students (as measured by eligibility for federally-funded free or reduced-cost school lunch) perform significantly better in schools with fewer poor students than in schools where over half the students are poor.”

Gottlieb deserves credit for raising the question about whether research shows that strong, socioeconomically integrated programs at Lake and Skinner will serve the area’s low-income kids as well as WDP. To answer his question, yes, research is there, found in the study he commissioned seven years ago and in numerous studies conducted since—it’s research that is deeper and based on far more historical and nation-wide data than the very encouraging, yet still emergent, data on the replicability and scalability of WDP.

The data that Gottlieb seeks doesn’t support putting all our eggs, low-income or otherwise, into one WDP basket.

In the Northwest part of the city where, according to DPS estimates, more than half the kids leave the neighborhood for middle school and there are far more empty seats than in any other quadrant, isn’t it time to be strategic about providing options to meet an array of needs?

Popularity: 8% [?]

Colorado Health Foundation Walton Family Foundation Daniels fund Pitton Foundations Donnell-Kay Foundation