Wednesday, November 7

Progress report grades useful; Insideschools more useful

Eduwonkette is spending the week trying to crunch the numbers on the progress reports and so far, she's found some interesting information about the racial breakdown of schools and scores and about the characteristics, such as the proportion of experienced teachers and the percentage of students receiving part-time special education services, that don't appear to relate to schools' grades.

I'll be paying attention, as always, to Eduwonkette's analysis and what parents have to say, but here's my take on the progress reports: It looks like most crummy schools got mediocre or bad grades, and lots of great schools got good grades. The many outliers suggest that you'd be foolish to treat these grades as anything more than they are -- one more piece of information, among many, to use when looking for a school, and an opportunity to take a look at how your school is helping all kids, or not.

The grades could encourage families who might seek out schools and programs outside their zone to consider their neighborhood schools, which would be good for those schools. But with real choice an illusion except in a very few scenarios, I'm wondering what impact the grades will have on parents. Most schools' reputations are pretty entrenched, although I can imagine that the grades might affect how fast up-and-coming schools up and come. Maybe, as Ms. Frizzle says, the grades are best left as a tool for schools.

I think the grades do say something — about how hard schools are pushing their kids — that no other available data say. Schools that are moving their kids forward despite starting at a disadvantage should be recognized for doing so. And schools that are "coasting" because their kids are middle-class and school-ready should know that that's not enough.

But with 85 percent of the grades based on test scores, it's only improvement on test scores that will count. It's clear that pretty much the only way for schools with B's to get A's next year is to improve kids' test scores, often only marginally. NYC Public School Parents has noted a few schools, such as IS 318 in Brooklyn and IS 289 in Manhattan, that have already said their lower-than-desired grades won't make them add more test prep. But many other schools may make another decision, to their students' possible detriment. And I can imagine that at top-rated schools, glee will soon give way to anxiety as administrators realize that to preserve their grades, they'll have to improve upon already excellent performance.

A comprehensive school grading system that looks at student improvement is itself an improvement over looking at straight percentages of students scoring at various levels. But issuing a single grade based on improvement and test scores is reductive and demeaning to teachers — that's a position I've heard from many people. And it drives to the sidelines discussion of other important features of school performance: has the school worked to increase parent involvement? has the principal instituted a more democratic structure that's keeping teachers in the school? is a new arts program engaging students who feel alienated by too much testing?

I think the new grades make Insideschools' qualitative reviews and parent comments even more vital, and I hope parents aren't so distracted by their school's grade that they lose sight of what schools should do to get kids excited about learning.


Leonie Haimson said...

You're wrong about this -- lots of good schools got lousy grades, and lots of struggling troubled schools got good grades. See today's Daily News is here , entitled “How a good school fails”.

It looks at PS 35 on Staten Island: 49% of its students live in poverty, yet 86% passed last year's state reading test, and 98% the math test. Citiwide, this school was in the 91st percentile in reading, and 97th in math. But it got an “F”.

How can this be so? Because last year, only 35% of its students improved their scores over the year before in reading, and only 23% in math.

This sort of rise or fall in one year's test scores are meaningless, as anyone who knows anything about statistics will tell you.

While many excellent schools like PS 35 received “Fs” yesterday, a lot of struggling schools also received “As.”

I went through the state list of failing schools for Districts 7, 8 and Districts 10-16. Of the 141 schools that were on the state failing school list that received grades yesterday, 27 of them got As, 51 Bs, 44 Cs, 13 Ds, and 6 Fs.

So in these districts, if a school was on the state failing list, it was far more likely to get an A than a D or F.

John Elfrank-Dana, a teacher at Bergtraum HS analyzed the grades in terms of the quality reviews that every school received last year – a more holistic approach.

Fully 80% of the schools that got F or D grades were rated as “proficient” or better in their quality reviews, while 37% of those that were rated unsatisfactory in their quality reviews got As and Bs on their report cards.

For more on why the school grading system fails our kids, see my oped in the Daily News here:

Ann M said...

Where did the methodology come from for the study? Why was such a flawed system approved costing...
"$80M school-grading system reversed rankings - sources
The city will assign letter grades to all public schools as soon as Monday - and some schools long considered among the best will receive C's, D's or even F's, sources said Thursday."
Ann M

Philissa said...

I agree that the system is flawed -- I don't think I say it isn't. And certainly PS 35 on Staten Island is an example of a school that's doing a really good job under tough circumstances and didn't deserve the low grade it got. However, I think it's safe to say that most F schools are very troubled and not doing so well at all.

I would be careful about suggesting that the quality reviews are a better measure of anything than the progress reports. First, the quality reviews look at little of interest to students or parents -- and if they do, it's couched in such jargon-filled language that it's nearly meaningless. Plus, the quality reviews are factored into the progress reports. Finally, plenty of cruddy schools got good marks on the quality review as well. I think there are good comparisons/analyses to be made but I don't know that Elfrank-Dana's makes it.

Or maybe all of these measures with disparate results just go to show that different people are looking for different features in their schools -- and there can be no "ranking" of all schools in the city.

Leonie Haimson said...

I wouldn’t assume that most F schools are poor schools at all. There is simply no way to tell given how flawed the formula is. One would have to look at each individually.

I know lots of parents whose judgement I trust say their schools that got Ds and Fs are good schools. The way the grade was determined is probably based more on chance than anything else.

And the quality reviews are not factored into the progress reports.

Philissa said...

You're right about the quality reviews. It's hard to keep all this straight. Do you remember the original presentations that James Liebman gave about his accountability initiative? I think this was in the spring of 2006. I could have sworn quality reviews were originally part of the final-grade plan.

I'm definitely not assuming that most F schools are troubled -- I know about them. I agree that it's dangerous and not at all informative to make assumptions based on the grades. You definitely do have to look at each school on a case-by-case basis. And if you're doing that anyway, using information like Insideschools profiles, school visits, the learning environment surveys (absolutely the best information to come out of the DOE, as far as I can tell), and yes, test scores, then I don't know why you would need a reductive grade. But I don't think that necessarily makes them wrong. At least they should encourage all schools hard questions about student progress.

I would be more concerned about the progress reports if I thought that good schools with accidentally low scores were really in danger of closing or losing their principals; I don't, but I may be naive.

Leonie Haimson said...

If you think the learning environment surveys are the best source of information about schools, then you should also know that there's no apparent relationship between the survey results, at least as DOE has calculated them, and the progress quotient -- which makes up the largest factor in the school grades.

Here's a comment from a statistician:

"What is troubling in the city's data is the correlation -- or lack thereof
-- between ENVIRON and PROGRESS, r = .11, no relationship. How could that be? None of the input variables (attendance, academic expectations, engagement, etc...) have any effect on PROGRESS? Surely this is not true.

Oddly, Progress does not seem to be related to any measure. Look at the relationship between the Quality Review Letter Grades (U, P, W) based on site visits and the three measures. These are the means:

ENVIRON : U = .34, P = .47, W = .58
PERFORM : U = .42, P = .52, W = .62
PROGRESS: U = .46, P = .51, W = .50

ENVIRON and PERFORM means seem to be associated with Quality Review letter grades, but the PROGRESS measure is not.

So, here is the question: If the Progress measure is inconsistent with other qualitative and qualitative measures and it does not seem to be influenced by any of the school factors common sense would tell you should influence progress, what is it?

What is it?

Statisticians have a term for a statistical aggregation that does not correlate with any meaningful variable in the model. New Yorkers have a word for it, too."

And its not enough to say that "most" of the F schools are no good -- if you're going to threaten to fire a principal or close a school based on this presumption. To the contrary, if there are any good schools that received Fs, this puts the entire grading system in doubt.

See NY1 report today for another very good school, Muscota, that received an F.

Beyond all this, even if there are some F schools that are not good schools, how does getting this grade help them improve?

Pamela Wheaton said...

Time Out From Testing has a fact sheet and a petition calling for an end to the grading system. The organization is encouraging parents to sign petitions at elementary school parent-teacher conferences next week.

TrudiRose said...

As a parent, I think that either a) There should be two grades: one based on test scores, the other on "progress" or "improvement" or whatever they're calling it; or

b) Schools that routinely have 80% or higher passing rates on tests should be exempt from the "progress" element.

I do think that schools that raise low-performing students from a 1 to a 2 deserve credit, but it's completely ridiculous to penalize a school that already has primarily high-performing students for not "improving" enough.

Also, as a parent, I'd personally rather have my kid in a class with a bunch of high-performing kids than low ones. So in that way, these grades are really deceptive. Schools with mostly high-need and low-performing kids that have the most "room for improvement" have the best chance for getting an A, but that's probably not a school I want my child to go to.