Showing posts with label progress reports. Show all posts
Showing posts with label progress reports. Show all posts

Tuesday, July 1

And the survey says...


Mayor Bloomberg announced the results of the 2008 Learning Environment Survey this morning; not surprisingly, there's good news and bad news.

This second year of the survey generated a significantly larger response, especially at schools that scored poorly last year (targets of DOE response-generating efforts). Overall, parents report high levels of satisfaction with their childrens' education and teachers; teachers who responded say they're more satisfied, too, but some areas, like professional development, still fall short.

Of great interest to us is the student survey, which shows a kid-typical mix of answers. (Middle and high-schoolers were invited to participate; between 11% and 15% actually did.)

Learning environment, for kids, means the life of the hallway and the schoolyard--what's said too loud in the cafeteria and who bumps who in gym. Bullying, fighting, and adults who yell continue to be problems, kids say. About half feel they can't turn to adults at school for help; more than half say that students don't "help and care about each other" or "treat each other with respect."

Four in ten students report that their schools don't have enough variety, in classes and activities, to keep them engaged. And it's still really hard to be smart and cool: Almost half of the students the DOE heard from say that kids who earn high grades at their school don't get other students' respect.

Bottom line: The grown-ups seem happier than they did last year. The kids -- well, they're still struggling. They want more challenge, and they need more support.

The DOE plans to post citywide survey results and reports for individual schools this afternoon; we'll update this post with a link when they do. (Learning Environment Surveys and attendance account for 15% of each school's annual progress report.)

Wednesday, March 12

Student Thought: On the weighted Regents pass rate and everything it stands for


Plainly stated, the Weighted Regents Pass Rate sucks. For those of you who don't know, the Weighted Regents Pass Rate is an assessment of a school's performance based on students' Regents test scores, and it's one component that makes up a high school's progress report grade.

As you can probably guess, the Regents pass rate part stands for: What percentage of students pass their Regents exams? I guess that one's okay. If you're being taught well in a course, you would likely be able to pass that Regents test (except for Math B, I know many kids who've scored in the top 5 percent on the SAT and have had to take Math B two or even three times).

But the "weighted" part gets tricky.

See, because of that little weighted part schools are given extra points for getting kids to take their Regents early or to achieve "mastery" by scoring an 85 percent or above. This little, tiny, eensy-weensy "weighted" part now puts the whole test prep culture that is so darn prevalent in our schools on STEROIDS. It is now become the SUPER DUPER AWESOME PUMPED UP EXCELLENT-TASTIC TEST PREP CULTURE.

And because of that SUPER DUPER AWESOME PUMPED UP EXCELLENT-TASTIC TEST PREP CULTURE a lot of students' lives get kind of messed up.

I have a friend who passed her Math B Regents exam in 8th grade based on the rock solid, well-oiled test prep curriculum at her middle school. She then came to high school, got dumped into precalculus, didn't know any of the material, struggled and even failed her first two years of math. She eventually had to be put in classes that were prerequisites for a test she'd already passed. This made her look kind of bad on her college applications and messed with her self-esteem.

So, while the school got points for having a student take the Math B so early, the student suffered.

In my discussions with the DOE regarding the NYC Student Union's positions regarding the progress reports, I have consistently argued that the Weighted Regents Pass Rate needs to be cut down or removed. The DOE's reply has been that it is the only measure of "longitudinal growth."

Regents aren't supposed to measure any "longitudinal growth." This growth DOE officials speak of has more to do with the day's weather, test-taking skills, and student anxiety than it does with the quality of teaching and learning that goes on in the school.

Regents are there to make sure that teachers are teaching their students and students are attempting to learn the subject matter at hand, to hold standards. That's it. When it comes to measuring a school's success, a simple Regents Pass Rate will do.

Cross-posted at NYC Students Blog

Monday, March 3

Responding to criticism, DOE tweaking progress reports and formulas


Could it be? The DOE appears to be responding to its critics!

The DOE informed principals last week that it will be altering the controversial progress reports before new grades are released next year, and many of the changes reflect suggestions made by parents, school leaders, and even City Council members who thought the single grades were reductive, counterproductive, and often wrong. As Elizabeth Green notes in the Sun, the grades aren't going anywhere, and they'll still be based on test scores, but they could be gentler and easier to understand.

In the future, the DOE has proposed, schools will not be penalized if their top-scoring students receive the highest score on state tests two years in a row; schools whose special education students take standardized tests will get credit, no matter those students' scores; and the "peer groups" against which schools are measured will reformed according to test scores, not demographic data. And schools might get separate grades for environment, student achievement, and student progress, instead of just the one grade they received last year (the composite grade will continue to be issued as well). Read about the full set of changes proposed at eduwonkette, who posted the full memo principals received.

I'll believe all the changes when I see them, but it sounds like the DOE is on the right track. As I said last fall, there's useful information in the progress reports, and I think structuring the reports in a way that allows schools and parents to access that information will pay off for the DOE and for kids. (Removing the high stakes attached to the grades would also be good for schools and kids.) Just think about what could have happened last year if the DOE had listened to community input before releasing the problematic progress reports!

Thursday, January 10

UFT to develop yet another school grading system


Lots of people have complained about the progress reports, saying their dependence on test scores gives short shrift to other important features of schools, including safety, class size, and the arts. UFT President Randi Weingarten plans to do something about it.

According to the Sun, Weingarten is developing a school grading system to rival the DOE's. In an attempt to predict what that grading system would look like, the Sun gives a rundown of Weingarten's opinions on the progress report grades:

She has praised the education department's emphasis on progress over absolute achievement — but denounced its reliance on just two years of test scores. She has praised the letters A, B, C, D, and F, saying "ratings help us make decisions" — but she also indicated support for giving more than one grade to each school. "Moving forward," she wrote in the same recent column, "the progress reports should give more weight to conditions like class size and safety, access to advanced courses and the availability of enrichment activities."
One would also think that a UFT-designed report card would give significant weight to teaching conditions at the school. Currently, how teachers feel about the support and professional development they get is condensed into just a few questions on the teacher surveys, which make up just 5 percent of the total progress report grade.

Monday, January 7

Queens lawmaker loudly opposes DOE reforms


Last week, when Chancellor Klein held a press conference to promote the performance bonuses going to schools with top progress report grades, he got a surprise when Assemblyman Mark Weprin, who represents Eastern Queens, where the conference was held, delivered a diatribe against the DOE's school grading system.

“Our schools have turned — I know the chancellor is standing here, but — to Stanley Kaplan courses in a lot of ways,” Weprin said, the Times reported. His impression was not dashed by the PS 46 student who said his favorite thing about his school is that "they help us get ready for the state ELA test.”

I'm surprised that Chancellor Klein was willing to turn over the microphone to Weprin, who has made his objections to recent reforms known for a while. In fact, Weprin's withering testimony last month at the City Council hearing about the progress reports, which you can read in full over at NYC Public School Parents, contained the exact same line the Times quoted. Perhaps if DOE officials had stuck around after their testimony was over, they could have better anticipated what Weprin would have to say.

Thursday, January 3

134 schools to get good-grades lagniappes


With great fanfare today, Chancellor Klein announced that the 134 schools that earned both an "A" on their progress reports and a "well developed" on their quality reviews would get the cash prizes promised to high performers. (The DOE's press release calls those the "top-performing schools," but we know that isn't quite accurate -- they're really the schools that improved most from 2006 to 2007.) The schools will get $30 per student to use at their discretion, as long as they also share the secrets of their success with schools that didn't get such high marks.

Three times as many elementary schools as high schools are getting the funds, as are more than twice as many schools in Queens as in the Bronx. According to the list of schools the DOE released, schools are taking home chunks of change ranging from $4,458 (East New York Family Academy) to $122,837 (Franklin Roosevelt High School). I wonder why the amounts being disbursed are not all multiples of $30 -- perhaps it's a result of the DOE's class size reduction plan that diminished classes by an average of just a fraction of a kid each?

Friday, December 21

State's accountability system has bad news for city schools


The state has released its own list of elementary and middle schools in good standing and in need of improvement under No Child Left Behind — and the news isn't great for the city or its progress reports.

The state removed 18 city elementary and middle schools from the list but added 64, bringing the total number of city schools not in good standing to 318. Many schools that received D's and F's on their progress reports are considered in good standing with the state, including at least two of the schools that the DOE has announced it will close this year. And many other schools that received A's and B's made the state's list of failing schools.

City education officials say there is "correlation" between the two lists because as a school's progress report score gets higher, it is more likely to be considered in good standing by the state. Still, the discrepancy between the two lists makes sense; after all, the two accountability systems focus on different things. No Child Left Behind looks only at the percentage of students scoring at proficiency each year, while the progress reports look at individual student improvement over the course of each year. The higher number of failing schools this year on the state's list could have to do with more students being tested, as the Post suggests, or on the fact that the state's requirements are getting stricter each year as we get closer to 2014, when No Child Left Behind expects every child to be proficient on state tests.

Thursday, December 20

City-chartered schools getting grades get very good ones


When the progress reports first came out, many, including Regent Merryl Tisch, were not happy that charter schools did not get grades. Chancellor Klein said he didn't have the authority or the data to issue grades for charter schools. But now the city has issued grades for more than a dozen of the schools it chartered, and the results are, unsurprisingly, favorable to the charters. Of the charter grades, 79 percent were A's and B's (compared with 62 percent of other schools), and only one school, Peninsula Preparatory Academy in Queens, received an F. KIPP Academy was among the five schools with A's — guess the staff retreat in the Carribbean paid off!

The charter progress reports are shorter than those for regular public schools, and "environment" is measured solely by attendance. Because of this, the reports clearly note that "it would be inaccurate to make a direct comparison to the grades assigned to non-charter DOE public schools" — but that hasn't stopped the press. The Sun proclaims, "Charter Schools Win Top Grades: Surpass Traditional Public Schools on Progress Reports," and notes that two city-chartered schools had higher numerical grades than any other schools in the city.

For equity's sake, I'm glad the charters are getting grades, but in reality, how much will they matter to the hundreds of families waiting for spaces to open up in charter schools that are often more disciplined and academically oriented than neighborhood schools? The charter schools' strong showing does little to dispel the notion that lots of test prep will equal a high grade in the city's accountability system. As Julie Trott, head of Williamsburg Collegiate Charter School, which got one of the two highest grades in the city, told the Sun, "We just basically are super, super serious about academics and don't play at all." Parents don't need a grade to tell them whether that's an environment they want for their child.

Still, given how little information is available about charter schools that isn't generated by the schools themselves, charter school reports strike me as more useful than those for regular public schools. We'll soon have more information; according to the Sun, the state has agreed to have all charter schools receive grades next year.

Tuesday, December 18

Thorough rundown of progress report issues in City Limits


For a thorough and thoughtful rundown of the issues surrounding the progress reports, check out a new article in City Limits Weekly by regular Insideschools contributor Helen Zelon. Using the City Council hearing on the progress reports as a starting point, Zelon takes a look at how parents use the grades, the relationship between the grades and school closures, and the role of parents in the DOE's reforms generally. (Here's a hint: it's not meaningful.)

Sunday, December 16

Student Thought: The importance of the school progress debate, Part II


By Seth Pearce

As I promised last week, here are the points that New York City Student Union members made when we met with James Liebman to discuss the progress reports.

1) The NYC Student Union supports the progress report program because it adds a sense of accountability and transparency to our schools and gives principals and SLTs important information about how to improve their schools.

2) We believe that students should be involved in revising the surveys to make them more student friendly and informative. In addition, we believe that like the parent survey, the student survey should include a question like "What is the most important thing that could be improved about your school?" We also thought that surveys of teachers, parents and students should carry more weight in the overall school grade.

3) We believe that the Student Progress section should be reduced to at most 50 percent of the grade and more weight should be given to the Learning Environment section.

4) We believe that the weighted Regents pass rate does not say as much about the output of the school as the survey-makers desire and that it should be reduced or eliminated in favor of a larger emphasis on credit accumulation and graduation rates as both of those use Regents scores to determine real student output. It also puts too much emphasis on test prep by giving schools points for trying to make students take Regents earlier.

5) We believe that attendance, though it is a somewhat troublesome factor, should be given more weight because it forces schools to reexamine policies on a day-to-day level and create more incentives for students to come to school. Shanna Kofman, a Staten Island NYCSU representative, pointed out that at Staten Island Tech, the school offers SAT tutoring the day before SAT exams so that students won't stay home to study. This is an important example; this occurs only several times a year but the school cares enough to adapt to the students in order to keep them in class for those few days.

6) Finally, we suggest that a student or students should be included in the evaluation of data collected from surveys and quality reviews, so that the effect of positive and negative aspects of every school can affect the school's report card grade in a way that accurately reflects the way those aspects affect students. Because schools are made up of people of diverse educational perspectives, the teams that evaluate schools must reflect this diversity, and therefore must include students.
The edu-activist community has, to this point, missed out on a great opportunity to revise this system and make it into a more positive factor in our schools. Instead, they have for a large part condemned the program outright, severing a possible avenue of communication between the various constituents of our school system.

I hope that the education community can eventually use this issue to give parents, teachers, and students more influence on the results-based system that seems soon to overtake American education (i.e. keeping the general program but working to decrease the importance of certain elements like high-stakes testing). By refusing to compromise on this we are decreasing the possibility of working together on the more important issues like class size. In this city, compromise matters.

Thursday, December 13

Earn an A for your progress reports proficiency


Think you know everything there is to know about the progress reports? Take Columbia University's Teachers College quiz to find out for sure.

Celia Oyler, the professor who wrote the quiz, doesn't try to hide her disdain for the reports. Here's the extra credit question:

Since these school grades are: so expensive to produce; not based on many important aspects of what many educators and parents consider central aspects of schooling; do not take into account multiple measures of student progress and school quality; do not take into account standard statistical measures of error; and are based predominantly (in elementary and middle schools) on state tests not designed to be used to make year-to-year comparisons of student growth, why are these school grades being used by the Bloomberg/Klein administration?

Wednesday, December 12

Student Thought: The importance of the school progress debate, Part I


A few days ago, walking to the train after an NYC Student Union meeting with some of my fellow students, it struck me to ask, Why has the debate on the NYC DOE's progress report program garnered so much attention? Why have so many newspaper articles been written on it, so many people been riled up about it? It's just a silly report card program, right? Aren't there so many important issues out there?

Well, yes and no.

While there are more urgent issues facing our schools, especially class size, this issue gains its importance because it very thoroughly defines the main theme of Klein/Bloomberg's tenure running our schools: The Search for Results. Under this administration and probably in many other school systems around the country, the focus of broad educational policy is measurable results. These results set the agenda for individual schools and school systems as a whole.

Hopefully, all of us witnessing and participating in this event can use what has transpired in New York as a learning experience on the short-term future of American education politics. Since the first school Progress Reports were released, many education advocacy groups have viciously attacked the DOE, alleging that the reports are a waste of money and encourage a culture of constant test prep.

Many of these attacks have been directed at DOE accountability czar James Liebman. I personally feel that these were uncalled for. The man is trying to create a system that brings a measure of accountability, transparency and, most important, attention to our schools. In that third category, Liebman has unquestionably succeeded.

The progress report debate has brought education issues into the public eye more than any other issue this year. It has stayed in the paper and on the minds of parents, politicians and plain old people. It has inspired questions to be asked and answers to given and has gotten more people thinking about their schools. Without the letter grade, bold and big in the top left hand corner of the progress report (the main qualm for some anti-report card activists), this would have been a non-story and no change would have come of it.

If there's one thing I would like to put out there before the debate begins to die down it is this: The report cards are not inherently evil. They are flawed, but their spirit is important and good. For my school's SLT at least, our Progress Report has given us important information about what can be improved in our schools and has forced us to develop strategies to deal with the areas in which we did not do as well. Hopefully, the progress reports also got more parents informed about what's going on in their children's schools and inspired them to take some action.

As I said, however, the report cards are flawed. Last week several reps from NYCSU went to meet with Mr. Liebman to explain our grievances about the current progress reports. In my next post, I will describe them.

Cross-posted on the NYC Students Blog

Tuesday, December 11

Progress reports (may be*) statistically sound; not enough for Council, parents


After yesterday's excitement, I'm ready to take a more substantive look at the content of the City Council hearing on the progress reports. Jenny Medina at the Times has the best rundown of all of the papers and for an overview of what James Liebman said and how the Council members responded, I would go to her report.

What stood out most to me was that once again the DOE managed to present a compelling initiative in a way that frustrated and angered elected officials and parents. A numbers-oriented friend of mine who shares my interest in education has told me that the progress reports are sound from her vantage point, and from mine, nothing I heard yesterday dissuaded me from thinking that they contain useful information parents ought to be able to find out. Liebman's presentation also helped me understand just how some top schools got low grades by showing how their students' progress, particularly that of their students who began the year in the lowest third, stacked up unfavorably next to other schools with similar students.

So I don't understand why Liebman had to undermine his own hard work by arguing that the grades are not based almost entirely on single assessments in math and English; saying that his office had "consulted" with, among many others, an organization whose leader was in the room and later testified that their only conversation was not about the progress reports; and by giving Time Out From Testing the runaround on his way out the door.

I was also relieved to see that in disliking the progress reports, Insideschools readers are more like typical New Yorkers than the Quinnipiac poll would have us think; Council member after Council member commented that their constituents have told them that poor grades are unfairly stigmatizing some good schools, some of which fear that their recent progress could be undercut. Liebman did say, as he has before, that he is open to tweaking the formula used to calculate the grades or even assigning schools multiple grades based on different criteria. But in my view, it's the presentation and the attitude behind it, not the formula, that need a major revision.

*Title updated to reflect an exchange in the comments about the statistical validity of the reports.

Monday, December 10

TODAY (12/10): City Council hearing on progress reports


There's plenty to do this week if you're concerned about the progress report grades that were recently released. (And if you have anything to do with the 13 schools the DOE has already said it will close because of poor grades, you're probably concerned.)

First, this morning the City Council's Education Committee is holding a hearing on the progress report grades. 9:30 a.m., City Hall. Map.

Then, tomorrow Central Park East I and II elementary schools are hosting a forum, featuring Deborah Meier and others, about the grades. Tuesday, 6-8 p.m., CPE I, Manhattan. Map.

And if that's not enough, you can also sign Class Size Matters' online petition against the report cards. Class Size Matters says the millions of dollars that are going into the progress reports would be better used lowering class size and building new schools.

I'll find out the answer to this question tomorrow morning when I see how many people are at the City Council hearing, but I'm curious: Are folks still at your school still as worried about the report cards as people were two weeks ago, when most Insideschools readers gave the initiative a "D" or an "F" in our poll? Or have people moved on?

Monday, November 19

Progress reports reduced to haiku at Eduwonkette


Looking for a laugh this Monday morning? Check out the results of Eduwonkette's Report Card Haiku Contest. Here's a taste of what you'll find:

who should get an A?
duck*duck*duck*duck*duck* duck*duck
duck*duck*duck*duck GOOSE!
-eduwonkette

to my son's teachers
to him you're a shining star
not just a C grade
-nyc mom

My school got an A!
And I thought we were failing--
I was almost SURR! -
Anonymous 1:50 PM
There are 68 haiku in all. Download the complete magazine(pdf) for some levity, but be careful — some of the poems are deadly serious.

Monday, November 12

NY Times: Progress report grades "simplistic and counterproductive"


Yesterday, the New York Times went after the progress reports in an editorial titled "Grading the Grades." It said pretty much everything I think (and much of what I said the other day):

The new system "does a valuable service to students, and teachers, by holding schools accountable for both overall performance and for how much progress students make from one year to the next. But Mr. Bloomberg should ditch the simplistic and counterproductive A through F rating system. It boils down the entire shooting match to a single letter grade that does not convey the full weight of this approach and lends itself to tabloid headlines instead of a real look at a school’s problems.
Last week, I heard from a couple of people that I was too generous in my appraisal of the progress report initiative. I don't know that I was. I said basically what the Times said — that the idea is a good one but the execution has big problems — and parent advocates were pleased with the Times editorial. Still, I admit that I am just getting up to speed on the theory and history behind the growth model of evaluating education, which is what the progress reports are based on. But the Times points out that while growth models are currently beloved by education researchers, they expect to see three years' test scores factoring into the computations — and the DOE used only one to judge elementary and middle schools. In their haste to show results (or to push initiatives through before they can be challenged?), the chancellor and mayor have compromised their reforms and created what parent leader David Bloomfield suggests could be considered a "crazy experiment gone bad."

Like Seth and the Times editorial board, I do think there is value to the growth model — as I said, parents should be able to know whether their schools are helping their kids make progress. And I still believe that a high-performing school may not score high on a growth model "improvement index." If a well-designed measure showed that, schools and parents would be more likely to take the news to heart. It seems that an "improvement index" that factors into a school's entire grade, if there must be one, at a much lower weight, would make more sense. I do think a single grade is reductive and distracting and unnecessary. At any rate, I agree with the Times that a "more subtle and flexible" school evaluation system is needed. Given our current leaders' inability to handle even the most reasoned criticism, I'll be pretty surprised if we see that.

Sunday, November 11

Middle School Muddle: Toss the grades


Toss The Grades: For More Details, Try the Quality Review Reports
(If you can slog through them, that is)

When it comes to selecting a District 2 middle school for my fifth-grader, I have no intention of ruling a school in or out based on the latest letter-grade from the New York City Department of Education.

The progress reports and accompanying grades are misleading, difficult to understand and culled from criteria that say much about the DOE’s priorities – improving test scores – and little about mine as a parent.

I had a hard time taking the letter grades seriously when I learned the Tribeca Learning Center, my son’s amazing elementary school, got a C.

And I would happily trade the A at Clinton, where my oldest son attends middle school, for smaller class sizes, a music program, a soccer team and a well stocked, staffed and open after-hours library.

So, I’m not disturbed that some of the fine schools we are seriously considering for next year -- like IS 89 -- received a D, or that the impressive Manhattan Academy of Technology, or MAT, got a C on its report card.

I sat down with the report cards this week to see what I might learn. The data and the methodology confused me, although it was clear that heavy penalties fell upon schools where test scores for the lowest performers failed to rise.

I switched to reading the quality reviews, like this one for MAT, and found much of the language in the quality reviews unfriendly to parents and filled with jargon: Does the average parent, for example, know what it means to “build and align capacity?’’ or why that matters? Or care if “professional development activities are in place to address differentiated instruction and to create a seamless curriculum?’’

Despite the jargon, overall, I found the quality reviews far more useful for parents, because they contained sections entitled: “What this school does well,’’ or “What this school needs to do to improve.’’

As for the report cards, here’s a tip I gleaned from my colleague Veronika Denes, a Ph.D. who directs research and program evaluation for the National Academy of Excellent Teaching at Teachers College and understands data better than anyone I know.

Veronika and colleagues spent more than a day trying to understand the methodology behind the reports, until they discovered online a simplifying tool that the DOE created for educators.

It’s 28 pages long.

Read all of Liz Willen's Middle School Muddle

Student Thought: My letter to Chancellor Klein on the new report cards


Dear Chancellor Klein,

My name is Seth Pearce. I am a senior at LaGuardia High School and a member of the NYC Student Union, a citywide, student-run and -created education advocacy organization. I am writing to you to express both my support for your new school progress report program and my criticism of some of its parts.

At last week's NYC Student Union meeting, students from schools around the city discussed the progress reports. Some students supported them and others didn't. There was, however, a general agreement on the need for accountability in our schools. These progress reports bring added accountability and transparency to our city's schools. They help give valuable information to our city's parents. The most important benefit of the progress reports might be increased involvement from these parents who now have a clearer view of what's going on their children's schools.

While I support the principle of the progress reports, I also believe that the system needs revision. A large problem with your report card is the small amount of influence the Learning Environment section has on the overall score. Attendance is also as a major indicator of school performance. Students who go to bad schools will probably go to school less often and vice versa. If students are in the habit of going to school it is more likely that they will progress academically and proceed to the next level of education. Surveys should also play a larger role because parents, students and teachers have the most direct insight into the schools output.

I would also like to say that while standardized test scores deserve a place in the progress report they are given too much value in this system. While they provide some insight into student performance, they are inadequate and distract from the real business of education: teaching and learning. Emphasis on these tests also devalues the roles teacher and student. Furthermore, the need for constant progress to succeed in their progress reports is unrealistic for high performing schools and can actually distract them from the great work they are doing. In my mind the importance of progress for these purposes should be taken on the sliding scale determined by a school's previous performance, e.g. progress would more important for low performing schools.

Thank you for taking the time to hear a student's opinion. If you ever want to read some student commentary about our school system, check out the NYC Students Blog or stop by at one of our Monday meetings.

Have a nice day,

Seth Pearce
seth@nycstudents.org
http://nycstudents.org

Wednesday, November 7

Progress report grades useful; Insideschools more useful


Eduwonkette is spending the week trying to crunch the numbers on the progress reports and so far, she's found some interesting information about the racial breakdown of schools and scores and about the characteristics, such as the proportion of experienced teachers and the percentage of students receiving part-time special education services, that don't appear to relate to schools' grades.

I'll be paying attention, as always, to Eduwonkette's analysis and what parents have to say, but here's my take on the progress reports: It looks like most crummy schools got mediocre or bad grades, and lots of great schools got good grades. The many outliers suggest that you'd be foolish to treat these grades as anything more than they are -- one more piece of information, among many, to use when looking for a school, and an opportunity to take a look at how your school is helping all kids, or not.

The grades could encourage families who might seek out schools and programs outside their zone to consider their neighborhood schools, which would be good for those schools. But with real choice an illusion except in a very few scenarios, I'm wondering what impact the grades will have on parents. Most schools' reputations are pretty entrenched, although I can imagine that the grades might affect how fast up-and-coming schools up and come. Maybe, as Ms. Frizzle says, the grades are best left as a tool for schools.

I think the grades do say something — about how hard schools are pushing their kids — that no other available data say. Schools that are moving their kids forward despite starting at a disadvantage should be recognized for doing so. And schools that are "coasting" because their kids are middle-class and school-ready should know that that's not enough.

But with 85 percent of the grades based on test scores, it's only improvement on test scores that will count. It's clear that pretty much the only way for schools with B's to get A's next year is to improve kids' test scores, often only marginally. NYC Public School Parents has noted a few schools, such as IS 318 in Brooklyn and IS 289 in Manhattan, that have already said their lower-than-desired grades won't make them add more test prep. But many other schools may make another decision, to their students' possible detriment. And I can imagine that at top-rated schools, glee will soon give way to anxiety as administrators realize that to preserve their grades, they'll have to improve upon already excellent performance.

A comprehensive school grading system that looks at student improvement is itself an improvement over looking at straight percentages of students scoring at various levels. But issuing a single grade based on improvement and test scores is reductive and demeaning to teachers — that's a position I've heard from many people. And it drives to the sidelines discussion of other important features of school performance: has the school worked to increase parent involvement? has the principal instituted a more democratic structure that's keeping teachers in the school? is a new arts program engaging students who feel alienated by too much testing?

I think the new grades make Insideschools' qualitative reviews and parent comments even more vital, and I hope parents aren't so distracted by their school's grade that they lose sight of what schools should do to get kids excited about learning.

Progress reports: "complete and fair" or "just more numbers"?


At this point so much has been said all over the Web about the progress report grades that I don't know what I can add. For once, the Times, the Post, the Daily News, and the Sun were editorially united; they all critiqued the plan by identifying good schools with low grades and lousy schools with high grades. Parents at desirable schools that received low grades are up in arms and the DOE is threatening failing schools with "consequences" that could include closure or principal replacement -- but as the Times notes this morning, despite the chatter, it's not at all clear right now what the grades really mean for parents or even for schools.

Here are a few especially sensible comments I've read about the grades. In the comments, feel free to nominate your own candidates for the Non-Hysterical School Grade Analysis Award.

On the New York Times City Room blog, a teacher writes,

This system for rating schools is the most complete and fair way they have ever been rated by the city or state. ... Grading schools may help the city understand how its educators are faring at the difficult task of bringing the thousands of under-educated students in NYC up to grade level. It does not, however, give parents much relevant information about what school is actually best for their child.
A recent PS 87 parent writes,
I think it’s great that the administration is assessing schools based on various criteria. These evaluations can supplement reputation, impressions based on visiting, and test scores alone for judging the quality of a school. And they’re not only a useful resource for helping parents to evaluate schools–they should help schools like P.S. 87 identify ways of improving. On the other hand, the emphasis on “consquences” for poorly performing schools is disheartening. Are these schools supposed to be scared into performing better? Shutting down a weak school will only increase the overall quality of NYC’s education if weak administrators, teachers, and students disappear. But that doesn’t happen–they are simply moved elsewhere!
And one more from the New York Times:
Why is this new grading system the “linchpin” of the Bloomberg/Klein administration? ... The schools, through NCLB, already are measured for Adequate Yearly Progress. So, why millions and millions more that could have been spent IN the classroom, to come up with this incredibly flawed methodology?
Louise at Only the Blog Knows Brooklyn writes,

What does it mean?

Something and nothing. You know your school and you know whether it's any good or not. No report card score is going to tell you anything that you don't already know.

Ms. Frizzle, a blogger who teaches at a middle school that received an "A" grade, writes,
One thing that sticks out in my head is that there is supposedly a computer program designed to help schools analyze their results to determine which actions are likely to achieve the greatest improvements in the data. (The idea is to prevent situations where principals throw a ton of resources at a problem identified in the school environment survey, improve that result, but find out later that because of the weighting it made very little difference in the overall school report). So you need a program to help you analyze the analysis? That seems like a waste of resources to me. Find a way to report data so that it is clear and comprehensible and paints a picture of what needs to change. Otherwise, it’s just more numbers.