Are School Progress Reports a Helpful Tool or a Hammer to Close Schools?

The department used to believe that the best way to roll out “new things” was with a roll of drums and flourishes. Hundreds of us were sitting in an auditorium in Long Island City and listening to Chancellor Klein try and motivate the audience: the topic was an explanation of the new School Progress Reports.

Jim Leibman, the former Accountability czar moved from room to room explaining the new grades A to F Progress Reports.

My notes are clear.

A = 5%

B = 10%

C = 70%

D = 10%

F = 5%

The “grades” would reflect a normal (or Gaussian) distribution, i. e., abell-shaped curve, which is expressed as,

A normal distribution is often used as a first approximation to describe real-valued random variables that cluster around a single mean value … The normal distribution is considered the most prominent probability distribution in statistics.”

Over the six years of School Progress Reports it was essential for the department to show “progress,” and the normal distribution curve morphed into a subjective judgment and the inflation of grades.

Among all City schools that received grades this year, including early childhood, elementary, K-8, middle, high, District 75, transfer schools, and Young Adult Borough Centers, the grade distribution was: 28 percent As, 36 percent Bs, 28 percent Cs, 6 percent Ds, and 2 percent Fs.

That’s right, 72% of high schools and 64% of all schools received grades of “A” or “B.” Not exactly a normal distribution curve.

See department description of methodology here.

Check out all schools on a spreadsheet here.

There is a problem: the New York State Education Department has developed a “college and career readiness” index that is not encouraging.

The new statistics, part of a push to realign state standards with college performance, show that only 23 percent of students in New York City graduated ready for college or careers in 2009

The School Progress Report/State Education Department college readiness metric mismatch is distressing.

What is even more distressing is a close look at the school by geographic area

District 2 (Central Manhattan)

A – 25

B – 17

C – 9

D – 4

F – 1

A single school district with far fewer students has many more schools and 75% of high schools received grades of “A and “B”

District 13, 16, 17 and 19 (Bedford Stuyvesant, Crown Heights and East New York) with many more students has far fewer schools and fewer percentages of schools with higher grades.

A – 10

B – 17

C – 9

D – 4

F – 1.

Is it the quality of the teachers? The principals? Or, maybe the levels of poverty? Why is central Manhattan filled with new(er) high schools and an entire swath of Brooklyn has far few(er) high schools?

The Progress Reports should provide information that enable schools to target professional development and specific cohorts of students within a school. The two hundred plus fully screened schools, schools that select their student bodies, are almost all “A” and “B” schools. The “D” and “F” schools cluster in the highest poverty zip codes.

Principals have a laser focus on their progress grades – not improving instruction, in high schools that means accumulating credits and passing regents exams and offering “college level” courses. An online newspaper reports that a Bronx principal does not offer English or Mathematics in the 11th grade so that he can offer “electives” to his most able kids – he’s trying to inflate his Progress Report grade – even if it means harming kids.

FDNY School for Fire & Life Safety (Brooklyn) got a B and not a single graduate earned a Regents diploma or met CUNY’s basic standards.

Data is important, data can provide us with information to guide policies, data as a stick to whip schools, teachers and families is a failed policy.

A principal, “I asked around and found a support organization that taught how to increase my grade – it has nothing to with instruction – just data manipulation – survival is the primary rule.”

Although we hear a drumbeat of “college and career readiness” we rarely hear a discussion of what the term means! David Conley is the leading authority; watch a U-Tube of a panel from June, 2012, at which Conley discussed college and career readiness in detail,

College readiness is not just grades on regents exams, Conley explains,

We describe skills in four categories—think, know, act, go. The more of these skills that a student has, the more post-secondary options are available:

  • Key cognitive      strategies (think): problem solving strategies, conducting research,      interpreting results, and constructing quality work products.
  • Key Content      Knowledge (know): structure of knowledge in core subjects, the value of      career related knowledge and willingness to expend effort to get it.
  • Key Learning      skills and techniques (act): ownership of learning, and learning      techniques such as time management, note taking, memorizing, strategic      reading, and collaborative learning.
  • Key transition      knowledge and skills (go): post-secondary aspirations and norms, awareness      of postsecondary costs and aid opportunities, knowledge of eligibility and      admissions criteria, career awareness, role and identity, and      self-advocacy.

Progress reports do not examine whether students have acquired “problem solving skills, conducting research, interpreting results or constructing quality work product.” The State and District leaders will tell you that the new Common Core-referenced exams will test these skills.

Only if the curriculum address these skills, oh, what curriculum? Neither State Ed nor the City has produced curriculum.

There are schools in which the school leaders and teachers are engaged in a teaching/learning process that reflects (“thinking, knowing, acting and going”) – too few.

Whether it was intended or not the Progress Report, the A-F Report Card is used to bash teachers and close schools. We have the ability (check out “big data“) to track results in real time, not wait for the end year summative assessment.

Maybe Leonard Cohen has it right: the dice are loaded.

http://www.youtube.com/watch?v=GUfS8LyeUyM

7 responses to “Are School Progress Reports a Helpful Tool or a Hammer to Close Schools?

  1. The progress reports are inherently unstable metrics. Of the 24 schools the Bloomberg administration wanted to close last year, 17 received grades of A or B this year. They earned these new grades without any change in staff or any assistance from the wizards at Tweed.

    I have seen good schools where student and teacher morale was high and things were improving closed and know of schools where the opposite is true and school gets good grades. School leaders who manage to manipulate the data to get high scores are promoted, even when they have cut out Art, Music, and killed any sense of joy students might have from being in school to get those good grades.

    The drift in the curve is known as “The Lake Wobegone Effect,” from Garrison Keilor’s fictional town “where all the men are strong, al the women good looking, and all the children above average.” This drift is a well known phenomenon and one that requires continually recalculating the mean performance that anchors the curve.

    However, the more important question above the bell curve in this case, is whether we really believe that educational outcomes are “a real valued random variable?”. I don’t think we would agree that. is so nor would we want it to be.

    Education takes place in classrooms and teachers need the freedom, support and tools to give students the skills and knowledge they need to succeed. Tests don’t tell us if students are reading, only whether they could if they chose to. Which is a better metric of success, a student who is voluntarily reading books (of his/her choice), newspapers, or magazines or the student who has been test-prepared to get a good score on a reading test that contains no passage longer than 200 words.?

    That is not a rhetorical question. It is the essence of the debate between the accountability wizards at Tweed and the teachers in the classroom. Teachers want clear standards, a curriculum to help them help students to meet those standards, and the ability to choose and get the materials that their students need so that each individual student can succeed. Tweed wants to have graphs and data and test scores even when these don’t really indicate that students are succeeding.

    Parents should weigh in on what they want for their children.

    Like

  2. Paula Washington

    At our School Leadership Team meeting yesterday, one of the agenda items was the School Progress Report. We received an A overall but, as in prior years, one of the segments of the Learning Environment Survey on which the SPR is based, school environment, was a C. We debated why this was and what we could do to improve it, but one thing stands out in my mind. The principal pointed out that other large high schools, even some of the most respected ones in the city, also received a C or worse on that segment. A look at the segment shows that it measures a number of items such as student attendance, feelings of safety, trust, and amount of communication, among them. Whereas all the other segments of the Learning Environment Survey measure empirical data, except for attendance the other items are soft. Communication among three thousand people is always problematic. Will the Department of Education use this as an excuse to break large schools into smaller ones?

    Like

    • Fed regs require states to identify the bottom 5% of schools in the state measured by performance and growth: persistently lowest achieving (PLA) schools. The state/school district must use one of the federal models to intervene – changing half the staff, removing the principal, closing the school, converting to charter. The School Progress Report “measures” schools in a more nuanced manner – comparing “similar”schools in a peer group. The major metrics are performance and growth. The environmental survey has a minimal impact on the overall grade. It does allow parents, kids and teachers to “vent” about the school. Schools can choose to address the results of the survey or not.

      The only chance of your school closing would be if the viola players were truly terrible.

      Like

      • Paul, it may not be beyond our competence, but it is certainly not the focus of the “Progress” Reports.

        What is interesting is that outside groups advising parents on how to pick a school, focus not on the letter grade or all the “empirical” stats, but on the Learning Environment Survey and what is says about the tone of the school.

        If the educational deformers (Bloomberg, KLein, et al) want to make an issue of informed parent choiceto justify the porgress reports, then they should look to expand the value of the Learning Environment Survey and put more effort into correcting the things that these surveys show are problemmatical.

        Like

      • No way can the viola players be terrible, as Paula is a fine viola teacher.

        Like

  3. Education can be defined as a collective common commitment to children for caring and competence. As a society is it beyond our capability to develop equally significant indices of both caring and competence and develop a Holistic Index?

    Like

  4. Marc: You have made a serious error. While all the children in Lake Waubegon are above average, it is the WOMEN who are strong and the MEN who are good-looking!

    Like

Leave a reply to Paula Washington Cancel reply