In its April 28th edition, the Washington Post has an article about "a Bush administration proposal [that would] require that all states use the same formula to calculate high school graduation rates." The goal is to have a better idea of the drop-out problem in American high schools. The article mentions that states, among other things, would be required to "post the performance of students on national reading and math tests alongside state test scores, which would give parents a sense of the rigor of state assessments" - certainly a step in the right direction. If you've been reading this blog for a while, or only since Sunday, you know I like to keep track of the ways quantitative-minded people can manipulate numbers by picking the model that fits their conclusions (until everyone is quantitative-minded enough to fight back). This all begged the question: how exactly do the states fudge their numbers?
The Washington Post article remained vague about that point, but the Internet is a wonderful thing, and states' practices when it comes to computing graduation rates are available for all to read in this December 2003 report by The Education Trust ("Telling the Whole Truth (or Not) About High School Graduation") with a June 2005 update, entitled "Getting Honest About Grad Rates: How States Play the Numbers and Students Lose". From the June 2005 document: "Of the states that did provide graduation-rate information, most reported rates that look dubiously high when compared to the results of multiple independent analyses of state graduation rates." This article on Stateline.org summarizes the key points. An excerpt: "New Mexico, for example, reported to DOE a
graduation rate of almost 90 percent, one of the nation's highest.
However, the state does not track the percentage of freshman who
graduate, only seniors. This ignores students who dropped out in the
9th, 10th and 11th grades." Of course, state education officials blame the ridiculously high statistics on their inability to collect more accurate data - let's fault the information systems. This is the same reason invoked by officials in Massachusetts to explain why the state does not keep track of graduation rates at all. Come on, how difficult can this be?
People have fudged statistics for years - when I prepared for the entrance examinations for the "grandes ecoles" (engineering schools) in France, the preparation lasted for two years (like freshman and sophomore year in college), and started with a common year for everybody, after which administrators assigned students to different sections depending on performance. It was common practice for the administrators to put weak students, likely to perform poorly on the exams, into the worst section (called P for physics, as opposed to P' for a-lot-of-physics [don't ask how people came up with such names! they vaguely reflect course content and students get different exams in the end depending on the track they were in: math or physics], M for math and M' for a-lot-of-math; ranking in terms of prestige was M'-P'-M-P), and it just so happened that the Parisian schools with the highest admission rates to the "grandes ecoles" had no P section, so that these weak students had to switch schools between their first and second year and become somebody else's problem before they were counted in the admission statistics.
To go back to high school graduation rates, Education Week computes the national average for public school districts is 69.6 percent (rather than the reported average of 83 percent quoted in the Stateline.org article), using something called the Cumulative Promotion Index, which "estimates the probability that a student in the 9th grade will complete high school on time with a regular diploma." States that reported realistic graduation rates in 2005 (Alaska and Washington, with 67 and 66 percent, respectively - from Stateline.org) were initially criticized for their low numbers, until attention was drawn to the fudging tactics of their counterparts. From the 2003 report (page 4), the worst cheater is North Carolina, with a 29% difference between publicized and independently estimated graduation rate. The 2005 has similar numbers, with North Carolina still leading the pack. Table 1 (page 3) in the 2005 report lists 34 states with graduation-rate targets lower than their publicized numbers (drum roll, please! they are doing so well).
I was about to reach the end of the report, surprised that statisticians had stuck to one performance measure for once, when I stumbled on the competitor of the CPI: the Promoting Power Index, due to researchers at John Hopkins University. That index compares the number of students enrolled in 12th grade with those enrolled in 9th grade, the idea being that if students stayed in high school for four years already, they're not going to give up when the end is in sight. I might have joked about every researcher's need to devise his own measure of this or that, until I read that "nationally, more than 2,000 high schools - 18 percent of all high schools - have a Promoting Power Index of less than 60 percent." While state officials are cooking up the numbers and pretending all is well, the dropout problem isn't going away any time soon.