Newsweek has released its annual list of America’s top high schools, using the same flawed formula as last year. Here’s what I wrote then:
Here is Newsweek’s formula:
“Public schools are ranked according to a ratio devised by Jay Mathews: the number of Advanced Placement and/or International Baccalaureate tests taken by all students at a school in 2004 divided by the number of graduating seniors.”Both parts of this ratio are suspect. In the numerator, they count the number of students who show up for AP/IB tests, not the number who get an acceptable score. Schools that require their students to take AP/IB tests will do well on this factor, regardless of how poorly they educate their students. In the denominator is the number of students who graduate. That’s right — every student who graduates lowers the school’s rating.
To see the problems with Newsweek’s formula, let’s consider a hypothetical school, Monkey High, where all of the students are monkeys. As principal of Monkey High, I require my students to take at least one AP test. (Attendance is enforced by zookeepers.) The monkeys do terribly on the test, but Newsweek gives them credit for showing up anyway. My monkey students don’t learn enough to earn a high school diploma — not to mention their behavioral problems — so I flunk them all out. Monkey High gets an infinite score on the Newsweek formula: many AP tests taken, divided by zero graduates. It’s the best high school in the universe!
[Note to math geeks annoyed by the division-by-zero: I can let one monkey graduate if that would make you happier.]
Though it didn’t change the formula this year, Newsweek did change which schools are eligible to appear on the list. In the past, schools with selective admission policies were not included, on the theory that they could boost their ratings by cherry-picking the best students. This year, selective schools are eligible, provided that their average SAT score is below 1300 (or their average ACT score is below 27).
This allows me to correct an error in last year’s post. Monkey High, with its selective monkeys-only admission policy, would have been barred from Newsweek’s list last year. But this year it qualifies, thanks to the monkeys’ low SAT scores.
Newsweek helpfully includes a list of selective schools that would have made the list but were barred due to SAT scores. This excluded-schools list is topped by a mind-bending caption:
Newsweek excluded these high performers from the list of Best High Schools because so many of their students score well above average on the SAT and ACT.
(If that doesn’t sound wrong to you, go back and read it again.) The excluded schools include, among others, the famous Thomas Jefferson H.S. for Science and Technology, in northern Virginia. Don’t lose heart, Jefferson teachers – with enough effort you can lower your students’ SAT scores and become one of America’s best high schools.
its easy to see why they count tests given. And given my logic, it is especially easy to see why they glorify schools who pay for a large % of the student body to sit for the exam. AP is owned by Kaplan who is owned by the Washington Post Corp. Do you know in five years, the newspaper division has seen its operating income get cut by more than half and the educational division, led by Kaplan who make lots of money on the # of tests paid for, but even more money on their sales of study guides and test prep.
The washington post stock has not done so bad either in those five years.
I’m not sure which is funnier – the formula they use – or the way your google ads change to suit the title of the post – maybe i will just join in on the protection of precious monkey habitats…
The news media sells itself as objective and educational. The truth is, more than anything else they are an entertainment industry. Like Hollywood studios, they only show what turns a profit. Don’t fool yourself otherwise!
> Q. “Why do you count only the number of tests given, and not how well
> the students do on the tests?â€
>> I can see the argument against using the fraction of exams passed (exams
>> passed, divided by exams taken). That’s what the FAQ question you quote is
>> arguing against. But it’s not what I am suggesting they do.
I think we’re in agreement. I thought the Q+A was referring
to why he used #tests given in the numerator rather than
#tests taken. It didn’t even occur to me that one would put
#tests taken in the denominator, but you are right – that
is the strawman he is attacking. Neither of his attacks have
any merit against your system.
Fractionmeister,
I can see the argument against using the fraction of exams passed (exams passed, divided by exams taken). That’s what the FAQ question you quote is arguing against. But it’s not what I am suggesting they do.
What I am suggesting is that they should use the total number of exams passed, per student. That way they don’t get credit for students who take the exams and do poorly, but they’re also not penalized for having more students try the exams.
To see how badly their metric can go wrong, see my discussion of Oxnard High in last year’s post. Oxnard made the Newsweek list, even though it was far below the state average in the number of AP exams *passed* per student. Apparently the vast majority of Oxnard students who took AP exams failed them.
But, y’see, then the monkeys can get jobs as statisticians working for MPAA contractors 🙂
Newsweek seems to be borrowing a chapter from the US News and World Report playbook (the magazine ranks colleges annually.) The problem with this business model (and Newsweek’s) is that there is little reason for college rankings to fluctuate much year to year, so it is hard to justify getting a new copy when HYP are always at the top. In 1999, Caltech shot up to its No 1 spot (its normally in the higher single digits) but went back down in subsequent issues. Did a sudden (and temporary) burst of bright students and brilliant faculty research boost the numbers? No. There was a drastic formula change that weighted factors in Caltech’s favor (and the magazine admitted as much). Many complained that the magazine did that just to push sales. Maybe this is Newsweek’s strategy: get the formula so wrong you can always justify changes in the formula. After all, who wants to read that TJ is in the lead every year?
Here’s the reason from the FAQ for including all AP exams taken in the numerator (instead of just those that received a passing grade). Does Jay Mathews really not understand how fractions work?
Q.
“Why do you count only the number of tests given, and not how well the students do on the tests?”
A.
“I decided not to count passing rates in the way schools had done in the past because I found that most American high schools kept those rates artificially high by allowing only A students to take the courses. In some other instances, they opened the courses to all but encouraged only the best students to take the tests.”
The absurdity of counting all AP tests can really be seen when you look at a school like Southside at 36, averaging over 4 tests per senior, with only 15% of students passing any at all. In other words, if every student that took a test passed, then they would need to average 26 AP tests per student that took them. Part of this is that in SC, the state pays for everyone to take the tests for any AP class you’re enrolled in, so they would likely have a much higher failure rate.
I was at least mildly curious to see how many AP tests were taken at my school, but alas, we’re on the list of excluded schools. What is the point of excluding a dozen schools when you’re ranking the top 1,200? Just looking at the names of schools near the top, it doesn’t exactly sound like they’re your standard public schools either, just magnets with slightly lower SAT scores is all.
It is interesting to read the associated article “Why AP Matters” and the FAQ, to see how Newsweek tries to address critics of its methodology. The FAQ in particular has some choice comments about why taking AP tests is important, and why it’s OK if some schools pay for students to take the tests.
It’s been a long, long time since I was in HS, but I don’t remember students routinely taking AP tests; especially not in subjects in which they excelled. But apparently, that’s what it takes to make the list under Newsweek’s methodology.
In the UK they measure schools by the exam score of the students, and parents get to pick which schools to send their kids to. Ipso facto, the more mobile and affluent parents, with kids more likely to have a more educationally conducive upbringing, get to send their kids to the schools with the better scores, consequently raising those scores. And vice versa. Schools can’t help but encourage this given their budgets increase with better scores.
With a tadette of mathematical aptitude it would seem more logical that schools should be measured by the educational improvement that occurs between students entering and leaving (or within years). In which case they’d be biased toward favouring entrants from less favourable backgrounds given they can make a better improvement.
I’m still mystified as to why RIAA/MPAA believe they can find the greatest moral corruption from IP thievery and its use in funding terrorism in the dens of iniquity that are America’s universities rather than its penitentiaries.
Wow, you just nailed that on the head beautifully. I’m from a high school that was ranked very well the year I graduated, and I thought the ranking was garbage then too. Newsweek isn’t trying to produce and inciteful and meaningful review of high schools, its trying to make a story it can sell to people. They don’t care how poorly this ranking really reflects schools’ performance, they just want a ranking that makes some sense to their readers and will give them something to print and make a profit off of. I think this sort of highlights the problem with media outlets in the United States of late, they’re very profit driven, often to the deteriment of their content.