Newsweek just released its list of the top 100 U.S. high schools. Like the more famous U.S. News college rankings, Newsweek relies on a numerical formula. Here is Newsweek’s formula:
Public schools are ranked according to a ratio devised by Jay Mathews: the number of Advanced Placement and/or International Baccalaureate tests taken by all students at a school in 2004 divided by the number of graduating seniors.
Both parts of this ratio are suspect. In the numerator, they count the number of students who show up for AP/IB tests, not the number who get an acceptable score. Schools that require their students to take AP/IB tests will do well on this factor, regardless of how poorly they educate their students. In the denominator is the number of students who graduate. That’s right – every student who graduates lowers the school’s rating.
To see the problems with Newsweek’s formula, let’s consider a hypothetical school, Monkey High, where all of the students are monkeys. As principal of Monkey High, I require my students to take at least one AP test. (Attendance is enforced by zookeepers.) The monkeys do terribly on the test, but Newsweek gives them credit for showing up anyway. My monkey students don’t learn enough to earn a high school diploma – not to mention their behavioral problems – so I flunk them all out. Monkey High gets an infinite score on the Newsweek formula: many AP tests taken, divided by zero graduates. It’s the best high school in the universe!
Why does Newsweek use this formula? There are two reasons, I think. First, they seem to conflate AP courses with AP exams. It is indeed good if more students take genuine AP courses, which teach the most challenging material. But there’s no point in having students take the AP exams if they’re not prepared. Some schools require their students to take AP exams, whether they’re prepared or not. The Newsweek formula rewards those schools. Here’s Jay Mathews, in Newsweek’s online FAQ:
If I thought that those districts who pay for the test and require that students take it were somehow cheating, and giving themselves an unfair advantage that made their programs look stronger than they were, I would add that asterisk or discount them in some way. But I think the opposite is true. Districts who spend money to increase the likelihood that their students take AP or IB tests are adding value to the education of their students. Taking the test is good. It gives students a necessary taste of college trauma. It is bad that many students in AP courses avoid taking the tests just because they prefer to spend May of their senior year sunning themselves on the beach or buying their prom garb. If paying your testing fee persuades you, indeed forces you, to take the test, that is good, just as it is good if a school spends money to hire more AP teachers or makes it difficult for students to drop out of AP without a good reason.
Second, it appears that better data would have been harder to get. Schools report the number of AP tests taken, but it appears that many don’t report anything about the scores their students receive.
Given Newsweek’s questionable formula, is it picking the best schools in the U.S.? Not likely. My wife, on reading Newsweek’s list, was surprised to see Oxnard High (of Oxnard, California) ranked as the 60th best. She was born in Oxnard and went to a nearby high school, and had never thought of Oxnard High as an elite school.
(To be clear: in no way am I comparing Oxnard High to Monkey High. Oxnard High seems like a pretty typical school by U.S. standards. Many of my wife’s friends graduated from Oxnard High. But, despite what Newsweek says, it’s not one of the very best schools in the country.)
Looking at standardized test scores – the actual scores, not the percentage of students who showed up for the test – Oxnard High appears to be a bit below average among California schools. Oxnard High students had an average SAT score of 997, compared to a state average of 1012; and 23% of Oxnard students took the SAT, compared to 37% statewide. 28% of Oxnard students met University of California admissions requirements, compared to 34% statewide.
What really makes Newsweek’s formula look bad is the data on AP test scores. If we use an improved version of Newsweek’s formula – dividing the number of AP scores of 3 or above (on a 5-point scale), by the number of enrolled juniors and seniors – Oxnard High scores 0.08, compared to a state average of 0.24. Many Oxnard High students take AP tests, but few score well. These are not the statistics of a top-performing school.
Here’s my report card for Newsweek’s high school ratings:
English: | Proficient |
Math: | Needs Work |
Do any of you even live in Oxnard? The OHS you remember from the 60’s is long gone. The new school is located about 3-5 mile NW of the old site. Attractive, vibrant school that all of parents and kids rave about. …so good, that a friend of mine pulled his kid out of private school last year [St. A’s] to attend the new OHS.
.02
q
Please cite some examples of what you consider that “amazing” record, because your comments flabbergast me! The real amazing ‘oddity’ to be overcome at Oxnard High over the past 30 – 40 years is figuring out that you are actually at a public school, in the United States of America. The systemic odd is that there are so many who get so much of so little, despite being a member of a category with a very limited definition of “citizen” (as least as defined in the political arena).
Oxnard High School is so much less than it ever could have been, symbolized, in part, by a depraved Spanish teacher, and leadership who believe that Spanish as a first language is the most honored claim to attention and unequal benefits. Oxnard High was on the brink of very good things, and then forgot its purpose, right around the late 60’s. It could never qualify to be on a list that would require ‘earning’ that right, based on any academic criteria. (Oh, and you are correct; California State University, Channel Islands is a wonderful achievement, and hopefully they will be accredited very soon).
Oxnard High School has an amazing record of graduating students that go on to make contributions to the national and international community for over 35 years. Considering a state university was established in the area only 5 years ago, graduating students overcome amazing odds.
Also overlooked is the socio economic, racial and cultural factors that contribute to the statistical outcomes. Many are of Hispanic origin for whom English is a second language. Bottom line-Oxnard students and teachers overcome numerous systemic barriers to success in the world with a generosity of spirit which enriches the tapestry of our world.
Newsweek’s methods and competency are both questionable. The references above regarding Oxnard H.S. (60th on last year’s list) are a case in point. The magazine erred in data collection, calculating all of the AP tests in the entire school district and attributing them to the single high school that bears the district name. Everyone associated with the school questioned the ranking and after a few days of celebration, poor Oxnard was deleted from the list completely!
Matthews includes charter and magnet schools, unless the kids are too smart (average of more than 1300 on the old SATs), then those schools are excluded! A brief review of online sources indicates that all 10 of this year’s top 10 are magnet schools with various application, testing and recommendation hurdles to admission, and I was unable to find any of the top 10 schools with more than 1% English learners among the student body. In what sense are these “public schools”?
Hmmm…old entry, but just thought I’d like clarify that many SL level IB courses are on par w/ AP courses, such as Math Methods SL and AP Calc AB and Physics SL with Physics B. IB also requires, beyond the tests, internal assessments (research paper for history, lab portfolios for sciences, literary analysis for English, spoken orals for foreign language and so on) as well as a 4000 word extended essay in a subject area on top of theory of knowledge, an epistomolgy class central to IB core topics. I think all this additional work makes IB a far superior program than the APs. I have gone through both the full IB diploma course as well as a few AP classes and IB challenged me far more. Many in the US seem to perceive the IB program as some arm of the UN bent on taking over the minds of American students by teaching them such dangerous subjects that lead to globalization such as History that’s not actually related to the American Revolution or even worse, an actual foreign language, that’s part of the core of the curriculum.
Also, AP tests can be taken by anyone, even if he/she has not taken the actual course while the IB requires that one must have taken the course, complete with the internal assessments which are factored as part of the IB grade before he/she can actually take the test at the end of 1 or 2 years. Therefore, wouldn’t that make the IB a much better standard of measurement than AP as one can be sure the student has sat through the classes (the IB also requires a certain amount of teaching hours for each class) and learned the material.
My school is #2 on the 2005 list (wtf, how did Alabama beat us >.>) and I feel IB has made all the difference in the lives of so many people who have gone through it.
Give it up people, IB deserves the exposure it receives.
I would just like to add a note about Jay Mathews’ list in Newsweek. Regarding the ratio he uses to establish rank, it should be noted that IB courses consist of what they call Higher Level and Standard Level. Diploma candidates must take 3 of each level of IB courses. The vast majority of American universities will not award college credit for IB SL courses, as they do fo AP exams with scores of 3 or better. By including ALL IB courses in his calculations, Mathews effectively elevates the exposure of IB schools on his list.
It should be further noted that Mr. Mathews has recently published a book about IBO which was co-authored by IB’s Deputy Director General, Ian Hill. Furthermore, the book was published by Open Court, whose owner is Blouke Carus, former Director of IB North America and still a sitting board member.
just a couple of comments in this interesting discussion…
lim x->0 a/x could approach inf depending on the value of a.
more to the point of the original post, i agree that the formula is basically worthless. i propose a formula based on % of students entering the HS in the 9th or 10th grade(depending on hte style of HS [mine was 10-12]) that graduate in the standard amount of time (4 and 3 years respectively) excluding those that transfer to another school. Combined with retention and graduation rates in the postsecondary endeavour each student chooses. an example of this would not only be the valedictorian who goes to yale and graduates in 4 years(on time), but also the student who is last in the class who goes to vocational school to become an electrician and graduates in 2 years and holds his/her job as a journeyman for 2 years.
just an idea
Wouldn’t the solution here be to standardize the tests that are being used to calculate the tests? E.g. every school in every county/state/province at the same level of students will make every child take the exact same, unbiased test, in the USA (and maybe other contries, like Canada, if you want to compare the two)
Of course, I think no one has come up with such a test, and no one is willing to implement such a “solution”, merely for the sake of ranking schools.
Besides, who says that the performance of a school from year to year isn’t affected by the students? Maybe the top school one year is just really lucky with it’s kids and got good ones, versus the last ranking one. Especially if that same top school was in last rank the year before.
Ugh. Tons of schools don’t pay for exams at all, but still force all of their AP students to take them – disadvantaging poor kids.
This list is especially ridiculous because seniors already know where they’re going to college and often know already that they won’t be granted credit for certain AP exams. More than anything, a high school’s rank is a function of the idiosyncratic policies of its administration than its intrinsic quality.
(Flawed as itmight be,) a listing of U.S. high schools by mere average SAT scores would still do a far better job.
I almost wrote about this when it made the papers because one of our schools, Pasco High, made number 905 – and they are proud of it.
My first thought I thought was that Pasco’s Hispanic population is very very high. Most of these kids Spanish is their first language. They all can take the Spanish AP and pass it for college credit. (Or just take it to get credit for the %).
The problem with canceling by multiplying or dividing both sides ends up being a problem with 0/0 or inf*0, which still can’t be defined.
Meromorphic functions on the Riemann sphere, for example, behave in a well-defined way everywhere. There’s a sensible metric for the sphere that gives it and subsets of it a finite area and arcs on it finite lengths, even when the point at infinity is involved, which allows you to translate C+{inf} into a system that is closed and bounded and has well-defined behavior everywhere. (But you still can’t sensibly deal with 0/0, inf/inf, or inf*0 though…)
This debate about dividing by zero is entertaining, but it’s becoming disconnected from my initial post.
It’s a mistake to think of infinity as being an ordinary number, in the sense that 837 is a number. That said, I think it’s defensible to use “infinite” as an adjective in the way I did, when writing for a general audience.
When somebody says that a quantity is infinite, I tend to interpret that as a statement about what happens as we approach some limit. The precise limit in question is usually clear in context.
Here, it’s really up to Newsweek how to treat a school with many AP exams and no graduates. I think it’s most consistent with their methodology to treat such a school as if its score were infinite.
In writing the original post, I did consider for a moment — but only a moment — saying something more complicated about the score of such a school, or giving a diploma to a single monkey to avoid this issue. But I decided quickly that doing either of those things would just distract readers from the main point of my post. And I figured that very few readers would care.
Sean,
And now it’s me who owes you an apology, since I misread your post to say that *I* should read your reference before you posted it. I read that way too fast 🙂 Sorry about that.
Anyway, the rest of my comment on the projectively extended real numbers stands.
Mike,
It looks I should read my references before I link to them. I thought the notes went into more depth on the matter. Here is a description how you can define x/0 = infinity for x nonzero:
http://mathworld.wolfram.com/ProjectivelyExtendedRealNumbers.html
“Actually, that’s what happens when you let people cancel inf or zero from both sides, not when you let a/0 be defined.”
Cancel from both sides in this case is the same thing as letting a/0 be defined.
Strictly speaking, there is no such thing as “cancel from both sides” in math. What you do is add, or multiply, the same quantity in both sides of an equality.
So, consider the equation x*a = y*a. You don’t “cancel out” the “a” from both sides; you divide both sides by “a”.
(x*a)/a = (y*a)/a
x*1 = y*1
x = y
Since a division is involved, it’s required that a is different from zero.
Actually, that’s what happens when you let people cancel inf or zero from both sides, not when you let a/0 be defined.
A couple clarifications about the Florida schools… I believe all of the Florida schools in the top 10 are IB schools (I went to Eastside, #4). The IB program requires graduates to take a minimum of 6 test, IIRC, many of which overlap in material with AP exams. Thus, students take 2 exams for what amounts to material covering a single advanced class in their junior and senior years.
When I was a student, my county subsidized exam costs, and actually received money (from the state, I believe) for each student that passed an IB/AP exam. In addition, many of the Florida public universities give full scholarships to IB graduates.
I don’t know if other states are quite like this, but that’s an awful lot of economic encouragement to excel in a metric that Newsweek considers worthy as defining a top public school.
(btw, I think the Newsweek article filters out private schools, not magnate programs)
I finally found the details of something I vaguely remembered. This is an example of what happens when you let a/0 be defined:
http://www.math.toronto.edu/mathnet/falseProofs/first1eq2.html
It’s a proof that 1=2.
a/0 = infinity according to physicists
a/0 = undefined according to mathmaticians
That’s how I learned it in college anyway. I’m not sure how satisticians would see it, since physicists conclusions are on real life examples and mathmaticians conclusions are based on mathmatical proof (ie, real life examples show that a/0 acts like inifinity, but no one has managed to proof it mathmatically).
re: Newsweek’s atrocious math: I remember being told in high school that one particular state requires that all of its juniors take the AP US History test. Seems to me that would really screw up Newsweek’s already shaky methodology.
Sean,
Thanks for your comments. Maybe I should have found a more reliable source. I’ll just say that every book on math that I own or have ever read says a/0 is undefined.
I read the document you suggested. I don’t see where it defines a/0. It defines
a/Inf = 0
but that doesn’t imply a/0 = Inf, since it specifically says: “Notice that nothing is said about the product of zero with either of the special symbols.” In order to go from a/Inf=0 to a/0=Inf, one of the algebraic steps is 0 times Inf, and that is undefined.
Let me show you how I see it:
a) a/Inf = 0 (by definition in the document)
b) a = Inf x 0 (undefined by the document)
c) a/0 = Inf (invalid since the last step is undefined)
Please let me know if I’m still wrong. I’d love to know if I’m mistaken in something so basic.
Mike, you are wrong. Division by zero is defined if you choose to define it. This is commonly done in way consistent with Edward Felten’s comments. Read Section 1.3.3 from The Field of Reals and Beyond from
http://www.math.ucdavis.edu/~emsilvia/math127/math127.html
Note that the Mathworld site is only hosted by the makers of Mathematica, it isn’t written by them. It has also been known to contain errors and be misleading in the past. See the wikipedia entry on Mathworld
http://en.wikipedia.org/wiki/Mathworld
Sorry, you’re wrong. An online source you might trust is
http://mathworld.wolfram.com/DivisionbyZero.html
(by the makers of the Mathematica package).
Nope. 0/0 is undefined. a/0, for a positive, is infinity.
Just a nitpick: a/0 is not infinite; it’s undefined. I thought it was funny that a piece that criticizes a math formula includes one of its own.
This is symptomatic of a larger problem. Schools fastrack kids into the AP classes because it makes them look good. More kids taking more AP classes means that they’re improving their education. But that doesn’t work. We find this out when the students get to college and can’t hack it in the classes with prerequisites that are supposed to be satisfied by AP credit.
That list is batshit. The second Mass. HS on the list is Belmont #156; state MCAS scores put as at about #32 within Mass, so there’s 30 schools missing above us on the list.
My wife, a King (Tampa, FL) HS graduate thinks it is rather improbable that King rates a #36, beating Gunn of Palo Alto at #70 and Palm Harbor University at #83 (by all accounts, a selective and good school, or so say my parents, who live in Palm Harbor).
The large number of Florida schools in the top 100, and total lack of Massachusetts schools, is definitely weird. My wife and I both went to HS in Florida
I couldn’t agree more with this assessment of a worthless study. When I saw that 5 of the top 10 schools were in Florida, plus, there were none in Minnesota on the whole list, I became very sceptical. Then I saw that my own high school was in the top 50; no way!
I don’t think they excluded magnet schools; I graduated from a magnet school ten years ago and was pleasantly surprised to see it listed 6th (the above critique of the methodology notwithstanding).
The exclusion of academic magnet schools (any schools with “strict academic admissions criteria”) is also a bit problematic. Perhaps if the goal is to say “you should move your family here for the benefit of your average kid”, then it makes sense, but it would at least be worth seeing how the magnets stack up in any given rating system.