Eva Luna
Be kind...life's hard!
Member since 8/05 4750 total posts
Name: God, bless & heal my DH, JenG's DH Rob & DebG
|
Re: Jericho $599k
Here's how it works:
Frequently Asked Questions About NEWSWEEK’s Best American High Schools By Jay Mathews Updated: 3:11 p.m. ET May 8, 2005
1. How does the Challenge Index work? I take the total number of AP or IB tests given at a school in May, and divide by the number of seniors graduating in June. All schools that NEWSWEEK researchers Dan Berrett and Dan Brillman and I found that achieved a ratio of at least 1.000, meaning they had as many tests in 2004 as they had graduates, are on the list on the NEWSWEEK Web site, and the top 100 schools on that list are named in the magazine. I did similar national lists for NEWSWEEK in 1998, 2000 and 2003. In the Washington Post, I have reported the Challenge Index ratings for every public school in the Washington area every year since 1998.
2. Why did the number of schools on the NEWSWEEK Web site list in 2003 get larger after the magazine came out? At the top of the Web site list I invite all qualifying schools we may have missed to e-mail me their data so that I can put them on. Eight-five schools were added to the original list of 739 in 2003. There is no available national database that has the number of AP and IB tests and number of June graduates for each public high school, so I have had to build my own. Often I capture the smaller schools I sometimes miss through the publicity generated by publication of a new list. As before, I will add to the Web site list any schools I missed whose 2004 data qualifies them.
3. Why do you count only the number of tests given, and not how well the students do on the tests? In the past, schools have usually reported their passing rates on AP or IB as a sign of how well their programs were doing. When I say passing rate, I mean the percentage of students who scored 3, 4 or 5 on the 5-point AP test or 4, 5, 6 or 7 on the 7-point IB test. Those scores, the rough equivalent of a C or better on a college course, make the student eligible for credit at many colleges. I do not count passing rates because I found that most American high schools keep their passing rates artificially high by allowing only A students to take the courses. In some cases, they open the courses to all but wrongly encourage only the best students to take the tests.
AP and IB are important because they give average students a chance to experience the trauma of heavy college reading lists and difficult college examinations. Clifford Adelman’s 1999 study for the U.S. Education Department, “Answers in the Tool Box,” showed that the best predictor of college graduation, based on the records of a cohort of 8,700 students, was not good high school grades or test scores, but whether or not a student has an intense academic experience in high school by taking challenging courses. I feel that when schools deny their average students a chance to have that experience, they should not be rewarded with higher ratings because their passing rates are high.
with hundreds of teachers and students over the last 20 years have convinced me that a student who works hard but struggles in an AP or IB course, and does poorly on the AP or IB test, is still better prepared for college than he would be if he were forced to take an easier course and test. By taking AP or IB, he has gone one-on-one against the academic equivalent of Michael Jordan, and Jordan has beaten him, but he now has a visceral appreciation of what he has to do to play at that level. To send a student off to college without having had an AP or IB course is like insisting that your child learn to ride a bike without ever taking off the training wheels. It is dumb, and in my view a form of educational malpractice. But most American high schools still do it.
The College Board says the AP grade reports that high schools will receive in 2005 will contain a new statistic that will show how well their students are doing on the test without rewarding schools that restrict access to AP. I call it the mastery rate. It will be the percentage of ALL graduating seniors, including those who never got near an AP course, who had at least one score of 3 or above on at least one AP test sometime in their high school careers. A College Board study of 2004 results showed the average mastery rate for schools that had AP was about 13 percent. It will be interesting to see how many schools do better than that modest standard.
4. Why do you divide by the number of graduating seniors, and does that mean you only count tests taken by seniors? Don’t you know that juniors, and sometimes even sophomores and freshman take AP tests? I divide by June graduates as a convenient measure of the relative size of each school. That way a big school like New Trier High in Winnetka, Ill., which gave 1,918 AP tests and graduated 970 seniors for a rating of 1.977 in 2004, will not have an advantage over Ottawa Hills High in Toledo, Ohio, which gave only 154 AP tests but also graduated only 78 seniors for a rating of 1.974. On the new NEWSWEEK list they are right next to each other at number 292 and number 293.
I count all tests taken at the school, and not just those taken by seniors.
5. How can you call these the best schools or the top schools if you are using just one narrow measure? High school is more than just AP or IB tests. Indeed it is, and if I could quantify all those other things in a meaningful way, I would give it a try. But teacher quality, extracurricular activities and other important factors are too subjective for a ranked list. Participation in challenging courses, on the other hand, can be counted, and the results expose a significant failing in most high schools (though not the ones that have made this list.) I think that this is the most important quantitative measure one can make of a high school, and I think one of the strengths of this list is the narrowness of my criteria. Everyone can understand what I am doing and discuss it intelligently, as opposed to the U.S. News college list which has too many factors for me to put my mind around.
As for the words “top” and “best”, they are always based on criteria chosen by the list maker. My list of best directors may depend on Academy Award nominations. Yours may be based on ticket sales. I have been very clear about what I am measuring in these schools. You may not like my criteria, but I have not found anyone who understands how high schools work and does not think AP or IB participation is important. I often ask people to tell me what quantitative measure of high schools they think is more important than this one. Such discussions can be very interesting and productive.
6. Why don’t I see famous public high schools like Stuyvesant in New York City or Thomas Jefferson in Fairfax County, Va., or the Illinois Mathematics and Science Academy in Aurora, Ill. or Lowell High in San Francisco on the NEWSWEEK list? I do not include any high school that accepts more than half of its students into the school based on highly competitive academic criteria like grades and test scores. All of those schools you name are terrific places with some of the highest average test scores in the country, but it would be deceptive for me to put them on this list. The Challenge Index is designed to honor schools that have done the best job in getting average students into college level courses. It does not work with schools that have no, or almost no, average students. I want a list that measures how good the schools are, not just how good their students are.
There are some magnet schools on the NEWSWEEK list, but only those that admit students through lottery-driven, first-come-first-serve, ethnic balance or other formulas that do not draw just the students with the highest grades and scores.
7. But aren’t all the schools on the list doing very well with AP or IB? So why rank them and make some feel badly that they are on the lower end of the scale? That is exactly right. Every school on the list is in the top four percent of American high schools based on this measure. They have all shown exceptional AP and IB strength. I am mildly ashamed of my reason for ranking, but I do it anyway. I want people to pay attention to this issue, because I think it is vitally important for the improvement of American high schools. Like most journalists, I learned long ago that we are tribal primates with a deep commitment to pecking orders. We cannot resist looking at ranked lists. It doesn’t matter what it is—SUVs, ice cream stores, football teams, fertilizer dispensers. We want to see who is on top and who is not. So I rank to get attention, nothing more, in hopes people will then argue about the list and in the process think about the issues it raises.
8. Is it not true that school districts who pay for the exams for the students skew the results of your Challenge Index? Should not an asterisk be attached to those school districts that pay for the AP exam? If I thought that those districts who pay for the test and require that students take it were somehow cheating, and giving themselves an unfair advantage that made their programs look stronger than they were, I would add that asterisk or discount them in some way. But I think the opposite is true. Districts who spend money to increase the likelihood that their students take AP or IB tests are adding value to the education of their students. Taking the test is good. It gives students a necessary taste of college trauma. It is bad that many students in AP courses avoid taking the tests just because they prefer to spend May of their senior year sunning themselves on the beach or buying their prom garb. If paying your testing fee persuades you, indeed forces you, to take the test, that is good, just as it is good if a school spends money to hire more AP teachers or makes it difficult for students to drop out of AP without a good reason. I was happy to see that in the Washington area, when Fairfax County began to pay the test fees and require that the tests be taken, many other districts in the area followed suit.
9. Why don’t you count the college exams that high school students take at local colleges? I would like to. We tried to count what are often called dual enrollment exams this year, but it proved to be too difficult for Berrett, Brillman and me. The problem is that we need to make sure that the dual enrollment final exams we are counting are comparable to the AP and IB exams that define the index. I tried to set a standard—we would only count dual enrollment final exams that were at least two hours long and had some free response questions that required thought and analysis, just as the AP and IB exams do. And I wanted to be sure that the exams were written and scored by people who were not employed by the high school, so that, like AP and IB exams, they could not be dumbed down to make the school or the teacher look good. Some high schools provided us with the necessary information, but most could not. It was too difficult for them to persuade the colleges managing the exams to help them, or they did not have the staff to gather the data we required. We did not want to be counting extra exams only for those schools that could afford extra staff, so we decided to stay with AP and IB and the Cambridge exams, a very small but similar system, while we thought about better ways to count dual enrollment next time. We also need to consider the view of some high school educators and college admissions officers that many dual enrollment courses were not nearly as challenging as AP or IB.
10. Why do some states have so many schools on your list and others so few? The more schools I have examined, the more I have come to believe in the power of high school cultures, which are different in different parts of the country for reasons that often have little to do with the usual keys to high school performance—the incomes and educations of the parents.
California, New York, Texas and Florida lead the nation, in that order, in number of schools on the list. That is no surprise. But it is more difficult to explain why much less populous Virginia and Maryland come right after those mega-states in the number of challenging high schools, and why Iowa, with some of the highest test scores in the country, has only two high schools that meet the criteria. Six states have no schools on the list at all.
My tentative explanation is that some areas have had the good fortune to get school boards and superintendents who saw that they served their students better by opening up AP and IB to everyone. Once a few districts in a state do that, many others follow. And once a state has success with AP or IB, its neighboring states begin to wonder why they aren’t doing the same.
11. Why limit your list to public high schools? Don't you think those of us who pay tens of thousands of dollars to educate our children at private schools are also interested in how our schools measures up? My children attended both public and private high schools, and I share your interest in rating both kinds of schools. The public schools are very quick to give me the data I need. They are, after all, tax-supported institutions. The private schools, sadly, have resisted this and all other attempts to quantify what they are doing so that parents could compare one private school to another. The National Association of Independent Schools has even warned its members against cooperating with reporters like me who might be trying to help what they call consumer-conscious parents like you. They say that parents should reject such numerical comparisons and instead visit each private school to soak up its ambiance. I am all for visits, but I think those private schools are essentially saying that parents like you and me are too stupid to read a list in a magazine and reach our own sensible conclusions about its worth.
Some private schools have shared their data with me, but since the majority are resisting and any list would be incomplete, I have shelved further activity in that area for the moment.
12. Shouldn’t I worry if my child’s high school has dropped in rank since the last NEWSWEEK list? No. Keep in mind, as I said before, that every school on the list is in the top 4 percent of all American high schools measured in this way. If you want to gauge a school’s progress, look at its rating, not its ranking. Many schools have dropped in their ranking since 2003 because there is so much more competition this year, but at the same time raised their ratio of tests to graduating seniors. That means they are getting better, and the rank is even less significant.
I realize it is my fault that people put too much emphasis on the ranks. If I didn’t rank, this would not happen. I was startled that people even remembered what their school’s rank was in previous years. But those who are being so attentive to the past lists should focus more attention on the ratings and how they have changed. Also, there are a few cases of schools that have large numbers of both AP and IB tests that were at a disadvantage because of the clumsy way I protected against counting two tests taken after just one course. I will have a better system for doing this now, and it has resulted in a gain in the ratings of some deserving schools. But in all cases, the important thing is that you school is on the list, not where on the list it is.
As for why I rank, when it creates so much trouble, see question 7.
13. Why are you making such a big deal out of AP? I hear more and more selective colleges are saying they don’t like the program and are raising the score for which they will grant course credit, and some high schools are dropping AP altogether. I have heard some people say the courses are either watered down, so the schools can stuff in more students and look good on your index, or limit a teacher’s ability to be creative. There is a bit, but only a very small bit, of truth in what you have heard. Many selective colleges are making it harder to get credit for taking AP courses and tests in high schools, but their reasons for doing so are unclear. Former philosophy professor William Casement, who has analyzed this trend, says he thinks AP courses and tests are not as good the introductory college courses and tests they were designed to substitute for, and that is why those colleges are pulling back. There is, sadly, almost no evidence to back up his theory. In fact, the colleges have done no research on the quality of their introductory courses, while the College Board has expert panels that regularly compare AP courses to college intro courses to make sure they are comparable. Some educators think the colleges don’t like to give AP credit because it costs them revenue. There is no evidence to support that theory either, but it is clear that selective college admissions offices are very happy to see AP or IB courses on applicants’ transcripts. As for high schools rejecting AP, there are exactly 12 who have acknowledged doing that. They are all private, all very expensive, and represent three hundredths of one percent of the nation’s high schools. Thousands of high schools, by contrast, are opening more AP or IB courses, which they say are the only national programs that provide an incorruptible and high standard for student learning.
|