We're No. 2! (Now what?)
No, it's not the BCS standings. A new set of university rankings places Berkeley second worldwide. Should we shout the news from the rooftops, or put it gingerly back in Pandora's box?
| 01 December 2004
It made big news in Malaysia earlier this month when two of that nation's universities came in 89th and 111th out of 200 in a new ranking of universities worldwide by the Times Higher Education Supplement, a weekly British newspaper focusing on higher-ed issues. One Malaysian newspaper proudly pointed out that Penn State and Canada's McMaster University, among other highly regarded institutions, came in behind both Universiti Malaya and Universiti Sains Malaysia in the first-ever THES rankings.
Long faces were seen at Dartmouth, however, which at No. 138 finished 110 places behind the second-lowest-ranked Ivy League school. "There are a lot of different ways to measure success," a campus public-affairs spokesperson gamely told the student paper there, "and Dartmouth does very well in a great many rankings."
And how did Berkeley do? If you haven't yet heard, the campus came in second overall in the Top 200, behind only Harvard University, and ahead of MIT, Caltech, Oxford, Cambridge, Stanford, Yale, Princeton, Columbia, and other perennial ranking favorites.
Both the pride of the Malaysians and the sniffiness of the Big Green are evident on the Berkeley campus, as admissions officials, deans, and even the chancellor take advantage of opportunities to mention our stellar THES showing with pride, while also voicing reservations about both the methodology underlying the study and the subject of university rankings as a whole. Some worry about the THES study's heavy reliance on a university's reputation (as expressed in non-quantitative terms by a sample of 1,300 international faculty) as opposed to a range of other, more readily quantifiable factors. Others are concerned that, with Berkeley nearly at the pinnacle, there may be nowhere to go in future rankings but down.
Good news or bad news?
The THES, which serves roughly the same audience in the U.K. as the Chronicle of Higher Education does in the United States, introduced its World University Rankings with great fanfare earlier this month, proclaiming that they offer "a snapshot of the leading institutions on a set of criteria that are valued around the world." Indeed, says John A. Douglass, senior research fellow in public policy and higher education at Berkeley's Center for Studies in Higher Education (CSHE), "This is the first significant effort at ranking universities internationally, marking the significant global influence of research universities and the increasing sense of competition among leading institutions — for students, faculty, and resources."
|Is the lesson here that only the cream of the crop hold their positions indisputably, while elsewhere, even within the Top 10, the competition is nearly too close to call?|
The criteria employed by the THES differ in ways large and small from many of the ones employed by such well-known ranking enterprises as the National Research Council's once-a-decade survey of Ph.D. programs and U.S. News & World Report's hugely influential (and highly profitable) annual ranking of undergraduate institutions. And to the extent that many informed scrutinizers of the THES rankings have assessed the methodology behind them, their underlying criteria are ripe for criticism — in some cases by the same people who acknowledge, in nearly the same breath, that the rankings are good news for Berkeley and will do much to sustain our reputation as the world's leading research university.
In a nutshell, here's how the THES approached its self-imposed charge to "[apply] a single set of measures consistently across the world." Half of a ranked university's final score was based on its reputation, calculated on the basis of "peer review" responses provided by 1,300 academics around the world, who identified themselves as able to comment on specific academic subjects and geographical areas and their top choices in each. (Berkeley surpassed Harvard in this important metric … though marks in other areas — most notably student/faculty ratio — lowered our overall ranking.)
Twenty percent of the score was based on what THES editors call "a ranking of research impact," which is academese for the volume of citations per faculty member (the data coming from a single source, a database produced by U.S.-based Thomson Scientific). Another 20 percent relied on faculty-to-student ratios, with the editors acknowledging that despite the difficulty of making valid international comparisons on this basis, the indicator "is a simple and robust one that captures a university's commitment to teaching."
The final 10 percent was accorded on the strength of two factors, each relating to an institution's "international orientation": the percentage of overseas students enrolled and the percentage of international faculty employed.
With first-place Harvard awarded 1,000 points on these measures, every other ranked university was, in essence, graded on a curve. Berkeley, in second place, trails Harvard by more than 100 points, and third-place MIT claimed its spot with some 90 points less than Berkeley. While these gaps can't help but emphasize a perceived distance between the top-ranked contenders, they diminish significantly as one moves down the list, with only four points separating No. 10 ETH Zurich and No. 11 London School of Economics, and a fraction of a point separating No. 13 Chicago from No. 14 Imperial College London.
Is the lesson here that only the cream of the crop hold their positions indisputably, while elsewhere, even within the Top 10, the competition is nearly too close to call? To conclude as much is both risky and to some extent pointless, as becomes clear in talking with those to whom rankings are at once important tools of institutional promotion and frustrating artifacts of circumstances beyond anyone's control.
Arbitrary and volatile?
Critics of the THES methodology (and that of other similar surveys, such as U.S. News & World Report's) have pointed to what one of their number, Anne Machung of the Planning and Analysis Office at UC's Office of the President, calls "the arbitrary nature of the variables used to construct the rankings, and their volatility."
'It's going to be a challenge for us
financially to maintain our excellence in the face of ever-increasing
competition from elite private universities. But we have to do this,
and in such a way that we don't compromise our commitment to public
service and fulfilling our commitment to the people of California.'
UC Berkeley Chancellor
The 50 percent of each ranking attributable to a university's "peer reviewed" reputation is questionable, they say, in the absence of any insight into how the 1,300 respondents were selected, how many responded, and how valid their self-identified expertise might be considered. (The THES has so far published only a popular summary of its findings, without the raw data that would shed greater light on the underlying methodology.) The 20-percent weight accorded citations favors universities in English-speaking countries and those with strong capabilities in the natural sciences (as the THES editors acknowledge). Calculations of student/faculty ratios may or may not include the substantial proportion of instruction carried out by GSIs, adjunct faculty, and others; each university provided its own data in this area, with no attempt made by the THES and its consultant, QS Research, to take different measures into account. And the two international factors in the mix each have their critics, particularly among those familiar with the drastic impact on international-student recruitment in the U.S. stemming from various post-9/11 policy changes.
"Change either the variables on the list or the weights assigned them," says UCOP's Machung, "and you will get a new set of rankings." To demonstrate that point, she compared the top 25 North American universities on the THES list with the top 25 U.S. universities in the most recent U.S. News survey and found, "not surprisingly … many of the same universities on both lists, but with their rankings quite jumbled." Berkeley, No. 2 on the THES list, is No. 21 in the U.S. News ranking; even in less-rarefied air, the University of Massachusetts ranks as high as No. 22 and as low as No. 98, respectively.
Does that mean one set of rankings is better than the other? "Patently they are not," says Machung. "They are simply different, using somewhat different variables and assigning their variables different weights. Both lists are arbitrary, and probably neither really speaks to university quality except in a very general sense." Or, as Marc Meredith of Stanford's business school wrote earlier this year: "Academic quality is a difficult concept to quantify."
It's pleasant to think that some organization — if not the THES, then perhaps the NRC — would, over time, so improve the relevant inputs and outputs that its findings would become as close to unimpeachable as can be achieved in this contentious universe. And surely that would continue to place Berkeley in the Olympian heights it has come to regard as its natural neighborhood … wouldn't it?
Don't hold your breath: Experience, says Machung, shows that when a news organization or survey group changes its ranking model, the rankings themselves change "irrespective of any actual change in the quality of the universities themselves." Because Berkeley is currently rated No. 2 in the THES survey, with Harvard far out in front of it, such volatility suggests that the campus is more likely to fall in future rankings than to rise.
Reputation and reality
Concern over the future of Berkeley's ranked reputation extends beyond the new-kid-on-the-block THES. The campus's high placement in the most recent (1995) NRC rating of Ph.D. programs has been widely and frequently heralded — understandably, since 35 of 36 graduate programs here ranked in the top 10 in their fields in that study, with six of them ranked No. 1 nationwide in subject areas as disparate as German and chemistry. To what extent the near-universal praise at Berkeley for the NRC study as the Rolls-Royce of such efforts is based on those salubrious findings — as opposed to its rigorous methodology (for example, it surveyed more than 8,000 faculty nationwide as opposed to the THES's sample of 1,300 worldwide) — is difficult to pin down. What isn't hard to understand is the danger to Berkeley of any future rankings that take us down a reputational peg … or more.
"That's the underlying worry we all have," says Chancellor Robert Birgeneau. "Can we sustain this long tradition of excellence?" Going forward, he says, "It's going to be a challenge for us financially to maintain our excellence in the face of ever-increasing competition from elite private universities. But we have to do this, and in such a way that we don't compromise our commitment to public service and fulfilling our commitment to the people of California."
|'Every admissions cycle is another test of our reputation — one
that we are in danger of failing, unless we develop a long-term strategy
… for recruiting and retaining the best grad students in the country.'
Associate Dean of the Graduate Division
The problem, says Jeff Reimer, associate dean of the Graduate Division, is that there's almost no limit to the impact of circumstances beyond anyone's control on a university's rankings. In this regard, he shares with the chancellor a concern for the long-range effects of the ongoing state budget crisis on Berkeley's reputation.
However, while many in the UC system have expressed repeated concern about the legislature's failure to keep faculty salaries in line with those paid by competitive institutions — viewing that ever-receding parity as a serious threat to faculty recruitment and retention alike — Reimer doesn't see that danger as primary. He's more worried about reduced financial support for the graduate students upon whose work so much of the campus's reputation, as he sees it, depends.
"Our graduate students are the ones doing the cutting-edge research; they set the intellectual climate of the campus, not the faculty. The real question is not faculty salaries. Yes, if the salary system breaks, it's true we'll miss a few attractive candidates, and some faculty will leave. But grad students turn over much more rapidly — which means the real question is how do we support our grad students at a time when the fees they pay are escalating out of control? If we fail to support them, the effect will be seen almost immediately."
Grad-student applicants are "extremely savvy, very attuned to what's in their best interest," Reimer continues. "They'll perceive any change in the academic underpinnings of a university's reputation — and you can bet that if our academic reputation falls in some substantive way, it will lead to an almost immediate domino effect that would ultimately impact the quality of grad students seeking admission." Today's grad students are tomorrow's faculty, here and elsewhere — and the eventual informants for most surveys that aim to measure a university's reputation, whether or not they style themselves as "peer reviewed."
In that very real sense, then, Reimer says, "Every admissions cycle is another test of our reputation — one that we are in danger of failing, unless we develop a long-term strategy … for recruiting and retaining the best grad students in the country."
|'We want each student to collect enough information that the decision
they make includes the rankings as one of their criteria, but not the
sole one. Rankings have their uses … but they shouldn't be the
reason you make your ultimate choice.'
Asst. Vice Chancellor for Admissions and Enrollment
John Douglass of CSHE believes similarly, though he includes faculty in the scope of his concern. "The elite privates in the U.S. have vast and growing resources," he says, "while major public universities like Berkeley face the prospect of declining funding and a declining ability to compete for top faculty and graduate students unless a strategic approach is found soon."
Berkeley's reputation, as codified by various rankings and studies, is alluded to by those charged with recruiting both graduate and undergraduate applicants. Acting Dean of the Graduate Division Joseph Duggan is a critic of the new THES study, which he calls "not very carefully done," in part because it assesses complex universities as a whole rather than distinguishing among the various disciplines. The NRC studies, he says, are "much more exact and make many more distinctions" (though he has critical comments about their methodology as well). Yet he acknowledges that when called upon to talk about Berkeley's reputation, one is well advised "to take what you get: We can say that the Times Higher Education Supplement calls us No. 2 in the world, but we'll add that in the opinions of our peers we're No. 1. Image is a very important thing."
That is less of a double standard, arguably, than a demonstration of the ability to hold two thoughts simultaneously. Richard Black, assistant vice chancellor for admissions and enrollment, says that even though the THES ranking is "obviously very gratifying," he's resisting the temptation to "do a Sufi dance on Sproul Plaza" in response to it — nor will the Undergraduate Admissions Office use it as the basis of an extensive recruiting campaign, though it will probably update its widely distributed "viewbook" handout to include a low-key mention.
So in its pursuit of top-flight candidates for admission, Berkeley doesn't rely on these rankings to help close the deal? "We look at them," Black says, "but we don't rely on them. We want each student, and his or her parents, to collect enough information that the decision they make includes the rankings as one of their criteria, but not the sole one. Rankings have their uses — they may pique your curiosity about a school you hadn't considered previously — but they shouldn't be the reason you make your ultimate choice."
That said, Black acknowledges, a university's ranking is probably on an applicant's mind early in the process, and is still there when the final choice is made — a suspicion that corresponds with research conducted at UCLA that showed that nearly 80 percent of the students at highly selective colleges considered rankings as an important element of their decision-making process.
In the end, then, how should one regard the THES study? It's hard not to view the results with pride, regardless of quibbles over methodology or hardnosed concerns about the ever-widening GSI budget gap. (As Chancellor Birgeneau told the Berkeleyan wryly, "You have to take these rankings, even the ones whose methodology appears sound, with a grain of salt — unless, of course, you do very well.") Yet, particularly in view of Harvard's apparently unbreakable hammerlock on the top spot, it would seem prudent to take the "nowhere to go but down" threat to Berkeley seriously — if not for the "truth" that any such diminished ranking would embody, then on account of the inescapable human temptation to take any such rankings — be they the Nielsen ratings, America's Top 40, or the 50 Best Mutual Funds — at face value.
The Times Higher Education's Supplement's "World University Rankings 2004" is available to registered subscribers to the publication's print edition and website. A 14-day free trial subscription permits access to the report and associated editorial; visit www.THES.co.uk/main.aspx to begin the process.