U of T = No guts, Coward !

Canadian Universities Forum (discussion group)

Subject: U of T = No guts, Coward !
MEASURING UP: What university rankings do and don?t tell us

Ottawa Citizen Op/Ed
By David Naylor
April 23/06

On Saturday, April 23, the Ottawa Citizen published the following article by President David Naylor, describing U of T´s concerns about Maclean´s magazine´s annual ranking of Canadian universities.

Last month, the heads of four major Canadian universities decided not to participate in Maclean?s survey of our recently graduated students. In a letter to the magazine, we raised long-standing concerns about Maclean?s practice of ranking universities in a special edition every fall. But we also signaled our willingness to discuss other ways of helping Canadians evaluate their universities in a meaningful manner.
The response from Maclean?s thus far has been frustrating. There has been some bluster: The magazine knows more about statistics and survey design than a few old professors, and Maclean?s is going ahead regardless. There have been implicit threats: If we won?t help Maclean?s gather data from our graduates, then Maclean?s will find another way to get similar information, even if it drives down our institutional rankings in the process. And there has been some righteous rhetoric: We are all hiding something, and shirking our duty of disclosure to prospective students. And so on.

The reality is a bit different. In fact, more than a dozen institutions coast to coast have quietly declined to participate in the Maclean?s graduate survey. And other institutions are now re-thinking the ranking exercise itself. These academic leaders respect Maclean?s spring review of campuses and the readable compilation of a wide variety of performance indicators by the magazine. Many of us would happily collaborate with Maclean?s if the spring format could be strengthened in some way, perhaps by grading different dimensions of a university?s performance. But what is rapidly losing credibility is Maclean?s fall ritual of lumping a wide range of very different measures into a single set of rankings and proclaiming each year?s ?winners? and ?losers?.

Here?s the problem: Rankings and ?league tables? are a good measure of success in things like sports and sales, where winning generally comes down to a single number. But no single measure can accurately reflect even a mid-sized university, where hundreds of professors and lecturers teach hundreds of courses across disciplines as varied as engineering and religion.

Such concerns go well beyond Maclean?s . They raise an important question for an era that is, rightly, concerned with measurement, accountability and transparency. When does a metric become so oversimplified for the sake of newsworthiness that it is no longer worth using? My institution has found Maclean?s useful for one thing only: Marketing. None of us really believes that the ranking has much intellectual rigour. As academics we devote our careers to ensuring that people make important decisions on the basis of good data, analyzed with discipline. But Canadian universities have been complicit, en masse, in supporting a ranking system that has little scientific merit because it reduces everything to a meaningless, average score.

Think of it this way. If one of your hands is plunged in boiling water, while the other is frozen in a block of ice, then the average temperature of your two hands is just fine. That?s exactly what happens when a range of data about a university are averaged into a single ranking.

I encountered the danger of such over-simplified data most dramatically well before becoming a university president. As a healthcare researcher, I helped develop systems for measuring performance in hospitals. Imagine a hospital that, on a scale of one to 10, scores ?one? for heart surgery but ?10 ? for delivering babies. The combined rating of ?5? is misleading for heart patients and expectant women alike! Now ask what a single average number means for a university with 17 distinct schools and faculties, three separate campuses, scores upon scores of academic departments, centres and colleges, and hundreds of academic programs.

Yes, a single number is seductive. It is easy to put out a press release based on a single number, or to splash a few numbers on the cover of a magazine and to market the results as a key decision-making tool. But these league tables are hardly a transparent source of information for young people and their families making important decisions about post-secondary education. The tables mask problems in their top ranks and hide real strengths in their bottom ranks.

In 2005, for example, Maclean?s ranked the University of Toronto tied for number one among Canadian universities with medical schools and PhD programs. We?ve owned or shared that position for 12 years in a row. On the surface, it?s a fabulous record ? supported by the fact that we are, by a fair margin, the biggest and arguably best research enterprise in Canada. But simply saying that we?re ?Number One? masks the fact that we urgently need to enhance the undergraduate student experience in some of our largest programs. We identified that problem some years ago, have recently measured it precisely with rigorous new survey measures, publicized the results, pinpointed best practices to improve the undergraduate experience, and are taking serious steps to address those issues.

Dalhousie, on the other hand, has been 12 th in the same league table for the last two years. That relatively low ranking is inconsistent with Dal?s strong reputation among North American scientists as a site for post-doctoral study. And a Grade 12 student interested in environmental sciences would have no idea, from that ranking, that a Dalhousie biologist is one of Fortune Magazine ?s ?top ten people in the world to watch? because of his leadership in Ocean Studies.

Rankings can?t capture those nuances. They make matters murkier, not clearer, because they vary significantly depending on what numbers one adds to the tossed salad of indicators and how one combines them to create a single result. The truth is that there?s no truth in a single number.

That?s why U of T has worked for years to refine its own performance indicators, which we publish on our website in a report entitled Measuring Up . It compares us to our peers and to our past record, but does not resort to a simple, aggregated ?#1? or ?#2? or ?#3?. Instead, it offers students 23 pages of data and analysis for consideration.

In short, measuring the performance of universities doesn?t fit neatly into a magazine league table. Just as the different and diverse dimensions of Canadian healthcare institutions are separately and rigorously evaluated by experts, so also do we need public report cards on universities that are truly informative. We would be happy to work with Maclean?s in developing one.

I am encouraged by evidence that prospective students actually pay relatively little attention to magazine and newspaper rankings. They talk to friends and relatives, visit our campuses and talk to current students, take the pulse of specific programs that interest them, and weigh a whole series of data points using their own unique decision-making algorithm.

They know, intuitively, that picking a university is a chance to experiment with the very disciplines of complex judgment that a fine university strives to teach.

(in reply to: U of T = No guts, Coward !)
(in reply to: U of T = No guts, Coward !)
Well I can certainly understand the perspective from UofT point. They´ve been ranked in the top 3 almost every year since the ranking´s inception and said very little about how useless and potentially misleading these types of rankings can be. Instead, they´ve been trumping their high ranked position to support recruiting efforts. Naturally they want any graduate survey data released to MacLean´s to show them in the best light. Why it took them so long to speak out?? I suspect they participated as long as the results made them look good. And why not??

(in reply to: U of T = No guts, Coward !)
that´s why maclean´s sucks. i try to throw out everything i read from maclean´s now.
(in reply to: U of T = No guts, Coward !)
I like how they used the USA News method and found that the results stay pretty much the same. Toronto remains #1.

(in reply to: U of T = No guts, Coward !)
The rankings as it currently exist is quite stale and is in need of a serious makeover. I used to read it just for fun and to see if my uni has changed. Of course it never does. Now I just flip a through a few pages whenever I visit the magazine stand in November and put it back on the shelf. It´s kinda boring to see the same old rankings year after year. MacLean´s can probably publish it once every three years and nobody would ever notice any difference. Of course they won´t do that because it´s a big revenue generator for them. The annual directory however, is full of useful information but the ranking does need an overhaul.
(in reply to: U of T = No guts, Coward !)
Why would you expect a change in the rankings every year?
(in reply to: U of T = No guts, Coward !)
I don´t expect the ranking to change unless there is a change in the methodology.

Canadian Universities Forum at Canada City Web Site | Start Home Based Business in Canada | Canadian and International FLP Business