We ALL WISH UOFT WOULD JUST SHUT UP

Canadian Universities Forum (discussion group)


 
 
Subject: We ALL WISH UOFT WOULD JUST SHUT UP
UofT sucks ...
[20-07-2006,23:10]
Anonymous
(in reply to: We ALL WISH UOFT WOULD JUST SHUT UP)
^says a uoft reject, what do you ppl have against the greatest university in canada?
[20-07-2006,23:11]
Anonymous
(in reply to: We ALL WISH UOFT WOULD JUST SHUT UP)
According to whom, Shanghai Jiao Tong University?

McGill is ranked number one in the medical doctoral category of Maclean?s 2005 Canadian university rankings, sharing the top spot with the University of Toronto. Outstanding students and a reputation for the highest quality are among the primary strengths.

McGill is ranked the top Canadian school in the Times Higher Education Supplement ranking of the world?s best universities in 2005 and the only Canadian university in the top 25 in the world.

McGill is Research University of the Year 2005 in the medical doctoral category, named by Research InfoSource, Canada?s leading research and development consulting firm. McGill earned 97.3 points out of a possible 100 in its ranking system, the highest score in the country.

[20-07-2006,23:13]
Anonymous
(in reply to: We ALL WISH UOFT WOULD JUST SHUT UP)
^

You keep talking as if SJTU is not respected and THES is. Do some research, both have been criticised equally. And if SJTU is such BS then why do all the top institutions use it? Don´t be such a cry baby just because McGill placed 67th in the World.

One minute you hate Macleans and call it biased because it´s out of Toronto, and the next your using it to talk about how great McGill is! Make up your mind, man. If you want to talk Maclean´s then mention how UofT has been number one for the past 12 years.

Buddy, Research InfoSource is the leadering R&D consulting firm in the country? Stop making up shit. Research InfoSource is mickey mouse compared to the Thomson Corporation.

http://en.wikipedia.org/wiki/Thomson_Corporation

I´m not saying Research Infosource is BS, because I have no reason to think it is, and I have no problem with UofT being ranked 2nd. But don´t forget, McGill doesn´t even get mentioned in the report by Thomson Corporation. What kind of "top" University in Canada doesn´t even get a mention? UBC, Waterloo, everyone else is there. And it´s not just Science, it´s biz, engineering, education.

[21-07-2006,00:15]
Anonymous
(in reply to: We ALL WISH UOFT WOULD JUST SHUT UP)
1. SJTU: not reproducible, see http://www.ad-astra.ro/journal/8/florian_shanghai_irreproducibility.pdf

2. What matters today is that UofT is not ranked currently higher than McGill in Maclean?s. I never said I did not like Maclean?s, I just want to raise the possibility of a conflict of interest.

3. Sorry about my choice of words, I retract in saying that Research Infosource is the leading R&D consulting firm in the country.

4. Thomson Corporation does not rank Universities. The report that you refer to is not a comprehensive study but rather a press release on impact factors in certain areas of science (http://scientific.thomson.com/press/2005/8290754/). The press release is less than 500 words. Come on, you cannot infer much from this! The methodology is not mentioned. No information is provided about what categories were looked at, what time span of publications was included, and more importantly, the overall impact factor across all disciplines. BTW, Waterloo was not included, only UofT, UWO, and UBC.

[21-07-2006,00:54]
Anonymous
(in reply to: We ALL WISH UOFT WOULD JUST SHUT UP)
^ sorry only mentioned in the body of the text.
[21-07-2006,01:00]
Anonymous
(in reply to: We ALL WISH UOFT WOULD JUST SHUT UP)
^

LOL! Are you sure you are a University student? Forget a McGill student!

1. Just because it is published doesn´t mean that it is right. A lot of bad science gets published all the time. SJTU writes a nice reply at the end. To be honest, I haven´t read and studied this paper, so I´m not sure if I agree with the guy, and I gaurantee you haven´t either. So, as it stands, we know that there is criticism in the same way that there has been criticism towards THES for manipulating their methodology to allow more European Universities in the top spots.

2. Ok, that´s fine. But, I don´t understand how you KNOW that there is a conflict of interest. 12+ years ago, McGill was ranked number 1 quite a lot. Plus, York and Ryerson tend to do poorly in the rankings, yet they´re from Toronto.

3. Cool.

4. It IS a ranking of output and IMPACT. It is a study and the results are given out to the media and public in the form of a press release. I don´t see how the length of the press release affects it´s usefulness?

lol. The methodology IS mentioned.

Between the years 2000-2004,

- number of papers (easy, eh?)
- IMPACT = number of citations per paper.

Are you brain dead? You can´t figure out the categories looked at? They are all listed!

I don´t know what you´re talking about with overall impact factor, but then again, you´ve been saying some pretty whacked out shit, so I´ll just ignore that.

And YES, Waterloo is mentioned. Best in Computer Science and Plant/Animal Science for IMPACT.

[21-07-2006,01:07]
Anonymous
(in reply to: We ALL WISH UOFT WOULD JUST SHUT UP)
1. Common on, for a study to be valid, one must be able to replicate it. If you read the study, you would know that a different group of researchers used the exact same methodology and got completely different results. It even goes as far to accuse SJTU of data manipulation.

2. Well, it is a suspicion of mine. When I read Maclean?s, it is like reading the National Post and the Global Mail, which are supposedly Canada?s national newspapers. Most of the articles are about Toronto.

3. I am glad you?re happy.

4. Yes, I know what impact factor is. Did they look at last year?s publications? This would be more representative of the university?s current performance rather than say if they looked over a span of 10 years. What journals did they include in each category? And more importantly, what university had the highest impact factor across all categories? Why did they not include this? This would be the most interesting finding! McGill could have very well have held this if it was constantly high, although never first, across all categories.

[21-07-2006,01:25]
Anonymous
(in reply to: We ALL WISH UOFT WOULD JUST SHUT UP)
Okay, fine they examined between 2000 and 2004. Sorry I do not know this press release memorized like you do. I would like to see which university had the highest impact factor across all disciplines. You do understand how to calculate the average of a set of numbers and that if a university scored 1st in some disciplines, it may not score 1st in all disciplines.
[21-07-2006,01:56]
Anonymous
(in reply to: We ALL WISH UOFT WOULD JUST SHUT UP)
Arbitrary and volatile?

Critics of the THES methodology (and that of other similar surveys, such as U.S. News & World Report´s) have pointed to what one of their number, Anne Machung of the Planning and Analysis Office at UC´s Office of the President, calls "the arbitrary nature of the variables used to construct the rankings, and their volatility."

The 50 percent of each ranking attributable to a university´s "peer reviewed" reputation is questionable, they say, in the absence of any insight into how the 1,300 respondents were selected, how many responded, and how valid their self-identified expertise might be considered. (The THES has so far published only a popular summary of its findings, without the raw data that would shed greater light on the underlying methodology.) The 20-percent weight accorded citations favors universities in English-speaking countries and those with strong capabilities in the natural sciences (as the THES editors acknowledge). Calculations of student/faculty ratios may or may not include the substantial proportion of instruction carried out by GSIs, adjunct faculty, and others; each university provided its own data in this area, with no attempt made by the THES and its consultant, QS Research, to take different measures into account. And the two international factors in the mix each have their critics, particularly among those familiar with the drastic impact on international-student recruitment in the U.S. stemming from various post-9/11 policy changes.

"Change either the variables on the list or the weights assigned them," says UCOP´s Machung, "and you will get a new set of rankings." To demonstrate that point, she compared the top 25 North American universities on the THES list with the top 25 U.S. universities in the most recent U.S. News survey and found, "not surprisingly ? many of the same universities on both lists, but with their rankings quite jumbled." Berkeley, No. 2 on the THES list, is No. 21 in the U.S. News ranking; even in less-rarefied air, the University of Massachusetts ranks as high as No. 22 and as low as No. 98, respectively.

Does that mean one set of rankings is better than the other? "Patently they are not," says Machung. "They are simply different, using somewhat different variables and assigning their variables different weights. Both lists are arbitrary, and probably neither really speaks to university quality except in a very general sense." Or, as Marc Meredith of Stanford´s business school wrote earlier this year: "Academic quality is a difficult concept to quantify."

It´s pleasant to think that some organization ? if not the THES, then perhaps the NRC ? would, over time, so improve the relevant inputs and outputs that its findings would become as close to unimpeachable as can be achieved in this contentious universe. And surely that would continue to place Berkeley in the Olympian heights it has come to regard as its natural neighborhood ? wouldn´t it?

Don´t hold your breath: Experience, says Machung, shows that when a news organization or survey group changes its ranking model, the rankings themselves change "irrespective of any actual change in the quality of the universities themselves." Because Berkeley is currently rated No. 2 in the THES survey, with Harvard far out in front of it, such volatility suggests that the campus is more likely to fall in future rankings than to rise.

Reputation and reality

Concern over the future of Berkeley´s ranked reputation extends beyond the new-kid-on-the-block THES. The campus´s high placement in the most recent (1995) NRC rating of Ph.D. programs has been widely and frequently heralded ? understandably, since 35 of 36 graduate programs here ranked in the top 10 in their fields in that study, with six of them ranked No. 1 nationwide in subject areas as disparate as German and chemistry. To what extent the near-universal praise at Berkeley for the NRC study as the Rolls-Royce of such efforts is based on those salubrious findings ? as opposed to its rigorous methodology (for example, it surveyed more than 8,000 faculty nationwide as opposed to the THES´s sample of 1,300 worldwide) ? is difficult to pin down. What isn´t hard to understand is the danger to Berkeley of any future rankings that take us down a reputational peg ? or more.

"That´s the underlying worry we all have," says Chancellor Robert Birgeneau. "Can we sustain this long tradition of excellence?" Going forward, he says, "It´s going to be a challenge for us financially to maintain our excellence in the face of ever-increasing competition from elite private universities. But we have to do this, and in such a way that we don´t compromise our commitment to public service and fulfilling our commitment to the people of California."

The problem, says Jeff Reimer, associate dean of the Graduate Division, is that there´s almost no limit to the impact of circumstances beyond anyone´s control on a university´s rankings. In this regard, he shares with the chancellor a concern for the long-range effects of the ongoing state budget crisis on Berkeley´s reputation.

However, while many in the UC system have expressed repeated concern about the legislature´s failure to keep faculty salaries in line with those paid by competitive institutions ? viewing that ever-receding parity as a serious threat to faculty recruitment and retention alike ? Reimer doesn´t see that danger as primary. He´s more worried about reduced financial support for the graduate students upon whose work so much of the campus´s reputation, as he sees it, depends.

"Our graduate students are the ones doing the cutting-edge research; they set the intellectual climate of the campus, not the faculty. The real question is not faculty salaries. Yes, if the salary system breaks, it´s true we´ll miss a few attractive candidates, and some faculty will leave. But grad students turn over much more rapidly ? which means the real question is how do we support our grad students at a time when the fees they pay are escalating out of control? If we fail to support them, the effect will be seen almost immediately."

Grad-student applicants are "extremely savvy, very attuned to what´s in their best interest," Reimer continues. "They´ll perceive any change in the academic underpinnings of a university´s reputation ? and you can bet that if our academic reputation falls in some substantive way, it will lead to an almost immediate domino effect that would ultimately impact the quality of grad students seeking admission." Today´s grad students are tomorrow´s faculty, here and elsewhere ? and the eventual informants for most surveys that aim to measure a university´s reputation, whether or not they style themselves as "peer reviewed."

In that very real sense, then, Reimer says, "Every admissions cycle is another test of our reputation ? one that we are in danger of failing, unless we develop a long-term strategy ? for recruiting and retaining the best grad students in the country."

John Douglass of CSHE believes similarly, though he includes faculty in the scope of his concern. "The elite privates in the U.S. have vast and growing resources," he says, "while major public universities like Berkeley face the prospect of declining funding and a declining ability to compete for top faculty and graduate students unless a strategic approach is found soon."

Berkeley´s reputation, as codified by various rankings and studies, is alluded to by those charged with recruiting both graduate and undergraduate applicants. Acting Dean of the Graduate Division Joseph Duggan is a critic of the new THES study, which he calls "not very carefully done," in part because it assesses complex universities as a whole rather than distinguishing among the various disciplines. The NRC studies, he says, are "much more exact and make many more distinctions" (though he has critical comments about their methodology as well). Yet he acknowledges that when called upon to talk about Berkeley´s reputation, one is well advised "to take what you get: We can say that the Times Higher Education Supplement calls us No. 2 in the world, but we´ll add that in the opinions of our peers we´re No. 1. Image is a very important thing."

That is less of a double standard, arguably, than a demonstration of the ability to hold two thoughts simultaneously. Richard Black, assistant vice chancellor for admissions and enrollment, says that even though the THES ranking is "obviously very gratifying," he´s resisting the temptation to "do a Sufi dance on Sproul Plaza" in response to it ? nor will the Undergraduate Admissions Office use it as the basis of an extensive recruiting campaign, though it will probably update its widely distributed "viewbook" handout to include a low-key mention.

So in its pursuit of top-flight candidates for admission, Berkeley doesn´t rely on these rankings to help close the deal? "We look at them," Black says, "but we don´t rely on them. We want each student, and his or her parents, to collect enough information that the decision they make includes the rankings as one of their criteria, but not the sole one. Rankings have their uses ? they may pique your curiosity about a school you hadn´t considered previously ? but they shouldn´t be the reason you make your ultimate choice."

That said, Black acknowledges, a university´s ranking is probably on an applicant´s mind early in the process, and is still there when the final choice is made ? a suspicion that corresponds with research conducted at UCLA that showed that nearly 80 percent of the students at highly selective colleges considered rankings as an important element of their decision-making process.

In the end, then, how should one regard the THES study? It´s hard not to view the results with pride, regardless of quibbles over methodology or hardnosed concerns about the ever-widening GSI budget gap. (As Chancellor Birgeneau told the Berkeleyan wryly, "You have to take these rankings, even the ones whose methodology appears sound, with a grain of salt ? unless, of course, you do very well.") Yet, particularly in view of Harvard´s apparently unbreakable hammerlock on the top spot, it would seem prudent to take the "nowhere to go but down" threat to Berkeley seriously ? if not for the "truth" that any such diminished ranking would embody, then on account of the inescapable human temptation to take any such rankings ? be they the Nielsen ratings, America´s Top 40, or the 50 Best Mutual Funds ? at face value.

http://www.berkeley.edu/news/berkeleyan/2004/12/01_rankings.shtml

[21-07-2006,01:57]
Anonymous
(in reply to: We ALL WISH UOFT WOULD JUST SHUT UP)
Holy cow, you want me to read this! Forget it. I am done debating with you.
[21-07-2006,01:59]
Anonymous
(in reply to: We ALL WISH UOFT WOULD JUST SHUT UP)
^ dude, read it. it might change your opinion of THES. THES is a fucked up ranking, man. I don´t care if UofT beats McGill in the next THES ranking, it´s still fucked.

Read it.

BTW, the President of Berkeley is UofT´s last President :)

[21-07-2006,02:07]
Anonymous
(in reply to: We ALL WISH UOFT WOULD JUST SHUT UP)
"Buddy, Research InfoSource is the leadering R&D consulting firm in the country? Stop making up shit. Research InfoSource is mickey mouse compared to the Thomson Corporation."

LMAO, this is another case of UofT fanboys trying to say that THEIR source is the gold standard. Such bias! just because thomson favours your university doesn´t mean it´s better. Research Infosource IS a leading R&D consulting firm and McGill > UofT. give it up.

[21-07-2006,04:03]
Anonymous
(in reply to: We ALL WISH UOFT WOULD JUST SHUT UP)
"If you want to talk Maclean?s then mention how UofT has been number one for the past 12 years. "

??? you want to talk about the past?? Arright, next time you mention UofT vs McGill, mention that UofT has been(and still is!) in McGill´s shadow for all of its history. Mention that McGill was ranked the number 1 Canadian university in the Gourman Report seven years ago (forgot about that one)

McGill > UofT

[21-07-2006,04:07]
Anonymous
(in reply to: We ALL WISH UOFT WOULD JUST SHUT UP)
http://studywonder.com/canada_uni.htm

^ THIS RANKING!?

Dude, everyone thinks this ranking is completely wrong. York is ranked 6th, while Queen´s is ranked 21st?? Waterloo 22nd!? lol. ya. ok!

[21-07-2006,21:04]
Anonymous
(in reply to: We ALL WISH UOFT WOULD JUST SHUT UP)
Comparing Research InfoSource to Thomson Scientific is like comparing Pizza Hut to Nobu Restaurant. One is heads above the other. Thomson is one of the leaders for R&D in the World, forget Canada.
[21-07-2006,21:07]
Anonymous



Canadian Universities Forum at Canada City Web Site | Start Home Based Business in Canada | Canadian and International FLP Business