Jump to content

Recommended Posts

Posted

Methodology: For psychology 246 schools were surveyed, and only 16%, i.e. 39 schools responded, of which top 2 and bottom 2 responses were discarded to get the trimmed mean score...i.e. the Psychology rankings are based on the assessment of 35 individuals in the field grading programs on a scale of 1 to 5, with 4=Strong and 5=Outstanding. Scores: Rank 1 is 4.8, Rank 29 is 4.0. I guess what I am trying to say is that there is not much to go by.

 

(For the sub-fields it gets even better - only programs nominated by at least 7 of the 39 respondees appear on the lists, in order of the number of nominations.

Consider I/O Psychology - only 4 names made it to the list.)

Posted

The social psychology rankings fit with my impressions. But one flaw is that it's U.S.-centric. There are Canadian programs like University of Waterloo, University of Toronto, and University of British Columbia that are great schools but completely omitted. One or more could arguably be in the top 10... the recent research impact paper by Nosek in PSPB puts all three of those in the top 25. (Toronto: 8, Waterloo: 16, UBC: 22)

Posted

Does anyone know how to interpret the schools at the end without any ranking displayed?

 

My former BA/MS program was on there. It used to be a really great school for clinical psych but it really went downhill badly a few years ago.

Posted

I think the ones without tanking displayed places below 200. They explain it somewhere in one of the links.

That being said, not a fan of the methodology at all... Would not read much into these rankings.

Excuse my ignorance but can some tell me the advantages/limits to discounting the highest and lowest responses ?

Posted

This really misses the specialization in the field of Psychology. A lot of times, who you work for is more important than what program you're in. I'd rather go to #200 and work with an influential person in my field than #5 with someone I'm not as excited about. But oh well, I know some people follow these rankings dilligently. :-)

Posted

This really misses the specialization in the field of Psychology. A lot of times, who you work for is more important than what program you're in. I'd rather go to #200 and work with an influential person in my field than #5 with someone I'm not as excited about. But oh well, I know some people follow these rankings dilligently. :-)

 

that departmental prestige is the best predictor of post-PhD employment. So you might have a more enjoyable PhD but, statistically speaking, worse career prospects ;)

Posted

blah blah blah... the world is not about rankings... at least the real world is not about rankings... the real world is about individuals, right?

Posted

that departmental prestige is the best predictor of post-PhD employment. So you might have a more enjoyable PhD but, statistically speaking, worse career prospects ;)

This statement should be taken with caution. The article suggests that all else being equal it is the best predictor, taking it easy in a well ranked department still will not get you a job.

Posted

This statement should be taken with caution. The article suggests that all else being equal it is the best predictor, taking it easy in a well ranked department still will not get you a job.

 

Not quite

 

"The strongest predictor of employment was department-level rankings even while controlling for individual accomplishments, such as publications, posters, and teaching experience."

Posted

I wish I had a chance to read the full article, I can't help but wonder if they say anything about the "who you know" factor.

Posted

Does anyone know how to interpret the schools at the end without any ranking displayed?

 

My former BA/MS program was on there. It used to be a really great school for clinical psych but it really went downhill badly a few years ago.

 

"Rank Not Published means that U.S. News did calculate a numerical ranking for that school/program, but decided for editorial reasons that since the school/program ranked below the U.S. News cutoff that U.S. News would not publish the ranking for that school/program."

 

http://www.usnews.com/education/best-graduate-schools/articles/2013/03/11/methodology-best-social-sciences-and-humanities-schools-rankings

Posted

Not quite

 

"The strongest predictor of employment was department-level rankings even while controlling for individual accomplishments, such as publications, posters, and teaching experience."

 

However the thing to note is that the department-level rankings used here were not the US News Rankings but the NRC rankings, which do not publish an absolute ranking but a range of scores for each department depending on a basket of criteria. The resulting output was multiple sets of rankings, and outside of the absolute absolute top tier (say top 3 schools) the ranking variances were very wide, e.g. ranking for the same school varioes from #5 to #29, or #9 to #46 depending on the criteria used. Even looking at the most important criteria, the scores awarded for 185 schools fell in the range of 24 (low) to 72 (high). The score range for the top 40 schools was 60 to 72.

 

My point, as in my original post, is that the best way to look at rankings (even when considering they determine future opportunities etc.) is in large blocks, i.e. possibly Top 5, 6-30, etc. and yes most people would prefer a #5 to a #200 any day, but I would think there is no big difference between #10 and #25, especially if #25 has POIs you would rather work with???

Posted

too much focus on US News and World Report, too little focus on making your personal contribution to the field.  stop worrying about stats, start thinking about your personal trajectory.

Posted

It would be interesting to correlate students' and professors' attitudes about rankings with the rankings of their particular program. Perhaps we place more relevance on rankings if ours is high. Or maybe people in middle-to-moderately-high rankings value these rankings more, while the top programs don't need rankings to get the word out that they are prestigious. Or there could be no correlation at all! Does anyone know if this has been done? I feel like Lewin might know the answer. :)

Posted

hah hah! I appreciate your confidence but I don't know of any hard data on this question in particular. But there is more general evidence that people are motivated to be selectively skeptical about science or medical tests that present findings they dislike (Peter Ditto's work comes to mind... or Drew Westen if you like fMRI).

Posted

Thanks, Lewin! Yes, that's pretty much the type of work I was thinking of. Westen's quote that people "twirl the cognitive kaleidoscope until they get the conclusions they want" when they have a "vested interest" could apply to rankings: people in high OR low ranked programs might stress different aspects of the rankings process. 

 

I'm pondering this due to introspection, by the way. I noticed in myself a (dangerous) desire to overlook methodological weakness once I saw my future school near the top. 

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

This website uses cookies to ensure you get the best experience on our website. See our Privacy Policy and Terms of Use