antigone56 Posted September 21, 2013 Posted September 21, 2013 Look at this list please. http://grad-schools.usnews.rankingsandreviews.com/best-graduate-schools/top-humanities-schools/english-rankings When people on this board refer to "Top 10" schools, are they referring to the top 10 schools that are listed, beginning at Berkeley, or do they mean any school that has the rank of 1 through 10 (which may amount to 13 or so schools)? Same with references to "Top 20" schools. Is that the top twenty schools listed or any school that is ranked between 1 and 20, as indicated by the number next to the name of the school? I need clarification on this so that I have a sense of my own realistic-ness (??) in choices of schools.
aGiRlCalLeDApPlE Posted September 21, 2013 Posted September 21, 2013 I think it should apply to all those between 1 and 20, even if they're more than 20 schools. The same number given to two schools means that they both have the same ranked, so they both should be counted, right? However, I'd advise you like many people here advised me before; focus on fit, no matter what the ranking is. You can apply to a school ranked as #100 for example and you'd think it's a safe choice but you may not get in if you don't fit. My two cents...
antigone56 Posted September 21, 2013 Author Posted September 21, 2013 I am focused on fit, but I'd like to somewhat "spread out" my choices even with fit in mind. Fit is more important, though, I agree.
asleepawake Posted September 21, 2013 Posted September 21, 2013 This is not a thing that matters. Monochrome Spring, danieleWrites and Datatape 3
antigone56 Posted September 21, 2013 Author Posted September 21, 2013 It does matter! Maybe not to you, but it matters to some. Doesn't mean one is making it his/her sole criterion for choosing, but just considering it.
antigone56 Posted September 21, 2013 Author Posted September 21, 2013 Also, even for people who aren't obsessed with "ranking," as I wasn't when I handed a list to one of my advisors, it's hard not to then consider it a little when said advisor starts talking about the top schools being a long-shot no matter what. You know, you do the best you can in deciding, taking five to six different approaches of various people/advisors/grad cafe'ers into consideration. One of my advisors has absolutely no notion of 'rankings', another is focused on it, telling me to be realistic. So...that's why I was interested in what, technically, is top 10 or top 20.
ComeBackZinc Posted September 21, 2013 Posted September 21, 2013 I think you're conflating a couple separate ideas, here. 1. The notion of a particular program's relative prestige, exclusivity, and quality 2. The ranking of those qualities into any ordinal list 3. The particular rankings of the US News and World Report list The first matters a great deal on the job market. It isn't the only thing that matters, and people can and do overcome a lower ranking to get hired, but it matters a lot for getting hired. However, the fact that number one matters does not mean that number 2 is necessarily possible, and certainly does not mean that number 3 is accurate. Many people who believe that prestige is important don't think that US News and World Report has anything more than a crude idea of the relative prestige of different programs. Some of the rankings on that list seem so out of date as to be farcical. In any event: hiring committees will be made up of tenured professors who have their own ideas about which programs are the most prestigious (or highest quality or whatever.) asleepawake, TripWillis and practical cat 3
asleepawake Posted September 21, 2013 Posted September 21, 2013 (edited) All top 20 schools (of which there are 21 according to US News) are long-shots. It isn't meaningful to draw any kind of line there, when there are such a large number of factors playing into whether or not you are accepted. Even if you want to go with raw numbers, there is no magic cutoff between top 10 programs and 11-20 programs in terms of acceptance rate. The only place your question even means anything is at those schools explicitly ranked #10 and # 20. They have the exact same score as each other, and they are arranged alphabetically. Northwestern is 21st on the list only because New York University bests it alphabetically. You should take ratings into account only very generally. If they matter to you, find the lowest you are willing to go in the rankings, and the schools you apply to should fall all over the place in between there are the highest ranked schools that are a good fit for you. Edited September 21, 2013 by asleepawake danieleWrites, ProfLorax and Monochrome Spring 3
champagne Posted September 24, 2013 Posted September 24, 2013 I find it funny that the uselessness of the USNWR rankings is largely prevalent throughout all of academia, yet when they come out EVERY SINGLE YEAR there is a huge scramble to find your program's name on the list, publish it in the most expedient way possible, and explain why your school is in the position it's in. I hate the rankings as much as anybody, but there will never be a revolution against them until the leaders and members of the specific institutions stop the ludicrous farce of virally perpetuating them throughout the internet as soon as they are published year after year.
Swagato Posted September 24, 2013 Posted September 24, 2013 Who is making these efforts? University administration, or faculty and students? My impression is that such PR efforts are largely initiated by the former. The only times I hear the latter talk about rankings are if they're NRC, or Leiter-style.
Two Espressos Posted September 24, 2013 Posted September 24, 2013 antigone56, I think that most people refer to the numerical ranking of a given program in the USNWR rankings, since the reason two or more given programs are ranked equally is that they have the same numerical score and should thereby ostensibly be "equal" in quality (you probably already know this). At the risk of belaboring the issue, I'll say that I think that ordinal rankings (even USNWR) are very useful for broad classification: they help capture the intuitive sense of academic prestige. It's when we start splitting hairs (e.g., "Is program x, ranked 30th, better than program y, ranked 35th?") that the usefulness of rankings completely breaks down. That's my two cents, at least.
champagne Posted September 25, 2013 Posted September 25, 2013 Who is making these efforts? University administration, or faculty and students? My impression is that such PR efforts are largely initiated by the former. The only times I hear the latter talk about rankings are if they're NRC, or Leiter-style. I witnessed it from all three. Obviously, there are (large) sectors of all three camps that despise the rankings, but I've seen professors, students, and, of course, administration/communication offices discuss them in open forums. Since public relations inherently entails some sort of outward push of publication of this sort of thing, there will be a wider audience to catch that source of the message; however, I've personally interacted with students and faculty that go to the rankings through whatever outlet is available to them. It's an epidemic.
danieleWrites Posted September 28, 2013 Posted September 28, 2013 USNWR rankings are useless to the academy because they can't measure which school really is better at scholarship. How can that even be quantified accurately? For academics, it falls in that prestige thing. For administrators, it falls in the enrollment thing. The higher up on the list they are, the more high school seniors will apply, and the more alumni will donate. It's money. Have you looked at their methodology? It's ridiculous. They use GRE scores (as if that's a predictor of any use in academic success: http://www.fairtest.org/facts/gre.htm check the section about validity, as tested by Yale). They ask administrators in other schools to measure the program (while they have a decent idea of the scholarship of people in their field--faculty--how much can they accurately measure about the program itself? Do they read a representative sample of current graduate student work?). And finally, "output" statistics, like starting salaries for MBAs or bar exam statistics for JDs. For English, you're looking not at new information, but a combination of data collected in 2008 and 2012. And how do they even measure output statistics for English? These metrics make USNWR ranks useless to measure program quality. Standardized testing, which predicts success in college about 3% of the time, makes up a significant portion of the ranking metric. It's like using a social security number to decide which states are the best ones to get a job in. To make things worse, two-thirds of the metrics measure the benefits of prestige (Ivy Leagues et al have always and will continue to attract students with the best SAT scores, and have always and will continue to send their graduates out into higher paying jobs), not the quality of the program. If some Podunk U's English program hired the best of the PhDs from places like Harvard and developed one of the academically richest programs in the world, they still could not compete in the rankings with Harvard because Harvard attracts the people with better numbers regardless of program quality. Now, that's not to say the schools ranked at the top of USNWR aren't good schools with great programs. They are. What this is to say is that the rankings don't measure program quality or scholarship. At all. Most of the top scholars in Comp Rhet, for the past few decades, went to state schools. USNWR rankings can be useful, but without knowing what USNWR uses to measure quality, how do you know that they're measuring what you think makes a quality program? Does the amalgamation of GRE/LSAT/GMAT scores of students who wanted to apply to that school and beginning salaries for graduates tell you that one school is better than another? Here's The Atlantic with this year's Annual Dose of USNWR Rankings Reality: http://www.theatlantic.com/education/archive/2013/09/your-annual-reminder-to-ignore-the-us-news-world-report-college-rankings/279103/ which includes a link to a more useful ranking system with metrics that make sense: http://www.washingtonmonthly.com/college_guide/index.php If you want to get a real measure of a school's "rank", read the dissertations of last year's crop of graduates and find out where they're working now (not how much they're making). Of course, that means a lot of personal legwork, and you won't rank them the same way everyone else will, but the only way to measure the quality of a program's scholarship is to look at the program's scholarship. That's too time consuming and expensive for the ranking industry, so you get schools ranked with metrics that have no real value. wreckofthehope 1
id quid Posted September 28, 2013 Posted September 28, 2013 As self-defeating and self-perpetuating as this rankings cycle is, it's worth noting that young faculty at many schools will have the same names over and over again. From my field, the graduates within the past 10 year teaching at schools from University of Tennessee to Stanford: Yale, University of Pennsylvania, Toronto, Cornell, WashU, Brown, Michigan, Columbia, UCLA, Berkeley, Cambridge, Oxford... Schools outside that list: Colorado ('72) I do not think a sense of "rankings" is going to help anyone here, but it's worth considering your goals when you make your decisions. This is true for every single job and skill: take a look at what the people who are where you want to be did to get there. Things change, of course, and they change more when you do what you can to change them. Ignoring "rankings" because they're "subjective" is no more helpful than considering them the gospel truth and having a cut-off at an arbitrary point on an ordinal scale.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now