echo449 Posted March 15, 2017 Posted March 15, 2017 Just to chime in on the article discussion: In general, you will be able to publish more articles in departments where you have more time off from teaching. Period. There are exceptions to this, but the reason that "elite" programs offer so much time off--whether their students want it or not--is that it gives people time to write, revise, and edit for the journal publication. So! If thinking about rankings is too abstract and determining, it's more useful to think about the material support that you will be offered. silenus_thescribe and kurayamino 2
orphic_mel528 Posted March 15, 2017 Posted March 15, 2017 Thought I'd leave this here for consideration as well: http://www.phds.org/rankings/english
silenus_thescribe Posted March 15, 2017 Posted March 15, 2017 I could be wrong, but it seems to me that with respect to rankings and eventual job prospects, rankings are a reflection of already existing preferences/prejudices that hiring executive committees would have, rather than things that cause people to believe one thing or another. That is to say, it's not as if hiring committees are going to the rankings after looking through a list of job applicants, and then letting their decision be swayed by whatever the rankings are. If the rankings do factor in to the decisions of a hiring committee, they do so based more on general impressions rather than a one-to-one comparison. zombiekeats and Glasperlenspieler 2
LurkyLurker Posted March 15, 2017 Posted March 15, 2017 54 minutes ago, orphic_mel528 said: Thought I'd leave this here for consideration as well: http://www.phds.org/rankings/english How accurate do you guys think these statistics are as far as job placement rates go?
Bumblebea Posted March 15, 2017 Posted March 15, 2017 1 minute ago, LurkyLurker said: How accurate do you guys think these statistics are as far as job placement rates go? Not very. They're seven years old (I can't emphasize how much has changed on the job market since 2010) and the data were compiled even earlier than 2010, so they're probably more like 10 or 12 years old. And I can't really remember this very well, but I believe that when they came out there were a lot of questions surrounding their methodology. Like, it's not clear how job placements were even factored into their ranking system. LurkyLurker and brontebitch 2
orphic_mel528 Posted March 15, 2017 Posted March 15, 2017 9 minutes ago, LurkyLurker said: How accurate do you guys think these statistics are as far as job placement rates go? I think, aside from the absurdity of the methodologies used in these rankings, that all the prestige and ranking in the world won't help anyone who isn't willing to work hard and demonstrate their viability as a researcher and educator. To be clear, that's very much a general statement, by the way, and not aimed at anyone in particular--certainly not in this forum. However, the amount of anxiety I see over rankings and post-doc job prospects, and the relationship between these two things, is very troubling to me. If I can venture a guess, I would say that attending a top 20 program likely affords you connections and opportunities that others may not, but there are other ways to make a path for yourself. In 2014, I attended and presented at a conference in Australia at which one of the top-ranking Early Modernists in the world, from a top 10 program, was also in attendance. At the social gathering following the conference, he approached me and said he was impressed with my work. He asked me what doctoral programs I was applying to, and offered to put in a good word for me. At the time, I wasn't 100% sure I was going to go for a PhD, which I told him. I did my MA at a program that isn't even ranked, and I completed my BA at a university I affectionately/begrudgingly call a glorified community college. The point is: talent will out if you show it, and it will be recognized. This anxiety over rankings implies that the only way to job security is to get into a highly ranking program. I simply don't agree with that, and I think to have any anxiety over rankings developed from methodologies as shoddy as these is also unfair to yourselves. Any disadvantage, for lack of a better term, you might perceive you have because you're attending a low-ranked program can be compensated for. Anyway, that's my word as an Ancient One. I just hate to see anyone biting their nails off, thinking they're going to be unemployed and destitute because they didn't get into UPENN. biyutefulphlower, Warelin, positivitize and 8 others 11
CrunchyMamademic Posted March 15, 2017 Posted March 15, 2017 8 minutes ago, orphic_mel528 said: I think, aside from the absurdity of the methodologies used in these rankings, that all the prestige and ranking in the world won't help anyone who isn't willing to work hard and demonstrate their viability as a researcher and educator. To be clear, that's very much a general statement, by the way, and not aimed at anyone in particular--certainly not in this forum. However, the amount of anxiety I see over rankings and post-doc job prospects, and the relationship between these two things, is very troubling to me. If I can venture a guess, I would say that attending a top 20 program likely affords you connections and opportunities that others may not, but there are other ways to make a path for yourself. In 2014, I attended and presented at a conference in Australia at which one of the top-ranking Early Modernists in the world, from a top 10 program, was also in attendance. At the social gathering following the conference, he approached me and said he was impressed with my work. He asked me what doctoral programs I was applying to, and offered to put in a good word for me. At the time, I wasn't 100% sure I was going to go for a PhD, which I told him. I did my MA at a program that isn't even ranked, and I completed my BA at a university I affectionately/begrudgingly call a glorified community college. The point is: talent will out if you show it, and it will be recognized. This anxiety over rankings implies that the only way to job security is to get into a highly ranking program. I simply don't agree with that, and I think to have any anxiety over rankings developed from methodologies as shoddy as these is also unfair to yourselves. Any disadvantage, for lack of a better term, you might perceive you have because you're attending a low-ranked program can be compensated for. Anyway, that's my word as an Ancient One. I just hate to see anyone biting their nails off, thinking they're going to be unemployed and destitute because they didn't get into UPENN. This! Very eloquently said, Mel.
KikiDelivery Posted March 15, 2017 Author Posted March 15, 2017 28 minutes ago, orphic_mel528 said: I think, aside from the absurdity of the methodologies used in these rankings, that all the prestige and ranking in the world won't help anyone who isn't willing to work hard and demonstrate their viability as a researcher and educator. To be clear, that's very much a general statement, by the way, and not aimed at anyone in particular--certainly not in this forum. However, the amount of anxiety I see over rankings and post-doc job prospects, and the relationship between these two things, is very troubling to me. If I can venture a guess, I would say that attending a top 20 program likely affords you connections and opportunities that others may not, but there are other ways to make a path for yourself. In 2014, I attended and presented at a conference in Australia at which one of the top-ranking Early Modernists in the world, from a top 10 program, was also in attendance. At the social gathering following the conference, he approached me and said he was impressed with my work. He asked me what doctoral programs I was applying to, and offered to put in a good word for me. At the time, I wasn't 100% sure I was going to go for a PhD, which I told him. I did my MA at a program that isn't even ranked, and I completed my BA at a university I affectionately/begrudgingly call a glorified community college. The point is: talent will out if you show it, and it will be recognized. This anxiety over rankings implies that the only way to job security is to get into a highly ranking program. I simply don't agree with that, and I think to have any anxiety over rankings developed from methodologies as shoddy as these is also unfair to yourselves. Any disadvantage, for lack of a better term, you might perceive you have because you're attending a low-ranked program can be compensated for. Anyway, that's my word as an Ancient One. I just hate to see anyone biting their nails off, thinking they're going to be unemployed and destitute because they didn't get into UPENN. Thank you for such a helpful response to the question of rankings. These are the sort of responses I was hoping for when I made the thread.:)
imogenshakes Posted March 15, 2017 Posted March 15, 2017 I agree with @orphic_mel528 on all counts. I don't have the same conference experience, but I landed a prestigious internship at the Folger Shakespeare Library last summer (even, I believed, despite the "low ranking" of my MA program, my age, and subsequent lack of experience in pretty much every professional capacity at the time) I think in part because of networking. I met the director of the internship at an event they were hosting that I happened to be at, and I think because we built a relationship before the internship application was due and he became familiar with my work, it supercharged my chances of getting it. It's not the same thing, exactly, but it illustrates the larger point: if you build the relationships, you work really hard, you produce excellent research, you professionalize...all of this should increase your chances at a job down the road. And fwiw, my mentors advised me to choose a lesser-ranked program in which I feel I could be supported and produce a better dissertation in, as opposed to a higher-ranked program that would be more stressful for me and make it difficult to produce the kind of dissertation I'll need to get a job (this was before the new rankings came out and suddenly both of the aforementioned programs ended up tied, ha). Of course, I'm also feeling quite optimistic about all of this. I didn't even think I'd be in a position where I could choose from among several top-20 and 30 programs, so I'm trying not to worry to much at the moment. orphic_mel528, zombiekeats, HumanCylinder and 1 other 4
KTF87 Posted March 16, 2017 Posted March 16, 2017 When they say English, do they also mean Cultural Studies? and what about program ranking (not specialty ranking), and not department ranking? are those the same thing? I am asking because the LCS program at Carnegie Mellon is part of the English department - which is, as far as I know, a very very good department (and yet, it is nowhere near where i believe it supposed to be). Does the 51st position sound fair? and even if it did, would it mean that all programs within the department are subsequently ranked 51st? sorry for the tons of questions, but i do believe they are valid (although ranking, for me, is the last thing I would consider in my decision making. When they say English, do they also mean Cultural Studies? and what about program ranking (not specialty ranking), and not department ranking? are those the same thing? I am asking because the LCS program at Carnegie Mellon is part of the English department - which is, as far as I know, a very very good department (and yet, it is nowhere near where i believe it supposed to be). Does the 51st position sound fair? and even if it did, would it mean that all programs within the department are subsequently ranked 51st? sorry for the tons of questions, but i do believe they are valid (although ranking, for me, is the last thing I would consider in my decision making.
jrockford27 Posted March 16, 2017 Posted March 16, 2017 23 minutes ago, KTF87 said: When they say English, do they also mean Cultural Studies? and what about program ranking (not specialty ranking), and not department ranking? are those the same thing? I am asking because the LCS program at Carnegie Mellon is part of the English department - which is, as far as I know, a very very good department (and yet, it is nowhere near where i believe it supposed to be). Does the 51st position sound fair? and even if it did, would it mean that all programs within the department are subsequently ranked 51st? sorry for the tons of questions, but i do believe they are valid (although ranking, for me, is the last thing I would consider in my decision making. When they say English, do they also mean Cultural Studies? and what about program ranking (not specialty ranking), and not department ranking? are those the same thing? I am asking because the LCS program at Carnegie Mellon is part of the English department - which is, as far as I know, a very very good department (and yet, it is nowhere near where i believe it supposed to be). Does the 51st position sound fair? and even if it did, would it mean that all programs within the department are subsequently ranked 51st? sorry for the tons of questions, but i do believe they are valid (although ranking, for me, is the last thing I would consider in my decision making. I think this speaks to one of the problems I indicated about these types of ratings up thread. It's difficult to know how much knowledge anyone taking these surveys has of what the English department at a particular university encompasses. However, I wouldn't let it dissuade you. People in the know, people who will make hiring decisions, will likely know these types of things.
Dr. Old Bill Posted March 16, 2017 Posted March 16, 2017 32 minutes ago, jrockford27 said: I think this speaks to one of the problems I indicated about these types of ratings up thread. It's difficult to know how much knowledge anyone taking these surveys has of what the English department at a particular university encompasses. Exactly. I still feel like ranting at length about the rankings, but just to further your point... If most of the students and faculty surveyed about other institutions are in the realm of literature, there's almost no chance that they'll have any clue about the strength of comp-lit in other programs. There's also very little chance that they'll know about the strength of rhet-comp in other programs (otherwise University of Kentucky-Louisville, for instance, would be a lot higher). How about queer studies? Does the average grad student or faculty member have an immediate sense of, say, twenty great programs for that subfield? Not likely.Does the average grad student or faculty member have an immediate sense of strong programs in other eras, let alone other subfields? Doubtful. And when you factor in that only 14% of people surveyed actually responded at all, the picture is so much less accurate as to be useless. Literally useless. It's appalling that these numbers are going to be used by thousands of highly intelligent people -- either to determine the "best" programs (on the applicant end), or to determine the "best" job candidates (on the hiring end). zombiekeats, biyutefulphlower and brontebitch 3
Glasperlenspieler Posted March 16, 2017 Posted March 16, 2017 (edited) 38 minutes ago, Old Bill said: It's appalling that these numbers are going to be used by thousands of highly intelligent people -- either to determine the "best" programs (on the applicant end), or to determine the "best" job candidates (on the hiring end). To echo the point made by @silenus_thescribe I think the second part of this remark is reversing the nature of the correlation. It's not so much that hiring committees make decisions based on rankings, it's that the rankings attempt to make apparent the implicit hierarchical structures that are already embedded within academia and that hiring committees use whether there are ranking to make them explicit or not. Do they do a perfect job of this? Hell no! Especially, as you note, when it comes to specific sub-fields. But whether we like it or not, a handful of departments account for a majority of academic hiring (and not just as R1s and elite SLACs). While I certainly understand frustration with rankings, I think getting rid of them would actually makes the problem worse. Without rankings, only those students who have advisors who are aware of the fault lines of these embedded structures (and hence, are probably already in the upper echelons of them), would have a chance at making an informed decision about the comparative quality of different programs. The rankings, in a sense, attempt to democratize the information, even if they don't democratize the process. There is certainly room for improvement when it comes to rankings and for that reason, they should certainly be taken with a grain of salt and an eye towards how the programs actually fit your interests. This should go without saying though. It seems as if they only people who think that ranking should be taken as the word of God are the people arguing against them as means of a strawman. What they do is to provide a useful starting point for researching programs that roughly maps onto (perceived) quality of the programs in question, keeping in mind that there is likely a sizable margin of error and a great deal of variation based on specialty and approach. Edited March 16, 2017 by Glasperlenspieler silenus_thescribe, Mippipopolous, Dr. Old Bill and 1 other 4
Bumblebea Posted March 16, 2017 Posted March 16, 2017 (edited) 1 hour ago, Glasperlenspieler said: To echo the point made by @silenus_thescribe I think the second part of this remark is reversing the nature of the correlation. It's not so much that hiring committees make decisions based on rankings, it's that the rankings attempt to make apparent the implicit hierarchical structures that are already embedded within academia and that hiring committees use whether there are ranking to make them explicit or not. I agree with this. The rankings are, in part, a reflection of a hierarchical system that already exists. I don't think anyone on a hiring committee actually consults them before making decisions. I don't think any professional in the field worth his or her salt really gives them much thought or takes them seriously. However, the rankings are also a self-perpetuating truism and they become a kind of self-fulfilling prophecy. Berkeley remains the best because everyone assumes that Berkeley is the best. Is it really "the best"? In some things, probably. In other things, absolutely not. My bigger problem with the rankings, though, is the fact that they continue to be published year after year, and they rely on extremely sloppy methodology. They come from a survey distributed to DGSs and department chairs, and only 14% of those surveyed actually respond. I'm guessing that few of us would pick a hospital to have an experimental lifesaving treatment on the basis of rankings where only 14% of those surveyed responded. No--we'd go ask other professionals and seek out additional information. For English grad school, the stakes obviously aren't as high ... but IMO, it's absolutely irresponsible and maybe unethical that USNWR continues to recycle this lousy survey year after year after year and pretend that it represents anything other than a small number of people bothering to respond. And the rankings do matter, to some extent. Even though most professors and academics probably don't take them seriously, they're still out there. They determine which programs people apply to in greater numbers, and larger applicant pools translate into more competitive cohorts. Today's grad students are tomorrow's professors. I was a grad hopeful almost 10 years ago, and many of the people with whom I went through the process are now professors in a position to admit grad students or participate in a hiring process. Many of these people ten years ago believed that one's professional life would be forever determined by where one got into grad school. The process often "worked" for them, so they were eager to internalize those successes and attribute them to hard work and ability. I'm guessing--well, actually I know--that many of them still hold fast to the belief that programs are self-sorting and that certain jobs are appropriate only for people who went to certain schools. These beliefs don't go away simply because you become more educated. They tend to solidify and manifest in things like confirmation bias. So, I disagree that the rankings are democratizing. I think they're a natural thing--we all want to know where we fall in the hierarchy. But I also think that designing a much more comprehensive ranking system wouldn't be that difficult. Would it still suffer from problems of confirmation bias and perceived prestige? Of course. But it also might get at the nitty-gritty differences between programs in a way that would be more material than speculative. And the fact that no one has pursued this kind of survey is interesting indeed (to me, at least). I believe that there are a lot of people out there who still benefit from the status quo and don't really want a more democratic leveling of graduate programs. Edited March 16, 2017 by Bumblebea anxiousphd, Axil, positivitize and 10 others 13
Dr. Old Bill Posted March 16, 2017 Posted March 16, 2017 ^ Post of the week, right there. Well put, @Bumblebea.
piers_plowman Posted March 17, 2017 Posted March 17, 2017 A much more reliable indicator of quality is placement record, which I feel doesn't get discussed as often as it should. How many people a year is the department putting in tenure-track jobs? How many years after graduation does this usually take? Are people netting VAPs and postdocs? What is the ratio of dissertations filed to jobs obtained? And of course, there's plenty of nuance to the raw numbers. How is the placement in your particular subfield? Does it vary by advisor? Are 1 or 2 senior profs placing students at much higher rates, potentially skewing the data? Making this particularly difficult is the selectivity of the data posted; you either can't see what percentage of people are being hired (more important than the raw number, I'd argue), what field they're in, or what kind of position they got, all of which are very important. Websites notoriously display partial data to inflate perceptions of success. Some programs don't post data at all (not a good sign). All this to say that what we usually value (not always, but usually) about ranking isn't the quality of the education we might receive, but how that education becomes a hiring vector, for which placement data provides a much clearer picture than these rankings. Dr. Old Bill 1
zombiekeats Posted March 17, 2017 Posted March 17, 2017 Really excellent points, @Bumblebea and @orphic_mel528. I hope there is a time in which a more comprehensive rating system is created, not least because almost all of my advisors steered me away from totally relying on Ivy League, partially (probably) to help preserve my sanity, but also because in their opinion the more cutting edge research gets done in the prestigious research universities, where you're getting the rising stars of your field just as they're beginning their ascent, rather than at their peak or plateau. Speculation on mine and my advisors' parts, of course, but I wish there were a better way of actually seeing who's doing the best research, who's placing the most people, etc.
claritus Posted March 17, 2017 Posted March 17, 2017 2 hours ago, zombiekeats said: Really excellent points, @Bumblebea and @orphic_mel528. I hope there is a time in which a more comprehensive rating system is created, not least because almost all of my advisors steered me away from totally relying on Ivy League, partially (probably) to help preserve my sanity, but also because in their opinion the more cutting edge research gets done in the prestigious research universities, where you're getting the rising stars of your field just as they're beginning their ascent, rather than at their peak or plateau. Speculation on mine and my advisors' parts, of course, but I wish there were a better way of actually seeing who's doing the best research, who's placing the most people, etc. It's obviously not radically efficient, but I've found that going through dissertations on ProQuest—sorted by advisor/committee—gives you a fairly good idea of placement statistics, provided you do a little bit of google followup. Research quality is a lot harder to figure out, especially since it's subjective, but I try to follow the significant journals and imprints in my field. Special issues of journals and published roundtables are incredibly helpful (in my opinion) because they frame/are framed by immediate scholarly conversations. Obviously there will be bad and good work in each case, but the more important part is having an idea of who is involved in the conversations, and where they're writing from. Bumblebea, chestrockwells and meghan_sparkle 1 2
heliogabalus Posted March 20, 2017 Posted March 20, 2017 Tulane did fairly well for a school whose English PhD has been suspended for the last 12 years. I mean, it's no Bryn Mawr, but it's nothing to sniff at. anxiousphd 1
Dr. Old Bill Posted March 20, 2017 Posted March 20, 2017 30 minutes ago, heliogabalus said: Tulane did fairly well for a school whose English PhD has been suspended for the last 12 years. I mean, it's no Bryn Mawr, but it's nothing to sniff at.
Scarlet A+ Posted March 22, 2017 Posted March 22, 2017 What do they base the rankings off of? Research, of course, but what else? Is there any way to find out what a ranking or percentage of their MA students that go on to get into top 20 PhD programs, or what percentage of their PhD graduates go on to gain tenure track positions?
Dr. Old Bill Posted March 22, 2017 Posted March 22, 2017 19 minutes ago, Scarlet A+ said: What do they base the rankings off of? Research, of course, but what else? Is there any way to find out what a ranking or percentage of their MA students that go on to get into top 20 PhD programs, or what percentage of their PhD graduates go on to gain tenure track positions? No, and this is precisely the problem. Their stated methodology is that they basically just sent out surveys to graduate programs about other graduate programs. Surveys. Seriously. And a whopping 14% of those who were sent the surveys actually responded. NONE of the things that we would consider to be vital information about a program were taken into account -- it's purely hearsay; informed hearsay at times, I'm sure, but hearsay nonetheless. By way of analogy, their methods aren't too far removed from looking at a teacher's RateMyProfessor profile to determine how strong of a teacher he or she is. Actually, even RMP is probably a better indicator, as at least you can assume that the students doing the rating actually interacted with the teacher, which is more than you can say for the USNews rankings. There are thousands of things worth mobilizing against in this day and age, and ranking systems like this are pretty far down the list...but I wish there were some way to expose the system for what it is: specious, misleading, and wholly unrepresentative of what it purports to provide. Scarlet A+, Axil, zombiekeats and 2 others 5
bhr Posted March 23, 2017 Posted March 23, 2017 They don't disclose, but I'm also willing to bet that there is a relationship between the people who filled out the forms and the schools that are ranked at the top. For example, if I'm the DGS at, say, Stanford, there's a good chance that I rank my own program highly, as well as the programs I feel like I compete against, and the schools where I hire from. I'm willing to bet that most DGSs who filled out their form have programs in the top 25. And, again, the exclusion of cultural studies, digital media, digital humanities, tech comm, and rhet/comp make these rankings useless for about 1/2 the jobs out there. Talking to a friend at a very good SLAC who used to hire Ivies only recently, she told me her program has stopped considering them, as they expect all TT hires to teach 1-2 sections of FYW a year, and they would rather hire a Lit person who has a strong pedagogical background in that field. Purdue, for example, may be mid-ranked for Lit, but their folks come out with good FYW and PW experience, plus Writing Center experience with the OWL, which means they fit a lot more job openings than a Lit person from a lot of those top 10 programs. Again, I'm biased, as I'm in a standalone R/C program that consistently places folks in TT positions. What is interesting is that we've only placed a couple R1s in recent years, but have had a handful of grads move to R1s after a few years, which is a trend that seems to be catching on.
Warelin Posted March 23, 2017 Posted March 23, 2017 On 3/22/2017 at 10:37 PM, bhr said: They don't disclose, but I'm also willing to bet that there is a relationship between the people who filled out the forms and the schools that are ranked at the top. For example, if I'm the DGS at, say, Stanford, there's a good chance that I rank my own program highly, as well as the programs I feel like I compete against, and the schools where I hire from. I'm willing to bet that most DGSs who filled out their form have programs in the top 25. And, again, the exclusion of cultural studies, digital media, digital humanities, tech comm, and rhet/comp make these rankings useless for about 1/2 the jobs out there. Talking to a friend at a very good SLAC who used to hire Ivies only recently, she told me her program has stopped considering them, as they expect all TT hires to teach 1-2 sections of FYW a year, and they would rather hire a Lit person who has a strong pedagogical background in that field. Purdue, for example, may be mid-ranked for Lit, but their folks come out with good FYW and PW experience, plus Writing Center experience with the OWL, which means they fit a lot more job openings than a Lit person from a lot of those top 10 programs. Again, I'm biased, as I'm in a standalone R/C program that consistently places folks in TT positions. What is interesting is that we've only placed a couple R1s in recent years, but have had a handful of grads move to R1s after a few years, which is a trend that seems to be catching on. I think you bring up several good points here including that it is human tendacy to have some bias. However, according to US News: "The questionnaires asked respondents to rate the academic quality of the programs at other institutions on a five-point scale: outstanding (5), strong (4), good (3), adequate (2) or marginal (1). Individuals who were unfamiliar with a particular school's programs were asked to select "don't know." To me, this sounds like they weren't able to rank their own program. "Scores for each school were determined by computing a trimmed mean – eliminating the two highest and two lowest responses – of the ratings of all respondents who rated that school for the last two surveys; average scores were then sorted in descending order. " In theory, this sounds good. However, only 14 percent of people polled responded. There were 155 programs surveyed and 2 people were asked from each university. That would put the number of people asked at 310. 14 percent of 310 =43.4 people. Once you consider that most programs had 0 people responding to the survey and others had 2, it's likely that the list is comprised of the opinions of 22-35 universities. On a side note: You bring up a very good point. I think the ivies do pay attention to each other's programs and their placement rates. If I were a director, I'd want to know what my competitor was doing, who/why I was losing candidates to them and how I could improve my own program to have less people choose to go somewhere else. I'm not sure if any of those universities could tell you much about the programs elsewhere. I'd imagine it's also true of colleges close to each other (BU,BC, NU, Tufts, Brandeis). However, I don't think that the latter group is necessarily paying attention to the first group. I think this survey has a way of meaning something if the participation was higher and some sort of concrete numbers were thrown into the mix. Do you think that the major jumps in some programs were due to its competitors perceiving it to improving its graduate program or from a better perceived placement record or a higher visibility rate? On a side-side note: When people refer to the ivies in this context, is Brown and Cornell included in that mix? Or are they excluded in favor of Stanford/Berkeley/Chicago to refer to a "top 5" school? It's always confused me because I know a few people use it to exclusively refer to HYP. zombiekeats and Glasperlenspieler 2
WildeThing Posted March 24, 2017 Posted March 24, 2017 Idea: Ask applicants who are on the verge of receiving responses or who have just finished their applications (so they're at their most informed but not yet tainted by the responses) to write up a list of their top XX schools they'd like to go to, and rank it in order as if they had free rein to go wherever they wanted. It's not scientific and parameters will vary wildly, but applicants spend a lot of time researching places and fit and writing specific samples so they know departments well and will consider all sorts of things for their choices (fit, prestige, placement, city, funding). With enough responses you should get a fairly good idea of which programs are (perceived by the people trying to get in as) the best, and if respondents state their field you can distinguish between subfields. Sure, they generally won't really know about what it's like to be a graduate at those departments, but that problems exists in the current methodology too. At least you avoid personal bias and can expect the respondents to know what they're talking about. If everyone's in agreement, go on and do this. guest56436, bhr and claritus 3
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now