Jump to content

Foreign Policy's Best IR Schools


Recommended Posts

Also worth checking out are the contending companion articles: 

It’s Never Been a Better Time to Study IR by Francis Gavin of SAIS

America’s IR Schools Are Broken by Stephen Walt of HKS 

My key reflection on the rankings is to bear in mind that they are compiled from "Responses from 1,541 IR scholars at U.S. colleges and universities." A common thread through both of the accompanying essays is that traditional, academic IR is behind the eight ball to some degree when it comes to understanding and serving the modern requirements of practitioners in the field of IR. 

Given that many (most?) on this forum are interested in professional masters programs in IR/PP/PA, to ask allegedly cloistered scholars where one would get the best professional preparation for a career in IR seems a bit self-defeating. 

Link to post
Share on other sites

My take is the analysis of the "top" programs is sloppy and is really nothing more than a popularity contest. Goldgeier's tweets on the subject I think are spot on. MA hopefuls should base their decisions to attend schools based off of program fit, whether the school offers a program best suited to a candidate's career goals and of course, how much funding the school is offering. School rankings and prestige matter little especially if you intend to seek employment with a government agency or NGO as they do not seem to care where you went so long as you have all the requisite "skills". 

I was surprised to see these rankings come out as I had heard from someone that works at FP that they were taking a lot of heat from various schools who believed these rankings were hurting their brand, and there was talk of scrapping them all together.

 

Edited by Nico Corr
Link to post
Share on other sites

The attempt to rank programs that prepare people for such a variety of professions and skillsets is in itself ridiculous. It makes sense to rank history programs, or economics programs, but to rank history and economics programs together and then try to figure out which one of them is best? Bizarre.

I wouldn't bother with the rankings. In the professional world, nobody knows or cares how these programs rank. They know where they went, they know where their colleagues went (and if their colleague who went to SAIS is a blowhard, guess what), and they know who teaches where in their particular policy field. There is also a tacit understanding (ymmv - I'm in a field where you can't learn everything on the job) that the quality of graduates from some of the mammoth programs (SAIS, SIPA in particular) varies widely - to the extent that  a lot of people advise getting a specialized master's rather than an MPA. So if you're graduating from one of those, it matters more who you are than where you went (which, idk, would that defeat the purpose of going there?).

Some of the top midwestern and west coast programs are virtually unknown in DC because of how regional even this market is.

Link to post
Share on other sites
18 minutes ago, UrbanPolicy&Development said:

Are there reputable rankings for U.S. domestic or local policy programs? 

"reputable" is subjective. Although I think Foreign Policy is as reputable as one can get, the ranking is arduous and based mostly off of whatever reputation said schools already have with the "practitioners" in the field. 

Link to post
Share on other sites
20 hours ago, ExponentialDecay said:

The attempt to rank programs that prepare people for such a variety of professions and skillsets is in itself ridiculous. It makes sense to rank history programs, or economics programs, but to rank history and economics programs together and then try to figure out which one of them is best? Bizarre.

I wouldn't bother with the rankings. In the professional world, nobody knows or cares how these programs rank. They know where they went, they know where their colleagues went (and if their colleague who went to SAIS is a blowhard, guess what), and they know who teaches where in their particular policy field. There is also a tacit understanding (ymmv - I'm in a field where you can't learn everything on the job) that the quality of graduates from some of the mammoth programs (SAIS, SIPA in particular) varies widely - to the extent that  a lot of people advise getting a specialized master's rather than an MPA. So if you're graduating from one of those, it matters more who you are than where you went (which, idk, would that defeat the purpose of going there?).

Some of the top midwestern and west coast programs are virtually unknown in DC because of how regional even this market is.

tl;dr These bad rankings don't make all rankings bad. 

While I would, and did, say that there is a limit to the value of these rankings as measures of the quality of the programs covered, I think dismissing the notion of rankings altogether is rather flippant. Moreover, I think your post misses the point of a ranking by focusing way too much on the expectation of some workplace social capital afforded to those who go to a higher ranked school, rather than focusing on what people who go to higher ranked schools actually achieve professionally relative to others. If middling graduates the top 5 schools still have, on average, better outcomes than those in the next highest five, then you could safely say that the top 5 programs provide better preparation than the other programs, regardless of whether one's colleagues talk in reverent whispers whenever one's top-tier pedigree is discussed.  

These rankings, however, can't be said to measure that very well (probably), because they are merely based on the subjective assessments of scholars in IR. What would be more useful, IMHO, would be rankings based on longitudinal surveys of graduates asking them to reflect on the extent to which they feel they have professional mobility. This could be composed of a battery of questions regarding income, debt burden, the attainment of progressive responsibility/promotions, the opportunity to pursue meaningful work, work/life balance, etc. I think something along these lines would sift through some of the interdisciplinary nature of IR, and the differing personal choices all graduates will make, to implicitly ask what I think is the most important question, "Has your degree empowered you to pursue your chosen path, regardless of what that specific path may be?" Once that question is answered, the major remaining criterion for a prospective student to evaluate is the topical fitness of the program. 

I suppose the main point I'm trying to get across is that bad rankings have more to do with poor imagination and design than some fundamental lack of value in the exercise of ranking, and since its something we're all going to do all of the time anyways, we might as well try to make good, formal, transparent ones available. After all, ranks are models, and as George Box tells us, "All models are wrong, but some are useful."  

Link to post
Share on other sites
21 hours ago, Nico Corr said:

"reputable" is subjective. Although I think Foreign Policy is as reputable as one can get, the ranking is arduous and based mostly off of whatever reputation said schools already have with the "practitioners" in the field. 

Maybe not even practitioners - it's mostly academics I believe. I spoke with one of the survey respondents (who coincidentally convinced his friend to put the survey in Foreign Policy years ago). He unsurprisingly listed SIS, none of the other DC schools, and four other schools he liked. 

Edited by irapplicant1776
Link to post
Share on other sites

@Poli92

I'm not dismissing all rankings (as is evident from my post). I am dismissing policy rankings.

 If middling graduates the top 5 schools still have, on average, better outcomes than those in the next highest five, then you could safely say that the top 5 programs provide better preparation than the other programs, regardless of whether one's colleagues talk in reverent whispers whenever one's top-tier pedigree is discussed.  

You're making a basic logical error. It may be that middling graduates at top 5 schools are better than top graduates at the next 5, and if you put them into boxes marked A and B for 2 years rather than making them attend classes, you'd still get the same outcome. It may be that the middling graduates of top 5 schools are carried by their top graduates, who attend those schools not to learn anything, but because they want to be carried by the reputation of those that came before. How much of education is signaling is unclear: Cowen has a new book out on this subject this month, which is already making the rounds in economics discussions, where he makes a persuasive argument that education is 80% signal.

I think rankings based on subjective ideas, whether it be what school a professor thinks is best or whether an alum thinks their school empowered them to pursue their chosen path, are useless. You're not getting a standardized answer, as everybody is interpreting the question differently, and most importantly, you're not getting at the crux of the question: is the degree significant value-added? If you asked me that question, sure my school empowered me to pursue my path. Could I have pursued the same path had I gone somewhere else? Yes I could. I'm also not interested in hearing the evaluations of people who don't work in my field. I don't care if School A prepares you well for campaigning or running an education non profit. I'd be interested to hear how graduates in the aggregate evaluate their experience, but not as a ranking. And I'd be interested to see how schools rank on objective criteria: funding, attrition, debt at graduation, placement, salaries. I don't think malarkey about whether you feel happy and fulfilled has any place in this decisionmaking process. 

Even the best designed rankings are questionable, methodologically and, even if they're sound, as to whether they're worth the paper they're printed on.

Link to post
Share on other sites
16 minutes ago, ExponentialDecay said:

I'm not dismissing all rankings (as is evident from my post). I am dismissing policy rankings.

In this case that's a distinction without a difference, as it is still unreasonable to dismiss the exercise of ranking policy programs simply because some rankings will be bad, or because rankings will always be incomplete. It is a necessary reality of decision making that you must make simplifications in order to feasibly evaluate alternatives. I'm simply proposing one means of doing so that may be an improvement on the FP rankings by making clear some of the components that would factor into someone labeling program A as better than program B. 

 

20 minutes ago, ExponentialDecay said:

You're making a basic logical error. It may be that middling graduates at top 5 schools are better than top graduates at the next 5, and if you put them into boxes marked A and B for 2 years rather than making them attend classes, you'd still get the same outcome. It may be that the middling graduates of top 5 schools are carried by their top graduates, who attend those schools not to learn anything, but because they want to be carried by the reputation of those that came before.

I believe you've misunderstood or I've poorly communicated my case. I said that, "If middling graduates the top 5 schools still have, on average, better outcomes than those in the next highest five, then you could safely say that the top 5 programs provide better preparation than the other programs." In this case, "those" refers to middling graduates in schools ranked 6-10. Of course there will be those students who would have succeeded regardless of which school they attend, but across all prospective students, these are probably the exception rather than the rule. Because of that, I think, and I'm guessing many would agree, that it is more valuable to compare the central tendencies of different programs rather than the extremes of one to the central tendencies of another.  

 

43 minutes ago, ExponentialDecay said:

think rankings based on subjective ideas, whether it be what school a professor thinks is best or whether an alum thinks their school empowered them to pursue their chosen path, are useless. You're not getting a standardized answer, as everybody is interpreting the question differently, and most importantly, you're not getting at the crux of the question: is the degree significant value-added? If you asked me that question, sure my school empowered me to pursue my path. Could I have pursued the same path had I gone somewhere else? Yes I could. I'm also not interested in hearing the evaluations of people who don't work in my field. I don't care if School A prepares you well for campaigning or running an education non profit. I'd be interested to hear how graduates in the aggregate evaluate their experience, but not as a ranking. And I'd be interested to see how schools rank on objective criteria: funding, attrition, debt at graduation, placement, salaries

This may just be an intractable difference in opinion between us, but I would still point out that I did include salaries and debt burden in my recommendation. Additionally, because of the interdisciplinary nature of the field, I think it is foolish to only ask those who end up in your field of interest about their experiences. Maybe you went into graduate school knowing exactly what job you wanted and ended up doing exactly that, but in that case you would be somewhat of an exception. An interdisciplinary program should be a time for exploration and ultimately a honing of interests, so it should be of some value to prospective students that alumni have found success in broadly-defined way and in a number of fields, though I would say that it would be valuable to see how these metrics would break down by concentration, dual-degree status, etc. In general, I'm in favor of more data. 

 

33 minutes ago, ExponentialDecay said:

I don't think malarkey about whether you feel happy and fulfilled has any place in this decisionmaking process. 

This is troublesome. Why would someone enter this field if not to pursue meaningful work? You're probably not getting rich, famous, or powerful in this field, and there are certainly better fields if one of these is your goal, so what is driving you? 

Link to post
Share on other sites

The methodology behind these rankings is manifestly ridiculous.  I reluctantly factored them heavily into my application decisions, only because they are, however flawed, one of the few ways to evaluate the prestige of the program in this saturated field (in hindsight, I would have looked more at overall institutional prestige, since that's what's appreciated by the great majority of the public).  That said, these things tend to be self-fulfilling -- the higher rankings will attract more qualified students, leading to better outcomes and other metrics, boosting school status, and so on.

Link to post
Share on other sites
19 hours ago, Poli92 said:

I believe you've misunderstood or I've poorly communicated my case. I said that, "If middling graduates the top 5 schools still have, on average, better outcomes than those in the next highest five, then you could safely say that the top 5 programs provide better preparation than the other programs." In this case, "those" refers to middling graduates in schools ranked 6-10. Of course there will be those students who would have succeeded regardless of which school they attend, but across all prospective students, these are probably the exception rather than the rule. Because of that, I think, and I'm guessing many would agree, that it is more valuable to compare the central tendencies of different programs rather than the extremes of one to the central tendencies of another.  

I'm not talking about the shape of the distributions or what points of it you should be comparing. I'm talking about the difference between correlation and causation. It is highly possible that middling graduates at top 5 schools were better applicants a priori than middling graduates at the next 5 (as suggested by the fact that they attend a better school - but we can't reason from the outcome here for the same reason that we can't reason from the outcome previously). In that case, the school value added is negligible.

Maybe you went into graduate school knowing exactly what job you wanted and ended up doing exactly that, but in that case you would be somewhat of an exception.

I'd be an exception if I knew exactly what organization and job title I wanted at graduation and got it (though not a rare exception). I don't think I'm an exception for knowing roughly what kind of work I want to do in what field. Most people don't go in thinking they want to do trade policy and come out doing refugee resettlement.

Why would someone enter this field if not to pursue meaningful work? You're probably not getting rich, famous, or powerful in this field, and there are certainly better fields if one of these is your goal, so what is driving you? 

We're talking about numbers here, so let's not get personal. The problem is that fulfillment means different things to different people, but we can't compare preferences. To some people, money and power is fulfillment - and when they are asked "is your career fulfilling", they hear "does your career get you money and power". This is not what people for whom a low-paid job working with their community is fulfillment will hear, and yet you endeavor to lump them all - and countless others - into the same ranking. If I'm reading such a ranking as an applicant with my own individual opinion on what is fulfillment, what exactly does that tell me? (although, it does tell me something if the fulfillment figure across all programs is about the same - not least that it's a shitty variable).

More data is good, I suppose. If you know how to use it.

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use and Privacy Policy.