From the ASA webpage I linked:
"Using the traditional ASA indicator for the acceptance rate (that is, the number of accepted manuscripts divided by the number of overall decisions), ASR’s acceptance rate for the year was 6 percent. (Using the method of calculating the acceptance rate proposed by England [in Footnotes, March 2009], in which acceptances are divided only by final decisions, the ASR acceptance rate was 9 percent.)"
The range I gave accounted for both ways of calculating acceptance rates. No matter how you slice it, any percentage <50 means you have better odds of not getting published, and we're way below 50%. Obviously there are things you can do that make your work more publishable than the average submission, but the "Not exactly difficult if you stick with it" argument doesn't have a leg to stand on. The vast majority of papers submitted to ASR (for example of a highly selective journal) never make it into its pages.
On your other counterarguments, we risk reaching an impasse if neither goes out and gets the data. Let me take the first crack. First, there's plenty of research in economics that the top journals are dominated by faculty at top ranked departments. Economics is definitely more stratified than sociology, but take it as a strong case. Second, look at the recent hires in sociology this year (scroll down to this list). I can't take the time to randomly sample from this group, but just going from top to bottom I looked up their CVs. The obvious trend is that students who went to higher ranked programs have more publications at better journals and get jobs at higher ranked program.
1. PhD from top 25 program, hired at elite SLAC, has an AJS, a book, and two other pubs
2. PhD from ~90 program, couldn't find info on pubs
3. PhD from top 50, hired at ~80 public, 5 or so pubs in mid-tier journals
4. PhD from top 5 program, hired at top 25, lots of pubs in ASR, SF, etc. (not a fresh PhD)
5. PhD from top 25, hired at unranked public, 3 pubs in low tier journals
6a. PhD from top 5 program, hired at top 25 program, 7 pubs in top-mid tier journals
6b. PhD from top 5 program, hired at top 25 program, 5 pubs in mid tier journals and some books (an ethnographer)
7. PhD from top 50 program, hired at decent SLAC, 2 pubs in mid tier journals
I'll stop there, but the pattern stays the same. Now we can quibble about how to categorize the Journal of Health and Family (mid tier, low tier?) but the trend would be the same. And finally, and I think this should settle it, let's look at the most recent issues of some top general and specialty journals. What are the author affiliations?:
1. ASR April 2014: UC Berkeley, University of Memphis, UC Davis/Penn State, NYU/Penn, ESSEC Business School (france), UNC-CH, Stanford/Chicago
2. AJS January 2014: UCLA, Texas/Iowa, Michigan/Indiana, Columbia/Iowa
3 Social Forces March 2014: Cologne, Oregon State, Trinity College Dublin, Univ Toronto/Northwestern, Utrecht, Indiana, UMASS-Boston, Purdue, Penn state, Stanford/Princeton, , Chinese university of Honk Kong, UW Madison, UC-Riverside, UWashington/Uconn, Arizona
4. Gender & Society June 2014: Marquette, USC, CSU Northridge/UCLA, Penn, Ohio State/ Southwestern, CUNY
5. Soc of Ed April 2014: EUI (Italy), NYU, UW Madison/Iowa, NYU/Penn
So how to interpret these? For one, it's obvious that not every paper is by a top 25 program. But the top programs are definitely overrepresented here (a "/" indicates multiple authors from different programs. Many of the co-authorships are between advisers and former students who landed jobs at lower ranked school, as is the norm). Also, as you move down a tier from, say, ASR to Social Forces, the field opens up a bit.
So in response to your "I'd like to see a source for this idea that faculty at lower ranked programs aren't publishing as much in top journals": I don't know of any systematic study in sociology to match those down in economics, but a glance at the data support my (uncontroversial) claim that faculty at lower ranked programs appear less frequently in top journals. Of course, that's not news to anyone who regularly reads the top journals. The more interesting question is whether faculty publishing patterns affords any advantages to students in those programs.
As for GRE/GPA. You say "GRE/GPA doesn't show student quality and most programs know this. Ask them and you will see." You will recall that I said that students with GRE/GPA aren't somehow inherently "better" students. Just that they select into (and are admitted to) higher ranked programs, and then PhDs from higher ranked programs have more job market success (see above list of recent soc hires). Thus, the correlation between GRE/GPA and job market outcomes > 0. And why ask professors what they think about GRE/GPA when I could see what they do (we are sociologists after all; better to observe in situ than take a post-hoc account)? Look at the results board here on Grad Cafe (admittedly not a random sample), and you will note that higher GRE/GPA candidates get accepted at higher ranked programs (and usually choose those over lower ranked programs). To argue otherwise is to defy the data. I also strong recommend you to this conversation on Orgtheory.net: http://orgtheory.wordpress.com/2011/05/09/an-inconvenient-truth-about-gre-scores/
As for the more general point: "You also failed to miss my overall point because you were so focused on your statistical argument. It's not the program ranking that matters but the networks and people you become enmeshed with during the process. "
I apologize if I missed your overall point. I was responding to the specific arguments you made to the OP, which were factually incorrect. But it looks like I missed the wood for the trees. On your overall point that "networks matter", I wholeheartedly agree. That networks matter regardless of rank is of course ridiculous. The most high-value (for getting a job anyway) networks accrue at the higher ranked programs. Of course, you show plenty of anecdotal evidence that high value networks can exist outside of the top ranks. I can't dispute that, and it sounds like your program does a great job attracting top candidates and students. But a few negative cases do not disprove an overwhelming trend.
Now let me be quite clear. I don't think rank carries some inherent value. Especially as it's measured by USNEWS, it's a purely reflexive proxy for prestige. There's all kind of status bias there. But the problem is that the status bias in rankings has material consequences on the lives of job candidates. That's nothing to sneeze at. So it's irresponsible to advise people to "ignore rank, focus on all the other stuff" when going to a program with a low rank affects your material wellbeing later on. This is not to say "ignore all the other stuff, just go for the highest rank". Nobody promotes that; me least of all.
And then, following Bourdieu, we know that the symbolic capital of rank can buy other forms of capital: the networks, the talent, and the funding. So rather than X --> Y, where X is the meat-n-potatoes of a program and Y is the rank. We have Z-->Y--> X, where Z is some historical unknown that locked us into path dependency. And finally, none of what I am saying is new or surprising; this conversation has been had many times. e.g. http://asr.sagepub.com/content/69/2/239.short
Finally a disclaimer: for the sake of argument, it is easiest to operationalize "job market outcomes" as a function of the rank of the hiring program. I don't believe these are the only careers people should aspire for or pursue. The more we diversify, the better the outcomes for all of us. And if a TT job at an R1 (never mind in the top 25 soc programs) is not your career of choice, then it's probably not necessary to go to a top ranked program. But because the PhD granting programs are almost exclusively oriented toward placing PhDs in academic jobs, it makes the most sense to use that benchmark in argument. Again, that doesn't mean I think everyone needs to go that road.
Edit: spelling corrections