This post, and the one that will follow (on data analysis applying early versus later in the cycle) represent data compiled from Law School Numbers (LSN) and analyzed/regressed/blogged by Daniel Plainview. This is a work in progress, and the work has graciously been shared to Spivey Consulting by Mr. Plainview. It represents the law schools ranked 1-14 in the most recent USNWR rankings. The data, analysis, and comments all are expressively from Mr. Plainview. I will add a qualitative analysis and prediction for the 2012/14 law school admissions cycle as a third posts in this 3 part series. Enjoy this (to my knowledge) never before statistical look at a few of the commonly excepted but rarely challenged notions in law admissions: that the “yield” component applying to a law school ED gives an applicant an elevating factor in the admissions process.
With all that said, let’s take a look at the numbers. In the first chart, I’m positing the results considering the ENTIRE pool of applicants who were either accepted or rejected at these schools. As always, these are the results after controlling for LSAT score, undergraduate GPA, URM status, timing of the application, nontraditional status, and gender. Our variable of interest here is simply the increases in the likelihood of acceptance for an applicant who applies ED as opposed to RD.
Notes here:
- The first number is the number of observations the regression is based on, and the second is how many times more likely ED candidates are than identical RD candidates to be accepted (in other words, Michigan ED candidates are 1.166 times more likely to be admitted than identical RD candidates, and Duke ED candidates are 4.222 times more likely to be admitted than identical RD candidates).
- Georgetown actually seems to disadvantage ED applicants, as they are 31% less likely to be admitted than identical RD applicants. I have no idea why this might be. We see this same results at George Washington, but because all ED accepted applicants to GW are given substantial scholarships, the result there makes perfect sense. For Georgetown, not so much. I’m definitely interested in what hypotheses you all might have about this.
- I have an asterisk beside Northwestern because of Northwestern’s recently instituted policy of giving full rides to accepted ED applicants. These regression are based on data going all the way back to the 2003/2004 cycle. One would expect Northwestern’s number to be similar to GW’s (which isn’t listed here), but actually in the first year that Northwestern implemented the program, their ED boost was enormous.
- N/A is for schools that don’t offer binding ED, and NSS means “not statistically significant” and indicates that a school had no boost associated with ED applications in my analysis.
- The University of Virginia has the biggest ED boost, which is no surprise, and confirms the conventional wisdom that is “ED UVA!”
Next, we’ll look at the ED boost associated with non-splitter applicants (applicants who I don’t classify as splitters nor reverse-splitters):
Notes here:
- This time around, it seems like Michigan disadvantages non-splitter EDing applicants. Thoughts?
- Northwestern drops off the list.
Next up, the splitters:
Notes:
- The boosts are generally much bigger for splitters, where they actually exist.
- Duke drops off the list here.
- Oh my god, Georgetown.
Finally, the reverse-splitters:
Only one story here, but it’s a big one: UVA gives a massive boost to reverse-splitters who ED. The sample size is reasonably small at 58, but given the relatively small number of variables we are controlling for, the results are valid.