Podcast: A Look at the New ABA 509 Law School Data with Kyle McEntee

In this episode of Status Check with Spivey, Mike has a conversation with Kyle McEntee, LSAC's Senior Director of Prelaw Engagement and the founder of Law School Transparency, centering around the newly released 2024 ABA 509 disclosures and how applicants should consider the data therein. They discuss interpreting class size changes, the ongoing rise of GPAs and grade inflation, LSAT inflation and how the highest LSAT percentiles have changed over time, the new option for law schools to obtain variances from the ABA to go test-optional (plus an explanation of what variances are and how the variance process works), the removal of the non-residents category from 509 reports and what that means for international applicants, diversity data (and how that's impacted by the non-resident recategorization), ordinal rankings (including a discussion of U.S. News and MyRank by Spivey), rising law school tuition and how law schools function financially within universities, and more—plus, what all this means for current and future applicants.

You can read our recent blog post with a breakdown of some of the new 509 data here.

You can listen and subscribe to Status Check with Spivey on ⁠⁠Apple Podcasts⁠⁠⁠⁠Spotify⁠⁠, and ⁠⁠YouTube⁠⁠. A full transcript of this episode is below.


Full Transcript:

Mike: Welcome to Status Check with Spivey, where we talk about life, law school, law school admissions, a little bit of everything, you know the drill. Today I'm joined by Kyle McEntee, who's at LSAC, who I've known for about 20 years. Kyle co-founded, after graduating from Vanderbilt Law School, Law School Transparency, which was a massive undertaking and game-changing project, which focused on a number of things but primarily on, how do applicants make informed decisions on where to attend law school vis-a-vis transparent employment data? After that, Kyle joined LSAC, where he is now, as a Senior Director of Prelaw Engagement. Kyle and I talk about those law school matriculation vectors based primarily on the new 509 ABA data reports that came out. We talked a little bit about test-optional, the ABA variance that allows schools to ask for permission to go up to 100% test-optional, whether we think that's a good thing or a bad thing and if it's coming. We end a little bit on the silliness of the ordinality, how rankings, primarily U.S. News rankings, are ordered 1, 2, 3, 4, 5, 6, and maybe other ways to look at them, which we'll be probably publishing soon the projected rankings but another way of looking at them. Without further delay, here's Kyle and me.

Hey, Kyle. Good to see you, as always.

Kyle: Yeah, absolutely.

Mike: We met how many years ago?

Kyle: It would have been in 2007 at the UNC Law Fair, when you convinced me to apply to Vanderbilt.

Mike: That's so interesting because in 2007, that's all I did. I was on a plane meeting college students like you. You were a UNC undergrad. Was Vanderbilt on your radar?

Kyle: No, I actually had never even heard of the school, which I don't know how it's possible, but I was like, "What's a Vanderbilt?" And you told me what a Vanderbilt was, and I was like, that sounds pretty good. Nashville sounds awesome.

Mike: That's before we were a football powerhouse. I would love to talk about the federal court ruling on Vanderbilt's quarterback, but we should probably move to 509 data. So, the ABA releases every year in December, 509 data. Why does it matter to applicants, to the market, and why does it matter maybe more this year than ever before?

Kyle: So, I look at ABA data through two lenses. The first is a global lens. For those of us who care about the state of legal education, we can look at the 509 data for trends, for academic and research purposes, or for informing policy recommendations, whether that means taking action or inaction, and that's whether at the regulatory level or at the institutional level.Then there's also the decision-making lens, which is of particular importance to prospective law students. So, there is some information contained in these 509 reports that could and should impact the decision people make to enroll or to continue to stay enrolled.

And as you pointed out, these disclosures are fairly new. A lot of the information contained in the 509 reports has been around for quite a few decades, but we've seen expansions of this data. So, these reports come out in December, but we also see bar passage data come out in February and employment data come out in April, and that's all under the ABA standard 509, which provides consumer information to the public for, again, that global state of legal education, policymaking, and research purpose, but also the informed decision-making purpose.

Mike: We talked about how you applied to law school in 2007. If you were applying now, how much would you be using 509, with all your sophistication and 20 years almost in this space? How would you use 509 data?

Kyle: 20 years, that's kind of crazy. I'd actually start with the employment data personally; that's what I would do. I would look at, "Where do I want to go work?" and then find the schools that have a good track record of putting graduates in those states. And then the additional 509 data from these 509 reports help fill out that informed decision-making body. We need information on cost, we need information on environment, we need information on scholarship opportunities and admission chances, and the 509 data really provides insight into, "Where can I get in?" in addition to, "Where should I want to go?"

Mike: Okay, so as a former admissions officer, if you're sitting on a plane with me and you talk about "chance me," I'm going to talk for two hours with you. How does it help, and how does it hurt? And we can juxtapose that with the LSAC volume summary, which is real-time but also changing, and 509s, which are static but also historic.

Kyle: Well, they're a great starting point. I counsel people to use the interquartile ranges, so that's the 25th and 75th percentile LSAT and GPAs, and then the median, which is that halfway point, median GPA, median LSAT, as a starting point to determine which schools might be good safety schools, which might be good target schools, and which might be good reach schools. Numbers are not everything, but they are a lot. This information really gives you a good starting point as to which schools you should be looking to, again within that scope of what schools make sense for my career goals. That said, I think people look at the medians and say, "Oh, well, my score is below median, I just shouldn't apply," and A, that's just not how law school application decisions are made, but also, it's a median, so 50% of people scored at that point or lower, or had a GPA at that point or lower. And so, those numbers, again, they're a starting point, but schools round out their classes, and they say they make holistic admission decisions, and I think that's generally fairly true, but the numbers play a role, and you can't deny that either.

Mike: I would say for many years, the way I describe it is, there's always this tension. There's this tension between, "We want to take this person with the higher numbers, but their application wasn't strong, or their interview was awkward, and they didn't seem very interested in us. But they have high numbers, so we're going to waitlist them because they didn't seem interested." And the person below our medians—not you when you applied to Vanderbilt, but—who, "They're below our medians, but they visited, they seem like a great fit for our culture, they crushed it on the interview." And in the past, I think often the way U.S. News worked out before the change a year ago, often, but not always, we would tip toward the high number person off the waitlist—until we shored up our medians, you know, in July, and then we would start admitting the people we really just liked. But now, more so than ever, there's this opportunity because of the way U.S. News really diminished the weight of the inputs. To your point, outcomes—employment, bar passage—are much more important. There's an ability unlike ever before for law schools to say, "We're going to take that person who's going to be employable and is going to fit in our law school." And we'll see how this shakes out with the upcoming data. But I noted on the 509s, and we predicted this, LSAT was more flat than up or down, and there weren't that many schools up +2 at all.

Kyle: Yeah, that's a great thing, right? I work for LSAC; we make the LSAT, right? We think it's an important part of the admission process. But we also don't think it should be everything or even a lot of the process. It's just one tool in the arsenal of a law school admission committee. And to me, that shows great restraint because there were more people with higher scores. And while enrollment did go up, it didn't go up nearly as much as it could have.

Mike: So, let's talk about that. Enrollment went up what?

Kyle: I think 4.6%.

Mike: For applicants, why does that matter?

Kyle: When you're looking at an individual school's enrollment, it matters because you are trying to forecast, what are your employment opportunities going to be three and a half years from the date you enroll? And so, in the aggregate, which is the sum of all those schools and their decisions, right so we had 128 schools go up, 58 go down, and I think 10 stayed the same. And so if I'm looking at the 509 data, and I'm looking at the schools that I'm interested in, and I saw a drastic increase in enrollment, I might ask different questions. I might say, "What kind of resources are you pouring into your career services office? How is this going to change my classroom experience? Do you have more faculty? Are you adding an extra section?" That said, if I see the numbers go down, I might say, "Why are they going down? What was your purpose?" And these are questions I would ask the admissions officers, whether at an LSAC forum, whether at a law school fair, whether just calling them up. But overall, I look at these numbers in the context of, "What does this mean for my job prospects?"

Mike: Yeah, it's so interesting, you come from the world of always transparency of job prospects. There's two ways of looking for it. I mean, you're taking from Peter and paying Paul, but the reverse. In one sense, and I think the market's aware of this, if we see another 5% increase in enrollment, that kind of makes things less competitive for applicants if schools are admitting more people. Now, there's another huge driver, which is how many people are applying. But more schools taking more people makes, by definition, more admits, makes things less competitive, which if you're just an applicant in a vacuum, you're saying that's good. Your point is, hey, slow down, because it might be less competitive at the global level, but at the school you're looking at, what's going to happen three years down the road?

Kyle: Exactly. Exactly.

Mike: Right. I do think, by the way, if I had to predict, I think we're going to see another bump, because of that 12% class that's dropping off. I think we'll see about a 5% increase in enrollment. So, I think you brought up a really good point. If you're looking at 509s, the volume summary report will tell you a lot, but it's not telling you what schools are considering as far as how many they're going to take. So, look at the 509s, and if you see a trend or an anomaly, an outlier, a school going from 88 to 150, or 150 to 88, ask the admissions office was that intentional?

Kyle: Absolutely. And you also have to look at regional context and state context, because if you want to work on the west coast—and I haven't looked at the geographic disbursement of enrollment data at this point in time—if you're looking to work on the west coast and enrollment there is flat, but then maybe in the northeast enrollment is up 10% or 15%, or maybe in the southeast it's declined. And again, when you're looking at that overall increase of 4.6% this year, it's the aggregate of all of those localities. And because the job market is very localized, very regional, not very national, the implications just vary, and it's really tough to make too broad of claims about what's going to happen three and a half years down the line, especially because, who can predict what's going to happen three and a half years down the line?

So I'm interested to know that you are predicting a 5% increase in enrollment. If I were predicting, I would actually say we're going to have a decrease, because I think schools, I think they know that, for example, NALP is forecasting employment numbers for the class of 2024 being down. Summer hiring at large firms has been and is increasingly down. And so, I think we're going to see more restraint from law schools. But again, it's all on the individual basis, right? Because these are individual schools making independent decisions. Now, they're related because they're all in a competitive market, but each school is saying, "Hey, I want to increase by 5%," or "I want to stay the same" and then accidentally increasing 5%.

Mike: Yeah, the reason why I would predict 5% increase is, we as you know talk to a lot of law schools, and more and more they're getting pressure from their central administration—so what, the colleges, right—"Hey, you've got to start not taking money from us; you got to start giving us money, because the demographic cliff is about to hit us." It's so interesting, because I never thought about this until like maybe five years in—the pressure on the dean is, if you increase enrollment, you're just making more money, and every entity has to survive. Law schools don't have endowments like many undergrad schools, so you need the net tuition revenue. So that is a pressure. It's a real pressure. When you look at your budget and it's always in the red, you're not sleeping comfortably. On the flip side, to your point, if you decrease your class size, the employment outcomes are going to be a lot more fortuitous, in all likelihood, for your students—obviously, they're going to be better than if you increase—and you're going to bring in better medians. It's kind of the tide that floats all boats, in some sense.

Kyle: Absolutely. And I think it depends on the type of pressure that the deans are facing too. Is it to increase headcount or is it to increase net revenue? And if it's to increase net revenue, you can keep your class size the same or decrease it and increase your revenue if your scholarshipping is done a little bit differently. And I think as schools have more and more tools to evaluate their scholarshipping strategies, we'll see a little bit more of that shake out. Of course, that's a negative to the student, right? Because that means they're on average paying more for law school as opposed to paying less.

Mike: Yeah, I'd say if you're applying this cycle and maybe next, say to yourself, "This may be more competitive, but there's still more money on the table as far as merit aid and discounting." Talking about predicting the future, I think that we're going to see this thing shake out where a lot of schools are going to have to give up on this arms race of scholarshipping. That's a conversation for another day, but it seems like you have a point.

Kyle: I agree it's a conversation, a long conversation for another day. But I think it goes back to that point I made in the beginning about the two lenses, right? We both enjoy this conversation, this academic conversation about, what is the state of legal education? Where is it going? Should it or should it not go in various directions? And that can be scary for an applicant in the midst of this stressful process, but I would caution them to just take what we're saying, find it interesting, but don't lose sleep over it as an applicant. Just let come what may, right? Because you can't control these larger picture questions. Right? And again, when you say one thing about one school, it might not apply to the school down the road or the school across the country. Because we're not talking about specific schools here. Who knows what schools we're talking about? And so it's just easy to get caught up in, like, the conversation, I think, as an applicant. And I was certainly guilty of that when I was applying to law school. But it's good to take stock of just like, the cycle's going to happen, you're going to do your best, you're going to engage with the admissions professionals, and you're going to try to persuade them to let you in, and you can try to persuade them to give you more scholarship money. But then, at the end of the day, it doesn't really matter what else is going on in the world. It's, does each particular admission and scholarship offer make sense for you? Is this where you need to go to make your best choice for your personal and professional ambitions?

Mike: That's wonderful advice. My favorite podcast is Dr. Peter Attia's The Drive. People are always asking him, "Tell us what supplements we should take." And he's like, well, to be clear, the definition of supplements is they need to supplement a healthy lifestyle. Dr. Attia would say, put clean things into your body, don't put bad things into your body, and move a lot. We're talking about the "supplements." We're not talking about the big things, which are, take your individual preferences—which you just clicked on—find what you want to do, start from there, and then start looking at the 509s.

What else in the 509s stood out to you? We've talked about LSAT, we've talked about enrollment, what else?

Kyle: One number that I always like to point out is the number of people who are transferring out of their law school after their first year. So the last year's entering class, 2023, 2.6% of them, or 988, transferred out. And the takeaway for me here is, go to a school that you'll be satisfied graduating from unless you're willing to drop out, because it's not that common for whatever reason. Some of that is you do well, you want to stick around because more job opportunities are going to be available to you higher in your class. On the other hand, people get assimilated into the culture of the law school and really enjoy it and find it hard to leave.

Mike: Did you go to Vanderbilt thinking, I've never heard of this place, it's a good school—because so many students at Vanderbilt, and then when I went to WashU, you're probably not going to be at your dream school because your dream school is probably going to be above... less than 1% are above every school's medians. If you're doing it based on that, you're probably not at your dream school. And I saw so many times students come to Vanderbilt and WashU wanting to transfer out, and then they never even applied out, because they fell in love with their classmates, their faculty, their culture.

Kyle: My story might be pretty specific to what drew me to Vanderbilt in the first place, but no, I never actually anticipated transferring out. The reason I went to Vanderbilt is because the law school published the job outcomes from the class of 2007, and I was able to see exactly which jobs people were getting. And I could see myself in those jobs. I wanted to work in the Southeast. I was interested in working for a big firm. And so, my actual top choice—I won't say what school it was, where I was waitlisted—I didn't even consider applying as a transfer. And in fact, one of my classmates ended up transferring from that school to Vanderbilt. Which doesn't say anything about either school; it just goes to show, the transfer up and down the prestige ladder goes both ways.

Mike: So, you're looking at attrition, you're seeing that it's a low number; what your point would probably be is, if there's a school with a higher outlier, definitely look at that because there might be something going on.

Kyle: Yeah, exactly. And for that, instead of looking at the transfer out data, I look at the transfer in data and then the net change between transfers out and transfers in. This has impacts on class size, which is relevant if you're concerned about job outcomes and also your classroom experience. But the patterns can be interesting. I do think there's some year-over-year patterns of schools that take huge transfer classes but also lose a lot of transfers. That happens a lot in D.C.; it happens a bit in California. Some data in there that I actually never understood the value of was the GPA data for transfers in, and that's because law school curves are just so vastly different. You can't really compare a 3.3 at one school to a 3.3 at another school without knowing how their curves are set.

Mike: I'm a former admissions officer. We weren't even pinging off of the GPA of the school; we were pinging off, where did you fall in your class?

Kyle: Exactly.

Mike: At 3.3 you could be in the top 20 or literally almost like 50th percentile or worse.

Kyle: Yeah, or even, like, probably even top 5% at some schools that they’ve got like a 2.6, 2.7 curve, you know?

Mike: So that can shift to the fact that GPA in the 509s from undergrad schools, I'm almost disgusted with GPA—that's a strong word; we may or may not play with that with editing—but the GPA inflation is so off the charts. Like, we can talk about ABA 503 variances a little bit, because I want there to be standardized test scores. I don't want there to be applying without a test score because it's so hard to differentiate on GPA anymore because everyone has a GPA. Thoughts on those?

Kyle: Yeah, I'm likewise astonished at the grade inflation, and this is something that people have for decades complained about inflation, and saying, "Oh, the younger generation, their grades are super inflated." I'm cautious when I read those data, because I'm like, am I being that person, that curmudgeon who was looking at the data and saying, "Ah, it was harder when I did it"? I really don't think so. I've spent a lot of time reflecting on that, looking at various patterns across undergraduate institutions, and I'm with you, I think it's crazy. Right? And it makes it all the more important—of course, this is a statement with some self-interest—but all the more important to have some standardization through a test. Now, whether that's the LSAT, whether that's the GRE, whether that's something else novel that comes out, innovation is good for legal education. There's no doubt about that. But GPA as a primary measure to be used in a test-optional world, I don't really see it advancing the goals I think a lot of us have of improving the diversity of the legal profession. And I don't mean just racial diversity, I don't mean just gender diversity, I mean diversity from all walks of life, enrolling people who are going to make great lawyers from all kinds of backgrounds, serving all kinds of different people.

Mike: If you want people from all different backgrounds making great lawyers, GPAs are incomparable across schools today—if you go to a service academy and you have a 3.3, you're killing it, but if you go to some other school and you have a 4.0, you're literally with 75% of your graduates—and they're also incomparable across timelines. If you went to school 20 years ago, your GPA is almost guaranteed to be lower than it is now, which again, I can take out the self-interest. The best scenario for our firm, if we cared about growing and wearing monocles and top hats, would be for them to have no standardized test scores and hinge off of your application and your interviews. To your point, I want there to be standardized tests, and also to your fair point, it doesn't necessarily have to be one test. JD-Next came in, that was in the data pool. It was small, but JD-Next is a different option; it might grow. I think 98 percent took the LSAT.

Kyle: Yeah, 98%. I think it was like two dozen use JD-Next. So it's like 38+ thousand versus a few dozen.

Mike: Yeah, introductory year.

If you had to bet—and you know a lot about variances, I'll let you describe it—will the variance lean towards more people just not taking the test at all? I remember applying to both business school and my doctoral programs. Boy, did I not like buying those books and studying for those tests. So if a school said, "Hey, you don't have to take a test, Spivey," I probably would have been pretty interested. Will it trend in that direction? Will tests like JD-Next—like you said, innovation into the market? Or do you think schools are just going to do what's comfortable and stick around the way it is?

Kyle: I ultimately think schools are going to do what's comfortable and stick around with what's going on already. And it's not just comfort; I think it's also quality. I think they appreciate the reliability and predictability and the validity of the LSAT. There's a reason I think we saw all these deans and admissions deans clamoring against the ABA trying to change this. The law school community, it's simultaneously both collegial and self-aware. And that's reasonable. There's a lot of really good people involved in legal education, but there is an uneven track record of doing the right thing, and the ABA has stepped in several times over the past 10 or 15 years to address some of the not-great behaviors at law schools when it comes to a lack of transparency, misleading information, enrolling people who have little chance of passing the bar, etc. Now, the ABA stepped in and addressed a lot of those concerns. And so I think what they're doing here is they're trying to spur innovation because they have a certain set of goals about what the profession should look like, and they have a certain set of ideas about what's causing, maybe, the growth in those goals to not be what they are or what they could be.

But on 503, I think it is worth pointing out how the variance system works, how the ABA accreditation system works, just to put this a little bit in context. So, the ABA standards are—I don't want to call them unique, but it is uncommon in the accreditation world to require 100% compliance to retain your accreditation. And under the ABA standards, there's two ways around the 100% compliance. The first is that, if you fall out of compliance, a school may submit what's called a "reliable plan" to get you back into compliance. And the second way around it is there are variances, of which there's two types—there's the emergency and the experimental. So, on the emergency side, that's what we saw in COVID, allowing schools to do hybrid classes or online classes, even though at the time the ABA standards didn't allow for the proportions of online courses. But then the experimental, this is what I've always found the most interesting. So a school is able to receive an experimental variance when the noncompliance, the permissive noncompliance with the standards, has the potential to improve or advance the state of legal education. So, this is granted to individual schools, not generally, which is why for the 503 variance, schools have to apply for the variance. They can't just say anyone can be out of compliance with standard 503. And it also has to be for a certain term, at least to start. After a certain term, a variance—again, the permission to not be in compliance with the standards—can be made for an indefinite term. And they added that I think it was in August 2024. I'm not a fan of the indefinite nature of variances or the potential for that. Again, that's a topic for a different day.

But what this really is about, to me, is innovation and consumer protection, and the variance system is kind of meant to do both. Right? On the one hand, we want schools to innovate. We want them to try new things. And before you change the accreditation standards, which is a pretty rigorous process, you want to have some sense that the changes are going to either be positive, or at least they're not going to have huge negative externalities. On the other side of things, there's the consumer protection aspect of this. So being a lawyer, it requires rigor. It's high stakes. It costs a lot of time and money. And we don't want to set people up to fail, and certainly not at a huge expense. And so, we need information to help identify people who we believe can do the work and identify who will need extra help in doing so, and that's what the LSAT has traditionally done a really good job at. Without transparency into what's going to happen and, ultimately, does happen with the variances, we do put at risk these individuals who are going through this experimentation. Some cost is justified in innovation, but in my opinion, the ABA and schools need to get informed consent to do so. It's just a basic research principle.

Mike: I would argue we have an incomparable test versus the last LSAT with logic games. These test-takers are in some sense without data of not logic games. They're also an experiment, no different from JD-Next or playing with variances.

Kyle: So, I wouldn't say it was no different, and it's for a few reasons. One is we have the psychometrics to show that the test is as reliable and as valid as it was pre-logic games, and that's because for so many years we had a five-section test with one unscored section, and so that means we had decades of data showing what happens when you remove logic games and have two logical reasoning sections to go along with the reading comp section. And so we were able to show, before we made the decision to remove logic games or not replace it with something else, that actually there wasn't an impact on validity or reliability. And to date, the data over the last six months have borne that out.

Mike: I just haven't seen it. I don't think the market's seen it. Right?

Kyle: Yeah. And I think we've written on it somewhat, and I'm at this point blurred between what I have access to internally versus what's publicly available, which is not to say I can't talk about it, just I can't remember exactly what data we've published at this point in time. But that is data that we routinely publish to law schools, and we routinely publish to pre-law advisors and publish internally. And so, there is data to show that we haven't seen any concerning shifts. And therefore, there are no concerns about the validity or reliability of the test post-change.

Mike: Okay. So fair, we'll get the data eventually. It'll be interesting when schools get it.

There is obviously, I'm sure you saw, there's LSAT inflation. The top-scoring bandwidths seem to be doing better, higher percentage in 165 to 180 than any other five-block increment. As an expert, what's your theory?

Kyle: There's two things that are worth saying first. One is, there's a difference between who takes the test and who applies to law school. The "who applies to law school" is always going to be a self-selection, and as there's an increased race to the top, I think that's implicating it. I think people are more likely to say, "Hey, I got a high score, this means I should go forward with this," as opposed to maybe doing something else. And related to that, we've seen law schools more or less keep enrollment down. I mean, I know we saw a 4.6% increase this year, but again, could have been much higher given the increased applicant pool. But I think that as schools are getting more and more competitive and keeping enrollments down, it is making it harder to get into law school, and so I think that's an aspect of it as well.

Mike: Could the number of retakes also be impacting that number?

Kyle: Yeah, I think the number of retakes, I think the increased quality of prep—we have LawHub now, which provides the actual test day interface. I think people might also be a little more comfortable taking it online, and now they've got the option to do it in person or remotely. I also think that people are spending more time prepping. They know the stakes. And so you'd expect that to shift the percentiles a little bit. And we’ve seen some minor increases at the top in the percentiles—like at 170. I think it's almost a 1% increase at that level. But we're seeing a much higher level when you look at the actual applicant pool as opposed to the test-taker pool.

Mike: We covered almost all the 509s. I appreciate this conversation. The one thing we haven't covered, which I think is really important, is what the ABA used to do with non-residents—what you and I would call international students probably—and how they changed it. Do you want to explain that to everyone? Because I think it's important to note that when you're looking at previous 509s, they're actually different than last year's 509s.

Kyle: Yeah, they are. When you look at the previous 509s, you had this category, it said "non-resident alien," I think. They might have improved that language at some point; I don't remember.

Mike: It needed improvement.

Kyle: But as you said, an international group. So, these are people who are not U.S. citizens.

Mike: Or permanent residents.

Kyle: And previously they got their own category, and it didn't matter what their race or ethnicity was. All that mattered was their national origin. Now, that group has been moved from non-residents or international and spread out across all the racial and ethnic groups. And so now there's just everyone in one pile instead of one pile for U.S. citizens with race and ethnicity and one pile for non-residents with no indication as to what their race or ethnicity is.

Mike: Yeah, what are your thoughts on how they used to do it, and what are your thoughts on the change?

Kyle: I always thought it was strange that they separated it. And I don't know the exact politics of it, but I'm sure it's some relic of some Department of Education politics over the last 30 years.

Mike: So you thought it was strange that they reported not international students. Someone—and LSAC might be the only entity, we certainly don't have it, and we looked, and we found like some law review article that surveyed law schools, and some responded, some didn't—I would love to know how many students are coming from India versus China versus South Korea, etc., England, you could fill in the blank. We have no ability to find that.

Kyle: I have not actually looked at our internal data, but I'm sure we have that. But I think, to your point about, like, how you separate the information, the ideal way to do it would be to say, okay, let's look at national origin—U.S. citizens, Indian citizens, U.K., Australia, on and on and on, right? That's one group. They all aggregate up to a hundred percent. Then, on the other hand, you look at race or ethnicity. And there are differences based on your country of origin as to how those groups relate to each other, and that's why our internal data, I don't know exactly the number, but it's a vast number of subcategories we put into each of the race/ethnicities, because it really does make a difference. But we're talking about populations that are so small that it's tough to be meaningful at a school-by-school level. But in the aggregate, it is pretty interesting information, and that's something worth pursuing for the ABA is to segment that out and start collecting that data.

Mike: And so one problem is the media is reporting, "Okay, well post-SFFA"— which people like to refer to as 'the Supreme Court got rid of affirmative action in admissions'; I would have a much more nuanced answer than that, but that's, again, another podcast—"post-SFFA, the data shows that, surprise, we have nearly the same number of underrepresented minority students or diverse students applying to law school," But we actually don't know that for a fact, because now they're saying, if you're from France and you're Black, previous to this non-resident change, you were recorded as a non-resident alien. And now you're recorded as, in this huge glob of all applicants, you're also reported now as Black. So we actually don't know if diversity is—it's not comparable data.

Kyle: So the ABA data is not comparable, but the LSAC data, which is showing that there wasn't a major loss in diversity, is still showing that it was steady. And that's because we are able to go back and adjust our definitions I think at least for the last eight years. It might be a little longer than that. Since 2014, I believe. Don't quote me on that, anyone. But the point is that we do have nearly a decade of data that we can backdate, so to speak, using the updated definitions to show these trends. And so the ABA could do that as well. They, for whatever reason, decided not to. But at least our data do indicate that we do know that the losses weren't as significant as we were expecting. And to some extent, it could be that affirmative action didn’t receive the kill that people expected, but it could also just be that law schools’ missions haven't changed and schools are still taking measures to ensure that they're enrolling the classes that they want.

Mike: To the extent that you have a lot more say than I do in the market, I would like for one of us, or both of us, to advocate for the 509s—since you have it—to show a little bit of international data. International applicants crave that, and they need that. So much of what you're saying, rightfully so, is, "How do you make a decision on where to apply and where to attend?" And I think part of that decision-making for an international student should be, which schools are enrolling international students and which seem to enroll very few or even none? And I think they should know that.

Kyle: I agree. I think they should know that too. And I think law schools would also be interested in understanding the international demographics better, because that helps them more intelligently do their resource allocation in terms of where they're recruiting. And I think it's better if schools are more efficient, because that means they're spending less money, which means they don't have to raise tuition as much.

Mike: I would agree that's better. We can summarize 509s, and then I have one question for you, since we have 509s, there’s things I can do with them.

In summary, I would say that, it was good news to me that—Anna Hicks-Jaco, our firm's president, is sick of me bragging about this—but I did predict that LSAT would be more flat than anything else, and people were freaking out, of course, last year, saying LSAT's going to go crazy. More schools were flat. I like that. The GPA inflation is real. I've done this long enough; we're not being curmudgeonly, and I'm curmudgeonly about many things, but I can look at graphs and say the GPA inflation is real. It seems, based on LSAC's data, diversity maybe dropped a little, but not nearly as much as some people thought.

Kyle: Absolutely.

Mike: I think that's hinged on two things. What I don't think it's hinged on is what people like to say in the media: "Oh, we've developed pipeline programs." Well, pipeline programs take years to work. I think it's more that—you and I pinged on this—the Supreme Court built in the ability for law schools to still read applications and consider factors. I can't speak for minority students; I'm a white male, but if I had to guess, I think minority students are saying, "I'm incentivized to go to law school."

Kyle: That's certainly been my experience talking to prospective law students at LSAC's forums this past year. The commitment is real.

Mike: Now, looking at the volume summary, how much more difficult or competitive do you think this admission cycle might be versus last? I still think we're going to be below 20% up.

Kyle: I hate predictions, you know that.

Mike: And I love them, so...

Kyle: Yeah, I'll leave the predictions to you. I think that's a reasonable read on the data.

Mike: It's got to come down a little bit. We'll see. You're smarter than me; you don't want to make a prediction.

Kyle: Yeah. I mean—well, not "yeah" to I'm smarter than you; "yeah" to I don’t want to make a prediction—one thing on the cycle that I do want to point out is we're still fairly early. Which I think supports your point. Traditionally, the median applicant applies the first week of January, and so that's why the advice of my colleagues here at LSAC is "try to apply by the end of the year." If you do that, your chances aren't really going to go down. Afterward, your chances might go down a little bit, but not as drastically as I think people think. Scholarships are a different story; there are real pressures on schools to not over-allocate scholarships with the net tuition pressures that they face. It’s still not too late.

Mike: The other thing I would hit on is, if in the first week of January we hit the 50% mark of people submitting applications, that means 60 to 75% of admits haven't gone out yet. Some schools haven’t even admitted anyone. It’s obviously not one-to-one, someone is processed as an applicant and an admit decision is made. The vast majority of admit decisions are going to come in 2025, not 2024. We are very early in the cycle as far as admit decisions.

Kyle: And for anyone looking to apply next cycle, it means you don't need to rush your applications to get them out in September and October or the first week of November, right?

Mike: Amen.

Kyle: People freak out about that. It causes so much stress. I actually posted on Reddit the statistic that I really like about, "the median applicant typically applies the first week of January," and I think it was one of my most upvoted threads ever, because people were like, "Wait, really? This reduces so much of my stress." We need to repeat that over and over again.

Mike: Yeah, every time we say this, someone comes back with, "Well, I looked at the law school data, lsd.[law], and lo and behold, you do get a little bit of a bump if you apply in September or October," and my answer is those people aren't retaking the LSAT, they're going to have one LSAT—our firm works with all of these people—they've been working on their application all summer long. Yeah, they're submitting a killer application in September. Guess what? Submit a killer application in December if you need to retake, if you need to work on your application, you’re in no different group.

Kyle: Yeah, exactly.

Mike: So, we have the 509 data. I can model out, and I have, how schools would be ranked if there was no methodological change. The only piece of data we don’t have are the assessment scores; as you know, those are incredibly stable. Our firm’s model, barring any metric or weight changes, would be very precise. And here’s what’s interesting to note—you’ve written about this and talked about this before—when you look at the raw scores, they are so tightly clustered, I think it’s the most clustered I’ve ever seen them, that if U.S. News changes how they round it, if they round it down at 5 instead of up, it would totally scramble the order because of that tight clustering. There are many reasons why ordinal rankings are ridiculous. School number 25 is not 3 better than 28, or 3 worse than 22. Now, what I’d like to do is—I already put the top 19 on LinkedIn—take these and put them in clusters. How would you do this? Like clustered brackets versus ordinal, and you can explain ordinality.

Kyle: So ordinal just means that 1 is better than 2, 2 is better than 3, right? I hate ordinal rankings so much. I hate the U.S. News rankings so much. I think they poison the decision-making quality of both law schools and pre-law students alike. I'm simultaneously sympathetic to both the institutions and the students. The institutions have to play the game, because everyone’s paying attention, and everyone’s paying attention because no one wants to be the one not paying attention. And when you look around and you're like, "Oh, all these lawyers who went to the schools that I want to go to seem to care about this, so I should care about this," and it’s tough to take yourself out of that. So I'm sympathetic to both groups while simultaneously absolutely loathing these rankings. Because what they do is they set you up to make a decision that just doesn’t make sense for anyone’s personal or professional goals.

The information out there is sufficient without the rankings to make your decision. And this is not to say that there's no difference between law schools. There are major differences between law schools in terms of their outcomes, in terms of where their reputation is strongest or weakest, or even where your reputation is "I've never heard of that school." But overall, what they do is they make you think that 28 is better than 25 instead of saying, “How does this school meet my goals? Are they placing graduates where I want to go work? Is this cost something I'm willing to bear? Is this debt something I'll be able to afford to repay?” If you want to work in California, but you got into a school in New York that was ranked 50 points higher than the school you got into in California, it just almost never makes sense to go to that New York school. And the rankings tell you that you’re wrong to make that decision.

Mike: The way I described it on LinkedIn, what I did is I put up the raw scores and then I put up how it would be grouped. For example, Yale isn't 1, Stanford isn't 1, they’re in "group one," which is a group of three, and the next group is a group of like five. So I put up the groups they would be in, and I think it just highlights that point that, yeah, maybe these groups are meaningful, maybe at a more moderate or even macro level. The micro level, I would love for more people to use things like—I don’t know why others don’t come up with more of these—our firm’s myrankbyspivey.com, where you just weight what's important to you. Is our thing perfect? No, but is it individualized? Yeah, no one’s dictating what the rankings are to you. You’re making your own.

Kyle: Do you want me to criticize the MyRank?

Mike: Of course. We like criticism.

Kyle: So my biggest issue with the MyRank is that it’s still supporting the system of ordering schools 1, 2, 3, 4, 5. And it's not like individuals who are editing the methodology and then creating their own personal ranking are actually informed enough to do so, and it’s just a broken concept to begin with. To me, it doesn’t actually solve anything. Now, if you were to adjust it such that it put schools in tiers and have some kind of employment weight by location factored into it, then I think we might get into something that might be really great. When it comes to the rankings and the methodology, it’s like, we’re still limited by the fact that there are only 196 ABA-approved law schools, that they operate on local and regional bases for the most part, there's maybe five to 20, depending on where you want to cut it, "national schools," like truly national schools that you can go to and it makes sense for most locations. But beyond that, you’ve got to look at region. You've got to look at the state. And even in some cases within a state, there are patterns within upstate New York versus Manhattan, for example, or Northern California versus Southern California, or Southern Texas versus Dallas, right? The patterns within states are even quite meaningful. And so anytime you put schools in an order, I think it’s just setting people up to think they've got a better grasp of the information than they actually do.

Mike: Yeah, I’m going to release some form of rankings that shows groups, before March, probably over the holidays, ideally on Christmas, and show the groups. I would push back, although I definitely hear your point—I'm not a fan of ordinal either—but almost by definition you’re making an ordinal decision, because you’re picking a law school. So you are picking, "I like this school better than the other ones I got admitted to," unless you only got admitted to one school and you still decided to go to law school. So the reason why we have this MyRank tool is at least you’re saying, "It’s better to me," and not "Bob Morse at U.S. News, who couldn’t name a 1L class, is telling me this school is better to me." Does that make sense?

Kyle: I understand your point. Where I disagree is that you're still saying, "Okay, I'm going to weight LSAT X%, GPA Y%, employment rate Z%," and I think it’s just creating this sense of control and understanding that’s just not actually there.

Mike: But you can take out LSAT and GPA.

Kyle: Sure, but no matter what you choose, it's still slicing and dicing in a way that I think is a bit incoherent for the purpose. Now you’re absolutely right, you are revealing your preference when you’re choosing one school over another, but often when people are doing that, they’re weighing the acceptance rates one standard deviation above the average, or two standard deviations above the average, and then aggregating that into some soup that produces that raw score and then the ordinal rank. What's happening is, it's more vibey, right? It’s, “This feels good about the environment, this feels good about the cost, this feels good about the job outcomes and the culture of the school,” and then they’re making their decision. So I think that is not quite ordinal, in a sense, even though there is a preference revealed with the ultimate decision.

Mike: I think another way of saying this is—and then we can end on it—I like making a list with MyRank of schools that might matter to me, like a list of 8, 12, 15 schools. But do I want personally to use just MyRank to pick the school I attend to? No, I want to go visit these schools if I can.

Kyle: Absolutely.

Mike: Any final thoughts?

Kyle: I’m just shocked at the tuition, to be honest.

Mike: Yeah.

Kyle: Back to just like a final 509 point, tuition at 25 law schools is over $70,000. At five schools it's over $80,000. Now, top-end salaries have increased a lot over the years, but there is this bimodal distribution of salaries where the left-hand mode has increased too, but not nearly at the pace of that right-hand mode. And so if anyone's unfamiliar with this, just Google "NALP bimodal distribution." I'm just shocked at it, and it’s unfortunate because I think it does limit who considers going to law school. I think we’ve seen an increase in the number of people who are wealthy attending law school. There’s a few pieces of data that support that point of view. One is the data released by U.S. News about the percentage borrowing, and we’ve seen that number drop from about 15 years ago it was in the mid-80s to as low as 70% borrowing for law school. That means 30% of people are paying these high prices. And it shocks me, not that they’re making the decision, but that so few of them are borrowing and doing so. And it’s not like the percentage of people who are getting full scholarships, which I think is 4.5%, and the people with full scholarships plus a stipend it might be another like 2.5% or 3%.

Mike: Yeah, you're 7.5 versus 30%.

Kyle: Exactly. And so, that paired with the disparity by race and borrowing. LSSSE, which is the Law School Survey of Student Engagement, released its 2024 report maybe a week or two ago, and what they did is they looked at data over five-year periods going back to their first year, 20 years ago, 2004. In 2004, there was only marginal differences in racial disparities in borrowing. And then in 2009, we started to see this be revealed a little bit, in 2014 it became stark. 2019-2024, it’s dramatic. And this correlates with, I think, the increased sophistication of law schools in gaming the U.S. News rankings. We saw a very similar pattern in enrollment of women. Whereas in like 2001 or so we were at rough parity, and there weren’t huge patterns across law schools—women were attending roughly 50% at the top schools versus the middling schools versus the schools that are much more local, right? Over time, that same time period, 2009-2014, we started to see a disparity where women were enrolling more at the schools with lower job outcomes, lower bar passage rates, and men were more greatly enrolled at schools that were performing better. Now, we’ve seen that pattern, at least, go away, I think in part because we’ve seen a drastic increase in the number of women applying to law school. But also, I think schools have been a little bit more intentional over the last seven or eight years about ensuring that they’re enrolling gender parity-ous? That’s not a real word, but I think people will know.

Mike: We can use it, though.

Kyle: But, again, this also all relates back to the U.S. News rankings and the sophistication and gaming of the rankings. And I just think it has these unfortunate patterns that are both shocking and sad and something for us to all endeavor to change.

Mike: The craziest thing is this—and it’s only crazy because we live in this world where profit tends to matter—tuition is increasing, but I don’t think profit, for lack of a better word, for these law schools are increasing. They're paying faculty members more money; they're discounting more to certain people with different numbers. Four years ago I saw a Columbia Law professor tweet, "It’s great that my school is ranked #4, but not at the expense of a tuition that's $70,000." I so wanted to dive in there, although I wouldn’t have made it out alive, "Well, where do you think that money’s going?"

Kyle: Right.

Mike: Number one, faculty salaries. That’s the biggest line item expense.

Kyle: It’s a systemic problem, too. You can’t go find an individual at a school to blame for these things. It’s a larger legal ecosystem of prestige obsession, of university and law school politics, of national and state politics, and it makes it really difficult to solve. Transparency has been transformative over the last 12, 13 years, but it can only take you so far. Regulation really is the only way out of this. And some of that regulation may not even be possible given faculty contracts. It’s really tough to shed faculty, which are a huge cost driver.

Mike: Your parting thought was tuition. My parting thought is, there's nothing we can do—or anyone listening to this—to get rid of the society of attribute-based esteem that we live in. People like faculty who are at highly ranked schools, who make more money, who drive U.S. News rankings up by peer assessment, there’s nothing we can do about that. If you’re an applicant, if you can get as much of this attribute-based pressure out and not look at any rankings ordinally, but say, "What is the best fit school for me, not based on what my parents want me to do," although maybe that should factor in some, "not based on what my friends want me to do, but based on where I want my life taking me, my career, objectively, without these psychological pressures pinging off you left and right from society," the better. Thanks, Kyle. Always a pleasure.

Kyle: Thank you!