Image Credit: Pexels
Image Credit: Pexels

The U.S. News & World Report top law schools list is arguably the best known ranking system in legal education. The list, which is compiled using a weighted average of twelve indicators of quality, includes quantitative indicators such as median undergraduate GPA and each university’s bar passage rate, as well as qualitative indicators such as a peer assessment score that collates law school ratings by faculty. The list’s methodology is based on factors its creators have deemed students should be looking for in a law school. But do the results of The U.S. News & World Report top law schools list reflect students’ actual enrollment choices?

In “A Revealed-Preferences Ranking of Law Schools,” an article forthcoming in the Alabama Law Review, Associate Professor of Law at University of Kentucky College Brian L. Frye and 2017-18 American Bar Foundation Fellow (Chicago, IL) Christopher J. Ryan, Jr. consider where The U.S. News & World Report may fall short, and offer an alternative approach to ranking law schools based on the preferences of students. Frye and Ryan argue The U.S. News & World Report has likely remained the primary law school assessor as a result of “first-mover advantage,” but that the system is by no means ideal. They hope their alternative approach to ranking law schools will offer a truer depiction of students’ law school needs and how they make a choice when deliberating multiple law school offers.

Q&A with Brian L. Frye and Christopher J. Ryan, Jr.

What was your impetus for exploring the law school ranking system and developing an alternative approach?

BLF: Other rankings systems try to provide an objective assessment of quality, either by measuring outputs, like the Above the Law rankings, or by measuring who-knows-what in the case of the U.S. News rankings. We thought it would be interesting to provide a subjective ranking system by asking what student choices say about what students actually want, as opposed to what they should want.

Why does the U.S. News rankings list incite contention? Has your article brought any of this to light?

CJR: For years, the U.S. News system has benefitted from the first mover advantage, despite its use of what many view to be a fairly flawed methodology. For example, 40 percent of U.S. News’ rankings comes from peer review in some form. I demonstrated in an earlier paper that peer review ratings of law schools are extremely time-invariant and not in fact based on year-to-year changes in law school quality. The US News methodology’s reliance on this virtually time-invariant measurement of “quality,” as well as its consideration of measurements of, for example, the total number of volumes and titles in the school’s law library at the end of the fiscal year - which is a poor proxy of a law school’s faculty resources, let alone the law schools quality - is precisely why the U.S. News rankings incite so much contention. I would say that my findings in that earlier paper were the first impetus for my exploration of alternative ranking systems. The article Brian and I wrote explores one such alternative ranking system based on the revealed preferences of law students.

BLF: Legal academia has a love/hate relationship with the U.S. News rankings. Everyone complains about them, but everyone wants to see their school rise in the rankings. It’s the same with law reviews. Law professors complain about the submissions process, but are still happy when their article is published by a high-ranking journal. People are right to say the U.S. News rankings are mostly nonsense, but they’re wrong to assume that rankings can’t provide useful information to students. Our article and ranking system suggest that participants in the marketplace for legal education may not be fully taking demand into account.

Why do you think alternative rankings have failed to challenge the market dominance of the U.S. News rankings? What’s needed to do so?

BLF: Depends on the ranking system! Some of them don’t get any traction because they are transparent nonsense, like the infamous Cooley rankings that curiously gave Cooley a phenomenally high score, others because they are based on criteria that are irrelevant to students, like SSRN downloads of professors. Most actual law students don’t read or care about law review articles, even when they are editing them. Prospective law students certainly don’t. If students were rational economic actors they would pay a lot of attention to systems like the Above the Law rankings, which are based on outputs that ought to matter, like bar passage and employments. Our ranking simply asks what students actually choose to do. The fact that it diverges significantly from other rankings suggests that prospective students may care about factors that are not incorporated in other ranking systems.

CJR: Brian’s lawyerly answer of “it depends” is actually quite accurate. It is really hard to upset the gold standard, and for better or worse, U.S. News is the gold standard. As I mentioned earlier, U.S. News was the first mover in this area, and so it reaps the benefits, as most first movers do. Several respectable alternative rankings have been developed over the years by members of the academy (like Black and Caron’s rankings of law school based entirely on law faculty quality as proxied by SSRN download and citation counts and several other ranking systems proposed by Brian Leiter) and especially by other publications, Above the Law (which, like U.S. News, has a fairly involved methodology based mostly on outcomes of a law school’s graduates) and Vault (which bases their ranking entirely on peer review - from practicing attorneys - and current student reviews). The best alternative rankings, and I would argue that Above the Law has a very good alternative ranking, have failed to move the U.S. News ranking from their prime position simply because the legal academy is still reactive to the U.S. News rankings and because prospective law students know to look at the U.S. News rankings but maybe not where to find another ranking system. I think a fair challenge to the market dominance is particularly hard for “new” alternative ranking systems, but I am optimistic that at least one and perhaps many ranking systems can challenge the market dominance of U.S. News.

How do you think law school rankings should be handled? What should be the primary aim(s) of such lists?

BLF: A ranking system should be designed to present useful information to prospective students. Ideally, a ranking system will be as transparent as possible, not just about its methodology, but also about what kind of information it is designed to convey and, by extension, what kind of information it doesn’t. In other words, let a thousand flowers bloom! Different ranking systems are good for different purposes.

CJR: I agree. Rankings should be democratic. The primary aim of law school rankings, in my view, should be its signaling function - to indicate quality - and a corollary to this function is to mitigate informational asymmetries between providers and consumers of legal education. Ultimately, what a consumer gets out of the ranking system should be that which the consumer finds relevant. The consumer should be able weight the categories of law school characteristics according to what the consumer values. Why not? Our ranking, a sort of revealed preference of law students based on the currency of their entering credentials, is just one of many possible ways of operationalizing that notion.

How does your proposed ranking system differ from the U.S. News rankings - what does it take into account?

BLF: Our ranking system simply asks what choices prospective students actually make, rather than what choices they “should” make. In practice, law schools make admission decisions based almost entirely on grades and LSAT scores. So our rankings system effectively asks: what choices do students actually make when confronted with admission offers from competing schools?

CJR: As such, we devote 1/6 of the score to the 25th percentile, median, and 75th percentile of each measure to comprise the composite score we assign to each law school for ranking purposes. While there is not much difference in the ranking position of the very top law schools (i.e. only one T-14 law school moved out of the coveted territory) or the very bottom (where most of the for-profit law schools fall) between our ranking and the U.S. News rankings, there is considerable difference among every other tier of law schools. This tells us that, while U.S. News rankings may have identified the over performers and the underperformers fairly well, their reliance on the peer review may actually work to the detriment of several very good law schools (or benefit of below average law schools) that are excellent (or unsuccessful, respectively) at attracting the top talent in a diminished legal education market.

Do you intend to promote this alternative ranking approach as a potential competitor to the U.S. News rankings? What are your plans for it?

CJR: We have been blown away by the positive response with which our article has been received, and we believe that this underscores the consumer appetite for an alternative to the U.S. News rankings. As such, we have been discussing updating the rankings annually, but that’s as far as our discussion has gone. We have no immediate plans to promote it.

BLF: No, because it isn’t designed to replace ranking systems based on outputs & doesn’t provide the right kind of information to do so. Instead, it provides information about whether students are actually making decisions based on the factors considered by other ranking systems. To the extent they don’t and our ranking system diverges from others, it suggests that prospective students care about factors that other ranking systems are not measuring or taking into account.

Update: Frye and Ryan have released a new article, “The 2018 Revealed-Preferences Ranking of Law Schools,” which compares 2017 and 2018 outcomes.