During the last several days, Conglomerate bloggers have discussed several factors that might influence standing in the law school hierarchy. Gordon discussed scholarship as a driver of reputation and Bernie Black's critique of the new Harvard Record law school rankings, Vic alluded to an "X Factor" (and read the comments!), and Christine, citing some preliminary numbers, suggested that the X Factor might be faculty blogging. My first post also argued that SSRN may be a better barometer of faculty productivity than most people realize.
So let’s think about translating some of these thoughts into a concrete strategy. Personally, I think there is more to a great law school than just its faculty publication record, but I’ll concede that “more and better” scholarship (however one defines it) is an integral part of most law schools’ strategic plans. Thus, assuming scholarship is the goal, here is one approach that is sure to horrify many readers—a numbers-driven appointments committee.
This seemingly radical idea was first suggested by Paul Caron and Rafael Gely in a well-known book review of Moneyball. 82 Tex. L. Rev. 1483 (2004). The operative strategy is identical to Billy Beane’s famous management of the Oakland Athletics: Since most baseball teams (law schools) lack the budget of the New York Yankees (Harvard), the general manager (appointments committee) needs to find baseball players (teaching candidates) who are undervalued by the market.
To illustrate how this might play out in the AALS hiring market, consider the relative attractiveness of four hypothetical candidates:
- Candidate A has a JD from Yale, Yale Law Journal editor, prestigious judicial clerkship, three published articles, including a student note.
- Candidate B has the same sterling credentials as A but only one published article, no student note and a work-in-progress.
- Candidate C has a JD from Stanford, a PhD in a social science discipline, a work-in-progress but no student note or other publications.
- Candidate D went to a Top 15 law school, no clerkship, was an editor on a secondary law journal, but has three published articles, including a student note.
Caron & Gely’s compared several background characteristics of Leiter’s 50 most-cited young scholars with a random sample of professors who were hired at roughly the same time at comparably ranked law schools. Though their methodology (simple t-tests of means) is too simplistic to yield definitive results, it is worth noting that only two variables had any predictive power for “more and better” scholarship: (1) number of articles before first tenure-track job, and (2) publication of a student note. Judicial clerkships, law review membership, rank of graduating institution, other graduate degrees, and teaching experience were not significant predictors.
Under the typical hiring committee
heuristics, however, Candidate A will get lots of AALS interviews, but there
will be significant disagreements over B, C, and D. Many committees will prefer
B and C because of pedigree (Top 3 law school, PhD,
clerkship, law review, etc.). But
Caron’s & Gely’s statistics suggest that A and D,
because of their publication records, are better bets than B and C. Thus, a number-driven appointments committee
would search out more D’s and less B’s and C’s. (Note that many schools will vainly chase after too many A's.)
I suspect that many faculty members would have a hard time biting the credentials bullet—even if that is clearly the source of the market inefficiency. (Remember the catcher in Moneyball with great stats that the scouts didn’t want because he was “huge in the [posterior]”?)
Caron and Gely ultimately need a more sophisticated methodology (e.g., a multivariate regression model with a larger sample and more refined control variables) to better demonstrate the potential of a Moneyball strategy. But their initial analysis suggests that the traditional heuristics may indeed be overvaluing and undervaluing candidates in predictable ways. Further, the Black & Caron study suggests that SSRN downloads may have some ability to predict future citations. See p. 47, tbl. 10.
If Caron & Gely did collect the data for a refined study, would they publish the results in a law review (or, worse yet, a blog)? There might be more at stake here than a burst of SSRN downloads.
TrackBack URL for this entry:
Links to weblogs that reference A Numbers-Driven Appointments Committee: