November 08, 2005
Why SSRN might be a superior measure of faculty productivity
Posted by Bill Henderson

Many thanks to Gordon, Christine and Vic for inviting me to guest blog at Conglomerate. I really appreciate the opportunity. Without further ado, this post argues that SSRN downloads may be a much better proxy for faculty productivity and performance than most people realize. Let me get a couple of ideas onto the table. 

Bernie Black’s and Paul Caron’s Ranking Law Schools: Using SSRN to Measure Scholarly Performance   systematically compares SSRN download data with three other measures of faculty productivity: (1) citations counts, (2) quantity and placement of faculty scholarship, and (3) reputational surveys. Despite “blogger bias” and other factors that can skew SSRN downloads in ways unrelated to merit (however one defines it), Black and Caron show that SSRN downloads are, nonetheless, highly correlated with the traditional measures of faculty performance.

In a short commentary on the Black & Caron study, Ted Eisenberg acknowledges the innovation and efficiency of SSRN’s system. But he also points out that SSRN downloads have lower intercorrelations with the reputation, publication, and citation metrics when the cohort is limited to Top 10 to Top 20 law schools—i.e., peer institutions that actually “compete for students and faculty.” Eisenberg therefore contends that SSRN downloads are missing key information that is captured by the other, more established measures (e.g., citation counts use Westlaw’s TP-ALL, which has broader subject matter coverage).

Here is a comparison of intercorrelations drawn from one Eisenberg’s key tables.

Measure of Scholarly Performance

Correlation with SSRN Downloads

Correlation with Leiter Reputation

SSRN Downloads

1.00

0.55*

Leiter Reputation

0.55*

1

US News Reputation

0.51*

0.96*

Lindgren & Seltzer Top 20 Journals

0.69*

0.83*

Leiter Top 10 Journals + books

0.45*

0.78*

Eisenberg & Wells Citation Count

0.50*

0.90*

Leiter Citation Count

0.61*

0.88*

* p ≤ .05

Obviously, the SSRN correlation coefficients are lower than those for the Leiter’s Reputation Ranking. The other non-SSRN intercorrelations are similar. (See Eisenberg table 1C).

So Eisenberg may be right: SSRN may be missing key information. On the other hand, it is also possible that the other measures of faculty performance—reputation, publications, and citation counts—all share an irrelevant confound that inflates their intercorrelations.

Consider the following causal chain:

  1. Faculty reputation is a “sticky” variable. The first US News ranking in 1987 (limited to Top 20) was based on survey of 183 law school deans. Eighteen years later the same 16 are still the Top 16. They’ve played musical chairs a bit, but the inner circle has stayed the same.
  2. Law review editors are relatively risk averse and, all else equal, will favor professors from more prestigious law schools. This is the well-known letterhead bias. And, as just noted, the pecking order hasn’t changed too much over time.
  3. Some law professors, especially those trying to write their way up the hierarchy, will be reluctant to cite articles in less prestigious journals. If professors at the Top 20 (30, 40?) publish in the Top 20 (30, 40?), then many professors will, consciously or unconsciously, favor citations to the Top 20 (30, 40?).

If such a prestige bias exists (#1 affects #2, #2 affects #3, #3 reinforces #1), then the non-SSRN intercorrelations are destine to be inflated.

In contrast, SSRN downloads may not suffer from the same prestige bias, or at least not to the same degree. If the abstract makes an intelligent pitch, the article is in your field, it has not yet been published by a law review (i.e., there is uncertainty as to substantive or signaling quality), there is a strong incentive to download it. Timeliness, subject matter relevance, and anonymity encourage scholars to read the work of others at less prestigious schools in order to remain competitive.

In such an environment, SSRN downloads can be a good way to identify underplaced scholars. Further, law schools with SSRN downloads far in excess of their US News rankings (see Black & Caron, table 15) may have figured out markers for productive scholars that more elite schools tend to ignore. And all the data is there, right before our eyes.

In my next post, I will tie this topic into Paul Caron’s & Rafael Gely’s insightful Moneyball analysis.

Law Schools/Lawyering, SSRN | Bookmark

TrackBacks (0)

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d8345157d569e200d83494e7b769e2

Links to weblogs that reference Why SSRN might be a superior measure of faculty productivity:

Comments (22)

1. Posted by Michael Guttentag on November 8, 2005 @ 19:17 | Permalink

My chief concern with SSRN, that I expect others share, is that the number of downloads does not seem to provide a reliable indication of the level of scholarly interest in an article. Have you ever done a calculation comparing the total number of downloads and the number of legitimate scholars in a particular field? I look at the download numbers for some papers (not my own, of course), and wonder: do this many people really have serious scholarly interest in this topic, or are they downloading this paper for some other reason? If we had a system that could identify and weigh “appropriately,” whatever that means, the difference between Bernie Black or Bill Henderson downloading a paper, and a student downloading a paper because it was assigned reading or to cite in a term paper, that would really be something. Call it Google meets SSRN.


2. Posted by Gordon Smith on November 8, 2005 @ 22:20 | Permalink

Bill, Another concern, related to Michael's, is the potential for manipulation. Is there any other measure of scholarly success that is so easily manipulable as SSRN?

* I could email everyone in my law school and write, "Support me and support the University of Wisconsin by downloading my article!" That's potentially hundreds of extra downloads right there, if the students get into the act.

* Then we could enlist the alumni in the effort to raise the school's rankings. That's gotta be worth hundreds or thousands of downloads.

* Or I could put a link on my blog that says, "Girls! Girls! Girls!" when it's actually one of my papers.

* Or I could ask my mother to download each of my papers once a day until she just can't spare the effort.

The reason we think this doesn't happen already is that nobody cares that much about SSRN rankings, but with increased exposure and use, methods to exploit the rankings will follow.


3. Posted by William Henderson on November 9, 2005 @ 14:34 | Permalink

Michael, Gordon,

You both identify legitimate shortcoming of the SSRN ranking system. And if the unit of analysis was individual faculty members, the potential for error is higher than for overall faculty as it is reasonable to assume that serious gaming is, for the most part, randomly distributed among law schools.

Yet, when the unit of analysis is overall faculty, Black and Caron have a very strong argument: If SSRN has truly serious biases--so serious that it cannot be taken seriously--then its correlations with other established measures of faculty quality and productivity (Leiter reputation, US News reputation, the leading citation count studies, the leading publication count studies) would be relatively low. But instead, the correlations are remarkably high and quite comparable across ranking systems.

Eisenberg's critique was an "apples-to-apples" comparison of Top 20 schools that actually compete with each other. Yes, the SSRN intercorrelations are less powerful in this range, but that is exactly the same range where the prestige bias I outlined above is likely to operate with the greatest vengeance. The more valuable information is which schools outside the Top 20 are outperforming their current US News ranking. From there, it may be possible to reverse-engineer effective hiring strategies.

It is easy to point out SSRN’s flaws. But the more relevant analysis is weighing its weaknesses against its strengths. If we understand these relationships, it becomes apparent that SSRN has several advantages that the other ranking systems lack. We can criticize pliers for not being a wrench or a ratcheting socket, but if we possess all of these tools and knowledge of how to use them, we are better equipped to build something substantial.


4. Posted by AnonNotAProf on November 9, 2005 @ 18:58 | Permalink

New poster here. My father is a university professor in a science field, and so I grew up around academics, and I married a history professor, but I ended up the black sheep that went to law school.
Coming from this background, I find discussions about law school rankings and prestige to be fascinating but somewhat strange. And I have something of a chip on my shoulder about it.

As you all know, if one goes to graduate school to get a Ph.D. in the humanities or sciences, you become a protégé of your graduate advisor, doing similar work under his/her tutelage and learning his/her wisdom, as a trainee for the academic priesthood. Particle physicists in major research universities, for example, conduct a good portion of the research on particle physics, and are primary producers of knowledge in the field. It is this knowledge - as captured in peer-reviewed journals - that is the currency of the profession of particle physics; the whole point (or product) of the endeavor.
In such a system, a school's reputation is built on the quality of the professors in terms or research, grants, and productivity, in producing this "product" that defines the field (published physics research for physics, published historical scholarship for history, etc.). A good reputation in turn encourages prospective grad students with the best credentials to apply to work for that professor, who then graduate and populate other universities, which in turn try to attract young professors who attract grants and are the most productive, thus repeating the cycle. To join the priesthood of the academic discipline, you must study under a priest.
In my case, I applied to law school in part because I had an interest in a particular legal subject (let's call it Subject Z). Thinking that "reputation" of law school was like "reputation" in "academic" disciplines, I dutifully poured over the research interests and publications of the professors described in the admissions brochures. I had the stupid idea that if I went to a law school that had a professor who was the most noted scholar in the area of Subject Z, that this would be a good way for me to likewise become a scholar -- and possibly a professor -- in the area of subject Z. I even chose a lower-ranked law school over a higher ranked one, on the basis of what research the professors were doing and where they were publishing in the area of Subject Z. Of course, that was silly and stupid on my part: The fact that Professor X had published law review articles about subject Z in the most prestigious journals, did not in any way mean that I as a student would have any meaningful opportunity to study or become a protege under a professor in subject Y, much less become a professor myself in that area.
In law school I realized that law professors aren't really the "high priests" of law in the way, for example, that physics professors at major research universities are the high priests of physics. Whereas an article in a peer reviewed scientific journal embodies the ultimate "product" of the endeavor known as "scientific research," the content of a law review article is not the ultimate "product" of the endeavor known as "law."
The analogous "priests" of law, it would appear, are judges. Judges, it seems to me, and especially appellate judges, are the ones who produce the ultimate intellectual product of this field of endeavor called "law." So why not base law school rankings on the number (and appellate level) of alumni who become judges?
(And to finish my story, I ended up as neither a professor nor a judge.)


5. Posted by AnonNotAProf on November 9, 2005 @ 19:05 | Permalink

Oops. Last sentence of 4th paragraph should read, ". . . or become a protege under a professor in subject Z, much less become a professor myself in that area."


6. Posted by Gordon Smith on November 9, 2005 @ 20:31 | Permalink

Bill,

This is really silly. You and Bernie and Brian are missing a huge and obvious point: the correlations between SSRN and other measures of faculty quality may be "remarkably high and quite comparable" now, but if SSRN threatened to become a serious force in the ratings race, the gaming (which is relatively low cost) would destroy its credibility.


7. Posted by William Henderson on November 9, 2005 @ 20:59 | Permalink

Gordon,

Two quick comments:

1) Read the Black & Caron study. It is harder to "game" SSRN than you might think--they have a whole section discussing this point. Downloading an article has an opportunity cost (slight but real) and a monetary cost (usually paper to print it out). I suspect many "web ads" for SSRN articles produce abstract reads but not much more, especially as they become more pervasive. Why should students and alumni be chumps for some egocentric law professor? People download because (a) they learned the article exists, and the web helps that (e.g., Bainbridge.com), and (b) the article looks interesting. It is hard to fudge the latter factor in an abstract.

2) SSRN downloads are becoming more important because law school competition is becoming more intense. Thus, I think your use of the word "if" is misplaced.

AnonNotAProf,

You lay out the peculiar differences between law and other academic disciplines (maybe law is a trade?) in a very clear and interesting way. And your law school selection process is great story that offers, to me anyway, some valuable perspective.

Re Judges being the high priests of the profession, Judge Harry Edwards used to be a professor at Michigan before his appointment to the DC Circuit in the late 1970s. Edwards has written several articles on the "growing disjunction" between the legal profession and the legal academy. (His most famous essay on this topic appeared in the Mich L Rev in 1992; it lead to a symposium at Michigan the following year.)

In a nutshell, he thinks (present tense, he repeated the thesis two weeks ago when I heard him speak) that the ascendency of "law and ... " scholarship has largely eclipsed the professional rewards for writing solid doctrinal analysis. Edwards believes that large proportion of the now professoriate has contempt for the practice of law. As a result, professors write for other professors and not for practicing lawyers ... And he also says that BigLaw partners are too focused on profits to the detriment of professional values.

So, if there is anything to Edwards' thesis, law professors are unlikely to warm to your proposal. But I would be interested in seeing the data on judgeship and law schools tabulated. It seems to me that there would be a lot of noise in the data because politics, over and above qualifications (however one defines it), is no doubt a major factor in a judicial appointment.


8. Posted by Gordon Smith on November 9, 2005 @ 21:22 | Permalink

I did read it, and I think the section on gaming is wishful thinking. Shall we do an experiment right here on the blog to prove it? Would they kick me off SSRN if I could improve my downloads overnight?


9. Posted by William Henderson on November 9, 2005 @ 21:29 | Permalink

Gordon,

I will would be interested in any good experiment you can dream up to test the gaming risk. I am willing to concede this point with good evidence. Game away. And if you are right, I hope Bernie is paying attention.

How would you control for interesting scholarship? Gordon, your work might just be worth reading.


10. Posted by Gordon Smith on November 9, 2005 @ 21:38 | Permalink

Bill, Let me sleep on this. I don't want to get Bernie angry because I like him. But I am quite confident that I could improve my own ranking by at least a couple hundreds spots quite easily ... with a little help from my friends.

Post a comment

If you have a TypeKey or TypePad account, please Sign In

Bloggers
Papers
Posts
Recent Comments
Popular Threads
Search The Glom
The Glom on Twitter
Archives by Topic
Archives by Date
August 2014
Sun Mon Tue Wed Thu Fri Sat
          1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31            
Miscellaneous Links