And Wharton finishes second, the deal is that it is always supposed to be in the top 3, so that's nice. The rankings are weird - they change methodology each time, which means that they got to put Duke first this year, which is incorrect. Most of the rankings, moreover, are based on student surveys (hmmm), and employer surveys, again, with widely varying samples over time. Social science, this is not.
However, perhaps the oddest single table in the rankings comes from what you'd think would be its most objective metric. BW counted the number of articles written by tenure track faculty in 20 journals, adopted a weird point system for apportioning credit based on length and co-authors (but not totally weird), and must have expressed the final number per capita (it's impossible to imagine that Wharton wouldn't finish first on the list otherwise - the school has 240 TT faculty, way more than anyone else).
Here's the top six, Ohio State is also in the top ten:
1. UC San Diego
2. Duke
3. Cornell
4. Wash. U.
5. Texas at Dallas
6. Maryland
I'm not really sure what to make of such a diverse array of schools, which don't include Chicago, Stanford, &c, &c. It's only through the survey responses that BW ends up with a ranking that looks somewhat "normal."
PS, if you're wondering what's a good business school journal, BW's list isn't bad. It includes:
the Harvard Business Review, Journal of Marketing, Operations Research, Information Systems Research, Journal of Finance, American Economic Review, Journal of Accounting Research, Journal of Financial Economics, Management Science, Academy of Management Review, Journal of Marketing Research, Strategic Management Journal, Accounting Review, Academy of Management Journal, Production & Operations Management, Journal of Business Ethics, Journal of Consumer Research, Review of Financial Studies, Administrative Science Quarterly, Marketing Science.
Permalink | Rankings | Comments (0) | TrackBack (0) | Bookmark
A perspicacious reader tipped me off that Above the Law has a new top law school list out. A few miscellaneous thoughts.
1. Does the world really need another law school ranking system? ATL says what differentiates its methodology is an emphasis on outputs, not inputs. It has a nifty graphic rejecting traditional inputs like entering students' LSAT scores and GPAs in favor of "real law jobs, quality full time positions, school costs, and alumni satisfaction." OK, I kind of get that. Measuring outputs in general is the holy grail for law schools, something everyone wants to do but no one quite knows how to do.
2. How exactly do those outputs get measured and weighted? Here's the breakdown (again, go to ATL for the graphic)
- 7.5% SCOTUS Clerks (adjusted for the size of the school)
- 7.5% Active Federal Judges (currently sitting article III, adjusted for the size of the school)
- 10% ATL Alumni Rating (nonpublic, a product of the ATL insider survey)
- 15% Education Cost (total cost, adjusting the score in some cases for cost of living)
- 30% Employment Score (counting full-time, long-term jobs requiring bar passage, excluding solo practitioners and school-funded positions)
- 30% Quality Jobs score (placement with NLJ 250 firm plus federal clerkships)
3. How did my school do? Well, Georgia Law does well--many of you might think remarkably well. I'm less surprised for two reasons. First, Georgia looks pretty good according to these output measures:
- We are cheap. Georgia residents pay $16,506. (Non-residents pay more than twice that, but qualify for resident status after a year.)
- We have sent 6 graduates to the Supreme Court in the past 9 years.
- Although this market has been brutal, I think our students have fared relatively well, especially because their relatively low debt burden gives them more flexibility in choice of job.
- Our alumni have an almost cult-like love of the school.
And second, just as most CEOs will tell you their stock is undervalued, probably most professors probably think their schools are undervalued. Admittedly I bring some bias to the table!
Here are the top 20 (see here for the full 50):
1 Yale Law 2 Stanford Law 3 Harvard Law School 4 University of Chicago Law 5 University of Pennsylvania Law 6 Duke Law 7 University of Virginia Law 8 Columbia Law 9 University of California, Berkeley 10 New York University 11 Cornell Law School 12 University of Michigan 13 Northwestern Law 14 University of Texas at Austin 15 Vanderbilt Law 16 Georgetown Law 17 University of California, Los Angeles 18 University of Notre Dame Law 19 University of Georgia Law 20 University of Southern California, Gould
h/t Haskell Murray
Permalink | Law Schools/Lawyering| Rankings | Comments (0) | TrackBack (0) | Bookmark
Last week, Christine revisited the debate over ranking law schools by comparing one labor-intensive study evaluating schools of education with the US News approach for law schools. This reminded me of a different attempt to rank undergraduate schools by Washington Monthly (see their 2010 rankings released over the summer here; see a NY Times Economix blog on the ranking criteria here). Washington Monthly attempts to provide students with information missing from US News (and sell magazines) by focusing on the following metrics, among others:
- Measuring how many students and recent graduates pursued public service (by using metrics like Peace Corps and ROTC participation as proxies); and
- Measuring the caliber of research at an institution (this measure differs according to the type of institution being ranked – e.g. liberal arts college vs. national – but includes metrics like dollar figures on total research expenditures).
But what is most interesting to me about the rankings is the attempt to measure “Social Mobility.” The editors state that they are interested in measuring the success of schools at “recruiting and graduating low-income students.” The introduction to the rankings explains more provocatively that the purpose of the guide is to ask:
. . . not what colleges can do for you, but what colleges are doing for the country. Are they educating low-income students, or just catering to the affluent? . . .
To measure social mobility, the rankings list the percentage of students receiving Pell Grants then compares a predicted graduation rate (based on the magazine’s formula that factors in the percentage of Pell Grant recipients and SAT scores) with an actual graduation rate. It is hard to tell how much stock to put in the magazine’s formula given the web site’s disclosure on methodology, but the very attempt to develop this metric has a number of fascinating implications for law schools.
Measuring teaching: inputs and outputs
First, law schools could learn much from this attempt to measure how well institutions do in their mission of educating students. There is not a direct parallel to law school – we are thankfully past the day when law schools failed a large portion of the 1L class (“Look to your right, now look to your left, one of those students …") But there is a need to find metrics of how well law schools do in preparing students.
The ABA created a minor firestorm late last year when it proposed shifting its evaluations of schools from “inputs” (like library volumes and raw expenditures) to “outputs” (most generally – how well are students prepared for the practice of law). Many law deans raised a ruckus when the ABA announced its intention to shift. There are indeed legitimate questions about whether any given output metric will truly capture what it means to be prepared for the practice of law (for example, using bar passage rates assumes that the bar exam is a good gauge of what it takes to be a lawyer). Nevertheless, it does strike me that students would be well served by more attempts to measure teaching quality.
Of course, as one of my co-bloggers gently pointed out to me at a conference in March, many students may care less about metrics of teaching quality than the signaling value of a degree based on its selectiveness in admissions. In other words, Yale could teach basket weaving, and students would still flock there (to pick a fanciful example). But, I still think student would care about measuring teaching because:
(1) the Yale example may apply less when we move to schools down the rankings ladder;
(2) it would be useful to differentiate two closely ranked competitors in the same market – let’s say UCLA and USC; and
(3) given the shake-up in the legal market, I wouldn’t be too comfortable even that a Yale degree would ensure continued employment – law firms looking to downsize may focus on whether associates can do the work, not just their pedigree.
So, the Washington Monthly endeavour ought to push law schools to develop metrics for measuring educational quality – although the rub is in finding the right metric.
Social mobility
The second interesting facet with the Washington rankings is its goal of measuring whether schools further social mobility. Social mobility is usually not part of what we talk about when we talk about the goals of legal education. Instead, law schools see themselves as preparing excellent lawyers for the practice of law. Admissions officers focus more on diversity than on social mobility.
But shouldn’t we be concerned with social mobility? If not in admissions decisions, at least in measuring and disclosing how education at one school compared with another affects social mobility? Some professors may turn up their noses at this idea, but I guarantee that many students care about this topic intensely. Many see law school either as a ticket into the upper middle class or as a means to stay on that precarious social rung. Do law schools deliver? The crisis in the legal job market combined with high tuition and debt loads has made that question particularly fraught. That doesn't mean we shouldn't try to answer it.
To measure social mobility we would need to look at both the front and back ends of legal education. At the back end – we can start with what we have -- post-graduation employment rates and salary information. Of course, data ought to be both audited and improved (such as providing information on ranges and standard deviations) to prevent law schools from gaming numbers and to provide better qualitfy of information. In addition, per the Washington rankings, back end success is more than just salary. Students might care about data on clerkship success and the percentage of students pursuing public or public interest careers. Many alternative rankings now provide bits of this information.
A lot more work could be done on the front end – namely measuring the quality of students entering a law school. Beyond LSAT scores, we might think about standardizing a series of admissions questions designed to look at the socioeconomic status of applicants. How many law schools ask of applicants the question that colleges often ask – namely the highest educational degree obtained by parents? Think about what even zip codes during high school could tell us.
We might disagree on whether social mobility ought to be an objective of law school admissions and education, but having better data on it would be valuable both for applicants and for the swath of the public that cares about legal and professional education.
Permalink | Law Schools/Lawyering| Rankings | Comments (0) | TrackBack (0) | Bookmark
Brian Tamanaha says YES. The reason: "Law schools ... responded to the worst recession in the legal market in at least two decades by letting in more law students."
From the comments, Brian's point has as much to do with the cost of legal education and the resulting debt burdens than with the number of prospective lawyers being admitted to law schools. For a creative, if futile, attempt to deal with law school debt, see this letter from a Boston College student to Interim Dean George Brown.
You might argue -- as have some of the commentators at Balkanization -- that law schools should not be responsible for rationing admissions when demand is high (as demand tends to be in a recession). Brian's response: "I would feel more comfortable with this rationale if law schools were honest and forthcoming employment numbers. That is not the case."
Back to that BC student. His proposal was elegant: "I am willing to leave law school, without a degree, at the end of this semester. In return, I would like a full refund of the tuition I’ve paid over the last two and a half years." Notice this argument in favor of his proposal:
This will benefit both of us: on the one hand, I will be free to return to the teaching career I left to come here. I’ll be able to provide for my family without the crushing weight of my law school loans. On the other hand, this will help BC Law go up in the rankings, since you will not have to report my unemployment at graduation to US News.
But the student is simply wrong, as Elie Mystal observes at ATL:
Here’s where he’s wrong. Boston College can just do what every other law school is doing: artificially inflate its employment statistics with a variety of part-time, temporary, or on-campus jobs. U.S. News has decided to ignore this problem, so there is no downside to the law school if its students can’t find the kinds of legal jobs they expected.
And this kid will probably end up with some kind of job. He’s got a baby on the way! What, he’s just going to sit on the couch and steal used Pampers out of area trashcans? No. He’ll find something. He’ll go back to his prior career in teaching (if he can), he’ll work retail, he’ll clean toilets — he’ll do what he has to in order to provide for his family.
And when, through his hard work and determination, he finds something to bring in a little bit of cash, BAM, BC Law will report him as “employed upon graduation” to U.S. News. Problem solved!
And now you get to the moral of the story: the U.S. News rankings are the root of all evil.
UPDATE: As long as we are featuring cynical thoughts about law school ...
Permalink | Law Schools/Lawyering| Rankings | Comments (0) | TrackBack (0) | Bookmark
Almost all economists, and certainly most of the well-cited ones, are quantitative empiricists now (and really a particular kind of empiricist, now they don't do big regressions, they look for instruments or experiments) - that's up from circa zero empiricists 50 years ago. Political science is a bit more heterogeneous, but APSR is almost exclusively the domain of quantitative empiricists, leading some in that field to observe, as Brian Leiter did yesterday re empirical legal studies, that the field is risking becoming arcane and narrow (here's Josh Wright and Professor Bainbridge on it too).
When will law follow suit? I think it will take a while, not least because the sort of state of the art that people with Ph.Ds are expected to do is a very far cry from the sort of work almost any law professor could be expected to do. I like my colleagues in the Finance Department, in other words, but I don't submit to the Journal of Finance (here's the latest edition - some of the subjects are of interest, but do have a look at the methods sections). In a discipline where only a tiny minority of the faculty have social science Ph.Ds, the tipping point towards technical empiricism is harder to identify than it probably is for economics and political science ... and that's not counting the possibility of buyer's remorse in those fields, or in this one, the conflation of the Ph.D with the ability or desire to do empirical work, the prospect that a subgroup will go down an unproductive rabbit hole (as far as I can tell, the law and courts people are still trying to decide whether the law constrains people, especially judges), and so on.
But look, corporate law and law and economics have been acquainted with empiricism as long as anyone, and one way of looking at how ELS is doing would be to see how those scholars are being cited, and for that we might consider Leiter's own invaluable empirical research.
Here's the corporate law list:
John Coffee |
Lucian Bebchuk |
Larry Ribstein |
Stephen Bainbridge |
Roberta Romano |
Ronald J. Gilson |
Reinier Kraakman |
Bernard Black |
Donald Langevoort |
Robert Thompson |
Runners-up for the top ten |
Henry Hansmann |
Mark Roe |
Lynn Stout |
Stephen Choi |
Jill Fisch |
Highly Cited Scholars Whose Cites
Are Not Exclusively in This Area |
Jonathan Macey |
Melvin Eisenberg |
On which I count between 2 and 4 empiricists. Here's the law and economics list:
Richard Epstein |
Eric Posner |
Ian Ayres |
Steven Shavell |
Robert Cooter |
Louis Kaplow |
Thomas Ulen |
Christine Jolls |
Einer Elhauge |
George Priest |
W. Kip Viscusi |
Runners-Up for the Top Ten |
Lewis Kornhauser |
Saul Levmore |
A. Mitchell Polinsky |
On which I count between 4 and 6 (though I may be missing some). For a total of between 6 and 10 out of 31. This is the senior list, but I've got to tell you, I don't think lists of juniors would be that different (you could look at the youngest members of Leiter's list and consider whether they would characterize themselves as empiricists or not). And those are the most economically-oriented fields. With everyone at every school doing scholarship now, I predict that there will be many scholars who write and cite work that isn't empirical, or the kind of empirical that social scientists understand as empirical, for years. Of course, social scientific empiricists won't care who cites them if they get jobs that they like, but still, you take the point.
Anyway, I'm not sure what to make of this. Nobody wants solely doctrinal scholarship or totally ungrounded theory. I do some "empirical" work, but I do it to be an upstanding member of the community, to take first cracks at developing data that someone else might use, to add context to nonempirical papers, and to steel myself to keep reading that literature. I wouldn't advise anyone else to do anything more than that, unless they've got their Ph.D and go to social science conferences, but your mileage may vary.
Permalink | Empirical Legal Studies| Law & Economics| Rankings | Comments (0) | TrackBack (0) | Bookmark
The specialty rankings are based solely on polling legal educators. As Brian Leiter notes, reputation scores are not "wholly meaningless" (that's a ringing endorsement from him). There are some legitimate questions about what would constitute "business law." Is it more than corporate and securities? Does it include commercial law? Banking? Employment law? Trusts and estates? It seems this defintional issue could still be addressed and US News could provide information on an area many prospective law students care about.
The upside of not having this information though, is that it may encourage students to seek out better sources - like Leiter's own rankings. Leiter's recent solution to the category question when ranking scholarly impact is to look at "Commercial Law/Bankruptcy" and "Corporate/Securities" as separate categories. In a similar 2007 study of scholarly impact, he had categories for "Business Law" (lumping in Corporate, Securities, Commercial, Bankruptcy and Antitrust), "Labor and Employment" and "Wills, Trusts, and Estates."
Permalink | Law Schools/Lawyering| Rankings | Comments (0) | TrackBack (0) | Bookmark
Are, for 2007:
- Duke
- Northwestern
- Wisconsin
- Chicago
- UCLA
- NYU
- Hastings
- Illinois
- Florida
- USC
- CT
- BC
- ND
- Iowa
- Fordham
- Maryland
- W&L (actually tied with Maryland for 16th, but why mess with the tyranny of Typepad list format?)
- Cornell
- Virginia
- Alabama
And Georgetown is 21, Texas 25, Vanderbilt 32, Michigan 56, California 76. It is an interesting list, and it's one I received in the mail, so I appear to be unable to direct you to a link. I'm also having a hard time thinking of a reason for the breakdown (Midwestern and Southern early ExpressO adoption? More student written articles in the above?). Do let us know whatever conclusions you draw in the comments.
Permalink | Rankings | Comments (10) | TrackBack (0) | Bookmark
Via Tax Prof Blog, the U.S. News & World Report is "seriously studying" two suggested improvements to next year's law school rankings. The first is to only include ABA-accredited school graduates who are first-time test takers in calculating the bar passage rate of a jurisdiction. Fine. The second, however, is a little more controversial: use the median LSAT and UGPA for both full-time and part-time students for each ranked school. Hmmm. (Disclaimer: Illinois has no part-time program, so our ranking would not be affected unless other schools' rankings went down because of the change.)
The reasoning seems to be that some schools are gaming the system by putting low-LSAT/UGPA applicants into the part-time program, where they either get the same education (by being part-time "day") or perhaps transfer into the full-time class after a year. The thinking is that these programs aren't really "separate." I actually do not know if this is true. I have taught at two different schools with part-time programs (Houston and Marquette), and was on admissions at both schools. At both schools, admission to each program was completely separated and on different time-tables. Would the medians have been different? Yes. That is not to say that the part-time programs at each did not have standout students who could have gotten into the full-time program at many law schools. However, part-time programs are very attractive for a variety of candidates. I vaguely remember discussions of one or two individuals who may have been rejected for full-time admission but called and made a very strong case for themselves and offered to go to the part-time program. My sense was that these cases were situations in which the admissions committee was not sure that the candidate was up to the challenge of law school, and the candidate basically offered to go through a probationary period of part-time instruction.
If there is intentional gaming, then I'm not sure how widespread it is. The comments on the USNWR web page seem to accuse schools like Georgetown and GW of gaming the system, but haven't their programs been large since before the USNWR rankings became the ends and not the means?
However, I'm sure that schools are aware that the medians of part-time students are not involved in the calculation. This allows schools to accept more part-time students than they otherwise would. This benefits the school financially and allows non-traditional students an opportunity to go to good law school on their own schedule. When applicants have been out of school for years, if not decades, assessing them by their UGPA and their performance on an unfamiliar standardized test seems fairly ridiculous anyway. If the rankings are to change, the schools will ultimately have to make hard choices about their part-time programs, which may not benefit the school or the public at large.
Of course, I can't decide if the better analogy for adjusting the rankings calculations at the margins is the Bluebook's once-every-five-year ritual of a new edition of the phrase "tinkering with the machinery of death."
Permalink | Rankings | Comments (1) | TrackBack (0) | Bookmark
So begins Tracey George's post over at ELS, where she's guest blogging this week. She reminds us that few topics produce more downloads than a paper on law school rankings, and especially one that introduces a new ranking system. Last year, Tracey wisely wrote such a paper, ranking schools based on their empirical legal scholarship (ELS). 860 downloads as of today. Hmmm, not bad. I remember Emory doing relatively well in that ranking, so I circulated the abstract among my faculty when the paper first came out. Tracey has now revised the paper (and assured me that Emory would do even better in this revised version) and set up a tantalizing schedule of upcoming blogging episodes. Over the course of the week, she'll be discussing her three measures of institutional ELS success--professors with social science doctorates, professors with second appointments in social science departments, and articles in ELS-oriented publications. In the spirit of reality TV everywhere, she's going to make us wait until Friday to see the revised rankings. Stay tuned.
Permalink | Empirical Legal Studies| Law Schools/Lawyering| Rankings| SSRN | Comments (0) | TrackBack (0) | Bookmark
Warning to law schools: Tom Bell is watching what you report to the U.S. News. So far, Baylor and Florida have gone under the microscope.
Permalink | Law Schools/Lawyering| Rankings | Comments (0) | TrackBack (0) | Bookmark
For the first time, Business Week has published a ranking of undergraduate business programs. Not surprisingly, my alma mater, BYU, does well, coming in at #8. I graduated from BYU with an accounting degree in 1986, and even back then the program had a strong national reputation. In this survey, BYU ranked #1 in the recruiter survey. BW's comments about BYU: "Stellar accounting program and ethics-based education wow many. But student body is 98% Mormon and may not be for everybody."
May not be for everybody? Nice understatement.
The #1 school in the ranking is Wharton, which is the only school in the survey that did not agree to cooperate with BW. Wisconsin checks in at #27.
Of course, how could we discuss rankings without a nod to methodology. You can find the description of BW's methodology here. Whoever came up with this idea should get a raise because BW will sell lots of issues, and the methodology will produce changes from year to year, which will ensure a captive audience.
UPDATE: Regarding the captive audience, do a Google News search on "Business Week undergraduate" and notice all of the schools touting their rankings.
Permalink | Rankings | Comments (2) | TrackBack (0) | Bookmark

Sun | Mon | Tue | Wed | Thu | Fri | Sat |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
6 | 7 | 8 | 9 | 10 | 11 | 12 |
13 | 14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 | 23 | 24 | 25 | 26 |
27 | 28 | 29 | 30 | 31 |
