Like a bunch of other law schools, Michigan has recently put up much more detailed employment statistics in regard to its recent graduates. I don't want to understate the importance of this development: it appears that, at the rate things are going, schools that don't have at least moderately informative employment stats posted on their web sites will be at a significant competitive disadvantage with schools who do during this admissions cycle. This is especially the case given the evidence that prospective students are beginning to get much savvier about how inadequate the standard level of law school "disclosure" has been (see, for example, this surprisingly realistic thread at Top Law Schools).
Nevertheless, the new disclosure regime that's beginning to take shape is still far from optimal. For instance, Michigan's stats, characterized by the school's CSO as "comprehensive," don't disclose anything about permanent versus temporary positions, or full time versus part time work. Nor does the school report anything about school-funded jobs. This seems like a particularly glaring omission, given that its close peer competitor UVA just revealed that no less than forty members of the UVA class of 2010 were in law school-funded "jobs" nine months after graduation. (On the plus side Michigan lists the precise identity of every judicial clerkship its graduates took over the last three years).
The most startling gap in Michigan's data is a product of the school's failure to gather salary information regarding its graduates. For the class of 2010, the school reports nine-month salary data for just 56.4% of its graduates. This isn't significantly more than the comparable percentage at Ohio State (45%), and is embarrassingly low in comparison to the figures for Chicago (94%) and Virginia (about 85%).
What could account for these differences in reporting rates? Is Michigan simply not attempting to gather salary data beyond whatever its graduates report in response to the school's post-graduation NALP survey? (It's difficult to believe that anywhere close to 94% of Chicago's 2010 grads actually reported their salaries, and in any case the school's web site says the salary data "is taken from a variety of sources, including: student self-reporting, alumni surveys, employer information, published salary data, and electronic resources.").
If that's the case -- I'm contacting the school's placement director for more information on this point -- then consider what the reported salary numbers most likely encompass. 129 of the school's 374 2010 graduates were working for megafirms of 501+ attorneys, while another 31 were working for firms of between 251 and 500 lawyers. If we assume that 90% these people reported their starting salaries (a glance at the salary reporting rates for graduates with these sorts of jobs from schools that seem to rely exclusively on self-reporting suggests this is a realistic estimate), that would mean that the school has salary data for something on the order of 65 of the school's 214 graduates who did not take jobs with very large firms. This would mean a top ten law school has failed to gather salary information for fully 70% of the 57% of its graduating class that didn't end up in Big Law. That, I suggest, ought to be unacceptable to prospective law students considering going to Michigan, given that such people invariably have other at least superficially attractive options in regard to what law school to attend.
(A commenter points out that the reporting rate dropped from 79% in 2009. It was also 79% in 2008, and 77% in 2007, although only 68% in 2006. In this broader context, the drop to 56.4% in 2010 certainly seems significant).
All this raises the broader issue of just how law schools go about gathering employment and especially salary data. It would appear that the effort and resources otherwise comparable schools put into this crucial task vary widely. That is one of the many things about law schools which needs to change.