These statistics are important, because they're all we have to work with when analyzing employment and salary outcomes, but one thing that those of us who pay attention to this stuff need to remind ourselves and others of is that all these numbers need to be taken with a pillar of salt. As bad as the story they tell appears to be, these statistics still almost certainly give a far too positive picture of the actual employment situation for recent law school grads.
The NALP statistics are far too complete to be genuinely reliable. This sounds like a paradox, and to understand that paradox we need to look at how these statistics are compiled. Law schools attempt to survey graduates regarding their employment status in the following manner: In their final semester of law school, students are asked by their school's CSO to report where they'll be working after graduation. Students who don't have a post-graduation employer, or who don't report their status, are surveyed again at some point between graduation and the February 15th NALP/ABA reporting date regarding status nine months after graduation.
There are several things about this process that lead to a systematic exaggeration of positive employment outcomes. Here I'll discuss two:
(1) 3Ls who report prior to graduation that they will be employed are not surveyed again prior to the following February. Unless the CSO receives information to the contrary (which in practice means the graduate reports his or her changed status), 3Ls whose anticipated jobs fall through between the spring of their graduating semester and the following winter, or who lose their jobs, are listed as employed (almost always, of course, in full-time positions requiring bar admission). In other words, the NALP procedures instruct law schools to go to great lengths to determine if the employment status of unemployed graduates changes between graduation and the nine-month reporting date, but they use a default rule which assumes no one has a job fall through, or otherwise becomes unemployed, over this same time frame.
This is probably produces a relatively minor positive bias in the statistics. The next factor produces a very major one.
(2) NALP "best practices" encourage schools to fill in the gaps when graduates provide incomplete information, or none at all. This explains the remarkably high reporting rate on these surveys, and the even more remarkable comprehensiveness of the reported data. As DJM noted, employment status was reported for 94% of the national class of 2011. This by itself is an exceptionally high "response" rate in the context of such surveys, but it becomes even more remarkable when we take into account the curious case of the Thomas M. Cooley School of Law. Cooley failed to ascertain the employment status of no less than 263 of its 2011 graduates. The school by itself accounted for nearly one in every ten 2011 law school graduates whose employment status was unknown. If Cooley is taken out of the equation, the national reporting rate rises to nearly 95%.
Now I suspect it's far from coincidental that Cooley failed to report the employment status of 26.3% of its graduates, when the other 200-odd ABA accredited schools collectively had a 5% non-reporting rate (and even this average is heavily affected by relatively low reporting rates among other bottom feeder schools. A quick glance suggests that top 100 schools have a median non-reporting rate in the 1%-2% range). Indeed we can, perhaps surprisingly, learn quite a bit from Cooley when trying to figure out just how reliable NALP statistics really are. Consider this statement on Cooley's webpage reporting employment statistics:
NOTE: The ABA advised schools to determine and report every graduate’s full-time/long- or short-term and part-time/long- or short-term employment status, even if a graduate did not voluntarily supply complete information. Of those 2011 graduates reporting sufficient information for Cooley to make this determination, slightly more than 87 percent reported full-time/long-term employment. Based on this high percentage, the default classification for those lacking complete data was full-time/long-term unless Cooley had evidence to contradict that classification. Note also that graduates working for legal temporary agencies were classified as fulltime/long term due to the nature of their employment contract with the temporary agency.This reporting policy, we should note, was announced in the midst of a lawsuit accusing Cooley of reporting fraud. It represents, in other words, the considered policy choice of a heavily lawyered up institution.
Under the circumstances I very much doubt that Cooley would employ this methodology unless it had a good basis for arguing that this remarkably optimistic spin on its employment data was following standard business practice. And indeed this turns out to be the case. It turns out that the default rule followed by law school CSOs is, when reporting employment data, to assume that "employed" means "employed in a full-time long-term position," unless the compiler of data has some reason to believe otherwise. In other words, graduates who provide incomplete data are assumed to be in the most positive employment situation. The same holds for graduates who report nothing, but whose employment status is determined independently by the CSO, by for example finding evidence of employment status on the internet.
All this helps explain two otherwise inexplicable facts: the apparent comprehensiveness of the NALP employment data, and the astonishingly high percentage of graduates in bar-required positions who are reported to have full-time long-term employment.
Note that the NALP data purports to provide a 100% reporting rate in regard to full-time/part-time/long-term/short-term employment among the 94% -- itself a remarkably high percentage -- of graduates whose employment status is known. The reason there's a 100% reporting rate in regard to these crucial variables is because law schools simply fill in the blanks when it comes to the huge gaps that exist in the data they actually manage to collect -- and they do so using the most optimistic methodological assumptions possible.
Here's just one example of how this works in practice. Consider Brooklyn Law School's class of 2011. The class had 455 graduates, of which 137 were completely unemployed nine months after graduation. Unfortunately for law schools, there's no way to positively spin flat-out unemployment. But when we look at the 236 BLS graduates who had jobs requiring bar admission it's a different story. Purportedly, only 21 of these 236 graduates had something other than a full-time long-term job. This improbable figure becomes even more improbable when we discover that 12 of those 21 people were employed by BLS itself in short-term positions. (BLS, knew, of courses, that these 12 graduates were not in long-term positions, since the school was paying for the jobs). So according to BLS, 215 of the other 224 BLS graduates (96%) who were doing legal work nine months after graduation were employed in positions that were both full-time and long-term.
That, of course, is completely incredible to anyone who has the slightest acquaintance with the actual employment status of current law school graduates from any school this side of the Holy Trinity. (Of course it helps a great deal if you treat, as Cooley openly admits it does, document review gigs as full-time long-term employment). But this incredible stat is far from incredible if you employ a data collection method that simply assumes that every piece of information you don't have actually represents the most positive possible outcome from the perspective of the reporting entity.
There's more to be said about how the reported employment data systematically overstates the actual employment status of new graduates, which I'll leave for another post.