Now, if you're not a law school professor or (especially) dean, moving up from 28th to 23rd in the rankings may not seem like much of a big deal. How many law school grads are going to get hired for a particular job that they wouldn't have gotten if their school had remained at 28th rather than 23rd in these idiotic rankings anyway? It seems implausible that the people doing legal hiring care about this nonsense, or at least care about it enough at the margin to justify, even in purely pragmatic terms, this absurd negative-sum positional game that law schools have gotten themselves into over the past couple of decades.
But law is an especially status and hierarchy-obsessed profession, so maybe it does make some sort of difference.
Second, this scandal illustrates, at best, the arrogance and cluelessness to which law school administrations are prone. The most charitable interpretation that can be put on the behavior of U of I administrators is that, even after they got caught in a bad admissions scandal due in part to what those charged with investigating it concluded was lack of sufficient institutional oversight and control over the admissions office, the school apparently did nothing to remedy that situation (the school's defense in regard to the current scandal is that all this fraud was solely the work of one unsupervised admissions office employee).
What is surprising is that the cheating in the law school went unchecked even after the Category I scandal had broken. With a median LSAT score represented as 168, the Class of 2014 was trumpeted as "the most academically distinguished" in the school's history. The actual median score was 163, four points lower than the previous year's median.The less charitable explanation is that Pless's fraud wasn't the work of one poorly supervised rogue employee, but something more troubling.
Nobody at the university was responsible for reviewing the accuracy of those numbers, even though the state commission investigating the first admissions scandal flagged the situation at the law school as worrisome. That commission's report, in 2009, criticized the university for leaving all decision-making authority in the hands of the admissions dean and recommended the school change that policy. It didn't.
Third, whether or not the school's explanation for what happened turns out to be the truth, this latest law school scandal naturally raises the question of what's going on at the other 199 ABA law schools, in regard to how information that affects law school rankings is gathered and reported.
A journalist doing a story on the ABA's oversight of legal education told me earlier this week that she had spoken to the ABA's Section on Legal Education (the body responsible for among other things deciding what sort of employment data law schools have to reveal), and that she had been assured that things like the Illinois and Villanova scandals were isolated incidents. I asked her how the law school deans and faculty members who run the Section could actually know that to be the case. She responded that this seemed like a good question. Indeed it does!
As bad as the official law school employment numbers look when you toss out all the stuff that shouldn't really count (temp jobs, part-time jobs, jobs that don't require law degrees, jobs funded by law schools to goose placement numbers), keep in mind that these much more realistic numbers are still every bit as vulnerable to fraud as Illinois's admissions stats were. In fact they are far more vulnerable, because while prospective law students can't report phony GPA and LSAT scores when they apply, law grads are almost completely unconstrained from claiming whatever they want to claim in regard to their employment and salary information. And my preliminary research on the question of whether they report such information accurately indicates that a significant number exaggerate their employment and salary situations.
As to whether career services offices accurately report the numbers they're given by graduates to the ABA and USNWR, my sense is that law school administrators aren't exactly spending a lot of time and money (or indeed any at all) checking up on that matter.
The rationale for all this faith in the accuracy of an almost completely unregulated reporting process, the results of which determine the outcomes of rankings that people who run a multi-billion dollar per year industy obsess over, is . . . what? That law school administrators, law faculty, and other people who work at law schools can be trusted more than, say, ordinary businesspeople?
That is, under the present circumstances, an interesting theory.