Thursday, March 29, 2007
A few more thoughts on the U.S. News Rankings of law schools....
Law professors across the country spent many hours poring over the U.S. News Rankings. They did this because they do matter-- both to incoming students, alums, and to them. Yes, they are imperfect, but they do matter.
Here are my thoughts as to some of the results:
1) Baylor
As I said previously, slipping only two notches was pretty good, since this is the first time that summer and spring starters were probably counted in the incoming student numbers. Our ranking reflects worse numbers for incoming students, but improvements in both academic reputation AND in reputation with lawyers and judges.
2) Pepperdine
Pepperdine was the biggest gainer, jumping up 21 places. I think they made some smart moves leading to this, and described them over on Law School Innovation.
3) Yale
Yale was #1 again, and has been every year since about the mid-80's. Here's the thing, though-- when I was there, I never heard much about the rankings. We didn't talk about it (unless we were with Harvard people). The Dean, Guido Calabresi, always acknowledged it and said something in the magazine about not thinking it was a true valuation of law schools. Even now, in an even more competitive market, the Yale Law web site doesn't seem to make prominent mention of it. Some might say this is its own kind of arrogance, though.
Here are my thoughts as to some of the results:
1) Baylor
As I said previously, slipping only two notches was pretty good, since this is the first time that summer and spring starters were probably counted in the incoming student numbers. Our ranking reflects worse numbers for incoming students, but improvements in both academic reputation AND in reputation with lawyers and judges.
2) Pepperdine
Pepperdine was the biggest gainer, jumping up 21 places. I think they made some smart moves leading to this, and described them over on Law School Innovation.
3) Yale
Yale was #1 again, and has been every year since about the mid-80's. Here's the thing, though-- when I was there, I never heard much about the rankings. We didn't talk about it (unless we were with Harvard people). The Dean, Guido Calabresi, always acknowledged it and said something in the magazine about not thinking it was a true valuation of law schools. Even now, in an even more competitive market, the Yale Law web site doesn't seem to make prominent mention of it. Some might say this is its own kind of arrogance, though.
Comments:
<< Home
Does SMU have to report their "Night School" part time law classes? You get a full SMU law degree in four years and I know people with LSATS in the low 150s who are there as political cases. Shouldn't they have to report them?
That's kind of a controversial thing-- I'm pretty sure they only have to report the day students, or at least that was the case until recently.
I really have to question the validity of this ranking as it makes no mention of Central Piedmont Community College's School of Beauty Shop Litigation Theory.
What we at William & Mary's Marshall-Wythe School of Law could never figure out is how and why U.Va had such a good ranking when all they did was play softball and drink.
Well, I don't know about the reasons for UVA law school's ranking, IPLG, but in response to Osler's comment about Yale's downplaying its no. 1 law-school ranking, I'd say it's not necessarily a different brand of arrogance. It's simply the luxury of being highly selective and in high demand. You don't need to tout your status.
As a college counselor of high-school students, I have a real love-hate relationship with rankings. On the one hand, I hate their arbitrariness (I used to work in undergrad admission at UVA and I know all the factors that can tip them one way or another, including the type of students who are reported, whose SAT scores are reported, who's a full-time student, state funding slipping, all that stuff). For a great and timely article about the numbers that get reported, see the MArch 11 op-ed piece in the Washington Post by the president of Sarah Lawrence College at:
http://www.washingtonpost.com/wp-dyn/content/article/2007/03/09/AR2007030901836
(I hope that link works.)
Basically, Sarah Lawrence quit reporting any SAT scores and US News arbitrarily assigns them "an SAT average of one standard deviation (200 points) below the average score of [their] peer group."
On the other hand, as a high-school counselor, I guiltily admit to referring students to the USNews site simply as a place to begin. Especially for international students, many of whom only know the top 5 names which they will never get into, referring them to the top 50 or 100 universities or liberal-arts colleges is generally a reliable place to start, provided I can convince them truly to look beyond the first ten to the first 50 or 75.
Short of visiting them all myself--which really is the best way to be able to fit a student with the right place--the rankings are a blunt instrument that can take aim into the right ballpark.
I suppose the biggest problem is who does them, and the elements in the methodology. In the UK, there is an independent agency called the QAA, subscribed to by universities and education-funding groups (who they are I'm not sure), which evaluates teaching quality at universities and assigns a score. There is also a research quality assessment done every 5 years or so.
The UK also has rankings done by the two major newspapers, but these rankings include the scores earned through the teaching and research assessments done by the independent body. Well, sort of independent. Maybe it's not terribly different from the "reputation" element of US News, but there seems to be more to it than just check-boxes on a survey sent to faculty.
The US may be too big and too diffuse to use such a system. And universities are scared of having state or federal governments evaluate them. But there must be a better way.
As a college counselor of high-school students, I have a real love-hate relationship with rankings. On the one hand, I hate their arbitrariness (I used to work in undergrad admission at UVA and I know all the factors that can tip them one way or another, including the type of students who are reported, whose SAT scores are reported, who's a full-time student, state funding slipping, all that stuff). For a great and timely article about the numbers that get reported, see the MArch 11 op-ed piece in the Washington Post by the president of Sarah Lawrence College at:
http://www.washingtonpost.com/wp-dyn/content/article/2007/03/09/AR2007030901836
(I hope that link works.)
Basically, Sarah Lawrence quit reporting any SAT scores and US News arbitrarily assigns them "an SAT average of one standard deviation (200 points) below the average score of [their] peer group."
On the other hand, as a high-school counselor, I guiltily admit to referring students to the USNews site simply as a place to begin. Especially for international students, many of whom only know the top 5 names which they will never get into, referring them to the top 50 or 100 universities or liberal-arts colleges is generally a reliable place to start, provided I can convince them truly to look beyond the first ten to the first 50 or 75.
Short of visiting them all myself--which really is the best way to be able to fit a student with the right place--the rankings are a blunt instrument that can take aim into the right ballpark.
I suppose the biggest problem is who does them, and the elements in the methodology. In the UK, there is an independent agency called the QAA, subscribed to by universities and education-funding groups (who they are I'm not sure), which evaluates teaching quality at universities and assigns a score. There is also a research quality assessment done every 5 years or so.
The UK also has rankings done by the two major newspapers, but these rankings include the scores earned through the teaching and research assessments done by the independent body. Well, sort of independent. Maybe it's not terribly different from the "reputation" element of US News, but there seems to be more to it than just check-boxes on a survey sent to faculty.
The US may be too big and too diffuse to use such a system. And universities are scared of having state or federal governments evaluate them. But there must be a better way.
Ummm, ok, you all are way, WAY too smart for me. Thank God tomorrow is Haiku Friday.
--Mrs. CL, Central Piedmont Community College School of Beauty, Class of '74 "Go Rat Tails!!"
Post a Comment
--Mrs. CL, Central Piedmont Community College School of Beauty, Class of '74 "Go Rat Tails!!"
<< Home