Interview with Bob Morse of U.S. News & World Report College Rankings, Part I
Meet the man behind the single most influential list in college admissions. Bob Morse is the Director of Data Research at U.S. News & World Report, the head of its revered college ranking system. As the force behind a series of annual publications that have achieved unanticipated fame within higher education, Bob Morse has helped to create the college ranking system as it exists today. He was nice enough to sit down with Top Test Prep and answer some questions.
Start by telling us a little bit about yourself.
I’ve been at U.S. News since 1976. I have a BA in economics and an MBA in finance, so I have a research and quantitative background. Doing the rankings is a research and quantitative analysis project. It’s not journalism in the sense that even though I do have a blog, the rankings themselves aren’t reporting … they’re creating information, while typical journalism is reporting on an event or analyzing an event or giving context to something that’s happened.
You have a blog?
I write the blog once or twice a week called Morse Code: Inside the College Rankings. Prior to the blog, U.S. News wouldn’t really write about rankings except at the time that we published the college and grad rankings, so the blog gives us the ability to make announcements.
How did you get connected to U.S. News & World Report?
I worked on Wall Street briefly, at a company called E.F. Hutton. A lot of them don’t exist anymore as they merged away but I used to work there in the mid-70’s. I was at U.S. News, but in another department. It doesn’t exist anymore; it was a research department called the economic unit.
U.S. News was moving from doing the rankings just based on reputation only in the very beginning, before I was involved, they were done very simplistically, in ‚1983 and, 1985. They wanted to make them more sophisticated.
How did the rankings come about?
At the beginning … we didn’t have the thick guidebook and we didn’t have the web, so it was just something that appeared in the weekly magazine in a very limited sense, sort of a top ten list. It was not some guerrilla force in admissions or higher ed, it was just information for consumers and our readers. Nobody thought that it was going to evolve into anything but an occasional feature or cover story. In 1987, I was put in charge (of college rankings). We were going to make it more sophisticated, a combination of reputation and quantitative data, and we were going to start doing this annual guidebook. I got involved in it because they wanted someone with a quantitative research background.
How do you assess a school’s reputation?
It’s become one of the more controversial parts of the rankings; controversial among people in the higher education establishment. The rankings themselves aren’t controversial to the public. The public, obviously, uses them and is attracted to them to a significant degree or otherwise we wouldn’t keep doing them.
We give college presidents and admissions deans and provosts a list of schools and we ask them to rate which ones are excellent and good, so it’s a subjective judgment about the relative standing of schools based on their academic reputations. The academic establishment doesn’t like that – or some of them don’t. Maybe liberal arts schools don’t. I think research universities do.
What’s most interesting to you about the college rankings?
A couple things. One, how it’s become this force in higher education. Some colleges are trying publicly to do better in the rankings and make educational decisions to improve in the rankings. I think that’s pretty interesting.
I think that we’ve filled an informational gap. There’s been a decrease in high school counseling- not at private schools, but at public schools; high school counseling has been diminished by budget cuts, and the public is really searching for tools to help them decide what’s the best school for them. So they’re forced to make decisions on their own and fend for themselves. It’s been satisfying that we’ve been able to fill this informational void. People are becoming more quantitative in judging the best schools.
Another interesting thing is that we’ve been part of this accountability movement. Schools are being held accountable for how they spend money, and whether they’re succeeding in educating students: how well are they doing at what they’re supposed to be doing. So it’s been interesting to be part of all these trends.
Which colleges have seen their university rankings improve the most over the last two or three years?
The rankings are more stable than people think. Typically over a two- or three-year period, the rankings don’t move that much, but I think two schools … University of Southern California and Washington University in St. Louis … have over the last decade or so made a strategic decision- they have a strategy to improve themselves, and their strategy is across-the-board improvement, step-by-step. They take small steps each year institution-wide, and that’s the formula to improve in the rankings.
What kind of steps are college taking to shape their ranking (small or large)?
They’re not small in the sense that they’re little things. They just do them a little bit each year. For example, [a college] would raise the SAT average, so maybe one year it was 1200, the next year it was 1225, the next year it was 1250 ‚ but they wouldn’t go from 1100 to 1300 (SAT) in one year; they would do it over a ten-year period. Or they would increase the freshman retention rate. They’d put money into increasing freshman retention. The graduation rate would be another one, or faculty salaries. They might put more emphasis on small classes and reduce the number of large classes. They’ll do this a little bit each year, focusing on many factors of the academic environment.
To be continued …
Ross Blankenship is an education and admissions expert who specializes in prep school, college and graduate admissions. He’s also an expert on the US News and World Report College Rankings. To read more about Ross Blankenship, visit TopTestPrep.com.