the harrison bergeron approach to education: how university rankings stunt the social revolution
I've been thinking some lately about the odd and confusing practice of comparing undergraduate and graduate programs at American colleges and universities and producing a set of rankings that show how the programs stack up against each other.
One of the most widely cited set of rankings comes from U.S. News and World Report, which offers rankings in dozens of categories, for both undergraduate and graduate-level programs. Here, the magazine offers its altruistic rationale behind producing these rankings:
(To access the entire rankings, developed and produced selflessly by U.S. News and World Report, you need to pay. Click here to purchase the Premium Online Edition, which is the only way to get complete rankings, for $14.95.)
The 2009 rankings, released in April, are in the news lately because of questions related to how the magazine gathers data from colleges. As Carl Bialik points out in a recent post at the Wall Street Journal, concerns over how Clemson University set about increasing its rank point to deeper questions about the influence of rankings numbers on university operations. Clemson President James F. Barker reportedly shot for cracking the top 20 (it was ranked 38th nationally in 2001) by targeting all of the ranking indicators used by U.S. News. Bialik writes:
Setting aside questions of the rankings' influence on university operations and on recruiting (both for prospective students and prospective faculty), and setting aside too the question of how accurate any numbers collected from university officials themselves could possibly be when the stakes are so high, one wonders how these rankings limit schools' ability to embrace what appear to be key tenets emerging out of the social revolution. A key feature of some of the most vibrant, energetic, and active online communities is what Clay Shirky labels the "failure for free" model. As I explained in a previous post on the open source movement, the open source software (OSS) movement embraces this tenet:
The U.S. News rankings, and the methodology behind them, runs completely anathema to the notion of innovation. Indeed, a full 25 percent of the ranking system is based on what U.S. News calls "peer assessment," which comes from "the top academics we consult--presidents, provosts, and deans of admissions" and, ostensibly, at least, allows these consultants
Who becomes "distinguished" in the ivory-tower world of academia? Those who play by the long-established rules of tradition, polity, and networking, of course. The people who most want to effect change at the institutional level are often the most outraged, the most unwilling to play by the rules established by administrators and rankings systems, and therefore the least likely to make it into the top echelons of academia. Indeed, failure is rarely free in the high-stakes world of academics; it's safer to say no to "a radical but promising idea" than to say yes to any number of boring but safe ideas.
So what do you do if you are, say, a prospective doctoral student who wants to tear wide the gates of academic institutions? What do you do if you want to go as far in your chosen field as your little legs will carry you, leaving a swath of destruction in your wake? What do you do if you want to bring the social revolution to the ivory tower, instead of waiting for the ivory tower to come to the social revolution?
You rely on the U.S. News rankings, of course. It's what I did when I made decisions about which schools to apply to (the University of Wisconsin-Madison [ranked 7th overall in graduate education programs, first in Curriculum & Instruction, first in Educational Psychology] the University of Texas-Austin [tied at 7th overall, 10th in Curriculum & Instruction], the University of Washington [12th overall, 9th in Curriculum & Instruction], the University of Michigan [14th overall, 7th in Curriculum & Instruction, and 3rd in Educational Psychology] the University of Indiana [19th overall, out of the top 10 in individual categories], and Arizona State University [24th overall, out of the top 10 in individual categories]). Interestingly, though, the decision to turn down offers from schools ranked higher than Indiana (go hoosiers) wasn't all that difficult. I knew that I belonged at IU (go hoosiers) almost before I visited, and a recruitment weekend sealed the deal.
But I had an inside track to information about IU (go hoosiers) via my work with Dan Hickey and Michelle Honeyford. I also happen to be a highly resourceful learner with a relatively clear sense of what I want to study, and with whom, and why. Other learners--especially undergraduates--aren't necessarily in such a cushy position. They are likely to rely heavily on rankings in making decisions about where to apply and which offer to accept. This not only serves to reify the arbitrary and esoteric rankings system (highest ranked schools get highest ranked students), but also serves to stunt the social revolution in an institution that needs revolution, and desperately.
In this matter, it's turtles all the way down. High-stakes standardized testing practices and teacher evaluations based on achievement on these tests limits innovation--from teachers as well as from students--at the secondary and, increasingly, the elementary level. But the world that surrounds schools is increasingly ruled by those who know how to innovate, how to say yes to a radical but promising idea, how to work within a "failure for free" model. If schools can't learn how to embrace the increasingly valued and valuable mindsets afforded by participatory practices, it's failing to prepare its student body for the world at large. The rankings system is just another set of hobbles added on to a system of clamps, tethers, and chains already set up to fail the very people it purports to serve.
One of the most widely cited set of rankings comes from U.S. News and World Report, which offers rankings in dozens of categories, for both undergraduate and graduate-level programs. Here, the magazine offers its altruistic rationale behind producing these rankings:
A college education is one of the most important—and one of the most costly—investments that prospective students will ever make. For this reason, the editors of U.S. News believe that students and their families should have as much information as possible about the comparative merits of the educational programs at America's colleges and universities. The data we gather on America's colleges—and the rankings of the schools that arise from these data—serve as an objective guide by which students and their parents can compare the academic quality of schools. When consumers purchase a car or a computer, this sort of information is readily available. We think it's even more important that comparative data help people make informed decisions about an education that at some private universities is now approaching a total cost of more than $200,000 including tuition, room, board, required fees, books, transportation, and other personal expenses.
(To access the entire rankings, developed and produced selflessly by U.S. News and World Report, you need to pay. Click here to purchase the Premium Online Edition, which is the only way to get complete rankings, for $14.95.)
The 2009 rankings, released in April, are in the news lately because of questions related to how the magazine gathers data from colleges. As Carl Bialik points out in a recent post at the Wall Street Journal, concerns over how Clemson University set about increasing its rank point to deeper questions about the influence of rankings numbers on university operations. Clemson President James F. Barker reportedly shot for cracking the top 20 (it was ranked 38th nationally in 2001) by targeting all of the ranking indicators used by U.S. News. Bialik writes:
While the truth about Clemson’s approach to the rankings remains elusive, the episode does call into question the utility of a ranking that schools can seek to manipulate. “Colleges have been ‘rank-steering,’ — driving under the influence of the rankings,” Lloyd Thacker, executive director of the Education Conservancy and a critic of rankings, told the Associated Press. “We’ve seen over the years a shifting of resources to influence ranks.”
Setting aside questions of the rankings' influence on university operations and on recruiting (both for prospective students and prospective faculty), and setting aside too the question of how accurate any numbers collected from university officials themselves could possibly be when the stakes are so high, one wonders how these rankings limit schools' ability to embrace what appear to be key tenets emerging out of the social revolution. A key feature of some of the most vibrant, energetic, and active online communities is what Clay Shirky labels the "failure for free" model. As I explained in a previous post on the open source movement, the open source software (OSS) movement embraces this tenet:
It's not, after all, that most open source projects present a legitimate threat to the corporate status quo; that's not what scares companies like Microsoft. What scares Microsoft is the fact that OSS can afford a thousand GNOME Bulgarias on the way to its Linux. Microsoft certainly can't afford that rate of failure, but the OSS movement can, because, as Shirky explains,open systems lower the cost of failure, they do not create biases in favor of predictable but substandard outcomes, and they make it simpler to integrate the contributions of people who contribute only a single idea.
Anyone who's worked for a company of reasonable size understands the push to keep the risk of failure low. "More people," Shirky writes, "will remember you saying yes to a failure than saying no to a radical but promising idea." The higher up the organizational chart you go, the harder the push will be for safe choices. Innovation, it seems, is both a product of and oppositional to the social contract.
The U.S. News rankings, and the methodology behind them, runs completely anathema to the notion of innovation. Indeed, a full 25 percent of the ranking system is based on what U.S. News calls "peer assessment," which comes from "the top academics we consult--presidents, provosts, and deans of admissions" and, ostensibly, at least, allows these consultants
to account for intangibles such as faculty dedication to teaching. Each individual is asked to rate peer schools' academic programs on a scale from 1 (marginal) to 5 (distinguished). Those who don't know enough about a school to evaluate it fairly are asked to mark "don't know." Synovate, an opinion-research firm based near Chicago, in spring 2008 collected the data; of the 4,272 people who were sent questionnaires, 46 percent responded.
Who becomes "distinguished" in the ivory-tower world of academia? Those who play by the long-established rules of tradition, polity, and networking, of course. The people who most want to effect change at the institutional level are often the most outraged, the most unwilling to play by the rules established by administrators and rankings systems, and therefore the least likely to make it into the top echelons of academia. Indeed, failure is rarely free in the high-stakes world of academics; it's safer to say no to "a radical but promising idea" than to say yes to any number of boring but safe ideas.
So what do you do if you are, say, a prospective doctoral student who wants to tear wide the gates of academic institutions? What do you do if you want to go as far in your chosen field as your little legs will carry you, leaving a swath of destruction in your wake? What do you do if you want to bring the social revolution to the ivory tower, instead of waiting for the ivory tower to come to the social revolution?
You rely on the U.S. News rankings, of course. It's what I did when I made decisions about which schools to apply to (the University of Wisconsin-Madison [ranked 7th overall in graduate education programs, first in Curriculum & Instruction, first in Educational Psychology] the University of Texas-Austin [tied at 7th overall, 10th in Curriculum & Instruction], the University of Washington [12th overall, 9th in Curriculum & Instruction], the University of Michigan [14th overall, 7th in Curriculum & Instruction, and 3rd in Educational Psychology] the University of Indiana [19th overall, out of the top 10 in individual categories], and Arizona State University [24th overall, out of the top 10 in individual categories]). Interestingly, though, the decision to turn down offers from schools ranked higher than Indiana (go hoosiers) wasn't all that difficult. I knew that I belonged at IU (go hoosiers) almost before I visited, and a recruitment weekend sealed the deal.
But I had an inside track to information about IU (go hoosiers) via my work with Dan Hickey and Michelle Honeyford. I also happen to be a highly resourceful learner with a relatively clear sense of what I want to study, and with whom, and why. Other learners--especially undergraduates--aren't necessarily in such a cushy position. They are likely to rely heavily on rankings in making decisions about where to apply and which offer to accept. This not only serves to reify the arbitrary and esoteric rankings system (highest ranked schools get highest ranked students), but also serves to stunt the social revolution in an institution that needs revolution, and desperately.
In this matter, it's turtles all the way down. High-stakes standardized testing practices and teacher evaluations based on achievement on these tests limits innovation--from teachers as well as from students--at the secondary and, increasingly, the elementary level. But the world that surrounds schools is increasingly ruled by those who know how to innovate, how to say yes to a radical but promising idea, how to work within a "failure for free" model. If schools can't learn how to embrace the increasingly valued and valuable mindsets afforded by participatory practices, it's failing to prepare its student body for the world at large. The rankings system is just another set of hobbles added on to a system of clamps, tethers, and chains already set up to fail the very people it purports to serve.
0 comments:
Post a Comment