Editorial: University rankings are misleading. Why are we still using them?

Many high school seniors have opened emails in recent weeks telling them if they have been accepted into the colleges of their choice. Even if they do, criticism of published college rankings that may have determined their preferences emerges — once again.

A math professor at Columbia University questions the data reported by the Ivy League school to US News & World Report. which earned him second place this year. The University of Southern California, which seems struggling to stay out of trouble for more than a few months at a time, pulled its graduate school of education from the rankings this year after revealing a “history of inaccuracies” in the had discovered data reported by her.

A few weeks ago, in what must be the grandfather of the fake data scandals, the ousted dean of Temple University’s business school received a 14-month penalty after being convicted in federal court of sending false information to US News & World Report to boost the school’s reputation. Claremont McKenna College, The George Washington University and many other schools have optimized data to improve ranking.

But the ultimate problem with the leaderboards isn’t with the scammers. The problem is the rankings themselves. They can be a counterproductive way for families to choose schools – for example, a much cheaper school might offer an equivalent or better education than a higher ranked but more expensive one.

The pickiest schools — Princeton, MIT, and so on — don’t need rankings to improve their reputation or pool of applicants. And the differences between a school that could be 70th on the list and one that might be 90th They are unlikely to have a major impact on a student’s postgraduate prospects or college experience.

Probably only a few applicants know that the single biggest factor US News uses schools to rank their reputations with officials at other colleges, who may or may not have in-depth knowledge of the schools. That accounts for 20% of the score.

The second largest factor is six-year graduation rates. But given that low-income students are far less likely to graduate than middle-class students within that timeframe — or ever — this is indicative of student wealth rather than academic excellence. In fact, it can have the perverse effect of discouraging colleges from accepting more low-income students to keep their graduation rates from declining.

A massive 2017 Gallup poll found that alumni who attended prestigious schools were only slightly more satisfied with their college choices than those who attended schools further down the list. The biggest factor in student satisfaction with college was whether or not they had debt, although US News gives student debt only a 5% weighting in the ranking.

US News has made some positive changes in recent years. Student acceptance rate was dropped as one of the criteria, which had resulted in colleges heavily targeting students even when they had almost no chance of acceptance. Lower acceptance rates corresponded to higher rankings. The ranking began with the percentage of Pell Scholars who graduated within six years — a powerful statistic that indicates whether colleges are helping low-income students complete their education.

But many other factors used in ranking schools have little bearing on a student’s experience. The ranking uses alumni donations as an indicator of student satisfaction with their alma mater. That’s a pretty meager way of measuring satisfaction.

What most high school students and parents need to know is whether a college offers a wide variety of courses with good faculty; whether graduates will leave with a load of debt; whether students will feel comfortable and engaged on campus; and whether they are prepared for a fulfilling career.

College administrators lament the rankings but continue to participate. They should stop joining the charade and insist on being partners in the elaboration of valid methods for evaluating higher education. How satisfied students and alumni were with their choice should be decisive. Using data from the 2017 survey, Gallup economist Jonathan Rothwell published a study in which he developed an alumni satisfaction ranking but only published the 25 schools with the highest satisfaction ratings. Many of them were among the top performers in all published rankings, but there were a few surprises, including the University of La Verne and Azusa Pacific in Southern California. Rothwell’s study also found that a college’s price does not necessarily correlate with how satisfied alumni were with it.

However, if colleges and ranking organizations came together, they could create a unified survey process for students and alumni that would be far more useful and better reflect the value of the colleges when combined with other factors.

A new approach could include specific questions that students might find useful, for example: Which schools are more artistically oriented? Which specialize in experiential learning? Which have lots of extracurricular activities or a friendly, accepting campus environment?

Despite years of criticism, US News and other college-ranking publications are not going to abandon one of their popular and profitable annual features. It is up to colleges to stand up and refuse to support rankings that fall short and collaborate on a method that will provide students with valuable information to complete the confusing task of choosing a college. Editorial: University rankings are misleading. Why are we still using them?

Grace Reader

TheHitc is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – The content will be deleted within 24 hours.

Related Articles

Back to top button