Op-ed: Monitor your math machines

Graphic by Jessica Xing

Number-crunching machines have corrupted the U.S. News college ranking system.

Jethro R. Lee, contributor

The last time I touched an Encyclopedia was in sixth grade. Since then, Google and its algorithms have, like for many, become my number one resource for answering my questions. In our increasingly technologically innovative world, algorithms aimed at tackling complex problems through the rapidness of computers are spreading like wildfire. The days in which people manually evaluated college essays, resumes, credit scores and other socially pivotal entities in our lives are increasingly pushed behind us. However, people’s bias has led to unjust decision-making in societies and while machines may seem like a compelling solution, their man-made algorithms can still contribute to social disparity. For aspiring data scientists like me — rather than let this knowledge cause you to regret your career aspirations as I had initially had, let it motivate you to establish a precedent in which data should be handled responsibly for the benefit of all.

Many of our societal problems do not seem obviously attributed to machines. But number-crunching machines have corrupted a system that, as college students, we are all too familiar with: the U.S. News college ranking system. In “Weapons of Math Destruction, Cathy O’Neil says the U.S. News tried to elevate its declining reputation by creating algorithms that analyzed colleges through factors such as SAT scores, acceptance rates and even the percentage of new freshmen who successfully made it past their first year. The U.S. News is prized for providing a national standard for assessing colleges. However, a bad ranking could trap a college into a menacing cycle that further suppresses its reputation. O’Neil says top students and esteemed college professors would avoid schools with a poor ranking and alumni would feel less inclined to donate to that low-ranked school. She also says that colleges that lose top students and professors to colleges with higher rankings feel more urged to lower financial aid for the sake of attracting people that could improve their rankings. This sacrifice harms the people who would arguably benefit from the college’s money the most: us, the students. 

The fact that one has to pay for an education bewilders many. However, the fact that a ranking system is a part of why we have to pay so much for college should motivate us and college administrators to accept that a school’s value is not defined by its ranking. College administrators are instead committing themselves to unnecessarily spending money on “enhancing” their schools while forgetting their institution’s true purpose: to nourish our love for learning.

The standards imposed by the U.S. News have caused colleges to prioritize reputation over constructively improving the institutions themselves. Elite small colleges such as Bucknell University and Claremont McKenna College allegedly sent the U.S. News inflated data regarding the SAT scores of their entering freshmen class year. It’s disappointing that college administrators are exhibiting dishonest behaviors for the sake of increasing their institution’s appeal, but the United States’ unnecessary fixation on numbers and rankings makes it hard to see why they feel compelled to do so. 

O’Neil additionally says tuition and fees are not considered in the U.S. News algorithms, further supporting how U.S. News inadvertently boosts the disparity between the rich and the poor in terms of people’s opportunities to receive a greater education. In fact, the reputation of anything that is evaluated through algorithms –– businesses, hotels, restaurants, etc.–– is built when its owner devotes itself to supporting an algorithm rather than its people.

Algorithmic oppression contributes to racism. Safiya Umoja Noble is a professor at UCLA who has researched gender, technology, culture and how their interactions direct the way people use the internet. Even though computers aren’t inherently racist, their algorithms cause humans to play a considerable role in reinforcing computers to function in racist ways. Searching the keywords “Black girls” on Google used to present a revolting site that fueled egregious racial fetishization. Those results no longer exist, Noble says, but it’s unacceptable that it took at least two years for porn to be removed in the case of a search on “Black girls.” Companies such as Google need to recognize the power and dangers of artificial intelligence, which is becoming today’s most commonly used source of knowledge. The convenience of Google’s search engine compared to reading an encyclopedia or scrutinizing a database further raises the chance of the contamination of our minds with the internet’s misinformation. 

The detriments of algorithms don’t just pertain to social harmony, but they extend to each individual’s well-being. The Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition, or DSM-5, is a tool used in the field of psychology within the United States for many years to diagnose someone with a mental illness. However, in the United States, most patients only receive treatment for their mental condition if they experience a strict number of symptoms outlined in the DSM-5. While quantitative standards are important, qualitative data should also be used to determine whether a person has a mental condition worth treating through medical approaches. Since qualitative experiences play a crucial role in one’s well-being, the DSM-5 is only sufficient in determining whether a patient should be considered, not diagnosed, with a mental illness. To account for the qualitative aspects of one’s mental condition, clinicians should place greater trust in their patients and the people around them in deciding whether it’s appropriate to prescribe a medication that may be unnecessary and harmful, as stressed by American psychologist Allen Frances’s “Saving Normal.” Frances monitored the development of the DSM-4 but has become critical of the DSM-5’s quantitative emphasis.

Living in a quantitatively-motivated world has provided us with benefits. College students can worry less about being evaluated by an admissions officer who happens to dislike people of their race. Hotels can be evaluated rapidly for tourists’ convenience by systems like Yelp. An employer with a job opening doesn’t have to waste hours manually assessing a large pool of applicants. However, bias is still a genuine and concerning product of these seemingly emotionless machines. 

As an aspiring data scientist, I yearn to manage data responsibly to undo the crippling effects of corrosive algorithms. Emanuel Derman and Paul Wilmott are financial engineers interested in quantitative finance who have written about and researched a lot in human error and financial models. An oath by Emanuel Derman and Paul Wilmott describes how we can all responsibly manage the growing reign of mathematics. Overall, we should “make explicit [mathematics’] assumptions and oversights and understand that [our] work may have enormous effects on society and the economy, many of them beyond [our] comprehension.” Carefully measuring the impact of algorithms, acknowledging their limitations and spreading awareness of those limitations will help us avoid math from disrupting our planet, even if it means sacrificing efficiency for equality. 

Jethro R. Lee is a second-year data science and psychology combined major. He can be reached at [email protected].