
Speaker Safiya Umoja Noble speaks on AI and it’s future, Thursday, Oct. 16, Houston, Texas. | Annette Achonwa/The Cougar
From information-based help on artificial intelligence services like ChatGPT or Character.ai to entire “relationships” formed through them, AI has carved out a space in society — seen by some as a benefit and by others as a threat.
This rapidly growing technology is becoming a prominent part of culture, bringing unprecedented assistance and innovation. However, its convenience also comes with unforeseen problems.
Social scientist Safiya Umoja Noble discussed these themes during a lecture at the Rockwell Pavilion in the M.D. Anderson Library.
Noble, a professor of gender studies, African American studies and information studies at UCLA, is well-qualified to discuss AI’s societal effects. The two main points she discussed in her lecture were on AI’s current and future consequences on nature and its display of bias and discrimination.
Discrimination in AI was a recurring theme in her lecture. Contrary to the belief that AI is impartial, Noble said the technology reflects the biases of the people who build it.
Noble delves into this prejudice. Noble said AI surveillance systems are often discriminatory and bigoted against marginalized groups, intensifying social control, racism and the erosion of civil rights. This intensifies social control, racism and the erosion of civil rights.
Additionally, the carbon footprint AI emits causes extreme water damage. For low-income neighborhoods, often housing families of color, this pollution goes into their environment disproportionately.
“The more melanized you are, the less reliable A.I. systems are in being able to recognize you,” Noble said during her lecture.
Human intelligence versus AI was another memorable discussion at her lecture. Noble cited an MIT study that explored how using AI tools affects users’ cognitive abilities, decreasing them and making people less engaged in daily life.
She also scrutinizes AI functions. The data that AI uses is often incorrect, restrictive and late. Since this technology is relatively in its early ages, it is still not as smart as it could be, often using outdated information or even discriminatory information. It lacks the complexity of context, information and humanity. This is troublesome for people who are dependent on AI as their replacement for a search engine like Google.
“Power users of ChatGPT are experiencing up to 40% cognitive decline,” Noble said.
Noble calls for resistance against AI from students due to its inherently biased information and the lobotomization it is giving to its users.
Finance sophomore Kathy Le agrees with Noble’s sentiment, pointing out the factual and grammatical inaccuracy that she’s noticed in A.I. services and companies’ systems.
“I don’t think we should rely too much on AI cause it’s still really inaccurate and could spread misinformation if we’re not careful,” Le said.
news@thedailycougar.com
