Signed in as:
filler@godaddy.com
Signed in as:
filler@godaddy.com
Predictive policing systems are built on past and current arrest & conviction data. (Zalnieriute; Cutts, 2022)
Surveillance algorithms base their matches on old mugshots already present in national agency databases. New targets can be mistaken for old targets. “Although the accuracy of facial recognition technology has increased dramatically in recent years, differences in performance exist for certain demographic groups,” the United States Government Accountability Office wrote in a report to Congress last year.
Across the U.S., Black people face arrest for a variety of crimes at far higher rates than white people (Crockford, 2020)
Agencies use facial recognition software that identifies people based on existing pictures and mugshots stored in databases. Agencies get their data typically from huge private data farms.
False association between criminality and Black men. Predictive policing systems are based on past arrest and conviction data that embed racist decision-making in the process of enforcing law (Zalnieriute and Cutts, 2022)
The picture on the right features a fake drivers license photo (left). When the picture was run, it turned up the gentleman, Nijeer Parks, on the right, who was in no way related to or involved with the individual on the left.
"Parks, now 36, spent 10 days behind bars for a January 2019 theft and assault on a police officer that he didn't commit. He said he was released after he provided evidence he was in another city, making a money transfer at the time of the offence. Prosecutors dropped the case the following November, according to an internal police report.
Investigators identified Parks as a suspect using facial recognition technology, according to police documents provided as part of a lawsuit filed by Parks's lawyer against several defendants, including police and the mayor of Woodbridge, N.J. The lawsuit names French tech firm Idemia as the developer of the software" (CBS, 2024)
"Clearview AI is a surveillance technology broker with over 10 billion photos. It provides FRT to law enforcement agencies. Since 2017, it has scraped billions of publicly available images from websites like YouTube and Facebook, and enables customers to upload photos of individuals and automatically match them with other images and sources in the database. The FBI only has about 640 million photos in its databases, compared to Clearview AI’s approximately 10 billion" (Brookings Institute, 2022).
"Buolamwini and Gebru’s 2018 research concluded that some facial analysis algorithms misclassified Black women nearly 35 percent of the time, while nearly always getting it right for white men. A subsequent study by Buolamwini and Raji at the Massachusetts Institute of Technology confirmed these problems persisted with Amazon’s software" (ACLU, 2020)
**********************************************************************
"Detroit resident Robert Williams was arrested in front of his children and held in detention for a night after a false positive in an AI facial recognition system. Williams had his name in the system for years and had to sue the police and local government for wrongful arrest to get it removed. He eventually found out that faulty AI had identified him as the suspect" (The Conversation, 2024).
Dr. Monika Zalnieriute and Associate Professor Tatiana Cutts, law school faculty members in Melbourne, Australia, in an academic paper bringing to light to glaring problems of racial discrimination with AI facial recognition software write, "[N]ew technologies such as AI and machine learning algorithms reinforce structural racism, whereby certain peoples and communities are treated as having a lower status in society." They go on later to give an example of their claim. "For instance, the Suspect Target Management Plan (STMP) is a New South Wales (Australia) Police Force initiative designed to reduce crime among high-risk individuals through proactive policing. Data shows the STMP disproportionately targets young people, particularly Aboriginal and Torres Strait Islander people: of the 73 children under the age of 16 identified as targets, 73% were indigenous, compared with national census data of 3.2%" (Zalnieriute and Cutts, 2022).
Zalnieriute and Cutts also discuss the faulty predictive nature of AI technology in the medical field. Ultimately, the two implore the UNHRC advisory committee to "call for a ban on the use of predictive algorithms in criminal sentencing," and "for the development of binding international human rights law for private actors to remedy the violations of right to freedom from discrimination, especially in a transnational context. The development of such obligations is crucial for eradicating structural, systemic and institutional racism."
What is Techno-racism? "Techno-racism" refers to the practice, wittingly or unwittingly, of arming artificial intelligence with the capacity to inflict harm on racial minorities due to their skin color. Generally, we define racism as thoughts, beliefs, and actions that emotionally and physically act to make the lives of those outside of one’s group worse in some way. The United States during slavery and Jim Crow is perhaps the most salient example of how racism manifests in a society. Techno-racism, then, expands upon the traditional mechanisms, practices, and thought forms that act to denigrate human beings and relegate them to a subhuman status. Racism in tech manifests primarily in law enforcement facial recognition programs, and in healthcare algorithms and employment candidate screenings.
Historical Origin
Surveillance
The 50’s and 60’s saw the U. S. government initiate the COINTELPRO surveillance program. The FBI coordinated with state law enforcement agencies countrywide to monitor the actions, phone calls, whereabouts, activities, and associates of targeted individuals—people government officials like J. Edgar Hoover saw as a threat to safety of the U.S. Most notable were the use of intrusive surveillance techniques on Malcolm X, Martin Luther King Jr., and other civil rights activists (Brookings, 2022).
In the middle of the 20th century, the U.S. government initiated a complete surveillance of Japanese immigrants, monitoring their private mail exchanges and bank accounts. This monitoring led to eventual forced relocation of more than 120,000 Japanese migrants into internment camps.
The China Initiative was a four-year program running between 2018 and 2022 with a stated purpose of protecting the U.S. from national security threats by the Chinese government. In practice, the program led to massive anti-Asian sentiment. Much like the American Red Scare waves of the first half of the 20th century, many false arrests of Chinese immigrants were made, including of academics, professionals, and students.
After the 9/11 attacks in New York, TSA officials at major airports engaged in harassment and profiling of Muslim airline patrons. As well, the government engaged in massive surveillance of Muslim American communities and individuals.
“[D]uring the Obama and Trump administrations, Immigration and Customs Enforcement (ICE) purchased surveillance technology from private companies like Palantir and Thomson Reuters and used vehicle, insurance, tax, social media, and phone records to track undocumented immigrants throughout the country” (Brookings, 2022).
AI in Law Enforcement and Healthcare
Modern technological developments have seen the creation of companies specifically geared towards the production and sale of facial recognition programs and predictive algorithmic software in a variety of fields, most notably, health care and law enforcement.
Optum is a healthcare algorithmic service owned by UnitedHealth Group. The service provides predictive data to determine patients who may need health care services. Because the program is prone to recommend services primarily to white patients, excluding minority patients, it can effectively be deemed a discriminatory service.
“[Optum] was developed to streamline the process of identifying these beneficiaries and is now applied to more than 200 million people across the US each year.5 Recent evidence suggests that Optum systematically fails to refer people of colour to the support programmes at the same level of healthcare need as white people” (Zalnieriute and Cutts, 2022).
In law enforcement, we are seeing black men and women disproportionately identified as criminals. Companies who produce FRT software and sell the information within construct their personnel recognition databases with existing samples of people in prison. Because many of these people are black, predictive and facial recognition programs falsely identify black people as criminals much more than they do for whites. This has led to the false arrest, abuse, and harassment of many Black American citizens.
Top Companies that manufacture facial recognition tech:
Clearview AI
Palantir
Application in digital media production:
Pictures and personal data may be considered the content for our daily rhetoric. They tell the world who we are. Our understanding of each other is based on the rhetoric our media collectively produces and displays as cultural examples of each of us. We are only seen in the light that the media presents us in.
This means that if we are to change course on the way minorities are seen in this country and abroad, we must start with the kind of content that is produced. Pictures of Black men and women raising children, contributing positively to global culture, art, philosophy, science, and showing up for one another would contribute to a view of us that would ensure predictive software had sufficiently diverse data to make better decisions about Black people.
Sources
https://www.aclu.org/news/privacy-technology/how-is-face-recognition-surveillance-technology-racist
https://www.cnn.com/2021/05/09/us/techno-racism-explainer-trnd/index.html
https://www.cbc.ca/news/canada/facial-recognition-technology-police-1.7228253
Copyright © 2024 Profound Pens - All Rights Reserved.
Powered by GoDaddy
We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.