Content deleted Content added
mNo edit summary Tags: Reverted Visual edit |
|||
Line 86:
===== Racial bias =====
Racial bias refers to the tendency of machine learning models to produce outcomes that unfairly discriminate against or stereotype individuals based on race or ethnicity. This bias often stems from training data that reflects historical and systemic inequalities. For example, AI systems used in hiring, law enforcement, or healthcare may disproportionately disadvantage certain racial groups by reinforcing existing stereotypes or underrepresenting them in key areas. Such biases can manifest in ways like facial recognition systems misidentifying individuals of certain racial backgrounds or healthcare algorithms underestimating the medical needs of minority patients. Addressing racial bias requires careful examination of data, improved transparency in algorithmic processes, and efforts to ensure fairness throughout the AI development lifecycle.<ref>{{Cite web |last=Lazaro |first=Gina |date=May 17, 2024 |title=Understanding Gender and Racial Bias in AI |url=https://www.sir.advancedleadership.harvard.edu/articles/understanding-gender-and-racial-bias-in-ai |access-date=December 11, 2024 |website=Harvard Advanced Leadership Initiative Social Impact Review}}</ref><ref>{{Cite journal |last=Jindal |first=Atin |date=September 5, 2022 |title=Misguided Artificial Intelligence: How Racial Bias is Built Into Clinical Models |url=https://bhm.scholasticahq.com/article/38021-misguided-artificial-intelligence-how-racial-bias-is-built-into-clinical-models |journal=Journal of Brown Hospital Medicine |volume=2 |issue=1 |doi=10.56305/001c.38021 |access-date=December 11, 2024|doi-access=free |pmc=11878858 }}</ref>
=== Technical ===
[[File:Three Surveillance cameras.jpg|thumb|upright=1.2|Facial recognition software used in conjunction with surveillance cameras was found to display bias in recognizing Asian and black faces over white faces.<ref name="IntronaWood" />{{rp|191}}]]
Technical bias emerges through limitations of a program, computational power, its design, or other constraint on the system.<ref name="FriedmanNissenbaum" />{{rp|332}} Such bias can also be a restraint of design, for example, a search engine that shows three results per screen can be understood to privilege the top three results slightly more than the next three, as in an airline price display.<ref name="FriedmanNissenbaum" />{{rp|336}} Another case is software that relies on [[randomness]] for fair distributions of results. If the [[random number generation]] mechanism is not truly random, it can introduce bias, for example, by skewing selections toward items at the end or beginning of a list.<ref name="FriedmanNissenbaum" />{{rp|332}}
|