Content deleted Content added
m →Technical: Applied the "cite web" template for a reference. |
m →Technical: Applied the "cite web" template for a reference. |
||
Line 219:
{{further|Fairness (machine learning)}}
There have been several attempts to create methods and tools that can detect and observe biases within an algorithm. These emergent fields focus on tools which are typically applied to the (training) data used by the program rather than the algorithm's internal processes. These methods may also analyze a program's output and its usefulness and therefore may involve the analysis of its [[confusion matrix]] (or table of confusion).<ref>{{cite web|url=https://research.google.com/bigpicture/attacking-discrimination-in-ml/|title=Attacking discrimination with smarter machine learning|first1=Martin|last1=Wattenberg|first2=Fernanda|last2=Viégas|first3=Moritz|last3=Hardt|publisher=Google Research}}</ref><ref>{{cite arXiv |eprint=1610.02413|last1=Hardt|first1=Moritz|title=Equality of Opportunity in Supervised Learning|last2=Price|first2=Eric|last3=Srebro|first3=Nathan|class=cs.LG|year=2016}}</ref><ref>{{cite web|url=https://venturebeat.com/2018/05/25/microsoft-is-developing-a-tool-to-help-engineers-catch-bias-in-algorithms/|title=Microsoft is developing a tool to help engineers catch bias in algorithms|date=2018-05-25|first=Kyle|last=Wiggers|website=VentureBeat.com}}</ref><ref>{{cite web |title=Facebook says it has a tool to detect bias in its artificial intelligence |date=2018-05-03 |website=[[Quartz (publication)|Quartz]] |archive-url=https://web.archive.org/web/20230305194710/https://qz.com/1268520/facebook-says-it-has-a-tool-to-detect-bias-in-its-artificial-intelligence |archive-date=2023-03-05 |url-status=live |url=https://qz.com/1268520/facebook-says-it-has-a-tool-to-detect-bias-in-its-artificial-intelligence/}}</ref><ref>{{cite web|url=https://github.com/pymetrics/audit-ai|title=Pymetrics audit-AI|website=GitHub.com}}</ref><ref>{{cite web|url=https://venturebeat-com.cdn.ampproject.org/c/s/venturebeat.com/2018/05/31/pymetrics-open-sources-audit-ai-an-algorithm-bias-detection-tool/amp/|title=Pymetrics open-sources Audit AI, an algorithm bias detection tool|date=2018-05-31|first=Khari|last=Johnson|website=VentureBeat.com}}</ref><ref>{{cite web|url=https://github.com/dssg/aequitas|title=Aequitas: Bias and Fairness Audit Toolkit|website=GitHub.com}}</ref><ref>https://dsapp.uchicago.edu/aequitas/ open-sources Audit AI, Aequitas at University of Chicago</ref><ref>{{cite web|url=https://www.ibm.com/blogs/research/2018/02/mitigating-bias-ai-models/
Ensuring that an AI tool such as a classifier is free from bias is more difficult than just removing the sensitive information
from its input signals, because this is typically implicit in other signals. For example, the hobbies, sports and schools attended
|