Content deleted Content added
Citation bot (talk | contribs) Add: doi-access, authors 1-1. Removed URL that duplicated identifier. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by Jay8g | #UCB_toolbar |
m Open access bot: pmc updated in citation with #oabot. |
||
Line 3:
The phrases "algorithmic transparency" and "algorithmic accountability"<ref>{{cite journal|last1=Diakopoulos|first1=Nicholas|title=Algorithmic Accountability: Journalistic Investigation of Computational Power Structures.|journal=Digital Journalism|date=2015|volume=3|issue=3|pages=398–415|doi=10.1080/21670811.2014.976411|s2cid=42357142|url=https://www.cjr.org/tow_center_reports/algorithmic_accountability_on_the_investigation_of_black_boxes.php|url-access=subscription}}</ref> are sometimes used interchangeably – especially since they were coined by the same people – but they have subtly different meanings. Specifically, "algorithmic transparency" states that the inputs to the algorithm and the algorithm's use itself must be known, but they need not be fair. "[[Algorithmic accountability]]" implies that the organizations that use algorithms must be accountable for the decisions made by those algorithms, even though the decisions are being made by a machine, and not by a human being.<ref name="Dickey">{{cite news|last1=Dickey|first1=Megan Rose|title=Algorithmic Accountability|url=https://techcrunch.com/2017/04/30/algorithmic-accountability/|accessdate=4 September 2017|work=TechCrunch|date=30 April 2017}}</ref>
Current research around algorithmic transparency interested in both societal effects of accessing remote services running algorithms.,<ref>{{cite web|title=Workshop on Data and Algorithmic Transparency|url=http://datworkshop.org/|accessdate=4 January 2017|date=2015}}</ref> as well as mathematical and computer science approaches that can be used to achieve algorithmic transparency<ref>{{cite web|title=Fairness, Accountability, and Transparency in Machine Learning|url=http://www.fatml.org/|accessdate=29 May 2017|date=2015}}</ref><ref>{{Cite journal |last1=Ott |first1=Tabea |last2=Dabrock |first2=Peter |date=2022-08-22 |title=Transparent human – (non-) transparent technology? The Janus-faced call for transparency in AI-based health care technologies |journal=Frontiers in Genetics |language=English |volume=13 |doi=10.3389/fgene.2022.902960 |doi-access=free |issn=1664-8021|pmc=9444183 }}</ref> In the United States, the [[Federal Trade Commission]]'s Bureau of Consumer Protection studies how algorithms are used by consumers by conducting its own research on algorithmic transparency and by funding external research.<ref name="Noyes">{{cite news|last1=Noyes|first1=Katherine|title=The FTC is worried about algorithmic transparency, and you should be too|url=http://www.pcworld.com/article/2908372/the-ftc-is-worried-about-algorithmic-transparency-and-you-should-be-too.html|accessdate=4 September 2017|work=PCWorld|date=9 April 2015|language=en}}</ref> In the [[European Union]], the data protection laws that came into effect in May 2018 include a "right to explanation" of decisions made by algorithms, though it is unclear what this means.<ref>{{cite journal |title=False Testimony |journal=Nature |date=31 May 2018 |volume=557 |issue=7707 |page=612|url=https://media.nature.com/original/magazine-assets/d41586-018-05285-9/d41586-018-05285-9.pdf}}</ref> Furthermore, the European Union founded The European Center for Algorithmic Transparency (ECAT).<ref>{{cite web | url=https://algorithmic-transparency.ec.europa.eu/about_en | title=About - European Commission }}</ref>
==See also==
|