Content deleted Content added
Rescuing 1 sources and tagging 0 as dead.) #IABot (v2.0.9.5 |
|||
(13 intermediate revisions by 9 users not shown) | |||
Line 1:
{{Short description|Robotics}}
{{Robotic laws}} '''Laws of robotics''' are any set of laws, rules, or principles, which are intended as a fundamental framework to underpin the behavior of [[robot]]s designed to have a degree of [[autonomy]]. Robots of this degree of complexity do not yet exist, but they have been widely anticipated in [[science fiction]], [[movie|films]] and are a topic of active [[research and development]] in the fields of [[robotics]] and [[artificial intelligence]].
Line 17 ⟶ 18:
# No machine may harm humanity; or, through inaction, allow humanity to come to harm.
This was refined in the end of ''[[Foundation and Earth]]''
{{ordered list|start=0|A robot may not injure humanity, or, by inaction, allow humanity to come to harm.}}
Adaptations and extensions exist based upon this framework. {{As of
=== Additional laws ===
Authors other than Asimov have often created extra laws.
The 1974 [[Lyuben Dilov]] novel, ''Icarus's Way'' (a.k.a., ''The Trip of Icarus'') introduced a Fourth Law of robotics: "A robot must establish its identity as a robot in all cases."
Dilov gives reasons for the fourth safeguard in this way: "The last Law has put an end to the expensive aberrations of designers to give psychorobots as humanlike a form as possible. And to the resulting misunderstandings...".<ref>{{cite book
| last = Dilov
| first = Lyuben (aka Lyubin, Luben or Liuben)
Line 33 ⟶ 34:
| year = 2002
| publisher = Захари Стоянов
| isbn = 978-954-739-338-7}}</ref> More formally, in 2024 [[Dariusz Jemielniak]] in an article in [[IEEE Spectrum]] proposed a Fourth Law of Robotics: "A robot or AI must not deceive a human by impersonating a human being."<ref>{{Cite web |title=We Need a Fourth Law of Robotics for AI - IEEE Spectrum |url=https://spectrum.ieee.org/isaac-asimov-robotics |access-date=2025-02-03 |website=spectrum.ieee.org |language=en}}</ref><ref>{{Cite web |date=2025-01-24 |title=A Fourth Law of Robotics {{!}} Berkman Klein Center |url=https://cyber.harvard.edu/story/2025-01/fourth-law-robotics |access-date=2025-02-03 |website=cyber.harvard.edu |language=en}}</ref><ref>{{Cite web |date=2025-01-15 |title=Ki kell egészíteni Asimov robotikai törvényeit az AI miatt |url=https://www.blikk.hu/ferfiaknak/tech/robotika-torvenyei/nxbvh73 |access-date=2025-02-03 |website=Blikk |language=hu}}</ref><ref>{{Cite web |last=Tecnológica |first=Site Inovação |date=2025-01-21 |title=Leis da Robótica de Asimov precisam de atualização para IA |url=https://www.inovacaotecnologica.com.br/noticias/noticia.php?artigo=leis-robotica-asimov-precisam-atualizacao-ia&id=010180250121 |access-date=2025-02-03 |website=Site Inovação Tecnológica |language=pt}}</ref><ref>{{Cite news |last=Jaśkowiak |first=Piotr |date=2025-02-01 |title=Asimovowi zabrakło wyobraźni. Potrzebujemy Czwartego Prawa Robotyki |url=https://ssl.audycje.tokfm.pl/podcast/170407,Asimovowi-zabraklo-wyobrazni-Potrzebujemy-Czwartego-Prawa-Robotyki-a-na-antenie-tworzymy-Piate |work=Radio TokFM}}</ref>
A fifth law was introduced by [[Nikola Kesarovski]] in his short story "The Fifth Law of Robotics". This fifth law says: "A robot must know it is a robot."
Line 43 ⟶ 44:
| year = 1983
| publisher = Отечество
}}</ref> The story was reviewed by [[Valentin D. Ivanov]] in SFF review webzine ''The Portal''.<ref>
For the 1986 tribute anthology, ''[[Foundation's Friends]],'' [[Harry Harrison (writer)|Harry Harrison]] wrote a story entitled, "The Fourth Law of Robotics". This Fourth Law states: "A robot must reproduce. As long as such reproduction does not interfere with the First or Second or Third Law."
Line 66 ⟶ 67:
# We should consider the ethics of transparency: are there limits to what should be openly available?
# When we see erroneous accounts in the press, we commit to take the time to contact the reporting journalists.
The EPSRC principles are broadly recognised as a useful starting point. In 2016 Tony Prescott organised a workshop to revise these principles, e.g. to differentiate ethical from legal principles.<ref>{{cite journal|date=2017|title=Legal vs. ethical obligations – a comment on the EPSRC's principles for robotics|url=https://philpapers.org/rec/MLLLVE|journal=Connection Science|doi=10.1080/09540091.2016.1276516|author=Müller, Vincent C.|volume=29 |issue=2 |pages=137–141 |bibcode=2017ConSc..29..137M |s2cid=19080722 }}</ref>
==Judicial development==
Line 81 ⟶ 82:
==Tilden's laws ==
[[Mark W. Tilden]] is a robotics physicist who was a pioneer in developing simple robotics.<ref name=wired1>{{cite magazine| url=https://www.wired.com/wired/archive/2.09/tilden.html?pg=1&topic= | magazine=Wired | first=Fred | last=Hapgood | title=Chaotic Robotics | issue = 9 | date = September 1994| volume=2 }}</ref>
Tilden # A robot must protect its existence at all costs.
Line 87 ⟶ 90:
# A robot must continually search for better power sources.
==See also==
Line 99 ⟶ 102:
==References ==
{{reflist|30em}}<ref>17. Announcer (2011). [[Portal 2]]</ref>{{Robotics}}
<references />
[[Category:Robotics]]
[[Category:Robotics engineering]]
|