Content deleted Content added
m Disambiguating links to Extraterrestrial (link removed) using DisamAssist. |
m Open access bot: doi, hdl updated in citation with #oabot. |
||
Line 42:
[[Reinforcement learning|Learning from demonstration]] is a strategy for transferring human motion skills to robots. The primary goal is to identify significant movement primitives, significant movements humans make, from demonstration and remake these motions to adapt the robot to that motion. There have been a few issues with robots being unable to adapt skills learned by learning from demonstration to new environments (a change from the scenario in which the robot was given initial demonstrations). These issues with learning from demonstration have been addressed with a learning model based on a nonlinear dynamic system which encodes trajectories as dynamic motion primitive, which are similar to movement primitives, but they are significant movements represented by a mathematical equation; equation variables change with the changing environment, altering the motion performed. The trajectories recorded through these systems have proven to apply to a wide variety of environments making the robots more effective in their respective spheres. Learning from demonstration has progressed the applicability of robotics in fields where precision is essential, such as surgical environments.<ref name=":5">{{Cite journal |last=Teng |first=Tao |last2=Gatti |first2=Matteo |last3=Poni |first3=Stefano |last4=Caldwell |first4=Darwin |last5=Chen |first5=Fei |date=June 2023 |title=Fuzzy dynamical system for robot learning motion skills from human demonstration |url=https://linkinghub.elsevier.com/retrieve/pii/S0921889023000453 |journal=Robotics and Autonomous Systems |language=en |volume=164 |pages=104406 |doi=10.1016/j.robot.2023.104406}}</ref>
In the medical field, SAR technology focuses on taking sensory data from wearable peripherals to perceive the user’s state of being. The information gathered enables the machine to provide personalized monitoring, motivation, and coaching for rehabilitation. Intuitive Physical HRI and interfaces between humans and robots allow functionalities like recording the motions of a surgeon to infer their intent, determining the mechanical parameters of human tissue, and other sensory data to use in medical scenarios.<ref name=":6">{{Cite journal |last=Okamura |first=Allison |last2=Mataric |first2=Maja |last3=Christensen |first3=Henrik |date=September 2010 |title=Medical and Health-Care Robotics |url=http://ieeexplore.ieee.org/document/5569021/ |journal=IEEE Robotics & Automation Magazine |volume=17 |issue=3 |pages=26–37 |doi=10.1109/MRA.2010.937861 |issn=1070-9932|hdl=1853/37375 |hdl-access=free }}</ref> Biohybrid robotics have medical applications utilizing biodegradable components to allow robots to function safely within the human body.<ref name=":4" />
AI, machine learning, and deep learning have allowed advances in adaptable robotics such as autonomous navigation, object recognition and manipulation, natural language processing, and predictive maintenance. These technologies have been essential in the development of cobots (collaborative robots), which are robots capable of working alongside humans capable of adapting to changing environments.<ref name=":7">{{Cite journal |last=Soori |first=Mohsen |last2=Arezoo |first2=Behrooz |last3=Dastres |first3=Roza |date=2023 |title=Artificial intelligence, machine learning and deep learning in advanced robotics, a review |url=https://linkinghub.elsevier.com/retrieve/pii/S2667241323000113 |journal=Cognitive Robotics |language=en |volume=3 |pages=54–70 |doi=10.1016/j.cogr.2023.04.001|doi-access=free }}</ref>
In the industrial field, AI, Machine Learning, and Deep Learning can be used to perform quality control checks on manufactured products, identify defects in products, and alert production teams to make necessary changes in real-time.<ref name=":7" />
|