Adaptable robotics: Difference between revisions

Content deleted Content added
Added Fundamental Concepts section and everything written in it. Added Types of Adaptable Robotics and everything written in it. Added Applications section and everything in it. Added Challenges section and everything in it. Kept the previous software section. Replaced the intro section.
Fixed references and replaced jargon abbreviation with words. Please see Category:CS1 errors: dates.
Line 26:
 
=== Soft Robots ===
Robotics with soft grippers is an emerging field in the adaptable robotic scene which is based on the [[Venus flytrap]]. Two soft robotic surfaces provide enveloping and pinching grasp modules. This technology is tested in a variety of environments to determine the effects of diverse objects, errors of object position, and SRSsoft (Softrobotic Robotic Surface)surface installation on grasping capacity.<ref>{{Cite journal |last=Xiao |first=Wei |last2=Liu |first2=Chang |last3=Hu |first3=Dean |last4=Yang |first4=Gang |last5=Han |first5=Xu |date=April 2022-04 |title=Soft robotic surface enhances the grasping adaptability and reliability of pneumatic grippers |url=https://linkinghub.elsevier.com/retrieve/pii/S0020740322000315 |journal=International Journal of Mechanical Sciences |language=en |volume=219 |pages=107094 |doi=10.1016/j.ijmecsci.2022.107094}}</ref> Untethered actuation is achievable, especially in soft robots with LCPsliquid crystal polymers, a category of stimuli-responsive materials with two way shape memory effect. This can allow the LCPsliquid crystal polymers to generate mechanical energy by changing shape in response to external stimuli, hence untethered actuation.<ref name=":23">Y. Chi, Y. Zhao, Y. Hong, Y. Li, and J. Yin, “A Perspective on Miniature Soft Robotics: Actuation, Fabrication, Control, and Applications,” Advanced intelligent systems, Apr. 2023, doi: <nowiki>https://doi.org/10.1002/aisy.202300063</nowiki>.\</ref>
 
=== Modular Robots ===
Line 40:
Adaptable robotics possess capabilities that have made them applicable to many fields including, but not limited to, the medical, industrial, and experimental fields.
 
[[Reinforcement learning|Learning from demonstration]] (LfD) is a strategy for transferring human motion skills to robots. The primary goal is to identify significant movement primitives (MPs), significant movements humans make, from demonstration and remake these motions to adapt the robot to that motion. There have been a few issues with robots being unable to adapt skills learned throughby LfDlearning from demonstration to new environments (a change from the scenario in which the robot was given initial demonstrations). These Issuesissues with LfDlearning from demonstration have been addressed with a learning model based on a nonlinear dynamic system (DS) which encodes trajectories as dynamic motion primitive, (DMP). DMPswhich are similar to MPsmovement primitives, but they are significant movements represented by a mathematical equation; equation variables change with the changing environment, altering the motion performed. The trajectories recorded through these systems have proven to apply to a wide variety of environments making the robots more effective in their respective spheres. LfDLearning from demonstration has progressed the applicability of robotics in fields where precision is essential, such as surgical environments.<ref name=":5">{{Cite journal |last=Teng |first=Tao |last2=Gatti |first2=Matteo |last3=Poni |first3=Stefano |last4=Caldwell |first4=Darwin |last5=Chen |first5=Fei |date=June 2023-06 |title=Fuzzy dynamical system for robot learning motion skills from human demonstration |url=https://linkinghub.elsevier.com/retrieve/pii/S0921889023000453 |journal=Robotics and Autonomous Systems |language=en |volume=164 |pages=104406 |doi=10.1016/j.robot.2023.104406}}</ref>
 
In the medical field, SAR technology focuses on taking sensory data from wearable peripherals to perceive the user’s state of being. The information gathered enables the machine to provide personalized monitoring, motivation, and coaching for rehabilitation. Intuitive Physical HRI and interfaces between humans and robots allow functionalities like recording the motions of a surgeon to infer their intent, determining the mechanical parameters of human tissue, and other sensory data to use in medical scenarios.<ref name=":6">{{Cite journal |last=Okamura |first=Allison |last2=Mataric |first2=Maja |last3=Christensen |first3=Henrik |date=September 2010-09 |title=Medical and Health-Care Robotics |url=http://ieeexplore.ieee.org/document/5569021/ |journal=IEEE Robotics & Automation Magazine |volume=17 |issue=3 |pages=26–37 |doi=10.1109/MRA.2010.937861 |issn=1070-9932}}</ref> Biohybrid robotics have medical applications utilizing biodegradable components to allow robots to function safely within the human body.<ref name=":4" />
 
AI, Machine Learning, and Deep Learning have allowed advances in adaptable robotics such as autonomous navigation, object recognition and manipulation, natural language processing, and predictive maintenance. These technologies have been essential in the development of cobots (collaborative robots), which are robots capable of working alongside humans capable of adapting to changing environments.<ref name=":7">{{Cite journal |last=Soori |first=Mohsen |last2=Arezoo |first2=Behrooz |last3=Dastres |first3=Roza |date=2023 |title=Artificial intelligence, machine learning and deep learning in advanced robotics, a review |url=https://linkinghub.elsevier.com/retrieve/pii/S2667241323000113 |journal=Cognitive Robotics |language=en |volume=3 |pages=54–70 |doi=10.1016/j.cogr.2023.04.001}}</ref>