Robotics Collaborative Technology Alliance: Difference between revisions

Content deleted Content added
m Removed Template:Orphan, Template:Uncategorized, Template:Unreferenced, and/or Template:Underlinked (N/A in the draft namespace) using TheSandBot. Questions? msg TSD! Please mention that this is task #2! (BRFA)
GreenC bot (talk | contribs)
Rescued 2 archive links; Move 5 urls. Wayback Medic 2.5 per WP:URLREQ#army.mil
 
(23 intermediate revisions by 15 users not shown)
Line 1:
{{short description|United States research program}}
{{AFC submission|t||ts=20181023203200|u=Rafreyna|ns=118|demo=}}
 
<!-- Note: The following pages were redirects to [[Robotics_Collaborative_Technology_Alliance]] before draftification:
*[[Robotics Collaborative Technology Alliance (R-CTA)]]
-->
{{Multiple issues|
{{Orphan|date=January 2022}}
 
 
{{COI|date=September 2018}}
{{expert-subject|Robotics|date=September 2018}}
}}
'''Robotics Collaborative Technology Alliance (R-CTA)''' was a research program initiated and sponsored by the [[US Army Research Laboratory]]. The stated purpose of this alliance was to “bring together government, industrial, and academic institutions to address research and development required to enable the deployment of future military unmanned ground vehicle systems ranging in size from man-portables to ground combat vehicles.”<ref name=":0">{{Cite web|url=https://www.arl.army.mil/www/pages/392/RCTA2017-18BPP_013117_R1.1a_signed.pdf|title=ROBOTICS COLLABORATIVE TECHNOLOGY ALLIANCE (RCTA): Proposed 2017-18 Biennial Program Plan|last=|first=|date=|website=|archive-url=|archive-date=|dead-url=|access-date=}}</ref>
 
<!-- Note: The following pages were redirects to [[Robotics_Collaborative_Technology_Alliance]] before draftification:
Collaborative Technology and Research Alliances was a term for partnerships between Army laboratories and centers, private industry and academia for performing research and technology development intended to benefit the US Army. The partnerships were funded by the US Army.<ref>{{Cite web|url=https://www.arl.army.mil/www/default.cfm?page=93|title=Collaborative Alliances {{!}} U.S. Army Research Laboratory|website=www.arl.army.mil|language=en|access-date=2018-09-05}}</ref><ref name=":1">{{Cite web|url=https://www.arl.army.mil/www/default.cfm?page=392|title=Robotics {{!}} U.S. Army Research Laboratory|website=www.arl.army.mil|language=en|access-date=2018-09-05}}</ref>
*[[Robotics Collaborative Technology Alliance (R-CTA)]]
-->The '''Robotics Collaborative Technology Alliance''' ('''R-CTA)''') was a research program initiated and sponsored by the [[US Army Research Laboratory]]. The stated purpose of this alliance was to “bring"bring together government, industrial, and academic institutions to address research and development required to enable the deployment of future military unmanned ground vehicle systems ranging in size from man-portables to ground combat vehicles."<ref name=":0">{{Cite web|url=https://www.arl.army.mil/www/pages/392/RCTA2017-18BPP_013117_R1.1a_signed.pdf|archive-url=https://web.archive.org/web/20180905214921/https://www.arl.army.mil/www/pages/392/RCTA2017-18BPP_013117_R1.1a_signed.pdf|url-status=dead|archive-date=September 5, 2018|title=ROBOTICS COLLABORATIVE TECHNOLOGY ALLIANCE (RCTA): Proposed 2017-18 Biennial Program Plan|last=|first=|date=|website=}}</ref> Collaborative Technology and Research Alliances was a term for partnerships between Army laboratories and centers, private industry and academia for performing research and technology development intended to benefit the US Army. The partnerships were funded by the US Army.<ref>{{Cite web|archive-url=https://arl.devcom.army.mil/collaborate-with-us/opportunity/collaborative-research-alliance/|archivetitle=Collaborative Alliances|website=www.arl.army.mil|language=en|access-date=|dead2018-09-05}}</ref><ref name=":1">{{Cite web|url=https://arl.devcom.army.mil/collaborate-with-us/opportunity/collaborative-research-alliance/|title=Robotics|website=www.arl.army.mil|language=en|access-date=2018-09-05}}</ref>
 
== History ==
 
Since approximately 1992, ARLthe Army ResearchLaboratory has formed a number of partnerships that involved the triad of industry, academia and government.  One of them was the Robotic Collaborative Technology Alliance which was formed in 2009. The program was completed in 2018, ending with a capstone event at [[Camp Lejeune]], North Carolina.<ref>{{Cite web|url=https://www.arl.army.mil/www/default.cfm?article=3240|archive-url=https://web.archive.org/web/20180906013941/https://www.arl.army.mil/www/default.cfm?article=3240|url-status=dead|archive-date=September 6, 2018|title=Army's robotics alliance rallies researchers {{!}} U.S. Army Research Laboratory|website=www.arl.army.mil|language=en|access-date=2018-09-05}}</ref>
 
== Objectives ==
 
The goal of R-CTA was development of unmanned systems with a set of intelligence-based capabilities sufficient to enable the teaming of autonomous systems with soldiers. This included robotic systems capable of reasoning about their missions, move through the world in a tactically correct way, observe salient events in the world around them, communicate efficiently with soldiers and other autonomous systems, and perform a variety of mission tasks. R-CTA’sCTA's objective was to move beyond unmanned systems requiring human supervision such as drones, which were vulnerable due to near-continuous control by a human operator and breakdowns of communications links.<ref name=":1" />
 
== Research thrusts ==
 
The R-CTA program was organized around several research thrusts, including the followingareas:<ref name=":0" />
 
* Semantic perception – perception that understands a basic set of object types important to robotics, moving beyond just what is or is not an obstacle.
Line 48 ⟶ 43:
== Results ==
 
Examples of research results developed by the R-CTA program included the following:
 
* Distributed solutions for efficiently allocating a set of complex tasks to a robot team, by giving individual robots the ability to come up with new ways to perform a task, or by allowing multiple robots to cooperate by sharing the subcomponents of a task, or both.<ref>{{Cite bookconference |chapter-urllast1=https://ieeexploreZlot |first1=R.ieee |last2=Stentz |first2=A.org/abstract/document/1570329/ |title=Complex Task Allocation For Multiple Robots |book-title=Proceedings of the 2005 IEEE International Conference Publicationon Robotics and Automation |pages=1515–1522 |language=en-US |doi=10.1109/ROBOT.2005.1570329|chapter=Complex Task Allocation for Multiple Robots|year=2005|last1=Zlot|first1=R.|last2=Stentz|first2=A. |isbn=978-0-7803-8914-4 |citeseerx=10.1.1.70.5598 |s2cid=3281638}}</ref>
* Water detection sensor platforms on an XUV vehicle for terrain classification and obstacle detection in natural environments.<ref>{{Cite journalweb |lastlast1=Alok|firstfirst1=Sarwal|last2=Jeremy|first2=Nett|last3=David|first3=Simon|date=Dec 2004|title=Detection of Small Water-Bodies |website=Defense Technical information Center |url=httphttps://wwwapps.dtic.mil/docssti/citations/ADA433004|language=en}}</ref>
*
* Short-range sensing for safe driving, including video sensing, laser rangefinders, a novel light-stripe rangefinder, software to process each sensor individually, and a map-based fusion system.<ref>{{Citation|lastlast1=Thorpe|firstfirst1=Chuck|title=Safe Robot Driving in Cluttered Environments|date=2005|work=Springer Tracts in Advanced Robotics|pages=271–280|publisher=Springer Berlin Heidelberg|language=en|doi=10.1007/11008941_29|isbn=9783540232148978-3-540-23214-8|last2=Carlson|first2=Justin|last3=Duggins|first3=Dave|last4=Gowdy|first4=Jay|last5=MacLachlan|first5=Rob|last6=Mertz|first6=Christoph|last7=Suppe|first7=Arne|last8=Wang|first8=Bob|url=https://ink.library.smu.edu.sg/context/sis_research/article/9239/viewcontent/SAFE_ROBOTS.pdf }}</ref>
*
* A Geometric Path Planner (GPP) that produces routes for unmanned ground and air vehicles. The GPP generates plans that calculate factors such as mobility risk, traversal time, sensor coverage, and stealth.<ref>{{Cite book|chapter-url=https://ieeexplore.ieee.org/abstract/document/1308785/|title=Robust multirobot coordination in dynamic environments - IEEE Conference Publication|pages=3435–3442 Vol.4|language=en-US|doi=10.1109/ROBOT.2004.1308785|chapter=Robust multirobot coordination in dynamic environments|year=2004|last1=Bernardine Dias|first1=M.|last2=Zinck|first2=M.|last3=Zlot|first3=R.|last4=Stentz|first4=A.|isbn=978-0-7803-8232-9|citeseerx=10.1.1.58.3576|s2cid=16607433}}</ref>
*
* A multirobot coordination approach that ensures robustness and promotes graceful degradation in team performance when faced with malfunctions, including communication failures, partial robot malfunction, or robot death.<ref>{{Cite web |urlauthor=https://pdfs.semanticscholar.org/41d3/8b66097032b6f02b4f800090fa1988aa9ce0.pdfJuan Pablo Gonzalez |author2=Bryan Nagy |author3=Anthony Stentz |title=The Geometric Path Planner for Navigating Unmanned Vehicles in Dynamic Environments|last=|first=|date= |website=Carnegie Mellon University |archive-urls2cid=661252 |archive-date=|dead-url=|access-date=https://www.ri.cmu.edu/pub_files/pub4/gonzalez_juan_pablo_2006_1/gonzalez_juan_pablo_2006_1.pdf}}</ref>
* Water detection sensor platforms on an XUV vehicle for terrain classification and obstacle detection in natural environments.<ref>{{Cite journal|last=Alok|first=Sarwal|last2=Jeremy|first2=Nett|last3=David|first3=Simon|date=Dec 2004|title=Detection of Small Water-Bodies|url=http://www.dtic.mil/docs/citations/ADA433004|language=en}}</ref>
* A method and apparatus for error correction in [[speech recognition]] applications through comparison of user utterances.<ref>{{Cite patent|title=Method and apparatus for error correction in speech recognition applications|country=US|number=7756710B2|url=https://patents.google.com/patent/US7756710B2/en7756710|issue-date=2006-07-13|access-date=2018-09-05}}</ref>
*
* A method for mobile range sensing, through detecting the range of at least one object of a scene. The method cnsistedconsisted of receiving a set of images of the scene having multiple objects from at least one camera in motion.<ref>{{Cite patent|title=System and method for providing mobile range sensing|country=US|number=8059887B2|url=https://patents.google.com/patent/US8059887B2/en8059887|issue-date=2007-09-25|access-date=2018-09-05}}</ref>
*
* A method to combine the standard and throat microphone signals for noise-robust speech recognition using an optimum filter algorithm.<ref>{{Cite web|url=https://www.isca-speech.org/archive/interspeech_2004/i04_0809.html|title=Combination of Standard and Throat Microphones for Robust Speech Recognition in Highly Noisy Environments|last=|first=|date=|website=|archive-url=|archive-date=|dead-url=|access-date=}}</ref>
*
* Short-range sensing for safe driving, including video sensing, laser rangefinders, a novel light-stripe rangefinder, software to process each sensor individually, and a map-based fusion system.<ref>{{Citation|last=Thorpe|first=Chuck|title=Safe Robot Driving in Cluttered Environments|date=2005|work=Springer Tracts in Advanced Robotics|pages=271–280|publisher=Springer Berlin Heidelberg|language=en|doi=10.1007/11008941_29|isbn=9783540232148|last2=Carlson|first2=Justin|last3=Duggins|first3=Dave|last4=Gowdy|first4=Jay|last5=MacLachlan|first5=Rob|last6=Mertz|first6=Christoph|last7=Suppe|first7=Arne|last8=Wang|first8=Bob}}</ref>
*
*
*
* A Geometric Path Planner (GPP) that produces routes for unmanned ground and air vehicles. The GPP generates plans that calculate factors such as mobility risk, traversal time, sensor coverage, and stealth.<ref>{{Cite book|chapter-url=https://ieeexplore.ieee.org/abstract/document/1308785/|title=Robust multirobot coordination in dynamic environments - IEEE Conference Publication|pages=3435–3442 Vol.4|language=en-US|doi=10.1109/ROBOT.2004.1308785|chapter=Robust multirobot coordination in dynamic environments|year=2004|last1=Bernardine Dias|first1=M.|last2=Zinck|first2=M.|last3=Zlot|first3=R.|last4=Stentz|first4=A.|isbn=978-0-7803-8232-9|citeseerx=10.1.1.58.3576}}</ref>
*
*
*
* A multirobot coordination approach that ensures robustness and promotes graceful degradation in team performance when faced with malfunctions, including communication failures, partial robot malfunction, or robot death.<ref>{{Cite web|url=https://pdfs.semanticscholar.org/41d3/8b66097032b6f02b4f800090fa1988aa9ce0.pdf|title=The Geometric Path Planner for Navigating Unmanned Vehicles in Dynamic Environments|last=|first=|date=|website=|archive-url=|archive-date=|dead-url=|access-date=}}</ref>
*
*
*
* A method and apparatus for error correction in speech recognition applications through comparison of user utterances.<ref>{{Cite patent|title=Method and apparatus for error correction in speech recognition applications|country=US|number=7756710B2|url=https://patents.google.com/patent/US7756710B2/en|issue-date=2006-07-13|access-date=2018-09-05}}</ref>
*
*
*
* A method for mobile range sensing, through detecting the range of at least one object of a scene. The method cnsisted of receiving a set of images of the scene having multiple objects from at least one camera in motion.<ref>{{Cite patent|title=System and method for providing mobile range sensing|country=US|number=8059887B2|url=https://patents.google.com/patent/US8059887B2/en|issue-date=2007-09-25|access-date=2018-09-05}}</ref>
*
*
*
* A method to combine the standard and throat microphone signals for noise-robust speech recognition using an optimum filter algorithm.<ref>{{Cite web|url=https://www.isca-speech.org/archive/interspeech_2004/i04_0809.html|title=Combination of Standard and Throat Microphones for Robust Speech Recognition in Highly Noisy Environments|last=|first=|date=|website=|archive-url=|archive-date=|dead-url=|access-date=}}</ref>
 
== References ==
{{Reflist}}
 
[[:Category:RoboticsMilitary research inof the United States]]
 
[[:Category:Military technology]]
 
[[:Category:Military researchRobotics ofin the United States]]
[[:Category:Military technology]]
[[:Category:Robotics in the United States]]