Content deleted Content added
Maxeto0910 (talk | contribs) Added short description. Tags: Mobile edit Mobile web edit Advanced mobile edit |
GreenC bot (talk | contribs) Rescued 2 archive links; Move 5 urls. Wayback Medic 2.5 per WP:URLREQ#army.mil |
||
(2 intermediate revisions by 2 users not shown) | |||
Line 7:
<!-- Note: The following pages were redirects to [[Robotics_Collaborative_Technology_Alliance]] before draftification:
*[[Robotics Collaborative Technology Alliance (R-CTA)]]
-->The '''Robotics Collaborative Technology Alliance''' ('''R-CTA''') was a research program initiated and sponsored by the [[US Army Research Laboratory]]. The purpose was to "bring together government, industrial, and academic institutions to address research and development required to enable the deployment of future military unmanned ground vehicle systems ranging in size from man-portables to ground combat vehicles."<ref name=":0">{{Cite web|url=https://www.arl.army.mil/www/pages/392/RCTA2017-18BPP_013117_R1.1a_signed.pdf|archive-url=https://web.archive.org/web/20180905214921/https://www.arl.army.mil/www/pages/392/RCTA2017-18BPP_013117_R1.1a_signed.pdf|url-status=dead|archive-date=September 5, 2018|title=ROBOTICS COLLABORATIVE TECHNOLOGY ALLIANCE (RCTA): Proposed 2017-18 Biennial Program Plan|last=|first=|date=|website=}}</ref> Collaborative Technology and Research Alliances was a term for partnerships between Army laboratories and centers, private industry and academia for performing research and technology development intended to benefit the US Army. The partnerships were funded by the US Army.<ref>{{Cite web|url=https://
== History ==
Since approximately 1992, the Army ResearchLaboratory has formed a number of partnerships that involved industry, academia and government. One was the Robotic Collaborative Technology Alliance which was formed in 2009. The program was completed in 2018, ending with a capstone event at [[Camp Lejeune]], North Carolina.<ref>{{Cite web|url=https://www.arl.army.mil/www/default.cfm?article=3240|archive-url=https://web.archive.org/web/20180906013941/https://www.arl.army.mil/www/default.cfm?article=3240|url-status=dead|archive-date=September 6, 2018|title=Army's robotics alliance rallies researchers
== Objectives ==
Line 47:
* Distributed solutions for efficiently allocating a set of complex tasks to a robot team, by giving individual robots the ability to come up with new ways to perform a task, or by allowing multiple robots to cooperate by sharing the subcomponents of a task, or both.<ref>{{Cite conference |last1=Zlot |first1=R. |last2=Stentz |first2=A. |title=Complex Task Allocation For Multiple Robots |book-title=Proceedings of the 2005 IEEE International Conference on Robotics and Automation |pages=1515–1522 |language=en-US |doi=10.1109/ROBOT.2005.1570329 |year=2005 |isbn=978-0-7803-8914-4 |citeseerx=10.1.1.70.5598 |s2cid=3281638}}</ref>
* Water detection sensor platforms on an XUV vehicle for terrain classification and obstacle detection in natural environments.<ref>{{Cite web |last1=Alok|first1=Sarwal|last2=Jeremy|first2=Nett|last3=David|first3=Simon|date=Dec 2004|title=Detection of Small Water-Bodies |website=Defense Technical information Center |url=https://apps.dtic.mil/sti/citations/ADA433004|language=en}}</ref>
* Short-range sensing for safe driving, including video sensing, laser rangefinders, a novel light-stripe rangefinder, software to process each sensor individually, and a map-based fusion system.<ref>{{Citation|last1=Thorpe|first1=Chuck|title=Safe Robot Driving in Cluttered Environments|date=2005|work=Springer Tracts in Advanced Robotics|pages=271–280|publisher=Springer Berlin Heidelberg|language=en|doi=10.1007/11008941_29|isbn=978-3-540-23214-8|last2=Carlson|first2=Justin|last3=Duggins|first3=Dave|last4=Gowdy|first4=Jay|last5=MacLachlan|first5=Rob|last6=Mertz|first6=Christoph|last7=Suppe|first7=Arne|last8=Wang|first8=Bob|url=https://ink.library.smu.edu.sg/context/sis_research/article/9239/viewcontent/SAFE_ROBOTS.pdf }}</ref>
* A Geometric Path Planner (GPP) that produces routes for unmanned ground and air vehicles. The GPP generates plans that calculate factors such as mobility risk, traversal time, sensor coverage, and stealth.<ref>{{Cite book|title=Robust multirobot coordination in dynamic environments - IEEE Conference Publication|pages=3435–3442 Vol.4|language=en-US|doi=10.1109/ROBOT.2004.1308785|chapter=Robust multirobot coordination in dynamic environments|year=2004|last1=Bernardine Dias|first1=M.|last2=Zinck|first2=M.|last3=Zlot|first3=R.|last4=Stentz|first4=A.|isbn=978-0-7803-8232-9|citeseerx=10.1.1.58.3576|s2cid=16607433}}</ref>
* A multirobot coordination approach that ensures robustness and promotes graceful degradation in team performance when faced with malfunctions, including communication failures, partial robot malfunction, or robot death.<ref>{{Cite web |author=Juan Pablo Gonzalez |author2=Bryan Nagy |author3=Anthony Stentz |title=The Geometric Path Planner for Navigating Unmanned Vehicles in Dynamic Environments |website=Carnegie Mellon University |s2cid=661252 |url=https://www.ri.cmu.edu/pub_files/pub4/gonzalez_juan_pablo_2006_1/gonzalez_juan_pablo_2006_1.pdf}}</ref>
* A method and apparatus for error correction in [[speech recognition]] applications through comparison of user utterances.<ref>{{Cite patent|title=Method and apparatus for error correction in speech recognition applications|country=US|number=7756710|issue-date=2006-07-13}}</ref>
* A method for mobile range sensing, through detecting the range of at least one object of a scene. The method consisted of receiving a set of images of the scene having multiple objects from at least one camera in motion.<ref>{{Cite patent|title=System and method for providing mobile range sensing|country=US|number=8059887|issue-date=2007-09-25}}</ref>
* A method to combine the standard and throat microphone signals for noise-robust speech recognition using an optimum filter algorithm.<ref>{{Cite web|url=https://www.isca-speech.org/archive/interspeech_2004/i04_0809.html|title=Combination of Standard and Throat Microphones for Robust Speech Recognition in Highly Noisy Environments|last=|first=|date=|website=}}</ref>
|