Robustness (computer science): Difference between revisions

Content deleted Content added
Monkbot (talk | contribs)
m Task 18 (cosmetic): eval 10 templates: del empty params (3×); hyphenate params (8×);
Move hatnote text to See also. {{see also}} is not suitable for the top of an article
 
(13 intermediate revisions by 13 users not shown)
Line 1:
{{seeshort alsodescription|Fault-tolerantAbility of a computer system to cope with errors during execution}}
{{Complex systems}}
 
In [[computer science]], '''robustness''' is the ability of a computer system to cope with [[Error message|errors]] during [[Execution (computing)|execution]]<ref>{{cite web|url=http://dl.ifip.org/db/conf/pts/testcom2005/FernandezMP05.pdf |title=A Model-Based Approach for Robustness Testing |website=Dl.ifip.org |access-date=2016-11-13}}</ref><ref name="IEEE">1990. IEEE Standard Glossary of Software Engineering Terminology, IEEE Std 610.12-1990 defines robustness as "The degree to which a system or component can function correctly in the presence of invalid inputs or stressful environmental conditions"</ref> and cope with erroneous input.<ref name="IEEE"/> Robustness can encompass many areas of computer science, such as [[Defensive programming|robust programming]], [[Overfitting|robust machine learning]], and [[Robust Security Network]]. Formal techniques, such as [[fuzz testing]], are essential to showing robustness since this type of testing involves invalid or unexpected inputs. Alternatively, [[fault injection]] can be used to test robustness. Various commercial products perform robustness testing of software analysis.<ref>{{cite journal|url=http://www.stanford.edu/~bakerjw/Publications/Baker%20et%20al%20(2008)%20Robustness,%20Structural%20Safety.pdf |title= On the assessment of robustness|access-date=2016-11-13|doi=10.1016/j.strusafe.2006.11.004|volume=30|year=2008|journal=Structural Safety|pages=253–267 | last1 = Baker | first1 = Jack W. | last2 = Schubert | first2 = Matthias | last3 = Faber | first3 = Michael H.|issue= 3}}</ref>
 
== Introduction ==
Line 9 ⟶ 10:
 
== Challenges ==
Programs and software are tools focused on a very specific task, and thus aren'tare not generalized and flexible.<ref name="MIT" /> However, observations in systems such as the [[internet]] or [[biological system]]s demonstrate adaptation to their environments. One of the ways biological systems adapt to environments is through the use of [[redundancy (engineering)|redundancy]].<ref name="MIT" /> Many organs are redundant in humans. The [[kidney]] is one such example. [[Human]]s generally only need one kidney, but having a second kidney allows room for failure. This same principle may be taken to apply to software, but there are some challenges. When applying the principle of redundancy to computer science, blindly adding code is not suggested. Blindly adding code introduces more errors, makes the system more complex, and renders it harder to understand.<ref>{{cite web|url=http://www.cse.sc.edu/~huhns/journalpapers/V6N2.pdf |title=Building Robust Systems an essay |author=Agents on the wEb : Robust Software |website=Cse.sc.edu |access-date=2016-11-13}}</ref> Code that does not provide any reinforcement to the already existing code is unwanted. The new code must instead possess equivalent [[function (engineering)|functionality]], so that if a function is broken, another providing the same function can replace it, using manual or automated [[software diversity]]. To do so, the new code must know how and when to accommodate the failure point.<ref name="MIT" /> This means more [[logic]] needs to be added to the system. But as a system adds more logic, [[Software component#Software component|components]], and increases in size, it becomes more complex. Thus, when making a more redundant system, the system also becomes more complex and developers must consider balancing redundancy with complexity.
When applying the principle of redundancy to computer science, blindly adding code is not suggested. Blindly adding code introduces more errors, makes the system more complex, and renders it harder to understand.<ref>{{cite web|url=http://www.cse.sc.edu/~huhns/journalpapers/V6N2.pdf |title=Building Robust Systems an essay |author=Agents on the wEb : Robust Software |website=Cse.sc.edu |access-date=2016-11-13}}</ref> Code that doesn't provide any reinforcement to the already existing code is unwanted. The new code must instead possess equivalent [[function (engineering)|functionality]], so that if a function is broken, another providing the same function can replace it, using manual or automated [[software diversity]]. To do so, the new code must know how and when to accommodate the failure point.<ref name="MIT" /> This means more [[logic]] needs to be added to the system. But as a system adds more logic, [[Software component#Software component|components]], and increases in size, it becomes more complex. Thus, when making a more redundant system, the system also becomes more complex and developers must consider balancing redundancy with complexity.
 
Currently, computer science practices do not focus on building robust systems.<ref name="MIT" /> Rather, they tend to focus on [[scalability]] and [[Algorithmic efficiency|efficiency]]. One of the main reasons why there is no focus on robustness today is because it is hard to do in a general way.<ref name="MIT" />
Line 20:
 
==== Principles ====
'';Paranoia'' -: When building software, the programmer assumes users are out to break their code.<ref name="robust_programming" /> The programmer also assumes that his or hertheir own written code may fail or work incorrectly.<ref name="robust_programming" />
 
'';Stupidity'' -: The programmer assumes users will try incorrect, bogus and malformed inputs.<ref name="robust_programming" /> As a consequence, the programmer returns to the user an unambiguous, intuitive error message that does not require looking up error codes. The error message should try to be as accurate as possible without being misleading to the user, so that the problem can be fixed with ease.
 
'';Dangerous implements'' -: Users should not gain access to [[Library (computing)|libraries]], [[data structure]]s, or [[Pointer (computer programming)|pointers]] to data structures.<ref name="robust_programming" /> This information should be hidden from the user so that the user doesn'tdoes not accidentally modify them and introduce a bug in the code. When such [[Interface (object-oriented programming)|interfaces]] are correctly built, users use them without finding loopholes to modify the interface. The interface should already be correctly implemented, so the user does not need to make modifications. The user therefore focuses solely on his or hertheir own code.
 
'';[[Assertion (software development)|Can't happen]]'' -: Very often, code is modified and may introduce a possibility that an "impossible" case occurs. Impossible cases are therefore assumed to be highly unlikely instead.<ref name="robust_programming" /> The developer thinks about how to handle the case that is highly unlikely, and implements the handling accordingly.
 
===Robust machine learning===
Robust machine learning typically refers to the robustness of machine learning algorithms. For a machine learning algorithm to be considered robust, either the testing error has to be consistent with the training error, or the performance is stable after adding some noise to the dataset.<ref>{{cite web |author=El Sayed Mahmoud |url=https://www.researchgate.net/post/What_is_the_definition_of_the_robustness_of_a_machine_learning_algorithm |title=What is the definition of the robustness of a machine learning algorithm? |url=https://www.researchgate.net/post/What_is_the_definition_of_the_robustness_of_a_machine_learning_algorithm |access-date=2016-11-13}}</ref> Recently, consistently with their rise in popularity, there has been an increasing interest in the robustness of neural networks. This is particularly due their vulnerability to adverserial attacks.<ref>{{cite arXiv |last1=Li |first1=Linyi |last2=Xie |first2=Tao |last3=Li |first3=Bo |title=SoK: Certified Robustness for Deep Neural Networks |eprint=2009.04131 |date=9 September 2022|class=cs.LG }}</ref>
 
===Robust network design===
Line 36:
=== Robust algorithms ===
 
There existsexist algorithms that tolerate errors in the input.<ref>{{cite book |last1=Carbin |first1=Michael |title=Proceedings of the 19th international symposium on Software testing and analysis - ISSTA '10 |last2=Rinard |first2=Martin C. |chapter=Automatically identifying critical input regions and code in applications |date=12 July 2010 |pages=37–48 |doi=10.1145/1831708.1831713 |publisher=ACM |isbn=9781605588230 |s2cid=1147058 |chapter-url=http://people.csail.mit.edu/rinard/paper/issta10.pdf |isbn=9781605588230 |s2cid=1147058 }}</ref> or during the computation.<ref name="Danglot">{{cite journal |last1=Danglot |first1=Benjamin |last2=Preux |first2=Philippe |last3=Baudry |first3=Benoit |last4=Monperrus |first4=Martin |title=Correctness attraction: a study of stability of software behavior under runtime perturbation |journal=Empirical Software Engineering |date=21 December 2017 |volume=23 |issue=4 |pages=2086–2119 |doi=10.1007/s10664-017-9571-8 |url=https://hal.archives-ouvertes.fr/hal-01378523/document|arxiv=1611.09187 |s2cid=12549038 }}</ref> In that case, the computation eventually converges to the correct output. This phenomenon has been called "correctness attraction".<ref name="Danglot"/>
 
==See also==
*[[Fault tolerance]]
* [[Defensive programming]]
* [[Non-functional requirement]]