Robustness (computer science): Difference between revisions

Content deleted Content added
Citation bot (talk | contribs)
Add: s2cid, issue. Removed parameters. | You can use this bot yourself. Report bugs here. | Suggested by AManWithNoPlan | All pages linked from cached copy of User:AManWithNoPlan/sandbox2 | via #UCB_webform_linked
Monkbot (talk | contribs)
m Task 18 (cosmetic): eval 10 templates: del empty params (3×); hyphenate params (8×);
Line 1:
{{see also|Fault-tolerant computer system}}
{{Complex systems}}
In [[computer science]], '''robustness''' is the ability of a computer system to cope with errors during execution<ref>{{cite web|url=http://dl.ifip.org/db/conf/pts/testcom2005/FernandezMP05.pdf |title=A Model-Based Approach for Robustness Testing |website=Dl.ifip.org |access-date= |accessdate=2016-11-13}}</ref><ref name="IEEE">1990. IEEE Standard Glossary of Software Engineering Terminology, IEEE Std 610.12-1990 defines robustness as "The degree to which a system or component can function correctly in the presence of invalid inputs or stressful environmental conditions"</ref> and cope with erroneous input.<ref name="IEEE"/> Robustness can encompass many areas of computer science, such as [[Defensive programming|robust programming]], [[Overfitting|robust machine learning]], and [[Robust Security Network]]. Formal techniques, such as [[fuzz testing]], are essential to showing robustness since this type of testing involves invalid or unexpected inputs. Alternatively, [[fault injection]] can be used to test robustness. Various commercial products perform robustness testing of software analysis.<ref>{{cite journal|url=http://www.stanford.edu/~bakerjw/Publications/Baker%20et%20al%20(2008)%20Robustness,%20Structural%20Safety.pdf |title= On the assessment of robustness|accessdateaccess-date=2016-11-13|doi=10.1016/j.strusafe.2006.11.004|volume=30|year=2008|journal=Structural Safety|pages=253–267 | last1 = Baker | first1 = Jack W. | last2 = Schubert | first2 = Matthias | last3 = Faber | first3 = Michael H.|issue= 3}}</ref>
 
== Introduction ==
In general, building robust systems that encompass every point of possible failure is difficult because of the vast quantity of possible inputs and input combinations.<ref name="MIT">{{cite web|url=http://groups.csail.mit.edu/mac/users/gjs/6.945/readings/robust-systems.pdf |title=Building Robust Systems an essay |author=Gerald Jay Sussman |date=January 13, 2007 |website=Groups.csail.mit.edu |accessdateaccess-date=2016-11-13}}</ref> Since all inputs and input combinations would require too much time to test, developers cannot run through all cases exhaustively. Instead, the developer will try to generalize such cases.<ref>{{cite web|last=Joseph |first=Joby |url=http://www.softwaretestingclub.com/profiles/blogs/importance-of-making |title=Importance of Making Generalized Testcases - Software Testing Club - An Online Software Testing Community |publisher=Software Testing Club |date=2009-09-21 |accessdateaccess-date=2016-11-13}}</ref> For example, imagine inputting some [[Integer (computer science)|integer values]]. Some selected inputs might consist of a negative number, zero, and a positive number. When using these numbers to test software in this way, the developer generalizes the set of all reals into three numbers. This is a more efficient and manageable method, but more prone to failure. Generalizing test cases is an example of just one technique to deal with failure—specifically, failure due to invalid user input. Systems generally may also fail due to other reasons as well, such as disconnecting from a network.
 
Regardless, complex systems should still handle any errors encountered gracefully. There are many examples of such successful systems. Some of the most robust systems are evolvable and can be easily adapted to new situations.<ref name="MIT" />
Line 10:
== Challenges ==
Programs and software are tools focused on a very specific task, and thus aren't generalized and flexible.<ref name="MIT" /> However, observations in systems such as the [[internet]] or [[biological system]]s demonstrate adaptation to their environments. One of the ways biological systems adapt to environments is through the use of redundancy.<ref name="MIT" /> Many organs are redundant in humans. The [[kidney]] is one such example. [[Human]]s generally only need one kidney, but having a second kidney allows room for failure. This same principle may be taken to apply to software, but there are some challenges.
When applying the principle of redundancy to computer science, blindly adding code is not suggested. Blindly adding code introduces more errors, makes the system more complex, and renders it harder to understand.<ref>{{cite web|url=http://www.cse.sc.edu/~huhns/journalpapers/V6N2.pdf |title=Building Robust Systems an essay |author=Agents on the wEb : Robust Software |website=Cse.sc.edu |accessdateaccess-date=2016-11-13}}</ref> Code that doesn't provide any reinforcement to the already existing code is unwanted. The new code must instead possess equivalent [[function (engineering)|functionality]], so that if a function is broken, another providing the same function can replace it, using manual or automated [[software diversity]]. To do so, the new code must know how and when to accommodate the failure point.<ref name="MIT" /> This means more [[logic]] needs to be added to the system. But as a system adds more logic, [[Software component#Software component|components]], and increases in size, it becomes more complex. Thus, when making a more redundant system, the system also becomes more complex and developers must consider balancing redundancy with complexity.
 
Currently, computer science practices do not focus on building robust systems.<ref name="MIT" /> Rather, they tend to focus on [[scalability]] and [[Algorithmic efficiency|efficiency]]. One of the main reasons why there is no focus on robustness today is because it is hard to do in a general way.<ref name="MIT" />
Line 17:
 
=== Robust programming ===
Robust programming is a style of programming that focuses on handling unexpected termination and unexpected actions.<ref name="robust_programming">{{cite web|url=http://nob.cs.ucdavis.edu/bishop/secprog/robust.html |title=Robust Programming |website=Nob.cs.ucdavis.edu |access-date= |accessdate=2016-11-13}}</ref> It requires code to handle these terminations and actions gracefully by displaying accurate and unambiguous [[error message]]s. These error messages allow the user to more easily debug the program.
 
==== Principles ====
Line 29:
 
===Robust machine learning===
Robust machine learning typically refers to the robustness of machine learning algorithms. For a machine learning algorithm to be considered robust, either the testing error has to be consistent with the training error, or the performance is stable after adding some noise to the dataset.<ref>{{cite web|author=El Sayed Mahmoud |url=https://www.researchgate.net/post/What_is_the_definition_of_the_robustness_of_a_machine_learning_algorithm |title=What is the definition of the robustness of a machine learning algorithm? |access-date= |accessdate=2016-11-13}}</ref>
 
===Robust network design===
Robust network design is the study of network design in the face of variable or uncertain demands.<ref>{{cite web|url=http://www-math.mit.edu/~olver/thesis.pdf |title=Robust Network Design |website=Math.mit.edu |accessdateaccess-date=2016-11-13}}</ref> In a sense, robustness in network design is broad just like robustness in software design because of the vast possibilities of changes or inputs.
 
=== Robust algorithms ===