Content deleted Content added
Rescuing 1 sources and tagging 0 as dead.) #IABot (v2.0.9.3 |
Citation bot (talk | contribs) Removed URL that duplicated identifier. Removed parameters. | Use this bot. Report bugs. | Suggested by Headbomb | Linked from Wikipedia:WikiProject_Academic_Journals/Journals_cited_by_Wikipedia/Sandbox | #UCB_webform_linked 509/967 |
||
(3 intermediate revisions by 3 users not shown) | |||
Line 1:
{{Short description|Type of software bug}}
A '''software regression''' is a type of [[software bug]] where a feature that has worked before stops working. This may happen after changes are applied to the software's [[source code]], including the addition of new [[software feature|features]] and bug fixes.<ref name=wong-issre-97>{{cite book |last1=Wong |first1=W. Eric |last2=Horgan |first2=J.R. |last3=London |first3=Saul |last4=Agrawal |first4=Hira |title=Proceedings of the Eighth International Symposium on Software Reliability Engineering (ISSRE 97) |date=1997 |publisher=IEEE |isbn=0-8186-8120-9
* ''Local'' – a change introduces a new bug in the changed module or component.
* ''Remote'' – a change in one part of the software breaks functionality in another module or component.
* ''Unmasked'' – a change unmasks an already existing bug that had no effect before the change.
Regressions are often caused by [[Hotfix|encompassed bug fixes]] included in [[software patch]]es. One approach to avoiding this kind of problem is [[regression testing]]. A properly designed [[test plan]] aims at preventing this possibility before releasing any software.<ref>{{cite book |last=Richardson |first=Jared |author2=Gwaltney, William Jr |title=Ship It! A Practical Guide to Successful Software Projects |url=https://archive.org/details/shipitpracticalg0000rich/page/32 |year=2006 |publisher=The Pragmatic Bookshelf |___location=Raleigh, NC |pages=[https://archive.org/details/shipitpracticalg0000rich/page/32 32, 193] |isbn=978-0-9745140-4-8 }}</ref> [[Automated testing]] and well-written [[Test case (software)|test case]]s can reduce the likelihood of a regression.
==Prevention and detection==
Techniques have been proposed that try to prevent regressions from being introduced into software at various stages of development, as outlined below.
===Prior to release===
Line 15:
{{main|Regression testing}}
In order to avoid regressions being seen by the [[end-user]] after release, developers regularly run [[regression tests]] after changes are introduced to the software. These tests can include [[unit tests]] to catch local regressions as well as [[integration tests]] to catch remote regressions.<ref>{{cite book |last1=Leung |first1=Hareton K.N. |last2=White |first2=Lee |title=Proceedings of the International Conference on Software Maintenance |date=November 1990 |publisher=IEEE |___location=San Diego, CA, USA |isbn=0-8186-2091-9
For detecting performance regressions, [[software performance testing|software performance tests]] are run on a regular basis, to monitor the response time and resource usage metrics of the software after subsequent changes.<ref>{{cite journal |last1=Weyuker |first1=E.J. |last2=Vokolos |first2=F.I. |title=Experience with performance testing of software systems: issues, an approach, and case study |journal=IEEE Transactions on Software Engineering |date=December 2000 |volume=26 |issue=12 |pages=1147–1156 |doi=10.1109/32.
===Prior to commit===
Since [[debugging]] and localizing the root cause of a software regression can be expensive,<ref>{{cite book |last1=Nistor |first1=Adrian |last2=Jiang |first2=Tian |last3=Tan |first3=Lin |title=Proceedings of the Working Conference on Mining Software Repositories (MSR) |date=May 2013 |pages=237–246
==Localization==
Line 31:
A common technique used to localize functional regressions is [[Bisection (software engineering)|bisection]], which takes both a buggy commit and a previously working commit as input, and tries to find the root cause by doing a binary search on the commits in between.<ref>{{cite book |last1=Gross |first1=Thomas |title=Proceedings of the International Workshop on Automatic Debugging |date=10 September 1997 |publisher=Linkøping University Electronic Press |pages=185–191 |url=https://ep.liu.se/en/conference-article.aspx?series=ecp&issue=1&Article_No=15 |language=en |chapter=Bisection Debugging}}</ref> [[Version control]] systems such as Git and [[Mercurial]] provide built-in ways to perform bisection on a given pair of commits.<ref>{{cite web |title=Git - git-bisect Documentation |url=https://git-scm.com/docs/git-bisect |website=git-scm.com |access-date=7 November 2021}}</ref><ref>{{cite web |title=hg - bisect |url=https://www.selenic.com/mercurial/hg.1.html |website=www.selenic.com |publisher=Mercurial |access-date=7 November 2021}}</ref>
Other options include directly associating the result of a regression test with code changes;<ref>{{cite web |title=Reading 11: Debugging |url=https://web.mit.edu/6.005/www/fa15/classes/11-debugging/ |website=web.mit.edu |publisher=MIT}}</ref> setting divergence breakpoints;<ref>{{cite book |last1=Buhse |first1=Ben |last2=Wei |first2=Thomas |last3=Zang |first3=Zhiqiang |last4=Milicevic |first4=Aleksandar |last5=Gligoric |first5=Milos |title=Proceedings of the International Conference on Software Engineering: Companion Proceedings (ICSE-Companion) |date=May 2019 |pages=15–18
===Performance regressions===
[[Profiling (computer programming)|Profiling]] measures the performance and resource usage of various components of a program, and is used to generate data useful in debugging performance issues. In the context of software performance regressions, developers often compare the [[call stack|call trees]] (also known as "timelines") generated by profilers for both the buggy version and the previously working version, and mechanisms exist to simplify this comparison.<ref>{{cite journal |last1=Ocariza |first1=Frolin S. |last2=Zhao |first2=Boyang |title=Localizing software performance regressions in web applications by comparing execution timelines |journal=Software Testing, Verification and Reliability |date=2021 |volume=31 |issue=5 |pages=e1750 |doi=10.1002/stvr.1750 |s2cid=225416138 |url=https://onlinelibrary.wiley.com/doi/abs/10.1002/stvr.1750 |language=en |issn=1099-1689|url-access=subscription }}</ref> [[Web development tools]] typically provide developers the ability to record these performance profiles.<ref>{{cite web |title=Analyze runtime performance |url=https://developer.chrome.com/docs/devtools/evaluate-performance/ |website=Chrome Developers |publisher=Google |access-date=7 November 2021 |language=en}}</ref><ref>{{cite web |title=Performance analysis reference - Microsoft Edge Development |url=https://docs.microsoft.com/en-us/microsoft-edge/devtools-guide-chromium/evaluate-performance/reference |website=docs.microsoft.com |publisher=Microsoft |access-date=7 November 2021 |language=en-us}}</ref>
Logging also helps with performance regression localization, and similar to call trees, developers can compare systematically-placed performance logs of multiple versions of the same software.<ref>{{cite book |last1=Yao |first1=Kundi |last2=B. de Pádua |first2=Guilherme |last3=Shang |first3=Weiyi |last4=Sporea |first4=Steve |last5=Toma |first5=Andrei |last6=Sajedi |first6=Sarah |title=Proceedings of the International Conference on Performance Engineering |date=30 March 2018 |publisher=Association for Computing Machinery |isbn=978-1-4503-5095-2 |pages=127–138 |url=https://dl.acm.org/doi/abs/10.1145/3184407.3184416 |chapter=Log4Perf: Suggesting Logging Locations for Web-based Systems' Performance Monitoring|doi=10.1145/3184407.3184416 |s2cid=4557038 }}</ref> A tradeoff exists when adding these performance logs, as adding many logs can help developers pinpoint which portions of the software are regressing at smaller granularities, while adding only a few logs will also reduce overhead when executing the program.<ref>{{cite journal |title=A Qualitative Study of the Benefits and Costs of Logging from Developers' Perspectives |journal=IEEE Transactions on Software Engineering |date=30 January 2020 |doi=10.1109/TSE.2020.2970422
Additional approaches include writing performance-aware unit tests to help with localization,<ref>{{cite book |last1=Heger |first1=Christoph |last2=Happe |first2=Jens |last3=Farahbod |first3=Roozbeh |title=Proceedings of the International Conference on Performance Engineering |date=21 April 2013 |publisher=Association for Computing Machinery |isbn=978-1-4503-1636-1 |pages=27–38 |url=https://dl.acm.org/doi/abs/10.1145/2479871.2479879 |chapter=Automated root cause isolation of performance regressions during software development|doi=10.1145/2479871.2479879 |s2cid=2593603 }}</ref> and ranking subsystems based on performance counter deviations.<ref>{{cite book |last1=Malik |first1=Haroon |last2=Adams |first2=Bram |last3=Hassan |first3=Ahmed E. |title=Proceedings of the International Symposium on Software Reliability Engineering |date=November 2010 |pages=201–210
==See also==
|