Data-driven testing: Difference between revisions

Content deleted Content added
 
(67 intermediate revisions by 48 users not shown)
Line 1:
'''Data-driven testing''' (DDT), also known as '''table-driven testing''' or '''parameterized testing''', is a [[software testing]] technique that uses a table of [[data]] that directs test execution by encoding input, expected output and test-environment settings.<ref>{{cite web |title=golang/go TableDrivenTests |url=https://github.com/golang/go/wiki/TableDrivenTests |website=GitHub |language=en}}</ref><ref>{{cite web |title=JUnit 5 User Guide |url=https://junit.org/junit5/docs/current/user-guide/#writing-tests-parameterized-tests |website=junit.org}}</ref> One advantage of DDT over other testing techniques is relative ease to cover an additional [[test case (software)|test case]] for the [[system under test]] by adding a line to a table instead of having to modify test [[source code]].
{{Context|date=October 2009}}
'''Data-driven testing''' is a term used in the testing of [[computer]] [[software]] for the creation of re-usable test logic to reduce maintenance and improve test coverage. It is part of the discipline of [[Test automation|automated testing]] and is a methodology used in [[test automation]] where [[test script]]s are executed and verified based on the data values stored in one or more central data sources or [[database]]s. Anything that has a potential to change (also called "Variability" and includes such as environment, end points, test data and locations, etc), is separated out from the test logic (scripts) and moved into an 'external asset'. This can be a configuration or test dataset. More data can later be added or the configuration changed to reuse the same test logic and execute multiple data scenarios.
==Introduction==
In the testing of [[Computer software|software]] or [[Computer program|programs]], several methodologies are available for implementing this testing. Each of these methods co-exist because they differ in the effort required to create and subsequently maintain.
==Methodology Overview==
* '''Data-driven testing''' is the creation of interacting test scripts together with their related data results in a framework used for the methodology. In this framework, variables are used for both input values and output verification values: navigation through the [[System Under Test|program]], reading of the data sources, and logging of test status and information are all coded in the test script. The logic executed in the script is dictated by the data values.
 
Often, a table provides a complete set of stimulus input and expected outputs in each row of the table. Stimulus input values typically cover values that correspond to boundary or partition input spaces.
* '''[[Keyword-driven testing]]''' is similar except that the test case is contained in the set of data values and not embedded or "hard-coded" in the test script itself. The script is simply a "driver" (or delivery mechanism) for the data that is held in the data source
 
DDT involves a [[software framework |framework]] that executes tests based on input data. The framework is a [[software reuse |re-usable]] test asset that can reduce maintenance of a test [[codebase]]. DDT allows for anything that has a potential to change to be segregated from the framework; stored in an external asset. The framework might manage storage of tables and test results in a [[database]] such as [[data pool]], [[Data access object |DAO]] and [[ActiveX Data Objects |ADO]]. An advanced framework might harvest data from a running system using a purpose-built tool (sniffer). The DDT framework can playback harvested data for [[regression testing]].
The databases used for data-driven testing can include:-
 
* datapools
Automated test suites contain user interactions through the system's GUI, for repeatable testing. Each test begins with a copy of the "before" image reference database. The "user interactions" are replayed through the "new" GUI version and result in the "post test" database. The ''reference'' "post test" database is compared to the "post test" database, using a tool.<ref>{{cite web |url=http://www.diffkit.org/ |title=Home |website=diffkit.org}}</ref> Differences reveal probable regression. Navigation the [[System Under Test]] user interface, reading data sources, and logging test findings may be coded in the table.
* [[Open Database Connectivity|ODBC]] sources
 
* [[Comma-separated values|csv files]]
* '''[[Keyword-driven testing]]''' is similar except that the logic for the test case itself is containedencoded as data values in the form of a set of data"action valueswords", and not embedded or "hard-coded" in the test script itself. The script is simply a "driver" (or delivery mechanism) for the data that is held in the data source .
* [[Microsoft Office Excel|Excel]] files
* [[Data Access Object|DAO]] objects
* [[ActiveX Data Objects|ADO]] objects, etc.
 
==See also==
* [[{{Annotated link |Control table]]}}
{{Portal|Software Testing}}
* [[{{Annotated link |Keyword-driven testing]]}}
* [[Control table]]
* {{Annotated link |Test automation framework}}
* [[Keyword-driven testing]]
* {{Annotated link |Test-driven development}}
* [[Test Automation Framework]]
* {{Annotated link |Modularity-driven testing}}
* [[Test-Driven Development]]
* {{Annotated link |Model-based testing}}
* [[Hybrid Automation Framework]]
* Data Driven Testing
* Meta Data Driven Testing
* Modularity-driven testing
* Hybrid testing
* Model-based testing
 
==References==
{{reflistReflist}}
 
{{refbegin}}
* Carl Nagle: ''Test Automation Frameworks''[httphttps://safsdev.sourceforge.net/FRAMESDataDrivenTestAutomationFrameworks.htm], Software Automation Framework Support on SourceForge [http://safsdev.sourceforge.net/Default.htm], Data-driven testing approach [https://www.katalon.com/resources-center/tutorials/data-driven-testing/#]{{refend}}
{{refend}}
 
[[Category:Software testing]]