Het vakgebied Software Testen maakt gebruik van een internationaal jargon, waar de International Software Testing Qualifications Board (ISTQB) een rol speelt in het handhaven van een consistente uitleg van de termen en begrippen. We hebben voor u een doorzoekbaar mechanisme gerealiseerd waarmee u niet alleen de woorden kunt vinden, maar ook de definities ervan kunt doorzoeken.
Mocht u een begrip of definitie missen, laat het ons dan weten.
Standard Glossary of Terms used in Software Testing
Er zijn 39 termen in deze lijst die beginnen met de letter I.
An organizational improvement model that serves as a roadmap for initiating, planning, and implementing improvement actions. The IDEAL model is named for the five phases it describes: initiating, diagnosing, establishing, acting, and learning.
A variable (whether stored within a component or outside) that is read by a component.
Any event occurring that requires investigation. [After IEEE 1008]
A measure that can be used to estimate or predict another measure. [ISO 14598]
The assessment of change to the layers of development documentation, test documentation and components, in order to implement a given change to specified requirements.
Recording the details of any incident that occurred, e.g. during testing.
The process of recognizing, investigating, taking action and disposing of incidents. It involves logging incidents, classifying them and identifying the impact. [After IEEE 1044]
incident management tool
A tool that facilitates the recording and status tracking of incidents. They often have workflow-oriented facilities to track and control the allocation, correction and re-testing of incidents and provide reporting facilities. See also defect management tool.
A document reporting on any event that occurred, e.g. during the testing, which requires investigation. [After IEEE 829]
incremental development model
A development lifecycle where a project is broken into a series of increments, each of which delivers a portion of the functionality in the overall project requirements. The requirements are prioritized and delivered in priority order in the appropriate increment. In some (but not all) versions of this lifecycle model, each subproject follows a ‘mini V-model’ with its own design, coding and testing phases.
Testing where components or systems are integrated and tested one or some at a time, until all the components or systems are integrated and tested.
independence of testing
Separation of responsibilities, which encourages the accomplishment of objective testing. [After DO-178b]
A path that cannot be exercised by any set of possible input values.
A review not based on a formal (documented) procedure.
The phase within the IDEAL model where the groundwork is laid for a successful improvement effort. The initiating phase consists of the activities: set context, build sponsorship and charter infrastructure. See also IDEAL.
The set from which valid input values can be selected. See also domain.
An instance of an input. See also input.
A type of peer review that relies on visual examination of documents to detect defects, e.g. violations of development standards and non-conformance to higher level documentation. The most formal review technique and therefore always based on a documented procedure. [After IEEE 610, IEEE 1028] See also peer review.
The capability of the software product to be installed in a specified environment [ISO 9126]. See also portability.
The process of testing the installability of a software product. See also portability testing.
Supplied instructions on any suitable media, which guides the installer through the installation process. This may be a manual guide, step-by-step procedure, installation wizard, or any other similar process description.
Supplied software on any suitable media, which leads the installer through the installation process. It normally runs the installation process, provides feedback on installation results, and prompts for options.
The insertion of additional code into the program in order to collect information about program behavior during execution, e.g. for measuring code coverage.
A software tool used to carry out instrumentation.
A special instance of a smoke test to decide if the component or system is ready for detailed and further testing. An intake test is typically carried out at the start of the test execution phase. See also smoke test.
The process of combining components or systems into larger assemblies.
Testing performed to expose defects in the interfaces and in the interactions between integrated components or systems. See also component integration testing, system integration testing.
integration testing in the large
See system integration testing.
integration testing in the small
See component integration testing.
An integration test type that is concerned with testing the interfaces between components or systems.
The capability of the software product to interact with one or more specified components or systems. [After ISO 9126] See also functionality.
The process of testing to determine the interoperability of a software product. See also functionality testing.
Testing using input values that should be rejected by the component or system. See also error tolerance, negative testing.
See cause-effect diagram.
Testing of individual components in isolation from surrounding components, with surrounding components being simulated by stubs and drivers, if needed.
item transmittal report
See release note.
iterative development model
A development lifecycle where a project is broken into a usually large number of iterations. An iteration is a complete development loop resulting in a release (internal or external) of an executable product, a subset of the final product under development, which grows from iteration to iteration to become the final product.