ISTQB Advanced Test Analyst 6.-7. Defect Management & Test Tools

Lancez-Vous. C'est gratuit
ou s'inscrire avec votre adresse e-mail
ISTQB Advanced Test Analyst 6.-7. Defect Management & Test Tools par Mind Map: ISTQB Advanced Test Analyst 6.-7. Defect Management & Test Tools

1. 7. Test Tools

1.1. Test Design Tools

1.1.1. used to help create test cases and test data to be applied for testing

1.1.2. often designed and built to work with particular formats and particular products such as specific requirements management tools

1.1.3. Needed to obtain

1.1.3.1. the targeted level of test coverage

1.1.3.2. confidence in the system

1.1.3.3. product risk mitigation actions

1.2. Test Data Preparation Tools

1.2.1. Can “scrub” or “anonymize” where large volumes of realistic data are required

1.2.2. Are able to analyze a document such as a requirements document or even the source code

1.3. Automated Test Execution Tools

1.3.1. Needed

1.3.1.1. increase coverage while reducing costs AND

1.3.1.1.1. To run more tests

1.3.1.1.2. To run tests that would be impossible to run manually (i.e., large data validation tests)

1.3.1.1.3. test execution more repeatable

1.3.1.1.4. same test in many environments

1.3.2. High ROI for

1.3.2.1. automating regression tests

1.3.2.1.1. low-level of maintenance

1.3.2.1.2. repeated execution

1.3.2.2. automating smoke tests

2. 6. Defect Management

2.1. DEFECT

2.1.1. static testing

2.2. FAILURE

2.2.1. dynamic testing

2.3. ACTIONABLE DEFECT REPORT

2.3.1. Complete - all the necessary info

2.3.2. Concise - no extra info

2.3.3. Accurate - the info is correct and clearly states the expected and actual results as well as the proper steps to reproduce

2.3.4. Objective - the report is a professionally written statement of fact

2.4. Defect Classification

2.4.1. newly identified defects

2.4.1.1. Project activity – e.g., review, audit, inspection, coding, testing

2.4.1.2. Suspected cause of defect

2.4.1.3. Repeatability

2.4.1.4. Symptom,e.g., crash, hang, user interface error

2.4.2. investigated defects

2.4.2.1. Root cause

2.4.2.2. Source

2.4.2.3. Type – e.g., logic problem, computational problem

2.4.3. fixed defect

2.4.3.1. Resolution – e.g., code change, documentation change

2.4.3.2. Corrective action – e.g., requirements review, code review, unit test

2.5. Root Cause Analysis

2.5.1. Typical root causes

2.5.1.1. Unclear/ Missing/ Wrong requirement

2.5.1.2. Incorrect design / interface implementation

2.5.1.3. Code logic error

2.5.1.4. Calculation error

2.5.1.5. Hardware error

2.5.1.6. Interface error

2.5.1.7. Invalid data

2.5.2. helps an organization to monitor the benefits of effective process changes

2.5.3. quantify the costs of the defects that can be attributed to a particular root cause

3. 7. Test Tools ->Improving the Success of the Automation Effort

3.1. Possible benefits

3.1.1. test execution time more predictable

3.1.2. regression and def. valid. faster and reliable

3.1.3. provide better regression per build or iteration

3.2. Possible risks

3.2.1. may be difficult to maintain

3.2.2. direct tester involvement may be reduced

3.2.2.1. less defect detection

3.2.3. insufficient skills to use the automated tools

3.2.4. irrelevant tests may be automated

3.2.5. pesticide paradox

3.3. Test Levels

3.3.1. commonly used during system and integration testing levels

3.3.2. may be used forcomponent testing level (API testing )

3.4. Keyword-Driven Automation

3.4.1. primary advantages

3.4.1.1. keywords can be defined by domain experts

3.4.1.2. can benefit person with domain expertise

3.4.1.3. easier to maintain

3.4.1.4. TC spec. are independent of their impl.

3.4.2. TA is responsle to execute the keyword-driven TC and to analyze any failures that may occur

3.5. Causes for Failures

3.5.1. insufficient flexibility in the usage of the testing tool

3.5.2. insufficient programming skills

3.5.3. unrealistic expectation