ISTQB CTAL-Advanced Test Analyst
This course provides test engineers with advanced skills in test analysis, design, and execution. This hands-on course provides test engineers with the ability to define and carry out the tasks required to put the test strategy into action. The course will teach attendees how to analyze the system, taking into account the user’s quality expectations. They will learn how to evaluate system requirements as part of formal and informal reviews, using their understanding of the business domain to determine requirement validity. Attendees will know how to analyze, design, implement, and execute tests, using risk considerations to determine the appropriate effort and priority for tests. They will be able to report on testing progress and provide necessary evidence to support their evaluations of system quality. Attendees will learn how to implement a testing effort that supports the explicit and implicit testing objectives.
By the end of this course, an attendee should be able to:
- Perform the appropriate testing activities based on the software development lifecycle being used.
- Determine the proper prioritization of the testing activities based on the information provided by the risk analysis.
- Select and apply appropriate testing techniques to ensure that tests provide an adequate level of confidence, based on defined coverage criteria.
- Provide the appropriate level of documentation relevant to the testing activities.
- Determine the appropriate types of functional testing to be performed.
- Assume responsibility for the usability testing for a given project.
- Effectively participate in formal and informal reviews with stakeholders, applying knowledge of typical mistakes made in work products.
Created by Rex Black, President of RBCS, past President of the International Software Testing Qualifications Board, past President of the American Software Testing Qualifications Board, and co-author of the International Software Testing Qualifications Board Advanced Syllabus, this course is also ideal for testers and test teams preparing for certification. It covers the International Software Testing
Qualifications Board Advanced Level Syllabus Test Analyst 2012, and has been accredited by an ISTQB-recognized National Board.
Through presentation, discussion, and hands-on exercises, attendees will learn to:
- Explain how and why the timing and level of involvement for the Test Analyst varies when working with different lifecycle models
- Summarize the activities performed by the Test Analyst in support of planning and controlling the testing
- Analyze a given scenario, including a project description and lifecycle model, to determine the appropriate tasks for the Test Analyst during the analysis and design phases
- Explain why test conditions should be understood by the stakeholders
- Analyze a project scenario to determine the most appropriate use for low-level (concrete) and high-level (logical) test cases
- Describe the typical exit criteria for test analysis and test design and explain how meeting those criteria affect the test implementation effort
- For a given scenario, determine the steps and considerations that should be taken when executing tests
- Explain why accurate test case execution status information is important
- Provide examples of work products that should be delivered by the Test Analyst during test closure activities
- Explain the types of information that must be tracked during testing to enable adequate monitoring and controlling of the project
- Provide examples of good communication practices when working in a 24-hour testing environment
- For a given project situation, participate in risk identification, perform risk assessment and propose appropriate risk mitigation
- Explain the use of cause-effects graphs
- Write test cases from a given specification item by applying the equivalence partitioning test design technique to achieve a defined level of coverage
- Write test cases from a given specification item by applying the boundary value analysis test design technique to achieve a defined level of coverage
- Write test cases from a given specification item by applying the decision table test design technique to achieve a defined level of coverage
- Write test cases from a given specification item by applying the state transition test design technique to achieve a defined level of coverage
- Write test cases from a given specification item by applying the pairwise test design technique to achieve a defined level of coverage
- Write test cases from a given specification item by applying the classification tree test design technique to achieve a defined level of coverage
- Write test cases from a given specification item by applying the use case test design technique to achieve a defined level of coverage
- Explain how user stories are used to guide testing in an Agile project
- Write test cases from a given specification item by applying the domain analysis test design technique to achieve a defined level of coverage
- Analyze a system, or its requirement specification, in order to determine likely types of defects to be found and select the appropriate specification-based technique(s)
- Describe the application of defect-based testing techniques and differentiate their use from specification-based techniques
- Analyze a given defect taxonomy for applicability in a given situation using criteria for a good taxonomy
- Explain the principles of experience-based techniques, and the benefits and drawbacks compared to specification-based and defect-based techniques
- For a given scenario, specify exploratory tests and explain how the results can be reported
- For a given project situation, determine which specification-based, defect-based or experience-based techniques should be applied to achieve specific goals
- Explain by example what testing techniques are appropriate to test accuracy, suitability, interoperability and compliance characteristics
- For the accuracy, suitability and interoperability characteristics, define the typical defects to be targeted
- For the accuracy, suitability and interoperability characteristics, define when the characteristic should be tested in the lifecycle
- For a given project context, outline the approaches that would be suitable to verify and validate both the implementation of the usability requirements and the fulfillment of the user's expectations
- Explain why review preparation is important for the Test Analyst
- Analyze a use case or user interface and identify problems according to checklist information provided in the syllabus
- Analyze a requirements specification or user story and identify problems according to checklist information provided in the syllabus
- Explain how phase containment can reduce costs
- Explain the information that may be needed when documenting a non-functional defect
- Identify, gather and record classification information for a given defect
- Explain the purpose of root cause analysis
- Explain the benefits of using test data preparation tools, test design tools and test execution tools
- Explain the Test Analyst’s role in keyword-driven automation
- Explain the steps for troubleshooting an automated test execution failure
The course runs for four days, with three hours set aside on the fourth day for the ISTQB Advanced Test Analyst exam if desired. Each day is about 360 minutes of class time, from 8:00 to 4:30. For accredited course offerings, material is covered as described. For custom courses, material may be deleted, added, or expanded upon as needed.
Please note that timings are approximate, depending on attendee interest and discussion. All of the lectures include exercises and/or knowledge-check questions except as noted.
The following shows this session plan in relationship to the chapters and sections of the ISTQB Advanced Syllabus Test Analyst.
Introduction and Review (60 minutes)
1.0 Testing Processes (300 minutes)
1.2 Testing in the Software Development Lifecycle (15 minutes, no exercises)
1.3 Test Planning, Monitoring and Control (15 minutes, no exercises)
1.4 Test Analysis (75 minutes, 1 exercise)
1.5 Test Design (90 minutes, 1 exercise)
1.6 Test Implementation (15 minutes, no exercises)
1.7 Test Execution (60 minutes, 1 exercise)
1.8 Evaluating Exit Criteria and Reporting (15 minutes, no exercises)
1.9 Test Closure Activities (15 minutes, no exercises)
2.0 Test Management: Responsibilities for the Test Analyst (90 minutes)
2.2 Test Progress Monitoring and Control (15 minutes, no exercises)
2.3 Distributed, Outsourced and Insourced Testing (15 minutes, no exercises)
2.4 The Test Analyst’s Tasks in Risk-Based Testing (60 minutes, 1 exercise)
3.0 Test Techniques (825 minutes)
3.2 Specification-Based (585 minutes, 9 exercises)
3.3. Defect-Based (90 minutes, 1 exercise)
3.4 Experienced-Based (150 minutes, 2 exercises)
4.0 Testing Software Quality Characteristics (120 minutes)
4.2 Quality Characteristics for Business Domain Testing (120 minutes, 1exercise)
5.0 Reviews (165 minutes)
5.1 Introduction (15 minutes, 0 exercises)
5.2 Using Checklists in Reviews (150 minutes)
6.0 Defect Management (10 minutes)
6.1 When Can a Defect be Detected? (15 minutes, 0 exercises)
6.2 Defect Report Fields (15 minutes, 0 exercises)
6.3 Defect Classification (75 minutes, 1 exercise)
6.4 Root Cause Analysis (15 minutes, 0 exercises)
7.0 Test Tools (45 minutes)
7.1 Test Tools and Automation (45 minutes, 0 exercises)
The class materials include a bibliography of books related to software testing, project management, quality, and other topics of interest to the test professional.