Quality Lead

Get Started. It's Free
or sign up with your email address
Rocket clouds
Quality Lead by Mind Map: Quality Lead

1. Characteristics

1.1. Interpersonal

1.1.1. Bold

1.1.2. Upfront

1.1.3. Diplomatic

1.1.4. Understanding

1.1.5. Gels well with team (Dev and Test)

1.2. Ethical

1.2.1. Follows Process

1.2.2. Admits to mistakes or misses if any

1.2.3. Does not let friendship or interpersonal relations get in the way of work or process

1.3. Qualities

1.3.1. Assertive

1.3.2. Conflict Management

1.3.2.1. Internal

1.3.2.2. External

1.3.3. Customer Centric

2. Product/Application

2.1. Different Systems (internal and external) Interacting with the current application

2.2. Identifying Gaps (if any) through the understanding of the interaction points

2.3. Areas the developed applications are touching upon (Application, API, Data, UAM, etc)

2.4. Identifying Software/System dependencies

2.5. Is the Product/Application built for Maersk meeting MS vision? How to merge business requirement with MS roadmap and vision

3. Process

3.1. Testing Process Definition

3.2. Triage meetings decision (Daily, weekly, etc)

3.3. Communications

3.3.1. Email

3.3.1.1. Determine the recipients - Who is it intended for?

3.3.1.2. Level of detail - Is it enough for the recipient

3.3.2. Testing Updates - Stand up (Internal and External)

3.3.3. Raising Impediments

3.4. Defect Logging Process and Lifecycle

3.4.1. Internal

3.4.2. External (Maersk/Production Defects)

3.5. Testing Approach (Sprint wise plan - conducting functional and automation testing)

3.6. Scope Changes (How are they being handled?)

3.7. DOD and DOR Definitions - Are they clear and being followed?

3.8. Requirement Traceability Matrix

3.9. Defect Traceability (To Test Case and in turn User Story available?)

3.10. Metrics

3.11. Testing Summary - What all information to include

3.12. Sprint Demos

3.13. Azure DevOps (CI/CD pipelines etc)

3.14. Dashboard

3.15. DHR Portals, DX Reviews

3.16. Learnings from current sprints and process refinements in future sprints

4. Testing

4.1. Types of Testing Involved

4.1.1. Functional (UI, API, ETL, etc)

4.1.1.1. Buddy Testing

4.1.1.2. Feature Testing

4.1.1.3. Integration Testing

4.1.1.4. BVT and Sanity Testing

4.1.1.5. Regression Testing

4.1.2. Automation (UI + API)

4.1.2.1. Framework Decision

4.1.2.1.1. Keyword driven

4.1.2.1.2. Hybrid

4.1.2.1.3. BDD

4.1.2.1.4. Page Object Model

4.1.2.2. Automation Strategy

4.1.3. Performance

4.1.4. Security

4.1.5. Localization (Countries involved?)

4.1.5.1. China

4.1.5.2. Netherlands

4.1.5.3. Any other countries?

4.1.6. Compatibility (Browser and Device if any)

4.1.6.1. Browser

4.1.6.1.1. Chrome

4.1.6.1.2. Firefox

4.1.6.1.3. IE

4.1.6.1.4. Edge

4.1.6.2. Device

4.1.6.2.1. iOS

4.1.6.2.2. Android

4.1.7. Regression - When and how often

4.1.8. Report

4.2. Testing Schedule

4.2.1. Initial Schedule

4.2.2. Scope Changes vs Schedule Changes

4.2.3. Schedule impacts due to Regression

4.2.4. Does Ramp up impact Schedule - if yes, by how much?

4.3. Risks and Mitigation Plan - Continual process

4.4. Dependencies

4.5. Assumptions

4.6. Scope - Are they Updated regularly with scope change and sent for approval?

4.6.1. In

4.6.2. Out

4.7. Entry and Exit Criteria (Definition by phases) - Are they meeting?

4.8. Tools and License requirements

4.8.1. VSTS

4.8.2. Selenium

4.8.3. Postman

4.8.4. DevOps

4.8.5. AppInsights

4.8.6. Defect Management and Testing Tasks Monitoring Tools

4.8.6.1. JIRA

4.8.6.2. VSTS

4.8.6.3. Any other tool options if available

4.9. IP Reuse?

5. Clarifications or Queries

5.1. Tool or Technology

5.1.1. Identify the forums

5.1.2. DLs

5.2. Business

5.2.1. Knowledge Transfer sessions

5.2.2. Q & A sessions

5.3. Product - Who to reach out to for product issues or queries if any

5.3.1. Azure Team

5.3.2. DB Team (Cosmos Product Team)

5.3.3. Visual Studio

6. Project Resourcing (Origin, UAM, Data, etc.)

6.1. Internal

6.1.1. Resource Requirements (Functional, Automation, Performance, etc.)

6.1.2. Team Structure

6.1.3. Escalation Process (Internal)

6.1.4. Reviewers

6.1.5. Approvers

6.1.6. Resource Planning post production?

6.1.7. Roles and responsibilities

6.1.8. Review and Revisit Resource Loading

6.1.9. Team Management

6.1.10. Expectation Setting

6.1.11. Resource Skillset assessment

6.1.12. Performance Trakcing and Feedback

6.2. External (with Maersk)

6.2.1. Team Structure

6.2.2. Escalation Process (People to reach out to) - External

6.2.3. Introductory Sessions with Team Members

6.2.4. Reviewers and Approvers

6.2.5. Roles and responsibilities

7. Business (Supply Chain Management)

7.1. Business Understanding

7.1.1. Challenges faced earlier

7.1.2. What is the application trying to achieve

7.1.3. How is the business problem being solved

7.2. Identifying or understanding upcoming scope

7.3. Stakeholders involved

7.3.1. Origin

7.3.1.1. Stakeholder Management

7.3.1.2. Communication Plan

7.3.2. UAM

7.3.2.1. Stakeholder Management

7.3.2.2. Communication Plan

7.3.3. Other areas?

7.4. User base (Types of Users)

7.4.1. Admins

7.4.2. End Users

7.4.3. Others?

7.5. Customer Satisfaction and Feedback

8. Project Reference Documents

8.1. SOW

8.2. Compass

8.3. Virtuoso

8.4. Architecture Documents

8.5. Resource Loading

9. Methodology

9.1. Agile

9.1.1. Sprint Planning

9.1.2. Backlog Refinement and Prioritization

9.1.3. Daily Stand Ups

9.1.4. Sprint Retrospectives

9.1.5. Sprint Demos

9.2. Waterfall

9.3. Iterative

9.4. Watergile

10. Defect Management

10.1. Severity and Priority Definition

10.2. Defect Triaging and Communication (Internal and External)

10.3. Defect Lifecycle and Status

10.4. Defects and Sprint backlog relation (how and when defects impact sprint backlogs)

10.5. Defect Leakage

10.6. RCA

10.7. Defect Tracking

10.8. Production or UAT Leaked Defect Conversion to Test Case

11. Requirements

11.1. Functional

11.1.1. Epics

11.1.2. Features

11.1.3. User Stories

11.1.3.1. Requirement Analysis and Gap identification

11.1.3.2. Review

11.1.3.3. User Story Update to close identified gaps

11.1.4. Change Requests

11.2. Non Functional

11.2.1. Performance

11.2.1.1. Benchmarks Defined?

11.2.2. Security

11.2.3. Usability

11.2.4. Disaster Recovery

11.2.5. Others applicable?

12. Environments

12.1. Test Environment

12.2. Dev Environment

12.3. Performance Environment

12.4. Automation Environment

12.5. UAT Environment

12.6. Production Environment

12.7. Any other?