Quality Optimization Initiative

Get Started. It's Free
or sign up with your email address
Rocket clouds
Quality Optimization Initiative by Mind Map: Quality Optimization Initiative

1. High Level Team Goals

1.1. The real question

1.1.1. What high level goals and values are we trying to accomplish within our teams/projects that lead to success and value for our customers and for ourselves?

1.1.2. Bad formulated questions I asked developers

1.1.2.1. If you had to pick 5 high level goals that teams should follow with the hopes that following/achieving these goals would lead to things like pair programming, CI, unit testing, integration testing, automated builds/deploys, frequent releases, maintainable code, and simple design, what would those 5 goals be.

1.2. List of Approved Goals to Chose From

1.2.1. Teams can choose their final list

1.2.1.1. not all teams are on call - they may have different goals

1.2.1.2. some team's output isn't code, testing is a different type of goal for them etc

1.3. Nick's Suggested Goals

1.3.1. TBD Actual Conclusions

1.3.1.1. Build the Right Product

1.3.1.1.1. Proof of Done

1.3.1.2. Confidence

1.3.1.3. Secure Code

1.3.1.4. Bus Factor

1.3.1.4.1. Team Work?

1.3.1.4.2. Something sounds better

1.3.1.4.3. Collaboration

1.3.1.5. Continuous Delivery

1.3.1.6. Effective Environment/Team Dynamics?

1.3.2. Working Ideas

1.3.2.1. Sean's Goals/Areas

1.3.2.1.1. Agile Principles

1.3.2.1.2. Team Dynamics

1.3.2.1.3. Technical Exellence

1.3.2.1.4. Value

1.3.2.1.5. Confidence

1.3.2.1.6. Organizational Support

1.3.2.1.7. Quality

1.4. Survey Questions

1.4.1. Building the Right Product

1.4.1.1. At what level do you believe you have solved the real problem a user had?

1.4.1.2. When a user first sees the outcome of the latest sprint, how do you think they will respond? Wow, that's nice, hmm what was that supposed to do, that's not working right, wtf?

1.4.1.3. How happy was the Product Owner with the last sprint?

1.4.1.4. What was the PO's minimum happiness level for all the stories last sprint?

1.4.1.5. Last sprint, how many times did the team interact with the PO, customer, stakeholders, and users in regards to the sprint backlog?

1.4.1.6. How many stories were completed last sprint that didn't meet the Definition of Done?

1.4.2. Continuous Delivery

1.4.2.1. How long does it take for you to validate and verify a story is complete when a change is made? (minutes, hours, days?)

1.4.2.2. How long does it take for you to validate and verify the baseline is potentially shipable when a change is made? (minutes, hours, days?)

1.4.2.3. How long does it take for a story to be in production once it's created?

1.4.2.4. How lean do you feel your team is? 1-5

1.4.3. Confidence

1.4.3.1. Sprint Confidence

1.4.3.1.1. How confident are you that a sprint is complete and bug free?

1.4.3.1.2. How confident are you that at the end of the sprint your build will run on prod the same as it will on test?

1.4.3.1.3. How confident are you that this sprint has not broken any previous stories?

1.4.3.1.4. If the last sprints work was going to be used to identify a bomb target, how comfortable would you feel going to production tomorrow?

1.4.3.2. Release

1.4.3.2.1. From the last three sprints, how many bugs per sprint do you think the team introduced?

1.4.3.2.2. How cofident do you believe your team will be able to accomplish the next release in the planned time period?

1.4.4. Bus Factor

1.4.4.1. How many people would your team have to lose to really stop progress on any story/feature/baseline/project?

1.4.4.2. How many people last sprint would have to miss work for any single story to stop progress?

1.4.4.3. When a new feature is added to the project how many sprints will it take for the rest of the team to be able to work on it without help from the original implementers?

1.4.4.4. How many people would have to be out of the country for your On Call Return to Service Time to double?

1.4.4.5. How many roadblocks/impediments arose last sprint due to vacation/sickness/missing/absent team members?

1.4.5. Secure Code

1.4.5.1. At what level do you feel the code for this sprint is secure?

1.4.5.2. How confident are you that your code will maintain it's level of secureness with the current DoD and team practices?

1.4.6. Effective Environment?

1.4.6.1. How many times last sprint would a tool have lead to solving a problem sooner or more effectively?

1.4.6.2. How effective do you believe the environment you work in is helping you accomplish your job?

1.4.6.3. How sustainable do you believe your development process / progress is?

1.4.6.4. How well is your team reflecting on how to become more effective and adjusting to those reflections?

1.4.6.5. How much do you feel like your team is a real team, on the same side, striving for the same goals and accomplishments? 1-5?

1.4.6.5.1. Cohesivness?

1.4.6.6. How happy is your team?

1.4.6.7. How often is communication face to face?

1.4.6.8. How often is communication not face to face?

1.4.6.9. Organizational Support

1.4.6.9.1. How do you feel management is supporting your team? 1-5?

1.5. gathered ideas / random things

1.5.1. Team Work

1.5.1.1. Pair Programming

1.5.1.2. Peer Reviews

1.5.1.3. Sprint Planning

1.5.1.3.1. Task Breakdown/Design concurrence

1.5.2. Test yourself before you wreck yourself

1.5.3. Always Deliver Value

1.5.3.1. Spend more time developing features

1.5.3.1.1. less time fixing bugs

1.5.4. Maintainable

1.5.5. Sustainable

1.5.6. Ensure seamless flow between development and downstream activities (testing, deployment)

1.5.6.1. Sustained FLow

1.5.7. Ensure good health of the codebase

1.5.8. Build the right product

1.5.8.1. Simple design

1.5.8.2. good task breakdowns

1.5.8.2.1. push back on "requirements" to understand the real needs and values

1.5.8.2.2. ask the hard questions

1.5.8.3. stakeholder feedback

1.5.8.4. user feedback

1.5.9. Do things you don't already know how to do

1.5.9.1. verticality

1.5.10. Secure Code

1.5.11. Pillars of Agile

1.5.11.1. Self Improvement

1.5.11.2. Product Sense

1.5.11.3. Bus. Value

1.5.11.4. Technical Excellence

1.5.11.5. Confidence

1.5.11.6. Supportive Culture

1.5.11.7. Collaboration

1.5.12. Efficiency

1.5.13. Production Stability

2. Nirvana

2.1. Real Answer

2.1.1. DoD that includes all of our brainstormed goals above

2.1.2. Teams are focusing on gathering data related to the goals and identifying their own survey questions / metrics to improve on them

2.1.3. Teams have automated, continuous delivery

2.1.3.1. Proof of the "right product" can be done against the Production running software

2.1.3.2. small, frequent releases

2.2. Personal Answer

2.2.1. development practices

2.2.1.1. tdd

2.2.1.1.1. unit testing

2.2.1.1.2. integration code level testing

2.2.1.2. atdd

2.2.1.2.1. functional testing

2.2.1.2.2. system level testing

2.2.1.3. pair programming on most everything

2.2.1.4. peer reviews

2.2.1.4.1. 2 people that didn't work on it

2.2.1.4.2. full code walk through for each person

2.2.1.5. simple design

2.2.1.6. continuous integration

2.2.2. Continuous Deliver

2.2.2.1. small, frequent releases

2.2.2.2. push all day long

2.2.2.3. every push has unit level and integration level tests run

2.2.2.4. every push is a build and deploy to test

2.2.2.4.1. every deploy has functional/system level tests run

2.2.2.5. Every build has a red X or green checkmark that it could go to prod

2.2.2.5.1. every green checkmark has a button that pushes it to prod

2.2.3. scrum for all development teams

2.2.3.1. kanban for incredibly high performing teams (including scrum master and product owner, not just dev team members)

2.2.3.2. kanban for operational support teams

3. Improved Definition of Done

3.1. goal

3.1.1. let the team find best implementations to achieve high level goals

3.1.2. help set a standard for long term improvement, not just short term help

3.2. Conditions of Acceptance

3.2.1. proof a story is done

3.2.2. didn't break previous stories

3.2.3. Peer reviewed

3.2.4. Pair programmed

3.2.5. Stakeholder feedback

3.2.6. User Experience Testing

3.2.7. User feedback

3.3. The teams, with the input of stakeholders, create the Definition of Done, not someone else

3.3.1. A Definition of Done created by some else means I am not accountable for meeting it, even if the some else says that I am. A Definition of Done that I create and that results in low quality is my problem to solve.

3.3.2. multi team projects need to share and agree to the DoD

3.4. If a team cannot meet this new DoD

3.4.1. Product Owners, stakeholders and management create an effort budget to solve those shortfalls of doneness so that future iterations can meet the Definition of Done.

3.5. Display the DoD

3.5.1. The Definition of Done is not only on confluence.

3.5.2. It is on your team wall.

3.5.3. It is reviewed at every retrospective.

3.5.4. It is known to your manager and her manager and his manager

3.5.5. Everyone that knows and cares about the product knows what done means.

3.6. Management Announcement

3.6.1. "Our customers (or users) expect the highest reliability and smooth experience from our products. Technical quality and user experience quality must be built-in every iteration. To these end, we will create and follow a Definition of Done. Every product, and teams on every product, shall have a Definition of Done. The Quality Optimization Team of X, Y, and Z and their Agile colleagues will help you create a Definition of Done for your product. All work shall meet your Definition of Done. We require this be competed and put into action 30 days from today."

3.7. reference post from g+ guy (Alan Dayley)

4. Actionable Help

4.1. Setup Redefinition of Done meetings

4.1.1. Each team works with the QOI team to redfine their Definition of Done to align with the team's real world situation and the enterprise quality goals

4.2. a team will identify an inability to meet a goal or DoD Condition of Acceptance

4.2.1. How can we help the enterprise help each other with the experience it already has?

4.3. Bootstrap Days

4.3.1. What Is It?

4.3.1.1. One team (Team A) identifies a need

4.3.1.1.1. QOI? Enterprise? Identifies a team that is successful in that area and helps them lead a bootsrap day

4.3.1.1.2. Maybe these are identified at Retrospectives while a team tries to meet DoD or improve on goals

4.3.1.2. Team B leads a 30m-1hr brownbag on what practices they are using to accomplish this need

4.3.1.2.1. Maybe just a few people from team B

4.3.1.2.2. The entire Team A

4.3.1.2.3. focus on how it fits into their process and how it accomplishes the goal

4.3.1.2.4. This is not so much a teaching how to do X exactly, but a teaching of how X solves their problem and how they use it in their process

4.3.1.3. Same day - Team B and Team A participate in a 4-6 hour cooperative bootstrap activity

4.3.1.3.1. This is to bootstrap Team A with the practices successful to Team B so that A can actually start doing these practices in their next sprint / when they decide

4.3.2. Examples

4.3.2.1. If Team A has slow, complicated builds, with no automated feedback

4.3.2.1.1. Together both teams convert 2 (of the 20) projects to use maven

4.3.2.1.2. automatically run any unit tests

4.3.2.1.3. building in jenkins

4.3.2.1.4. direct feedback to all developers

4.3.2.2. If Team A has little to no automated tests

4.3.2.2.1. Together they add 5 unit tests to the baseline

4.3.2.2.2. 1 simple test

4.3.2.2.3. 1 test using mocking

4.3.2.2.4. 1 test covering one of the hardest parts to test

4.3.2.2.5. automatic running in the baseline

4.3.2.2.6. integrated into jenkins builds, failing a build if a test fails

4.3.2.3. If Team A is manually deploying to test

4.3.2.3.1. Together they convert 2 projects to automatic deployment to test

4.3.2.3.2. Jenkins builds so they can know when a build successfully deployed to test (or failed)

4.4. Focused Retrospectives

4.4.1. Help train scrum masters on facilitating retrospectives to help their team accomplish their goals

4.4.2. Maybe perform a goal focused retro every other time, or every few times

4.4.2.1. SM should be the one to help the team decide how often, but we need to ensure it's not as infrequent as twice a year

4.4.3. reto ideas

4.4.3.1. Happiness Radar

4.4.3.2. Pillars of Agile Spiderweb

4.4.4. Retros can conclude in identifying the need for a bootstrap day

4.4.4.1. How can they ask for this help?

5. Proof of Value

5.1. How do we know we are making progress in the right direction?

5.2. Value is complicated

5.2.1. as teams start to improve quality, automate processes, and deliver higher business value, traditional metrics will suffer and will not be valuable

5.2.1.1. velocity will decrease

5.2.1.1.1. The team will be accomplishing stories better, more tested stories, more maintainable stories

5.2.1.1.2. this takes longer than not testing

5.2.1.2. number of tracked bugs will increase

5.2.1.2.1. because teams are testing better and finding more (existing) bugs, but can't fix everything that's existed for years in 1 sprint

5.2.1.3. time of verification and validation will increase

5.2.1.3.1. we may have been doing the testing ice cream cone

5.2.1.3.2. we might now be testing 50% of the product

5.2.1.3.3. we might now be testing as a team instead of just 1 person

5.2.1.4. return to service time will increase

5.2.1.4.1. a team may want to investigate a problem before restarting everything and lose the ability to do root cause analysis

5.2.2. As a team develop their own successful practices traditional metrics will improve and be valuable

5.2.2.1. As the teams automate things that have always been manual

5.2.2.1.1. velocity will become more consistent

5.2.2.2. as automated tests, pair programming and improved peer review practices are in place

5.2.2.2.1. number of bugs will decrease

5.2.2.3. as automated tests get implemented and easy to do

5.2.2.3.1. time of verification and validation will decrease

5.2.2.4. return to service time will decrease

5.2.2.4.1. as problems disappear or automated practices take place of manual processes

5.2.2.4.2. potentially number of outages might be a better number as maybe they are eliminated all together

5.2.3. When does SEMS or AFWA measure business value?

5.2.3.1. Number of hours worked != business value

5.2.3.2. Story points worked != business value

5.2.3.3. We want teams to accomplish more business value in less points

5.2.3.4. We want teams to ask the hard questions to understand the real problems and develop the right solutions.

5.2.3.4.1. How can we measure that so teams don't ignore it, or teams get credit for excelling here?

5.3. 2 fold measurement of value

5.3.1. team value

5.3.1.1. Identify goals that represent the enterprises defined value

5.3.1.1.1. how we measure these goals are arbitrary and subjective already

5.3.1.1.2. measuring within that first level of subjectivity can lead to quicker identification of progress and solutions, adding real value early

5.3.1.2. Temperature of Goals/Value to the team

5.3.1.2.1. Can we let the team measure/rate themselves on high level goals or the DoD Conditions of Acceptance

5.3.1.2.2. Ways of measuring

5.3.2. traditional value

5.3.2.1. Once a team starts to reach 'knees' in their progress toward goals, start measuring and improving on quantified value

5.3.2.1.1. maybe this 'knee' is when a team agrees to be >2 from a 1-5 scale above

5.3.2.1.2. maybe when the team is no longer in the unhappy row in the happiness radar

5.3.2.2. Identify ways of tracking traditional value from the goals

5.3.2.2.1. help teams start tracking and pulling information out of these statistics

5.3.3. none of these measurements are to be used negatively toward a team

5.3.3.1. these are used to help guide the team toward solving the right problem.

5.3.3.1.1. tracking more bugs means understanding quality is not as high as the team wants it, something not known before

5.3.3.2. let's the team start knowing they are making progress and are successful

5.3.3.3. gives the team confidence in making the right choices because the enterprise recognizes these values and supports them

5.4. references

5.4.1. Scales Agile Framework Metrics