Create your own awesome maps

Even on the go

with our free apps for iPhone, iPad and Android

Get Started

Already have an account?
Log In

Technical Excellence Health Radar by Mind Map: Technical Excellence Health
Radar
5.0 stars - 4 reviews range from 0 to 5

Technical Excellence Health Radar

Continuous Attention

Technical

Continuous Integration, What percentage of the team commits code on the same day they change it, What percentage of team members commit to the baseline, How easy are your merges (1-5), How effective are your mitigation processes when a feature needs to be disabled/removed from an implementation (1-5)

Pair Programming, What percentage of the sprint does your team pair, What percentage of the team participates in pairing, How frequently do you rotate individuals to pair with (1-5), How frequently do you rotate driver & navigator (1-5), How effective is your team at handling conflict within pairs (1-5), How focused is the navigator on average (1-5)

Peer Reviews, How helpful are peer reviews (1-5), How lean are your peer reviews (1-5), What percentage of code comited gets peer reviewed, How confident do you feel the peer reviewers actually understand and followed the code they reviewed (1-5)

Refactoring, How effective is the frequency of refactoring (1-5), How effective is the average size of refactoring (1-5), How effective is the teams refactoring efforts (1-5)

Iteration

Sustainable Pace, What percentage of the team does not work overtime, On average, what percentage of each sprint gets done, How sustainable do you believe your development process is (1-5), How well is the team taking on the right amount of work (1-5)

Focus, How well known is the sprint goal (1-5), What percentage of a sprint revolve around a sprint goal, What percentage of time during the sprint does the team work on the sprint, How well does the team handle distractions and interruptions during the sprint (1-5)

Technical Debt, How well known is the project's technical debt (1-5), How effective is the team's process in paying off technical debt (1-5), How effective is the current rate of technical debt payment (1-5), What percentage of the baseline is not dead code

Development

Habits

Simple/Emergent Design, What percentage of work is done to accomplish the stories at hand and not potential future stories, What percentage of time does the team do the simplest thing that could possibly work next instead of fully design up front, How testable is the code/design (1-5), How understandable is the code/design (1-5), How browsable is the code/design (1-5), How explainable is the code/design (1-5)

Maintainability, How easy is it to maintain the baseline (1-5), How easy is to maintain new features in the baseline (1-5), What percentage of builds are static analysis tools run against, What percentage of code is not repeated lines of code, How effective is your use of static analysis tools output (1-5), How well known is the code complexity to the team (1-5)

Clean Code, How well are newcomers able to read the code that has been written (1-5), What percentage of comments answer the "why" not the "how", What percentage of methods have less than 3 arguments, What percentage of methods are less than X (7?) lines of code, What percentage of classes are less than 1000 lines of code, What percentage of your tests are written to the same level as production code, What percentage of the code follows an agreed upon coding standard

Test Driven Development, How often do you write a test before the code (1-5), How often do you write the right tests (1-5), Have to fix a failing test because it was written wrong (1-5), How often do you run all the class tests after changing code (1-5), What percentage of the team participates in TDD

Security, How secure is the baseline (1-5), How mindful is the team of vulnerabilities in the system (1-5), How confident are you that your code will maintain it's level of secureness with the current DoD and team practices (1-5)

Confidence

Automated Builds, From a fresh clone how easy is it to build the project (1-5), How efficient are your builds (1-5), A question that suggests builds aren't slowing down the team and aren't a big impediment to improving velocity or quality, How long does the build take? Is it less than 10 minutes?

Unit Tests, How empowered and willing is the team in writing unit tests (1-5), When a test fails how easy is it to identify why (1-5), How much closer did your last sprint get you to your team's unit test goals (1-5), How efficient are your unit tests (1-5), How fast are your unit tests (1-5), What percentage of unit tests are automated

Code Coverage, What percentage of test code coverage is your project at, What percentage of branch coverage is your project at, How much closer did your last sprint get you to your team's code coverage goals (1-5)

Integration Tests, How empowered and willing is the team in writing integration tests (1-5), How often are integration tests run (1-5), How much of your system is covered by integration tests (1-5), What percentage of your integration tests are automated, How fragile are your integration tests (5-1), How often do you have false positives (5-1)

Acceptance Tests, How empowered and willing is the team in writing acceptance tests (1-5), How confident are you that a sprint is complete and bug free (1-5), How easy is it to validate and verify a story is complete when a change is made (1-5), How confident are you that at the end of the sprint your build will run on prod the same as it will on test (1-5), What percentage of acceptance tests are automated

Potentially Shippable

Right Product

Regression Tests, How confident are you that this sprint has not broken any previous stories (1-5), What percentage of regression bugs are found during the sprint, What percentage of regression tests are automated

User Feedback, At what level do you believe you have solved the real problem a user had (1-5), How effective was the interaction with the PO, customer, stakeholders and users last sprint (1-5), What percentage of stories in the sprint involved user feedback

User Acceptance Testing, When a user first sees the outcome of the latest sprint, how do you think they will respond? (wtf, that's not working right, hmm what is that supposed to do, wow that's nice), When a user first sees the outcome of the latest sprint how well understood will the work be (1-5), How effective do you believe the UAT testing processes are (1-5), How efficient do you believe the UAT testing processes are (1-5), How happy was the Product Owner with the last sprint (1-5), What was the PO's minimum happiness level for all the stories last sprint (1-5)

Working Product

System Level Tests, How quickly can you validate and verify the baseline is potentially shipable when a change is made (1-5), If the last sprint's work was used to run a lasik machine, how comfortable would you feel going under the laser tomorrow (1-5), How sturdy (non-fragile) is the team's system level tests (1-5), How easy is it to add new system level tests (1-5)

Performance, How well known is it if the last sprint improved or hurt performance (1-5), What percentage of stories passed the team's minimum performance requirements, How confidant are you that the product will be able to support all the users in production (1-5)

Security Scans, What percentage of builds are security scanned, How timely is your security scan feedback (1-5), What percentage of security scans are automated

Operations

Performance

Stability, # of critical defects - need better questions somehow, How much feedback on critical defects do you get(1-5), How frequently do you address critical outages(1-5), How long does the average critical defect live before being fixed, What is the up time percentage of your system, How effective is the team at fixing critical production bugs (1-5), How efficient is the team at fixing critical production bugs (1-5)

Real User Monitoring (RUM), How quickly would you know when a system is degraded (1-5), How easy is it to gather new application and user statistics (1-5), How well does the team utilize RUM stats in making decisions (1-5), How well known are the most used and least used features of your system (1-5)

Verification

Root Cause Analysis, How empowered and willing is the team in performing root cause analysis when a problem is identified in production (1-5), How easy is to identify what a problem is in production (1-5), How effective are the tools and processes in automatically identifying the root cause of a production problem (1-5)

Implementation, Is the software running in production the same software built by build tool after initial commit?, What percentage of your production configurations are managed by source control, How quick is it to verify the system is functioning correctly after an implementation (1-5), How successful was the last implementation (1-5), How successful is the average implementation (1-5), How repeatable is your deployment process (1-5), How similar are your test and prod deployment processes (1-5)

Health Radar draft

Continuous Delivery

Environment, Control of Test, rename?, ability to deploy to test, empowered by the business to call a test build production ready, Hardware to Support, rename?, something that says do you have the hardware to run continuous builds, integration testing etc, Tools to Support

Development, Commit Often, Continuous Integration, Automated Unit Tests, do we need word "automated"?, if so should others list it?, Code Coverage, Tests Run Quickly, Tests Quickly Identify What is a Problem, rename?, not just that a problem exists, specifically where the problem is, quick, useful feedback, Automated Builds

Testing, Automated Deployments to Test, do we need word "automated", Automated Security Scanning, Integration Tests, System Level Tests, User Acceptance Testing

Production Deployment, Potentially Shipable, User Feedback, rename?, a way to gather feedback once in production, a primary point to continuous delivery is continuous feedback... not as useful to development without feedback, (still useful to business value and users, just not development)

Good Design

Simple Design, Test Driven Development, Focus, rename?, focus on only the sprint goals and sprint backlog, Clean Code, rename?, Maintainability, maybe a separate category?

Quality, Pair Programming, Refactoring, Tools and Resources, Peer Reviews, rename?, maybe not peer reviews specifically, some kind of checks/consensus within the team, Maybe: Team Consensus?, Security

Confidence

Stability, rename?, regression.. mm don't like this, Automated Regression Tests, Technical Debt, metrics, lean project? Reevalute need of features, Number of Critical Defects, rename?, something that shows defects are high/low

Proof of Done, Automated Acceptance Tests, Definition of Done, Verification

Production, Stability, Root Cause Analysis, Performance, Real User Monitoring, Validation, rename?

People

Where do we discuss servant leadership? I'd like to see a "QA" manager that takes the QA folks and makes sure they are getting the appropriate training and tackling the right issues with the team. Of course this goes to UX, Java Devs, Python Devs, etc...

Skills, Cross Functional, Empowered, rename?, basically does the enterprise allow them to do everything they need to do their job or is something holding them back, procedural somethings, Learning/Applying New Skills, Community of Practice, Innovation

Whole Team, Everyone Commits, hg commit, Collective Ownership, Bus Value, rename?, It surprises me how often people haven't heard this term, not sure what else concise to call it, this often get's confused with business value, Team Improvement, rename?, retrospectives, I believe technical excellence is iterated on, much from the retrospective meeting, that should be tracked, Sustainable Pace

Where Do These Go?

Lean Project Feedback, when do we cut features?, maybe a measurement of something else, maybe delete?

Technical Excellence

Definitions

Excellence, the state, quality, or condition of excelling; superiority, with excel being defined as to be better than; surpass-- to surpass or do better than others, Webster

Agile Principle #9, "Continuous attention to technical excellence and good design enhances agility.", Continuous Attention, Beware of routine in your daily business! Keep your concentration high, stay focused on the things you're working on., Values, Focus, Principles, Opportunity, Accepted Responsibility, Reflection, Practices, Motivation, Technical Excellence, Gather the right staff with the right skills to create the product. Cultivate life-long learning to acquire new knowledge and improve your skills., Values, Courage, Simplicity, Focus, Principles, Reflective Improvement, Amplify Learning, Accepted Responsibility, Quality, Practices, Testing, Sprint Retrospective, Motivation, Good Design, Don't create a product which looks nice outside but is a piece of crap inside. Follow a professional honor and build your software product in a way you could be proud of in every detail., Values, Simplicity, Principles, Improvement, Quality, Baby Steps, Incremental, Practices, Pair Programming, Test-Driven Development, Refactoring, Enhances Agility, If we do not have the right skills and the right attitude, we will get stuck in our agile journey. This is the reason why many teams do not proceed after their initial agile success., Values, Openness, Courage, Principles, Reflective Improvement, Practices, Sprint Retrospective

technical excellence means building quality in

Agile Principle #9 - Cloud Space, Not Business Value, Working features, ability to add new features quickly, daily updates on progress, regular planning meetings, Technical Value, clean & easy to understand code, automated tests, clear direction, good tools, snacks, half joke half serious, Examples, Clean code, Technical Debt, Debt isn't bad, a school loan makes sense, Just have to have a plan to work it off and do just that

Metrics to Gather for each Category

hmmm should this just be bubbles off the health diagram so they don't have to be synced up???

Team

Core

Empowered, To what level do you feel you are empowered to do what needs to be done to ensure quality (1-5), What percentage of the Definition of Done and team processes were defined by the team, How lean do you feel your team is (1-5), How do you feel management is supporting your team (1-5)

Collective Ownership, How empowered and willing is the team to make changes on any project within the team (1-5), How empowered and willing is the team to make design decisions on any project within the team (1-5), What level of individual team members mistakes are seen as the team's problems (1-5), When work is held up (by non external dependencies), how empowered and willing is the team to continue with that work (1-5)

Cross Functional, For any given feature what percentage can your team accomplish without external help/dependencies?, What percentage of all the roles needed to accomplish any given story are team members, What percentage of roles on the team can all the team members effectively participate in and accomplish, What percentage of roles on the team are all the team members empowered to participate in and accomplish

Bus Factor, What percentage of the team would you have to lose to really stop progress on any story/feature/baseline/project, What percentage of the team would have to miss work for any single story to stop progress, What percentage of team members would have to be out of the country for your On Call Return to Service Time to double, What percentage of roadblocks were not due to vacation/sick/missing/absent team members, How many roadblocks/impediments arose last sprint due to vacation/sickness/missing/absent team members?, When a new feature is added to the project how many sprints will it take for the rest of the team to be able to work on it without help from the original implementers?

Definition of Done, How well does the team know the definition of done (1-5), How well does the team commit to the definition of done (1-5), How useful is the definition of done (1-5), On average what percentage of the definition of done are stories being accepted at, On average how many stories a sprint are being accepted that meet the definition of done

Adapting

New Skills, How adaptive is your team in learning/applying the required skills to accomplish a sprint (1-5), How adaptive is your team in learning/applying new skills to better meet the definition of done (1-5), How effective are Communities of Practice to your team (1-5), How helpful is your team in presenting at Communities of Practice (1-5), At what level of attendance of conferences is being effectively utilized (1-5)

Team Improvement, How well is your team reflecting on how to become more effective and adjusting to those reflections (1-5), How effective are your team retrospectives in producing positive changes to the team processes (1-5), What percentage of the team dedicates themselves to honestly attempting retrospective items throughout the sprint, What percentage of the team helps each other out in any circumstance when it's needed or could offer substantial help

Innovation, How empowered and willing is the team to innovate (1-5), How innovative is your team (1-5), How innovative is your surrounding environment (1-5)

Support/ Environment

Resources, How much do you feel you have enough hardware to be technically excellent (1-5), How effective are the resources available to you in accomplishing the sprint (1-5), How easy is it to get new resources to support the success of the sprint (1-5), What percentage of the sprint couldn't have been solved sooner or more effectively with more resources

Tools, How effective are the tools available to you in accomplishing the sprint (1-5), How easy is it to get new tools to support the success of the sprint (1-5), How well utilized are the current tools (1-5), What percentage of the sprint couldn't have been solved sooner or more effectively with more tools

Take the Quiz

Some Hard Work