Maturity Models

Get Started. It's Free
or sign up with your email address
Maturity Models by Mind Map: Maturity Models

1. Software Engineering Institute (SEI)

1.1. Capability Maturity Model (CMM)

1.1.1. http://www.sei.cmu.edu/cmmi/

1.1.2. The ‘parent’ / ‘grandfather‘ of the majority of maturity models is the Capability Maturity Model (CMM) published by the Software Engineering Institute (SEI) based at Carnegie Mellon University, Pittsburgh, US in 1986.

1.1.3. 5 Levels of maturity

1.1.3.1. Level 1: Initial

1.1.3.1.1. Processes are ad-hoc, chaotic, or actually few processes are defined.

1.1.3.2. Level 2: Repeatable

1.1.3.2.1. Basic processes are established and there is a level of discipline to stick to these processes.

1.1.3.3. Level 3: Defined

1.1.3.3.1. All processes are defined, documented, standardized and integrated into each other.

1.1.3.4. Level 4: Managed

1.1.3.4.1. Processes are measured by collecting detailed data on the processes and their quality.

1.1.3.5. Level 5: Optimising

1.1.3.5.1. Continuous process improvement is adopted and in place by quantitative feedback and from piloting new ideas and technologies.

1.2. Capability Maturity Model Integrated (CMMI)

2. ITSMF

2.1. IT Service Management Capability Maturity Model (ITSM-CMM)

2.1.1. aligned to ITIL®

3. Sun Microsystems

3.1. Information Lifecycle Management (ILM) Maturity Model

3.1.1. 5 Levels of maturity

3.1.1.1. Level 1: Chaotic

3.1.1.2. Level 2: Reactive

3.1.1.3. Level 3: Proactive

3.1.1.4. Level 4: Optimized

3.1.1.5. Level 5: Self-aware

4. Oracle

4.1. Oracle SOA Maturity Model

4.1.1. 6 Levels of maturity

4.1.1.1. Level 0: No SOA

4.1.1.1.1. There is no SOA approach being taken. SOA is not underway.

4.1.1.2. Level 1: Ad Hoc

4.1.1.2.1. Awareness of SOA exists and some groups are embarking on building services. There is no SOA plan being followed.

4.1.1.3. Level 2: Opportunistic

4.1.1.3.1. An approach has been decided upon and is being opportunistically applied. The approach has not been widely accepted nor adopted. It may be informally defined, or if documented, may exist primarily as “shelf ware”.

4.1.1.4. Level 3: Systematic

4.1.1.4.1. The approach has been reviewed and accepted by affected parties. There has been buy-in to the documented approach and the approach is always (or nearly always) followed.

4.1.1.5. Level 4: Managed

4.1.1.5.1. The capability is being measured and quantitatively managed via some type of governance structure. Appropriate metrics are being gathered and reported.

4.1.1.6. Level 5: Optimized

4.1.1.6.1. Metrics are being consistently gathered and are being used to incrementally improve the capability. Assets are proactively maintained to ensure relevancy and correctness.

4.2. Oracle Cloud Computing Maturity Model

4.2.1. 6 Levels of maturity

4.2.1.1. Level 0: None

4.2.1.1.1. There is no Cloud approach being taken. No elements of Cloud are being implemented.

4.2.1.2. Level 1: Ad Hoc

4.2.1.2.1. Awareness of Cloud Computing is established and some groups are beginning to implement elements of Cloud Computing. There is no cohesive Cloud Computing plan being followed.

4.2.1.3. Level 2: Opportunistic

4.2.1.3.1. An approach has been decided upon and is being opportunistically applied. The approach has not been widely accepted and redundant or overlapping approaches exist. It may be informally defined, or if documented, may exist primarily as “shelf ware”.

4.2.1.4. Level 3: Systematic

4.2.1.4.1. The approach has been reviewed and accepted by affected parties. There has been buy-in to the documented approach and the approach is always (or nearly always) followed.

4.2.1.5. Level 4: Managed

4.2.1.5.1. The capability is being measured and quantitatively managed via some type of governance structure. Appropriate metrics are being gathered and reported.

4.2.1.6. Level 5: Optimized

4.2.1.6.1. Metrics are being consistently gathered and are being used to incrementally improve the capability. Assets are proactively maintained to ensure relevancy and correctness. The potential for market mechanisms to be used to leverage inter-cloud operations has been established.

4.3. Oracle Master Data Management (MDM) Maturity Model

4.3.1. 4 Levels of maturity

4.3.1.1. Level 1: Marginal

4.3.1.2. Level 2: Stable

4.3.1.3. Level 3: BestPractice

4.3.1.4. Level 4: Transformational

5. The Open Group

5.1. Open Group Service Integration Maturity Model (OSIMM)

5.1.1. 7 Levels of maturity

5.1.1.1. Level 1: Silo

5.1.1.2. Level 2: Integrated

5.1.1.3. Level 3: Componentized

5.1.1.4. Level 4: Service

5.1.1.5. Level 5: Composite Services

5.1.1.6. Level 6: Virtualized Services

5.1.1.7. Level 7: Dynamically Re-Configurable Services

5.1.2. official website

5.1.2.1. http://www.opengroup.org/soa/source-book/osimmv2/

5.2. Open Information Security Management Maturity Model (O-ISM3)

5.2.1. 5 Levels of maturity

5.2.1.1. Level 1: Initial

5.2.1.2. Level 2: Managed

5.2.1.3. Level 3: Defined

5.2.1.4. Level 4: Controlled

5.2.1.5. Level 5: Optimized

5.2.2. official website

5.2.2.1. http://www.ism3.com/

5.2.3. official publication

5.2.3.1. https://www2.opengroup.org/ogsys/jsp/publications/PublicationDetails.jsp?publicationid=12238

6. Digital Asset Management

6.1. DAM Maturity Model (DAM3)

6.1.1. 5 Levels of maturity

6.1.1.1. Level 1: Ad Hoc

6.1.1.1.1. Exposure to the application of DAM technologies, including managing repositories and workflow systems.

6.1.1.2. Level 2: Incipient

6.1.1.2.1. Casual understanding of DAM technologies, often starting in the form of content management systems and centralised document repositories.

6.1.1.3. Level 3: Formative

6.1.1.3.1. Demonstrated experience with implementation of named DAM systems and core competencies, such as ingestion, cataloging, transformation, transcoding, distribution, etc.

6.1.1.4. Level 4: Operational

6.1.1.4.1. Managing repositories and workflow systems is core to IT with organised knowledge transfer.

6.1.1.5. Level 5: Optimal

6.1.1.5.1. Understanding and participating in forecasting enterprise DAM needs in preparation of future business requirements.

7. RIMS

7.1. RIMS Risk Maturity Model for Enterprise Risk Management

7.1.1. 5 Levels of maturity

7.1.1.1. Level 1: Ad Hoc

7.1.1.2. Level 2: Initial

7.1.1.3. Level 3: Repeatable

7.1.1.4. Level 4: Managed

7.1.1.5. Level 5: Leadership

8. European Foundation for Quality Management (EFQM)

8.1. EFQM Excellence Model (aka. INK model, “Instituut voor Nederlandse Kwaliteit")

9. Information Security

9.1. Security Awareness Maturity Model

9.1.1. 5 Levels of maturity

9.1.1.1. Level 1: Non-Existant Program

9.1.1.2. Level 2: Compliance Focused

9.1.1.3. Level 3: Promoting Awareness & Change

9.1.1.4. Level 4: Long Term Sustainment

9.1.1.5. Level 5: Metrics

9.1.2. http://www.securingthehuman.org/blog/2012/05/22/security-awareness-maturity-model

9.1.3. https://www.youtube.com/watch?v=qopLSlEYv9Q&list=UUYzwGkfOqrevO-4TuTjPLwQ

9.2. Information Security Management Maturity Model (aka. ISM-cubed or ISM3)

9.2.1. http://www.lean.org/FuseTalk/Forum/Attachments/ISM3_v2.00-HandBook.pdf

9.3. "Pragmatic Security Metrics – Applying Metametrics to Information Security" - Brotby & Hinson, 2013 p. 47

9.3.1. 5 Levels of maturity

9.3.1.1. Level 1: Ad hoc

9.3.1.1.1. Information security risks are handled on an entirely informational basis. Processes are undocumented and relatively unstable.

9.3.1.2. Level 2: Repeatable but intuitive

9.3.1.2.1. There is an emerging appreciation of information security. Security processes are not formally documented, depending largely on employee’s knowledge and experience.

9.3.1.3. Level 3: Defined process

9.3.1.3.1. Information security activities are formalized throughout the organization using policies, procedures, and security awareness.

9.3.1.4. Level 4: Managed and measurable

9.3.1.4.1. Information security activities are standardized using policies, procedures, defined and assigned roles and responsibilities, etc., and metrics are introduced for routing security operations and management purposes.

9.3.1.5. Level 5: Optimized

9.3.1.5.1. Metrics are used to drive systematic information security improvements, including strategic activities.

10. ASL BiSL Foundation

10.1. BiSL® Maturity Model

10.1.1. 5 Levels of maturity

10.1.1.1. Level 1: Initial

10.1.1.1.1. The organization does not have a stable environment in which Business Information Management processes are executed. There are however some attempts and sometimes activities are executed in order to acquire insight and knowledge. The results and the outcomes of the activities are usually unpredictable.

10.1.1.2. Level 2: Repeatable

10.1.1.2.1. The organization executes activities repetitively. Previous experience and ways of working are used for the execution of activities. Signs of a standard way of working are appearing.

10.1.1.3. Level 3: Defined and managed

10.1.1.3.1. The activities and processes are defined and documented. The processes have been well thought through. The processes have also been designed and implemented to provide quantitative and qualitative indicators that the organization can use for control and adjustment.

10.1.1.4. Level 4: Optimizing

10.1.1.4.1. The organization is characterized by continual process improvement. Mechanisms and processes have been developed to enable ongoing and controlled improvements to the process.

10.1.1.5. Level 5: Chain

10.1.1.5.1. The focus of the organization during the design and implementation, the improvement, and the mutual adjustment of processes all focus on increasing the added value within the process chain in which they participate.

10.2. ASL®2 Maturity Model

10.2.1. 5 Levels of maturity

10.2.1.1. Level 1: Initial

10.2.1.1.1. The organization does not have a stable environment in which Business Information Management processes are executed. There are however some attempts and sometimes activities are executed in order to acquire insight and knowledge. The results and the outcomes of the activities are usually unpredictable.

10.2.1.2. Level 2: Repeatable

10.2.1.2.1. The organization executes activities repetitively. Previous experience and ways of working are used for the execution of activities. Signs of a standard way of working are appearing.

10.2.1.3. Level 3: Defined and managed

10.2.1.3.1. The activities and processes are defined and documented. The processes have been well thought through. The processes have also been designed and implemented to provide quantitative and qualitative indicators that the organization can use for control and adjustment.

10.2.1.4. Level 4: Optimizing

10.2.1.4.1. The organization is characterized by continual process improvement. Mechanisms and processes have been developed to enable ongoing and controlled improvements to the process.

10.2.1.5. Level 5: Chain

10.2.1.5.1. The focus of the organization during the design and implementation, the improvement, and the mutual adjustment of processes all focus on increasing the added value within the process chain in which they participate.

11. Pink Elephant

11.1. ITIL Process Maturity

11.1.1. 6 Levels of maturity

11.1.1.1. Level 0: Absence

11.1.1.1.1. “There is absolutely no evidence of any activities supporting the process”

11.1.1.2. Level 1: Initiation

11.1.1.2.1. “There are ad-hoc activities present, but we are not aware of how they relate to each other within a single process”

11.1.1.3. Level 2: Awerness

11.1.1.3.1. “We are aware of the process but some activities are still incomplete or inconsistent; there is no overall measuring or control”

11.1.1.4. Level 3: Control

11.1.1.4.1. “The process is well defined, understood and implemented”

11.1.1.5. Level 4 Integration

11.1.1.5.1. “Inputs from this process come from other well controlled processes; outputs from this process go to other well controlled processes”

11.1.1.6. Level 5 Optimization

11.1.1.6.1. “This process drives quality improvements and new business opportunities beyond the process”

11.1.2. http://www.pinkelephant.com/DocumentLibrary/UploadedContents/PinkLinkArticles/ITIL%20Process%20Maturity.pdf

12. OMG

12.1. Business Process Maturity Model (BPMM)

12.1.1. 5 Levels of maturity

12.1.1.1. Level 1: Initial

12.1.1.2. Level 2: Managed

12.1.1.3. Level 3: Standardized

12.1.1.4. Level 4: Predictable

12.1.1.5. Level 5: Innovating

13. Search Enginuity

13.1. SEO Maturity Model

13.1.1. 5 Stages of maturity

13.1.1.1. Stage 1: Initial / Ad-hoc

13.1.1.2. Stage 2: Repeatable

13.1.1.3. Stage 3: Defined

13.1.1.4. Stage 4: Managed & Measured

13.1.1.5. Stage 5: Optimized

14. ION Interactive

14.1. Search Marketing Maturity Model

14.1.1. 5 Levels of maturity

14.1.1.1. Level 1: Ad hoc

14.1.1.1.1. No management, no budget, no real structure.

14.1.1.2. Level 2: Engaged

14.1.1.2.1. Some executive awareness, basic keyword research, limited testing of landing pages.

14.1.1.3. Level 3: Structured

14.1.1.3.1. Actual budget, management responsibility, more detailed keyword research, some competitive research/insight.

14.1.1.4. Level 4: Managed

14.1.1.4.1. Active executive participation, daily management, professional bid management of keywords, competitive benchmarking.

14.1.1.5. Level 5: Optimized

14.1.1.5.1. Strategic executive participation, significant budget, integrated multi-channel analytics, segmentation of audiences etc.

15. This freeware, non-commercial mind map was carefully hand crafted with passion and love for learning and constant improvement ... (please share, like and give feedback - your feedback and comments are my main motivation for further elaboration. THX!)

15.1. Questions / issues / errors? What do you think about my work? Your comments are highly appreciated. Please don't hesitate to contact me for :-) Mirosław Dąbrowski, Poland/Warsaw.

15.1.1. http://www.miroslawdabrowski.com

15.1.2. http://www.linkedin.com/in/miroslawdabrowski

15.1.3. https://www.google.com/+MiroslawDabrowski

15.1.4. https://play.spotify.com/user/miroslawdabrowski/

15.1.5. https://twitter.com/mirodabrowski

15.1.6. miroslaw_dabrowski

16. TMMi Foundation

16.1. Testing Maturity Model (TMM)

16.1.1. 5 Levels of maturity

16.1.1.1. Level 1: Initial

16.1.1.2. Level 2: Definition

16.1.1.3. Level 3: Integration

16.1.1.4. Level 4: Management and measurement

16.1.1.5. Level 5: Optimisation

17. Martin Hopkinson

17.1. The Project Risk Maturity Model (RMM)

17.1.1. 4 Levels of maturity

17.1.1.1. Level 1: Naïve

17.1.1.1.1. Although a project risk management process may have been initiated, its design or application is fundamentally flawed. At this level, it is likely that the process does not add value.

17.1.1.2. Level 2: Novice

17.1.1.2.1. The project risk management process influences decisions taken by the project team in a way that is likely to lead to improvements in project performance as measured against its objectives. However, although the process may add value, weaknesses with either the process design or its implementation result in significant benefits being unrealised.

17.1.1.3. Level 3: Normalised

17.1.1.3.1. The project risk management process is formalised and implemented systematically. Value is added by implementing effective management responses to significant sources of uncertainty that could affect the achievement of project objectives.

17.1.1.4. Level 4: Natural

17.1.1.4.1. The risk management process leads to the selection of risk-efficient strategic choices when setting project objectives and choosing between options for project solutions or delivery. Sources of uncertainty that could affect the achievement of project objectives are managed systematically within the context of a team culture conducive to optimising project outcomes.

18. Forrester

18.1. IT Governance Maturity Model

18.2. Information Security Maturity Model

19. Berenschot

19.1. Project Excellence Model® (PEM)

19.1.1. Date: 2003

19.1.2. Derived from EFQM / INK model

19.1.2.1. EFQM / INK model is suitable for traditional organizations but not for project organizations.

19.1.2.2. Berenschot adapted the EFQM / INK model to a model that is appropriate for project organizations.

19.1.3. PEM consists of 12 key areas that are divided over:

19.1.3.1. organization area

19.1.3.1.1. consist of critical success factors (CSFs) in the project organization

19.1.3.2. results area

19.1.3.2.1. consist of project success criteria

19.1.4. PEM consists of 2 types of result areas:

19.1.4.1. narrow (easy tangible and measurable elements)

19.1.4.1.1. ‘classic’ areas like time, costs and quality

19.1.4.2. broad (contain elements that can not be translated so easily in measurable terms)

19.1.4.2.1. client

19.1.4.2.2. project personnel

19.1.4.2.3. contracting partners

19.1.4.2.4. users

19.1.4.2.5. stakeholders

20. AXELOS

20.1. Relationships between P2MM, P1M3, P2M3, P3M3.

20.2. PRINCE2® Maturity Model (P2MM)

20.2.1. PRINCE2® Maturity Model (P2MM)

20.2.1.1. http://www.p3m3-officialsite.com/nmsruntime/saveasdialog.aspx?lID=462&sID=210

20.2.2. PRINCE2® Maturity Model (P2MM) Self-Assessment

20.2.2.1. http://www.p3m3-officialsite.com/nmsruntime/saveasdialog.aspx?lID=469&sID=210

20.2.3. Derived from P3M3®

20.2.4. Dedicated to PRINCE2®

20.3. Project Management Maturity Model (P1M3)

20.3.1. Since P2MM is focused specifically on the PRINCE2 project management method and therefore this model is only applicable to organizations that use the PRINCE2 method. In order to overcome this problem the Project Management Maturity Model (P1M3) is developed which is an abstraction of P2MM.

20.3.2. Abstraction of P2MM. P1M3 is, unlike P2MM, method independent and can be applied in any organization and is not only restricted to organizations where PRINCE2 is used as is the case with P2MM.

20.3.3. P1M3 is restricted to project management.

20.4. Project and Programme Management Maturity Model (P2M3)

20.4.1. P2M3 is an extension of P1M3.

20.4.2. Supports project and programme management.

20.5. Portfolio, Programme, Project Management Maturity Model (P3M3®)

20.5.1. P3M3 is an extension of P2M3 and P2MM.

20.5.2. v1 was published in 2006.

20.5.3. v2 was published in 2008.

20.5.4. Official website

20.5.4.1. http://www.p3m3-officialsite.com/

20.5.5. P3M3® Introduction and Guide

20.5.5.1. http://www.p3m3-officialsite.com/nmsruntime/saveasdialog.aspx?lID=456&sID=235

20.5.6. Viewed independently for:

20.5.6.1. Portfolio (PfM3)

20.5.6.1.1. http://www.p3m3-officialsite.com/nmsruntime/saveasdialog.aspx?lID=457&sID=210

20.5.6.1.2. P3M3® Portfolio Management Self-Assessment

20.5.6.2. Programme (PgM3)

20.5.6.2.1. http://www.p3m3-officialsite.com/nmsruntime/saveasdialog.aspx?lID=464&sID=210

20.5.6.2.2. P3M3® Programme Management Self-Assessment

20.5.6.3. Project (PjM3)

20.5.6.3.1. http://www.p3m3-officialsite.com/nmsruntime/saveasdialog.aspx?lID=458&sID=210

20.5.6.3.2. P3M3® Project Management Self-Assessment

20.5.7. 5 Levels of maturity

20.5.7.1. Level 1 - Awareness of process

20.5.7.2. Level 2 - Repeatable process

20.5.7.3. Level 3 - Defined process

20.5.7.4. Level 4 - Managed process

20.5.7.5. Level 5 - Optimised process

20.5.8. 7 Perspectives

20.5.8.1. Management Control

20.5.8.2. Benefits Management

20.5.8.3. Finance Management

20.5.8.4. Risk Management

20.5.8.5. Organisation Improvement

20.5.8.6. Organisation Governance

20.5.8.7. Resources

20.5.8.8. Each perspective describes the processes and practices that should be deployed at each level of maturity

20.5.8.9. Perspectivs group together a range of key characteristics

20.5.9. 5 Common attributes

20.5.9.1. Planning effectively

20.5.9.2. Stakeholder involvement

20.5.9.3. Information & Configuration Management

20.5.9.4. Quality

20.5.9.5. Capability building

20.5.9.6. Common attributes are themes that are common in each perspective and can be found at each level

20.5.10. Watch: P3M3® Self Assessment Tool

20.6. Management of Value (MoV®) Maturity Model

20.6.1. Derived from P3M3®

20.6.1.1. as stated in official publication:

20.6.1.1.1. "The MoV maturity model mimics the P3M3 structure outlined in Figure D.1."

20.6.2. 5 Levels of maturity

20.6.2.1. Level 1 – Awareness process

20.6.2.2. Level 2 – Repeatable process

20.6.2.3. Level 3 – Defined process

20.6.2.4. Level 4 – Managed process

20.6.2.5. Level 5 – Optimized process

20.7. Management of Risk (M_o_R®) Maturity Model

20.7.1. Derived from P3M3®

20.7.2. 5 Levels of maturity

20.7.2.1. Level 1: Initial

20.7.2.2. Level 2: Repeatable

20.7.2.3. Level 3: Defined

20.7.2.4. Level 4: Managed

20.7.2.5. Level 5: Optimised

20.8. Portfolio, Programme, Project (P3O®) Offices Maturity Model

20.8.1. Derived from P3M3®

20.8.1.1. as stated in official publication:

20.8.1.1.1. "P3M3 provides a very useful input into the development of a P3O blueprint in terms of current and target portfolio, programme and project management capabilities."

20.9. ITIL® Maturity Model

20.9.1. http://www.axelos.com/gempdf/ITIL_Maturity_Model_v1_2W.pdf

20.9.2. Maturity level definitions are aligned with COBIT® and CMMI® definitions

20.9.3. 5 Levels of maturity

20.9.3.1. Level 1: Initial

20.9.3.2. Level 2: Repeatable

20.9.3.3. Level 3: Defined

20.9.3.4. Level 4: Managed

20.9.3.5. Level 5: Optimized

21. PMI

21.1. Organizational Project Management Maturity Model (OPM3®)

21.1.1. 4 Levels of maturity

21.1.1.1. Level 1: Standardize

21.1.1.2. Level 2: Measure

21.1.1.3. Level 3: Control

21.1.1.4. Level 4: Continually Improve

21.1.2. official website

21.1.2.1. http://opm3online.pmi.org/

22. Harold Kerzner

22.1. Kerzner Project Management Maturity Model (KPM3 or K-PMMM)

22.1.1. https://www.iil.com/kpm3/

22.1.2. 5 Levels of maturity

22.1.2.1. Level 1: Common Language

22.1.2.2. Level 2: Common Process

22.1.2.3. Level 3: Singular Methodology

22.1.2.4. Level 4: Benchamarking

22.1.2.5. Level 5: Continuous Improvement

23. Gartner

23.1. Gartner, Program and Portfolio Management (PPM) Maturity Model

23.2. Web Analytics Maturity Model

23.2.1. 5 Levels of maturity

24. PMO Tools

24.1. The PMO Maturity Cube

24.1.1. http://pmotools.org/pmocube/index.php/homeController

24.1.2. 3 Maturity Models

24.1.2.1. Service: A.1.7 - Managing one or more portfolios (Scope: Enterprise/Approach: Strategic)

24.1.2.1.1. Level 0: The PMO does not provide this service.

24.1.2.1.2. Level 1: The PMO maintains a list of active projects throughout the organization.

24.1.2.1.3. Level 2: The PMO maintains a list of active projects and programs throughout the organization and establishes their prioritization but does not follow a structured portfolio management process.

24.1.2.1.4. Level 3: The PMO maintains a list of active projects and portfolios, prioritizes them throughout the organization, and establishes formal processes, acting as facilitator in the definition (identification, categorization, evaluation, selection), development (prioritize, balance and commitment) and implementation (monitoring, review and change management) of the portfolio.

24.1.2.1.5. Level 4: The PMO maintains a list of active projects and portfolios, prioritizes them throughout the organization, and establishes formal processes, acting as facilitator in the definition (identification, categorization, evaluation, selection), development (prioritize, balance and commitment) and implementation (monitoring, review and change management) of the portfolio. The PMO uses an integrated system to automate the organization's portfolio management process.

24.1.2.2. Service: A.2.1 - Develop and implement the project management methodology (Scope: Enterprise/Approach: Tactical)

24.1.2.2.1. Level 0: The PMO does not provide this service.

24.1.2.2.2. Level 1: The PMO has developed a basic methodology for the organization, but it is not used consistently on all projects.

24.1.2.2.3. Level 2: The PMO has developed a standard methodology for the organization, aligning possible existing methodologies in different areas, and the methodology used in most projects in the organization.

24.1.2.2.4. Level 3: The PMO has developed a standard methodology for the organization, and it is used by all projects as it is mandatory unless a specific waiver is requested and approved.

24.1.2.2.5. Level 4: The PMO has developed and improved the standard methodology for the organization focusing on best practices and continuous improvement.

24.1.2.3. Service: A.3.3 - Monitor and control project /program performance (Scope: Enterprise/Approach: Operational)

24.1.2.3.1. Level 0: The PMO does not provide this service.

24.1.2.3.2. Level 1: The PMO monitors and controls the project /program performance considering time, cost, quality and customer satisfaction and provides follow-up reports without analysis upon request.

24.1.2.3.3. Level 2: The PMO monitors and controls the project /program performance considering time, cost, quality, and customer satisfaction and analyzes the available data.

24.1.2.3.4. Level 3: The PMO monitors and controls the project /program performance considering time, cost, quality, and customer satisfaction, analyzes data, and takes preventive and corrective actions working proactively with project /program manager and senior management.

25. PM Solutions

25.1. PM Solutions’ Project Management Maturity Model (PMS-PMMM)

25.1.1. Follows the Software Engineering Institute's (SEI) Capability Maturity Model's (CMM®)

25.1.2. Examines maturity development across ten knowledge areas in the Project Management Institute's (PMI®) A Guide to the Project Management Body of Knowledge (PMBOK® Guide)

25.1.3. http://www.pmsolutions.com/resources/view/what-is-the-project-management-maturity-model/

25.1.4. 5 Levels of maturity

25.1.4.1. Level 1: Initial process

25.1.4.2. Level 2: Structured process and standards

25.1.4.3. Level 3: Organizational standards and institutionalized process

25.1.4.4. Level 4: Managed process

25.1.4.5. Level 5: Optimizing process

25.2. PM Solutions’ Project Portfolio Management Maturity Model (PMS-PPMMM)

25.2.1. 5 Levels of maturity

25.2.1.1. Level 1: Initial process

25.2.1.2. Level 2: Structured process and standards

25.2.1.3. Level 3: Organizational standards and institutionalized process

25.2.1.4. Level 4: Managed process

25.2.1.5. Level 5: Optimizing process

26. MINCE2 Foundation

26.1. Maturity INcrements IN Controlled Environments (MINCE)

26.1.1. http://www.mince2.com/

26.1.2. http://www.mince2.org/

26.1.3. 5 Levels of maturity

26.1.3.1. Level 1: Activities

26.1.3.2. Level 2: Processes

26.1.3.3. Level 3: Systems

26.1.3.4. Level 4: Supply Chain

26.1.3.5. Level 5: Quality

26.1.3.6. Levels are derived form EFQM Excellence Model.

26.1.4. Tower I: People

26.1.5. Tower II: Methods and techniques

26.1.6. Tower III: Customer

26.1.7. Tower IV: Realization

26.1.8. Tower V: Knowledge

26.1.9. Tower VI: Supporting services

27. University of California (Berkeley)

27.1. Project Management Process Maturity (PM)2 Model

27.1.1. http://www.ce.berkeley.edu/~ibbs/yhkwak/pmmaturity.html

27.1.2. 5 Levels of maturity

27.1.2.1. Level 1: Ad-hoc

27.1.2.1.1. No PM processes or practices are consistently available

27.1.2.1.2. No PM data are consistently collected or analyzed

27.1.2.1.3. Functionally isolated

27.1.2.1.4. Lack of senior management support

27.1.2.1.5. Project success depends on individual efforts

27.1.2.2. Level 2: Planned

27.1.2.2.1. Informal PM processes are defined

27.1.2.2.2. Informal PM problems are identified

27.1.2.2.3. Informal PM data are collected

27.1.2.2.4. Team oriented ~weak!

27.1.2.2.5. Organizations possess strengths in doing similar work

27.1.2.3. Level 3: Managed at project level

27.1.2.3.1. Formal project planning and control systems are

27.1.2.3.2. managed

27.1.2.3.3. Formal PM data are managed

27.1.2.3.4. Team oriented ~medium!

27.1.2.3.5. Informal training of PM skills and practices

27.1.2.4. Level 4: Managed at corporate level

27.1.2.4.1. Multiple PM ~program management!

27.1.2.4.2. PM data and processes are integrated

27.1.2.4.3. PM processes data are quantitatively analyzed,

27.1.2.4.4. measured, and stored

27.1.2.4.5. Strong teamwork

27.1.2.4.6. Formal PM training for project team

27.1.2.5. Level 5: Continuous learning

27.1.2.5.1. PM processes are continuously improved

27.1.2.5.2. PM processes are fully understood

27.1.2.5.3. PM data are optimized and sustained

27.1.2.5.4. Project-driven organization

27.1.2.5.5. Dynamic, energetic, and fluid organization

27.1.2.5.6. Continuous improvement of PM processes and practices

27.1.3. Date: 2002

27.1.4. Authors:

27.1.4.1. Young Hoon Kwak, Ph.D

27.1.4.2. C. William Ibbs, Ph.D