Top Banner
NASCIO represents state chief information officers and infor- mation technology executives and managers from state governments across the United States. For more information visit www.nascio.org. Copyright © 2009 NASCIO All rights reserved 201 East Main Street, Suite 1405 Lexington, KY 40507 Phone: (859) 514-9153 Fax: (859) 514-9166 Email: [email protected] Data Governance Part II: Maturity Models – A Path to Progress NASCIO: Representing Chief Information Officers of the States Introduction In the previous report on Data Governance 1 an overview of data governance was presented describing the foundational issues that drive the necessity for state government to pursue a deliberate effort for managing its key information assets. Data governance or governance of data, information and knowledge assets resides within the greater umbrella of enterprise architecture and must be an enterprise- wide program. There is a significant cost to state government when data and informa- tion are not properly managed. In an emergency situation, conflicts in informa- tion can jeopardize the lives of citizens, first responders, law enforcement officers, fire fighters, and medical personnel. Redundant sources for data can lead to conflicting data which can lead to ineffec- tive decision making and costly investigative research. If data from differ- ent sources conflicts, then the decision maker must research and analyze the various data and the sources for that data to determine or approximate what is true and accurate. That exercise burns time and resources. Accurate, complete, timely, secure, quality information will empower decision makers to be more effective and expeditious. More effective decision making leads to higher levels of enterprise performance. The ultimate outcome is better service to citizens at a lower cost. When government can respond effectively and expeditiously to its constituents, it gains credibility with citizens. The opposite is also true. When government can’t respond, or responds incorrectly, or too slowly, based on inaccurate informa- tion, or a lack of data consistency across agencies, government’s credibility suffers. This research brief will present a number of data governance maturity models 2 which have been developed by widely recognized thought leaders. These models provide a foundational reference for understanding data governance and for understanding the journey that must be anticipated and planned for achieving effective governance of data, information and knowledge assets. This report contin- ues to build on the concepts presented in Data Governance Part I. It presents a portfolio of data governance maturity models. Future publications will present other important elements that comprise a full data governance initiative. These other elements include frameworks, organiza- tion, delivery processes, and tools. Maturity models provide a means for seeing “what are we getting into?” The higher levels of maturity present a vision or future state toward which state govern- ment aspires and corresponds to not only a mature data governance discipline, but also describe a mature enterprise architec- ture discipline. The case has already been Data Governance Part II: Maturity Models – A Path to Progress March 2009 NASCIO Staff Contact: Eric Sweden Enterprise Architect [email protected]
30

MDM Maturity Models.pdf

Nov 08, 2014

Download

Documents

Manoj Thomas

Maturity Model to evaluate the current state of Master Data in organisation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: MDM Maturity Models.pdf

NASCIO represents state chiefinformation officers and infor-mation technology executivesand managers from stategovernments across the UnitedStates. For more informationvisit www.nascio.org.

Copyright © 2009 NASCIO All rights reserved

201 East Main Street, Suite 1405Lexington, KY 40507Phone: (859) 514-9153Fax: (859) 514-9166 Email: [email protected]

Data Governance Part II: Maturity Models – A Path to Progress

NASCIO: Representing Chief Information Officers of the States

Introduction

In the previous report on Data Governance1

an overview of data governance waspresented describing the foundationalissues that drive the necessity for stategovernment to pursue a deliberate effortfor managing its key information assets.Data governance or governance of data,information and knowledge assets resideswithin the greater umbrella of enterprisearchitecture and must be an enterprise-wide program.There is a significant cost tostate government when data and informa-tion are not properly managed. In anemergency situation, conflicts in informa-tion can jeopardize the lives of citizens,first responders, law enforcement officers,fire fighters, and medical personnel.

Redundant sources for data can lead toconflicting data which can lead to ineffec-tive decision making and costlyinvestigative research. If data from differ-ent sources conflicts, then the decisionmaker must research and analyze thevarious data and the sources for that datato determine or approximate what is trueand accurate. That exercise burns timeand resources. Accurate, complete, timely,secure, quality information will empowerdecision makers to be more effective andexpeditious. More effective decisionmaking leads to higher levels of enterpriseperformance. The ultimate outcome isbetter service to citizens at a lower cost.

When government can respond effectivelyand expeditiously to its constituents, itgains credibility with citizens. Theopposite is also true. When governmentcan’t respond, or responds incorrectly, ortoo slowly, based on inaccurate informa-tion, or a lack of data consistency acrossagencies, government’s credibility suffers.

This research brief will present a numberof data governance maturity models2

which have been developed by widelyrecognized thought leaders. These modelsprovide a foundational reference forunderstanding data governance and forunderstanding the journey that must beanticipated and planned for achievingeffective governance of data, informationand knowledge assets. This report contin-ues to build on the concepts presented inData Governance Part I. It presents aportfolio of data governance maturitymodels. Future publications will presentother important elements that comprise afull data governance initiative. These otherelements include frameworks, organiza-tion, delivery processes, and tools.

Maturity models provide a means forseeing “what are we getting into?” Thehigher levels of maturity present a visionor future state toward which state govern-ment aspires and corresponds to not onlya mature data governance discipline, butalso describe a mature enterprise architec-ture discipline.The case has already been

Data Governance Part II: Maturity Models – A Path to Progress

March 2009

NASCIO Staff Contact: Eric SwedenEnterprise Architect [email protected]

Page 2: MDM Maturity Models.pdf

made in Data Governance Part I that stategovernment will never be able to effec-tively respond to citizens without properlygoverning its information and knowledgeassets.

In early 2009, the states are under severeeconomic stress—major revenue short-falls, growing deficits and reduced publicspending. State governments expectcontinued expenditure pressures from avariety of sources including Medicaid,employee pensions and infrastructure.Experts predict even more economictroubles for the states in fiscal year 2010and beyond. A key ingredient for estab-lishing strategies for dealing withcontinuing fiscal crisis is the ability toeffectively harvest existing knowledgebases. Those knowledge bases mustprovide reliable, up to date information inorder to enable judgment, discernmentand intuition. These comprise what mightbe termed wisdom. Even with perfectinformation, wisdom is still required tomake the right decisions and to executeon those decisions. State leaders will beforced to make tough decisions in themonths ahead, certainly requiring wisdom.This research brief will focus on that firstkey ingredient—knowledge. So, stategovernment must make the commitmentto begin now to manage and govern itsinformation and knowledge assets.Maturity models assist in helping stategovernment prepare for the journey andthat is what this report is intended topresent. Governance will not happenovernight—it will take sustained effortand commitment from the entire enterprise.

As state government moves up the maturi-ty curve presented by these models, therewill be technological and business processramifications. However, nothing willcompare to the organizational fallout. Itwill take commitment and leadership fromexecutive management to bring the enter-prise along in a way so that it will be apositive experience for governmentemployees and citizens. See NASCIO’spublication “Transforming Governmentthrough Change Management: The Role of

Data Governance Part II: Maturity Models – A Path to Progress2

NASCIO: Representing Chief Information Officers of the States

the State CIO” for further discussion oforganizational change management.3

The growing importance of properlymanaging information and knowledgeassets is demonstrated by a number ofpredictions regarding data and datagovernance by the IBM Data GovernanceCouncil.4

IBM Data Governance CouncilPredictions � Data governance will become a regula-

tory requirement.� Information assets will be treated as an

asset and included on the balance sheet.� Risk calculations will become more

pervasive and automated.� The role of the CIO will include responsi-

bility for data quality.� Individual employees will be required to

take responsibility for governance.

As described in the previous issue brief onthis subject the delivery process mustbegin with an understanding of what theend result will look like, and what value adata governance initiative will deliver tostate government. Value is defined byexecutive leadership and depends on thevision, mission, goals and objectivesexecutive leadership has established forthe state government enterprise. Thevalue delivery process must also providemethods and procedures for monitoringhow well state government is currentlyperforming and the incremental steps forreaching the desired level of performance.

The process for establishing and sustain-ing an effective data governance programwill require employing the followingenablers:

� Strategic Intent: describes WHY datagovernance is of value, the end statethat government is trying to reach, andthe foundational policies that describethe motivation of executive leadership.This strategic intent should bedescribed in the enterprise business

Page 3: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress 3

architecture. If state government doesnot have quality data and information,it will not achieve its objectives.Flawed data and information will leadto flawed decisions and poor servicedelivery to citizens.

� Data Governance Maturity Model:describes the journey from the AS IS tothe SHOULD BE regarding the manage-ment of data, information andknowledge assets. In parallel to thisjourney regarding data governance isthe journey that describes a maturingenterprise architecture operating disci-pline. State government mustunderstand where it is today andwhere it needs to go. This is an impor-tant step in planning the journey inmanaging information as an enterpriseasset. Data governance maturitymodels provide the means for gaugingprogress. By presenting intermediatemilestones as well as the desired endstate, maturity models assist inplanning HOW state government willreach the next level of effectiveness, aswell as WHEN and WHERE within stategovernment.

� Organizational Models, Roles andResponsibility Matrices (RASICCharts)5: defines WHO should beinvolved in decision making, imple-menting, monitoring and sustaining.Organizational models are a compo-nent of the enterprise businessarchitecture. An enterprise wide initia-tive will require the authority ofexecutive leadership and buy in fromall participants. Proper representationfrom stakeholders is also necessary formanaging risk. Collective wisdom canavoid missteps and false starts.Stakeholders and decision rights willvary depending on the specific issue orthe nature of the decision.

� Framework: describes WHAT isgoverned including related concepts,components and the interrelationshipsamong them. Decomposition offrameworks will uncover the necessaryartifacts that comprise the compo-nents of the framework. The

framework for data governance will co-exist with other frameworks thatdescribe other major components ofthe state government enterprise archi-tecture.

� Methodology for Navigating theFramework: describes the methodsand procedures for HOW to navigatethrough the framework, create theartifacts that describe the enterprise,and sustain the effort over time. Thismethodology will co-exist within theenterprise architecture methodologyand touch on business architecture,process architecture, data architecture,organizational governance, data /knowledge management processes,and records management processes.

� Performance Metrics: to measure andevaluate progress and efficacy of theinitiative. These are traceable back tostrategic intent and related maturitymodels. These metrics need to becontinually evaluated for relevancy.

� Valuation and Security of StateGovernment Information Assets. Aspresented in the previous issue briefon data governance, proper valuation ofdata and information will determine thelevel of investment to ensure quality andappropriate security throughout theinformation asset lifecycle. This is wherethe data architecture and security archi-tecture domains touch within stategovernment enterprise architecture.

This research brief will focus on presentingvarious data maturity models. Future briefsor webinars will treat other foundationalaspects of data and information gover-nance. Some common themes presentedby the variety of maturity models and theirassociated migrations to the higher levelsof maturity can be described as followsand also reflect a maturing enterprisearchitecture.

� From reactive to proactive under-standing of the management of dataand information

Page 4: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress4

� From point solutions to managedenterprise solutions

� From “siloed” data to synchronizeddata and information (i.e., consis-tent, quality data)

� From localized systems with incon-sistent levels of data classificationand security to consistent dataclassification and standards basedmanaged security

� From myopic approach to datamanagement to an enterprise wideview of information

� Migration to the capability to buildefficient information and knowledgemanagement

Implementing Data Governance –Maturity Models

There are a number of data governancematurity models that can assist in theplanning and implementation of datagovernance. Each has strengths and canbring valuable perspectives, presentcharacteristics, and form the foundationfor subsequently planning a data gover-nance delivery process. Reviewing andevaluating maturity models should occurearly in the process in order to establish anunderstanding of the end state. Thisunderstanding is necessary to properlyplan a data governance program. It mustbe understood that the delivery process isan ongoing enterprise operating disciplinewhich fits under the greater umbrella ofenterprise architecture. As with many of theconcepts presented by NASCIO, successfulimplementation of data governancerequires an enterprise perspective. Thisperspective will be portrayed in the higherstages of the maturity models presentedin this report.

It should be expected that data gover-nance maturity models will also “mature”as industry, government and societycontinue to “learn” how to manage andleverage data, information and knowl-edge, and most importantly act on thatlearning. Geospatial resources and socialnetworks are but two examples of change.

GIS and geospatial resource managementinitiatives have demonstrated the value ofhigh level associations and correlations.Social networks have demonstrated thevalue of group knowledge, and masscollaboration.

The current issues in information manage-ment began with the way systems weredeveloped. Application teams worked inisolation. Applications were built forimmediate return. And project teams wereincented and pressured to deliver immedi-ate results without proper considerationfor long term enterprise value and cost.Point solutions contributed greatly to thecurrent circumstances described in DataGovernance Part I. Data governance initia-tives must anticipate the necessity ofdealing with the data fragmentation thatexists as an aftermath of these circum-stances.

Current federal programmatic fundingguidelines and restrictions have notcontributed toward creating enterprisewide initiatives such as data governance.Therefore, funding for enterprise wideinitiatives must come from a state generalor technology fund. Data governanceincluding master data managementshould be factored into every projectincluding those that are federally funded.Reviews should be conducted to ensureprojects and programs are in compliancewith state government principles,standards and methods. Federal fundingreforms should take into account the levelof effort associated with such complianceand provide the latitude and flexibility toinvest responsibly at an enterprise level sostate government can do what it needs toin order to build long term value for thestate. This will require strong partneringand collaboration between state andfederal government.

There is the need for proper governancestructures that provide appropriate repre-sentation, decision rights, and renewedmethods and procedures to ensure stategovernment is not simply responding tofederal mandates and restrictive reportingrequirements. And that state government

Strategic PlanningAssumption: Through2010, more than75% of organizationswill not get beyondLevels 1 and 2 intheir data qualitymaturity (0.8 proba-bility).

Strategic PlanningAssumption: Through2012, less than 10%of organizations willachieve Level 5 dataquality maturity (0.8probability). - Gartner6

Page 5: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress 5

isn’t forced into “siloed” solutions becauseof funding restrictions. Rather, state andfederal government work together todevelop funding mechanisms that givestates the flexibility they need to buildlong term value, shareable resources andincreased efficiency. Such an approachshould also allow or even encouragecollaboration between state and localgovernment on joint initiatives.

As state government begins to think ofdata, information and knowledge as one ofthe most critical enterprise assets, the useof maturity models provides a means forassessing where the organization is todayand what will be required to migrate tothe desired end state. Maturity modelsalso assist in setting expectations. Thejourney the enterprise must take in devel-oping the capabilities to properly manage,and harvest value from its knowledgeassets will not be an easy trip. Maturitymodels also assist in planning what isfeasible in the near term—particularly

when state government is facing severefiscal stress. Nevertheless, even duringtimes of fiscal stress, state governmentmust make progress so it can bettermanage limited resources in the nearterm, and emerge from such times readyto move forward. It will require constancyof purpose8, consistency in executivesupport, and a sustained effort by theentire enterprise. One other aspect to thissubject is the need to view data, informa-tion and knowledge from the citizens’perspective versus agency specificperspective. The citizen would like to “see”one state government—not a collection ofagencies.

This research brief will look at a samplingof data governance maturity models anddraw some conclusions regarding the roleof maturity models in developing datagovernance within state government.

One of the key driversof EIM [EnterpriseInformationManagement] is toovercome decades of“silo-based,” applica-tion-centricdevelopment, inwhich each systemmaintained its ownversion of data andprocess rules to suitlocal performanceneeds. This resultedin duplication and alack of agility withinthe organization.-Gartner7

Page 6: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress6

DataFlux

The DataFlux Data Governance MaturityModel is very comprehensive. As theenterprise moves through the sequencefrom stage one to stage four, the valueharvested increases and the risk associat-ed with “bad data” decreases.

Tony Fisher, President and GeneralManager of DataFlux, presents an excellentoverview of information governancematurity on the SAS website.9 Fisher isspeaking about “Data Maturity” in thatpresentation. The scope of his discussionis relevant to the subject of this researchbrief—the broader view of data gover-nance maturity models. Again, the termscan get fuzzy in different conversations—data governance, data management,

“Process failure andinformation scrapand rework causedby defectiveinformation costs theUnited States alone$1.5 trillion ormore.” - LarryEnglish, InformationImpact International,Inc.10

knowledge management, data assets andinformation assets. DataFlux has recentlymodified their maturity model to empha-size a business perspective that drives theneed for managing data as an enterpriseasset, and the employment of means suchas organization, process, and technologyto achieve the necessary levels of dataquality. The phases in the DataFlux modelare presented here (Table 1).

DataFlux developed each stage of datamaturity by describing the characteristicsof each phase of maturity and how tomove to the next phase. These character-istics are formulated into four majordimensions that must be addressed asstate government matures its data gover-nance. The dimensions are: people, policies,technology and risk. DataFlux has presented

Level of Maturity Characteristics

1 Undisciplined(Think Locally, Act

Locally)

There are few defined rules and policies about dataquality and integration. There is much redundant data,differing sources, formats and records. The existingthreat is that bad data and information will lead to baddecisions, and lost opportunities.

2 Reactive(Think Globally, ActLocally)

This is the beginning of data governance. There is muchreconciliation of inconsistent, inaccurate, unreliabledata. Gains are experienced at the department level.

3 Proactive(Think Globally, ActCollectively)

It is a very difficult step to move to this phase. Theenterprise understands the value of a unified view ofinformation and knowledge. The enterprise beginsthinking about Master Data Management (MDM). Theorganization is learning and preparing for the nextstage. The culture is preparing to change.

4 Governed(Think Globally, Act

Globally)

Information is unified across the enterprise. The enter-prise has a sophisticated data strategy and framework.A major culture shift has occurred. People haveembraced the idea that information is a key enterpriseasset.

TABLE 1: DataFlux DATA GOVERNANCE MATURITY MODEL

Page 7: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress 7

that although there is no single path toreaching the higher levels of data gover-nance, whatever path is taken, it willrequire careful attention to these fourdimensions. Each stage has an associatedprofile detailing these four dimensions.One of the strengths of this maturitymodel is these detailed descriptions. The

descriptions are self evident regardinghow to move up the maturity ladder. Asan example, the four dimensions thatapply to the target maturity level—LevelFour—can be characterized as shown inTable 2.11 Further detail on this model andthe profiles for the other levels of maturitycan be found in the article cited.

People

� Executive sponsorship.� Data consumers actively participate

in strategy and delivery.� Roles are established such as data

steward.� A data governance expertise center

exists.� The organization truly embraces

data quality and adopts a “zerodefect” policy for data collectionand management.

Policies

� New project framing embraces aportfolio perspective considering thefull impact on existing data infrastruc-ture.

� Automated policies are implementedto ensure data consistency, accuracyand reliability across the enterprise.

� Service Oriented Architecture (SOA)approaches have been employed tomanage meta data including dataquality, data classification, identitymanagement, and authentication.

� Policy perspective is preventativerather than reactive.

Technology

� Data quality and data integrationtools are standardized across theenterprise.

� Data monitoring is continuous,proactive and preventative involv-ing appropriate metrics.

� The enterprise has established itsmaster data model – or TheEnterprise Data Model – datamodels are maintained usingconsistent approaches andstandards.

� Data models capture semanticbusiness rules that provide thebusiness understanding and techni-cal details of all enterprise data.

Risk

� Enterprise risk management is proac-tive providing proper balance acrossthe enterprise portfolio.

� Master data is tightly controlled acrossthe enterprise but allows the enter-prise to be dynamic.

� Enterprise data is consistent, reliable,and available to enable effectivedecision making.

TABLE 2: DIMENSIONS IN THE DataFlux DATA GOVERNANCE MATURITYMODEL: LEVEL FOUR – “GOVERNED”

Page 8: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress8

The latest version of the DataFlux datagovernance framework is as shown inFigure 1. This framework also presents thetechnology adoption that characterizesthe various phases. Each level of maturityhas associated “business capabilities” orbusiness behaviors, and examples oftechnologies employed. As the enterpriseprogresses to the higher levels of datagovernance maturity, there is greater reward—return on information and knowledgeassets—and a parallel reduction in risk.

FIGURE 1: DataFlux DATA GOVERNANCE MATURITY MODEL

Page 9: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress 9

Level ofMaturity

Characteristics

1 InformalProcesses

Reactive, dependent on a few skilled individuals, responsibilities assigned across separate ITgroups, few defined IT roles, data regarded as a minor by-product of business activity. Redundant,undocumented data, disparate databases without architecture, minimal data integration andcleansing, point solutions.� Little or no business metadata� Diverging semantics� Some commonly used approaches but with no enterprise-wide buy in� Little or no business involvement, no defined business roles� Reactive monitoring and problem solving

2 EmergingProcesses

Beginning to look at enterprise wide management and stewardship, no standard approaches, earlyenterprise architecture, growing intuitive executive awareness of the value of information assets.� Initial forays in data stewardship and governance but roles are unclear and not ongoing� Initial efforts to implement enterprise-wide management, but with contention across groups

with differing perspectives� Enterprise architecture and master meta data management projects are underway� Some processes are repeatable

3 EngineeredProcesses

Standard processes, enterprise information architecture, active executive sponsorship, centralmetadata management, periodic audits and proactive monitoring.� Ongoing, clearly-defined business data stewardship� Central enterprise data management organization� Enterprise data architecture guides implementations� Quality service level agreements are defined and monitored regularly� Commitment to continual skills development

4 ControlledProcesses

Measureable process goals are established for each defined process.� Quantitative measurement and analysis of each process occurs� Beginning to predict future performance� Defects are proactively identified and corrected

5 OptimizedProcesses

Quantitative and qualitative understanding used to continually improve each process.� Value is monitored continuously� Understanding of how each process contributes to the strategic business intent

TABLE 3: EWSolutions DATA GOVERNANCE MATURITY MODEL

EWSolutions

EWSolutions presents a maturity modelthey title the “EIM Maturity Model” whichpresents five phases. EIM refers toEnterprise Information Management. Thephases in the EWSolutions model arepresented in Table 3. The full presentation

of this model is presented in EWSolutionscourse materials.12 EWSolutions presenttheir maturity model early in their trainingon data governance and stewardship whichdemonstrates the value of maturity modelsas a communication and planning tool.

Page 10: MDM Maturity Models.pdf

FIGURE 2: Gartner EIM MATURITY MODEL

10

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress

Gartner

Garter introduced their enterprise infor-mation management maturity model inDecember of 2008 (Figure 2).13 Gartnermakes the point that enterprise informa-tion management (EIM) is not a singleproject. Rather, it is a program that evolvesover time.

Gartner presents that “managing informa-tion as an asset” has gained new attentionby top management. Further, over thenext five years industry will focus onmanaging information as a strategic asset.Gartner developed their maturity model toprovide guidance to organizations that are

serious about managing informationassets. It is important to understand thismaturity model accompanies Gartner’sdefinition of EIM. This maturity model alsopresents action items for each level ofmaturity (Table 4). Gartner’s EIM conceptpresents an integrated, enterprise wideapproach to managing information assetsand has five major goals that comprise anEIM discipline (Figure 3).

Page 11: MDM Maturity Models.pdf

11Data Governance Part II: Maturity Models – A Path to Progress

NASCIO: Representing Chief Information Officers of the States

Level ofMaturity

Characteristics

0 Unaware � Strategic decision made without adequate information� Lack of formal information architecture, principles, or process for sharing infor-

mation� Lack of information governance, security and accountability� Lack of understanding of meta data, common taxonomies, vocabularies and

data models

Action Item: Architecture staff and strategic planners should informally educate IT and business leaders on thepotential value of EIM, and the risks of not having it, especially legal and compliance issues.

1 Aware � Understanding of the value of information� Issues of data ownership� Recognized need for common standards, methods and procedures� Initial attempts at understanding risks associated with not properly managing

information

Action Item: Architecture staff needs to develop and communicate EIM strategies and ensure those strategiesalign with [the state government] strategic intent and enterprise architecture.

2 Reactive � Business understands the value of information� Information is shared on cross-functional projects� Early steps toward cross-departmental data sharing� Information quality addressed in reactive mode� Many point to point interfaces� Beginning to collect metrics that describe current state

Action Item: Top management should promote EIM as a discipline for dealing with cross-functional issues. Thevalue proposition for EIM must be presented through scenarios and business cases.

3 Proactive � Information is viewed as necessary for improving performance� Information sharing viewed as necessary for enabling enterprise wide initiatives.� Enterprise information architecture provides guidance to EIM program� Governance roles and structure becomes formalized� Data governance integrated with systems development methodology

Action Item: Develop a formal business case for EIM and prepare appropriate presentations to explain thebusiness case to management and other stakeholders. Identify EIM opportunities within business units [agenciesand divisions].

TABLE 4: Gartner EIM DATA GOVERNANCE MATURITY MODEL

Page 12: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

12

Level ofMaturity

Characteristics

4 Managed � The enterprise understands information is critical� Policies and standards are developed for achieving consistency. These policies

and standards are understood throughout the enterprise� Governance organization is in place to resolve issues related to cross-functional

information management� Valuation of information assets and productivity metrics are developed

Action Item: [Agency and division] information management activities should be inventoried and tied to theoverall [state government] EIM strategy. EIM must be managed as a program not a series of projects. Chartprogress using a balanced scorecard for information management.

5 Effective � Information value is harvested throughout the information supply chain� Service level agreements are established� Top management sees competitive advantage to be gained by properly exploit-

ing information assets� EIM strategies link to risk management, productivity targets� EIM organization is formalized using one of several approaches similar to project

management. The EIM organization coordinates activities across the enterprise

Action Item: Implement technical controls and procedures to guard against complacency and to sustain informa-tion excellence even as the [state government] changes.

EIMGoals

Integrated Master

Data Domains Seamless

Information Flows

Meta Data Management and Semantic Reconciliation

Data Integration Across the IT Portfolio

Unified Content

FIGURE 3: Gartner EIM GOALS

Data Governance Part II: Maturity Models – A Path to Progress

Page 13: MDM Maturity Models.pdf

13

NASCIO: Representing Chief Information Officers of the States

IBM

Data governance has risen to such promi-nence that IBM has created a DataGovernance Council.14 One of the initia-tives from this council is a datagovernance maturity model based on theSoftware Engineering Institute (SEI)Capability Maturity Model (CMM).15, 16 TheData Governance Council’s Maturity Modeldefines a set of domains that comprisedata governance. Review of thesedomains is a first step in understandingthe IBM maturity model. The 11 domainsreside within four major groupings:Outcomes, Enablers, Core Disciplines, andSupporting Disciplines. Interactions amongthese groupings are depicted in thediagram above (Figure 4).

Business outcomes require enablers.Enablers are supported through core andsupporting disciplines. Each of thedomains or disciplines depicted can befurther broken down into multiple compo-nents. This paper won’t fully explore thismodel in depth but will present the defini-tions of each domain. The maturity of eachdomain is evaluated and assessed individ-ually on a scale from 1 to 5. The intent isnot to score—rather to determine the ASIS and manage progress through thevarious maturity levels (Table 5).

In concert with this framework, IBMdeveloped the maturity model presentedin Figure 5.

The maturity model is the “yardstick” forassessing and measuring progress withineach of the 11 domains. The referencedreport that presents this maturity modelwas published in October of 2007. In July

Data Risk Management & Compliance

Value Creation

Outcomes

Enablers

Core Disciplines

Supporting Disciplines

Enhance

Requires

Supports

Organizational Structures & Awareness

Policy

DataQuality

Management

InformationLife-Cycle

Management

InformationSecurity

and Privacy

DataArchitecture

Classification &Metadata

Audit InformationLogging & Reporting

Stewardship

FIGURE 4: IBM Data Governance Council – DATA GOVERNANCE DOMAINS

Data Governance Part II: Maturity Models – A Path to Progress

Elements of Effective Data Governance

Page 14: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

14

Domain Description

Data Risk Management

& Compliance

The methodology by which risks are identified, qualified, and quantified, avoided,accepted, mitigated or transferred out.

Value Creation The process by which data assets are qualified and quantified to enable thebusiness to maximize the value created by data assets.

Organizational

Structures & Awareness

Description of the level of mutual responsibility between the business and IT, andthe recognition of the fiduciary responsibility to govern data at different levels ofmanagement.

Policy A description of the desired organizational behavior(s).

Stewardship A quality control discipline designed to ensure custodial care of data for assetenhancement, risk management, and organizational control.

Data Quality

Management

Methods to measure, improve and certify the quality and integrity of production,test and archival data.

Information Lifecycle

Management

A systematic policy-based approach to information collection, use, retention, anddeletion.

Information Security &

Privacy

The policies, practices and controls used by the organization to mitigate risk andprotect data assets.

Data Architecture The architectural design of structured and unstructured data systems and applica-tions that enable data availability and distribution to appropriate users.

Classification &

Metadata

The methods and tools used to create common semantic definitions for businessand IT terms, data models, data types, and repositories. Metadata that bridgehuman and computer understanding.

Audit Information,

Logging & Reporting

The organizational processes for monitoring and measuring the data value, risks,and efficacy of governance.

TABLE 5: IBM Data Governance Council – DATA GOVERNANCE DOMAIN DEFINITIONS

Data Governance Part II: Maturity Models – A Path to Progress

Page 15: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

15

of 2008, the Council announced its plansto develop a data governance frameworkbased on this maturity model.

As described earlier, maturity models andframeworks are necessary members of thedata governance toolbox. The maturitymodel describes the milestones in thejourney. The framework presents conceptsand the most prominent of the relation-ships among the concepts. A methodologywill describe how to navigate the frame-work in order to travel up the maturitymodel.

One of the values of maturity models isthat in describing the characteristics ofeach stage, they describe enterprisecharacteristics sought, independent of anymaturity model. State government doesnot have to follow a linear path throughthese stages. A foundational concept thatwas used in the NASCIO EnterpriseArchitecture Maturity Model was that the

various domains of enterprise architecturewill naturally, and most likely be at differ-ent levels of maturity. That brings up adifferentiating property of the IBM maturi-ty model. It is used to assess the individualmaturity of 11 separate domains.

Knowledge Logistics

The Commonwealth of Kentucky has initi-ated a data governance initiative using amaturity model designed by KnowledgeLogistics (Table 6). This model also followsclosely with the CMM levels of maturity. Aswith the other maturity models presented,characteristics change or evolve fromreactive, independent activities to verysophisticated leverage of informationassets not only for historical analysis butpredictive activities.

FIGURE 5: IBM Data Governance Council MATURITY MODEL

� Up to 75% ofinformationworkers havemade decisionsthat turned out tobe wrong due toflawed data.

� As much as 30%of the work weekis spent verifyingthe accuracy andquality of data.

� Only 10% ofknowledgeworkers believethey always haveall the informationneeded to confi-dently makeeffective businessdecisions.17

Data Governance Part II: Maturity Models – A Path to Progress

Page 16: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

16

Level of Maturity Characteristics

1 Initial � Entrepreneurial� Individual� Fragmented� Chaotic� Idiosyncratic

2 Repeatable � Departmental� Consolidation� Reconciliation� Internally Defined� Reactive

3 Defined � Integration� Enterprise View� Data Accountability� Strategic Alignment� Standards� Sharing & Reuse

4 Managed � Quantitative Control� Closed Loop� Low Latency� Interactive� Unstructured Data� Collaborative

5 Optimized � Improvement &Innovation

� Real-time� Extensive Data Mining� Knowledgeable

TABLE 6: Knowledge Logistics DATA GOVERNANCE MATURITY LEVELS

� Few Users� Rules Unknown� Variable Quality� Costly

� Local Standards� Internal Data Quality� Specialist Users� Local Process� Costly

� Centralized Data Quality� Planned & Tracked� Wide Data Usage� Metadata Management� Common Technology� Efficient

� Process Efficiency &Effectiveness

� Built-in Quality� Extended Value Chains� High Availability

� Competitive Intelligence� Data Assets Valued� Self-managing

Data Governance Part II: Maturity Models – A Path to Progress

Page 17: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress 17

MDM Institute

The MDM Institute (formerly known as theCDI Institute) presents the data gover-nance maturity model18 shown in Table 7.This model provides an excellent startingpoint for initiating the conversation aboutdata governance. The essence of thismodel is a migration from the initial statewhich is described as reactive, no control,application and project driven to a formal-ized approach. The MDM Instituteemphasizes leveraging service orientedarchitecture (SOA) as a foundationalapproach for planning, designing andimplementing enterprise services includ-ing data and information services. TheMDM Institute’s definition of data gover-nance also has a Master Data Management(MDM) focus in level 3 and an SOA flavorto distribute the governed master dataacross the enterprise in step 4.

“The formal orchestration of people,processes, and technology to enable anorganization to leverage data as anenterprise asset.”—The MDM Institute definition of data governance

This model is phased with fewer steps, butis based on the same concept of an evolv-ing maturity. At the higher levels thebusiness side of the organization is playingan active role.

Level of Maturity Characteristics

1 Basic (“anarchy”)

Application-centric approach; meets business needsonly on project-specific basis.

2 Foundational (“IT monarchy”)

Policy-driven standardization on technology andmethods; common usage of tools and proceduresacross projects.

3 Advanced (“business monar-

chy”)

Rationalized data, with data and metadata activelyshared in production across sources.

4 Distinctive (“Federalist”)

Based on service-oriented architecture (SOA) withmodular components, integrated view of compliancerequirements, formalized organization with definedroles and responsibilities, clearly defined metrics, andan iterative learning cycle.

TABLE 7: MDM Institute DATA GOVERNANCE MATURITY LEVELS

When organizationsarticulate a desire to“manage informationas an enterpriseasset,” they oftendon’t know how tobegin. - Gartner19

Page 18: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress18

Oracle Corporation

Oracle is well known for its emphasis on awell designed underlying data architec-ture. Oracle Corporation maintains anexpertise in data governance consistentwith the definitions for data governancepresented in NASCIO’s Data GovernancePart I issue brief. Effective data gover-nance must correctly align people,processes, and technology to convert datainto strategic information and knowledgeassets for the state government enterprise.

It is important to understand that the datathat needs governing resides across awide variety of heterogeneous applica-tions and business intelligence systems.Most data quality problems begin in thesefragmented applications. The very natureof this data makes it difficult to manageand creates challenges for data gover-nance.

Becoming an organization that fullycontrols and leverages its key data assetsis an evolutionary process. Oracle’s DataGovernance Maturity Model is intended to

be used to determine what steps an enter-prise will need to make to improve its datagovernance capabilities. That intention isin direct support of the rationale andintended outcome of this research briefand provides validation of our approach.

As described in the introduction, OracleCorporation also believes that a datagovernance maturity model will assist theenterprise in determining where they arein the evolution of their data governancediscipline and identifies the short-termsteps necessary to get to the next level(Figure 6 & Table 8). Each step on thejourney has associated measurable keyperformance indicators with real return oninvestment that justifies the cost.

A Survey of Progress in DataGovernance

So where do organizations stand in theirprogress? A survey was conducted byAiken, Allen, Parker, and Mattia thatexplored the current standing of datamanagement practice maturity.20

MARGINAL

Stage I

Manually maintain trusted sources.

Inconsistent siloedstructures with limited integration.

Gaps in automation.

STABLE

Stage II

Tactical implementations, limited in scope and target a specific division.

Limited scope and stewardship capabilities.

Typically done to gain experience.

BEST PRACTICE

Stage III

Process automation and improvement.

Enterprise business solution, which provides single version of the truth, with closed -loop data quality capabilities.

Driven by an enterprise architecture group.

TRANSFORMATIONAL

Stage IV

Data Governance is quantitatively managed, and is integrated with Business Intelligence, SOA, and BPM.

Data Governance is leveraged in business process orchestration.

FIGURE 6: Oracle DATA GOVERNANCE MATURITY MODEL

Page 19: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress 19

Although this research brief is focused ondata governance, the research from Aikenet al. is very relevant to our discussionbecause of the consistency in theoutcomes sought. This research brief, theresearch by Aiken et al., and the summaryof each of the maturity models are alldirected at the same outcome: managingdata, information and knowledge as enter-prise assets in order to achieve enterpriseintent.

175 organizations were assessed duringthe period 2000 to 2006 with the intent ofdetermining the maturity of data manage-ment practices. Such a study provides ageneral understanding of progress madein truly managing information as an enter-prise asset and how carefully it is

harvested for value. This study alsoprovides state government with assistancein determining how it stacks up againstthe rest of the “world” regarding itsmanagement of information assets—isstate government ahead, behind, or on parwith industry, federal government, etc.

The results are consistent with wherestates currently reside on any maturityscale. However, the point made by Aikenet al., is that armed with this informationmany organizations will see the opportu-nity for competitive advantage bydeliberately directing resources and incen-tives to pursue higher levels of maturity inmanaging enterprise information assets.State government isn’t necessarily subjectto competitive forces that characterize

Level of Maturity Characteristics

1 Marginal Stage I reflects an organization that has started tounderstand the need for data governance. They willneed to expand the scope of ongoing data qualityinitiatives, and add data stewardship capabilities.

2 Stable Stage II is characterized by division wide data gover-nance initiatives with data governance teams in place.Socializing the successes achieved at this level helpsdrive increased demand for further progress. Enterprisewide teams need to be formed and cross divisionalconflicts around data ownership and access rightsneed to be resolved. Master Data Managementsolutions need to be deployed.

3 Best Practice Stage III organizations are running best data gover-nance practices across their enterprise. Datagovernance policies are executed automatically byMaster Data Management execution engines, andfeedback loops that report results directly back to thegovernance committees.

4 Transformational Stage IV integrates the proven quality data in the appli-cations and business intelligence tools directly into allbusiness processes to achieve transformational statusfor the organization.

TABLE 8: Oracle DATA GOVERNANCE MATURITY LEVELS

Page 20: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress20

most markets. However, state governmentis involved in an unprecedented pressureto make gains in effectiveness while facingongoing fiscal crisis. In this way, competi-tive forces are turned inward—stateagencies may eventually be evaluated foreffectiveness and may in future competefor limited internal resources. Therefore,the pressure is still on government andeven non-profit organizations to effective-ly manage enterprise information assets.

Figure 7 presents a profile of the organiza-tions that participated in this survey.

The maturity model used is based primari-ly on the Carnegie Mellon UniversitySoftware Engineering Institute’s CapabilityMaturity Model Integration (CMMI)21 andresembles maturity models presentedearlier in this report. The rationalepresented by Aiken et al. is the adaptationand prevalent usage of CMMI maturitylevels to other areas of software engineer-ing. The data maturity levels arepresented in Table 9. This approachprovides the ability to compare thematurity of data management with otherdomains within enterprise architecture.

The assessment evaluated 5 predefineddata management processes (adaptedfrom Parker22). (See appendix for defini-tions of these processes and the statisticalresults from this survey.) Per CMMIpractice, overall ratings for participants inthe self-assessment were based on thelowest rating achieved on the 5 datamanagement processes. In other words, ifan organization achieved individual

ratings of 1, 2, 2, 3, and 2; the overall ratingfor that organization would be 1.Assessments scores adjusted for selfreporting inflation present that the partici-pants were somewhere between “Initial”and “Repeatable” on the maturity modelused. As stated by the researchers, theresults may be a motivator for organiza-tions to actively pursue the higher levels ofmaturity. State government is very early interms of data governance maturity.However, this study by Aiken et al.,presents that state government isn’tnecessarily in “catch up” mode. However, itcan be anticipated that organizations willbecome more prudent in their manage-ment of information assets.

The Value of Maturity Models

Better data leads to better informationwhich will lead to better informed decisionmakers. Better decisions will necessarilylead to better service to citizens. Properdata governance leads to state govern-ment becoming less reactive and morepredictive in its activities toward servingcitizens. Proper data governance leads tostate government acting as “one govern-ment” rather than a collection ofindependent agencies. Proper manage-ment of data, information and knowledgeassets provides economic gains, andcompliance with security and privacyrequirements. An important tool stategovernment can use to chart and evaluateits progress in improving the quality of itsdata and information are data governancematurity models.

FIGURE 7: ORGANIZATIONS IN SURVEY

Page 21: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress 21

Level Name Practice Quality and Results Predictability

1 Initial The organization lacks the neces-sary processes for sustaining datamanagement practices. Datamanagement is characterized as adhoc or chaotic.

The organization depends on entirelyon individuals, with little or no corpo-rate visibility into cost orperformance, or even awareness ofdata management practices. There isvariable quality, low resultspredictability, and little to no repeata-bility.

2 Repeatable The organization might knowwhere data management expertiseexists internally and has some abili-ty to duplicate good practices andsuccesses.

The organization exhibits variablequality with some predictability. Thebest individuals are assigned to criti-cal projects to reduce risk andimprove results.

3 Defined The organization uses a set ofdefined processes, which arepublished for recommended use.

Good quality results within expectedtolerances most of the time. Thepoorest individual performersimprove toward the best performers,and the best performers achievemore leverage.

4 Managed The organization statisticallyforecasts and directs data manage-ment, based on defined processes,selected cost, schedule, andcustomer satisfaction levels. Theuse of defined data managementprocesses within the organization isrequired and monitored.

Reliability and predictability ofresults, such as the ability to deter-mine progress or six sigma versusthree sigma measurability, is signifi-cantly improved.

5 Optimizing The organization analyzes existingdata management processes todetermine whether they can beimproved, makes changes in acontrolled fashion, and reducesoperating costs by improvingcurrent process performance or byintroducing innovative services tomaintain their competitive edge.

The organization achieves high levelsof results certainty.

TABLE 9: Aiken et al. – DATA GOVERNANCE MATURITY LEVELS

Page 22: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress22

A number of maturity models have beenpresented. Much value is brought to theenterprise by examining these structures.The organization will understand thecomplexities of data governance, andbegin to explore what it will take to devel-op a sustained, successful data governanceeffort. Management and technical staffwill gain an appreciation of the compo-nents, scope and depth, and level of effortrequired to initiate a data governanceprogram and that it will take time toachieve the higher levels of maturity. Thestate government enterprise can adapt itsown maturity model and framework fromthis mix of ideas.

Common across these maturity models isthe progressive maturing from strictlyreactive to predictive. It is the predictivenature that is the intended long termcapability sought—not only to managerisk, but to anticipate, uncover and preparefor opportunities and threats. This predic-tive capability will include identifyingpotential opportunities and threats andthe impact of these vectors on stategovernment. Understanding of impactthen leads to proactive development ofeffective response. Because the future cannot be predicted with certainty, stochasticmodeling, or probability analysis, can beemployed to present multiple outcomescenarios. The enterprise architect wouldcreate these scenarios based on the analy-sis of information assets from inside theenterprise and leveraging the informationassets of its partners. The outcome soughtis government that is no longer simplyreacting, but is prepared for any foresee-able circumstance. At this point theenterprise is truly dynamic, agile, fluid,adaptive and spontaneous.

Managing data, information and knowl-edge assets in this way is not strictly an ITinitiative—this is an enterprise initiativedemonstrating strong collaboration acrossbusiness and technology, strategists andimplementers, policy makers and citizens,career government employees and elect-ed officials. This also demonstratesgovernment that has created successfulcollaboration across multiple jurisdictions

and levels of government. The citizeneventually sees “one government.” All ofthese behaviors and characteristics arefounded on proper management of stategovernment data, information and knowl-edge assets with the ultimateoutcome—benefiting the citizen.

The language and progressive dynamicused in maturity models facilitate conver-sation and understanding amongtechnical staff, business staff and uppermanagement, and strategic partners.Seeing the relationships among thevarious components of data governancehelps develop the necessary understand-ing and prepares the organization tobegin development of a delivery processto launch and sustain a data governanceinitiative. Frameworks and maturitymodels can also be used in conversationwith partners to compare and contrastvarious approaches and sequencing indata governance.

The elements in the data governanceframework and maturity model willdepend most on what the enterprise istrying to accomplish and how informationassets can enable that intent. Stategovernment will be most interested indata quality, properly managing citizeninformation, and using business intelli-gence and analytics to predict trends, theimpact of those trends and determiningstate government response.

It is expected that state government willnot have the resources to necessarilycreate a separate governance structure formanaging data and information. However,some state governments have establishedthe roles of data stewards, data architects,data analysts, and data base administra-tors. State government may also haveexisting enterprise IT governance andwould be best served by incorporatingdata governance into this existing gover-nance structure.

Page 23: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress 23

Conclusions and Observations

� In an effort to better serve the citizenthrough increased efficiencies and acommon viewpoint, data must bemanaged.

� Some of the rationale for data gover-nance is to gain the capability torespond strategically and tactically tobusiness challenges; respond immedi-ately in an emergency; and ensuregovernment responses are orchestrat-ed through collaborative informationsharing. Without enterprise datagovernance, state government iscrippled in its ability to respond toopportunities and challenges—response will be inconsistent, arbitraryand ineffective across agencies.

� Data governance encourages themeasurement of successes and failures.Goals, objectives and strategies cannotbe defined, understood, communicat-ed, or measured without quality data.

� Maturity models provide a measure forthe state enterprise to gauge itssuccess in managing data and informa-tion as an enterprise asset.

� Data governance maturity models canbe used as references in communica-tion, awareness building, and themarketing of data governance.

� The states must face the challenge ofstove-piped federal program fundingwhich creates “islands of data.”Solutions developed under thisfunding will also be stove-piped. Stategovernment must continue to reachout to federal agencies throughNASCIO, NGA, and NCSL to move thefederal government toward reforma-tion of current federal fundingrestrictions and reporting to actuallyencourage enterprise-wide solutionsthat touch multiple government linesof business. State government musthave the ability to access, share, andanalyze information from across stateagencies and programs in order toeffectively deliver services, identifyfraud, avoid redundant investment andservice delivery, and provide a “onestate government” view to citizens.

� Data governance will not happenwithout the support of governmentleadership. The state CIO is in the bestposition organizationally and techni-cally to initiate and champion datagovernance. Understand the impor-tance of data, information andknowledge assets in achieving a visionfor eDemocracy and 21st Centurygovernment.

Page 24: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress24

Calls to Action for the State CIO and State Government

1. Begin now to develop expertise and governance for managing data, information and knowledge assets.� Given current economic stresses, focus on those areas of data governance most relevant to enabling effec-

tive tactical and strategic response. � Begin to develop a library of case studies that present the economics of data governance and, real outcomes

and illustrative consequences that resonate with policy makers. Examples include: fraud detection andprevention; avoidance of redundant or duplicative citizen assistance; improved business processes anddecision making; consequences of poor or conflicting information for decision making during a crisis; poten-tial and real outcomes when first responders, including firefighters, law enforcement officers, andparamedics, don’t have complete information when entering an emergency situation; the cost of researchthat becomes necessary when information from various sources is in conflict.

2. Begin to build awareness through communications and marketing initiatives. � The intent of these initiatives is to move the culture and organization of state government toward under-

standing the necessity of managing information as a state government enterprise asset. � Consider the cost of unreliable information or conflicting information from different sources and how that

hampers state government’s ability to gather and analyze state data particularly in responding to the currenteconomic crisis.

3. Understand the scope of data governance. � Identify opportunity areas for early initiatives. � Scope management will be critical – targeted initiatives must be carefully selected.� Begin to identify strategic partnerships that are necessary for implementing an effective, sustained effort

(e.g., private industry and public entities; intergovernmental agencies; counties, cities and states).

4. Ensure that data governance has appropriate representation from business stakeholders, i.e., the realowners of the information.� Data and information only has value to the extent that it enables the business units within state government

and their partners. � Any efforts to develop effective data governance must involve close collaboration between the business unit

partners and IT that recognizes the decision rights associated with the various roles in state government.

5. Implement data governance within existing enterprise and data architecture practice. � Data governance is not a separate activity. Rather, it is an important mechanism for managing enterprise

information and knowledge assets within the scope of enterprise architecture.

Page 25: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress 25

Appendix A

Data Management Processes defined by B. Parker.23 The five data management processes evaluated by Aiken et al.,were as follows and are further described in the article cited.

Pro

cess

Des

crip

tio

nFo

cus

Dat

a Ty

pe

Res

earc

h R

esu

lts

Mat

uri

ty L

evel

Ran

ge

/ A

vera

ge

Una

djus

ted

for S

elf-

Repo

rtin

g In

flati

on24

Dat

a Pr

og

ram

Co

ord

inat

ion

Prov

ide

app

rop

ri-

ate

dat

am

anag

emen

tp

roce

sses

an

dte

chn

olo

gic

alin

fras

tru

ctu

re.

Dir

ecti

on

Pro

gra

m d

ata:

Des

crip

tive

pro

po

si-

tio

ns

or o

bse

rvat

ion

s n

eed

ed to

esta

blis

h, d

ocu

men

t, su

stai

n, c

on

tro

l,an

d im

pro

ve o

rgan

izat

ion

al d

ata-

ori

ente

d a

ctiv

itie

s su

ch a

s vi

sio

n,

go

als,

po

licie

s an

d m

etri

cs.

2.06

to 3

.31

/ 2.

71

Org

aniz

atio

nal

Dat

a In

teg

rati

on

Ach

ieve

org

aniz

a-ti

on

al s

har

ing

of

app

rop

riat

e d

ata.

Dir

ecti

on

Dev

elo

pm

ent

dat

a: D

escr

ipti

ve fa

cts,

pro

po

siti

on

s, o

r ob

serv

atio

ns

use

d to

dev

elo

p a

nd

do

cum

ent

the

stru

c-tu

res

and

inte

rrel

atio

nsh

ips

of d

ata

–fo

r exa

mp

le, d

ata

mo

del

s, d

atab

ase

des

ign

s, an

d s

pec

ifica

tio

ns.

2.18

to 2

.66

/ 2.

44

Dat

a St

ewar

dsh

ipA

chie

ve b

usi

nes

s-en

tity

su

bje

ct a

rea

dat

a in

teg

rati

on

.

Dir

ecti

on

an

dIm

ple

men

tati

on

Stew

ard

ship

dat

a: D

escr

ipti

ve fa

cts

abo

ut

dat

a d

ocu

men

tin

g s

eman

tics

and

syn

tax

– su

ch a

s n

ame,

def

ini-

tio

n, a

nd

form

at.

1.96

to 2

.40

/ 2.

18

Dat

aD

evel

op

men

tA

chie

ve d

ata

shar

ing

wit

hin

ab

usi

nes

s ar

ea.

Imp

lem

enta

tio

nB

usi

nes

s d

ata:

Fac

ts a

nd

th

eir

con

stru

cts

use

d to

acc

om

plis

h e

nte

r-p

rise

bu

sin

ess

acti

viti

es –

su

ch a

sd

ata

elem

ents

, rec

ord

s an

d fi

les.

1.57

to 2

.46

/ 2.

12

Dat

a Su

pp

ort

Op

erat

ion

sPr

ovid

e re

liab

leac

cess

to d

ata.

Imp

lem

enta

tio

n2.

04 to

2.6

6 /

2.38

Dat

a A

sset

Use

Leve

rag

e d

ata

inb

usi

nes

s ac

tivi

ties

.Im

ple

men

tati

on

(no

dat

a)

Page 26: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress26

Appendix B: Acknowledgements

Tom Baden, Enterprise Architect, The Stateof Minnesota

Scott Batchelor, MarketingCommunications Director, DataFlux | A SASCompany

Dr. Jim Bryant, Chief Engineer andScientist, Joint Medical InformationSystems Lockheed Martin InformationSystems and Global Services

Dave Butler, Master Data ManagementInitiative, Oracle Corporation

Micheline Casey, Director of IdentityManagement, Governor’s Office ofInformation Technology, The State ofColorado

Anthony Collins, Chief Enterprise Architect,The State of Delaware

Robert Culp, Alliance Manager, ESRIStrategic Alliance, IBM

Mike Dunham, Senior Principle Consultant,Keane Federal Systems, Inc.

Lauren Farese, Director, Public SectorSolution Architects, Oracle Corporation

Michael Fenton, Director of EnterpriseArchitecture, The State of North Carolina,Office of Information Technology Services

Stephen Fletcher, Chief InformationOfficer, State of Utah, Co-Chair of theNASCIO Enterprise ArchitectureCommittee

Christopher Ipsen, Chief Security Officer,The State of Nevada

Tom McCullough, Principal ArchitectSoftware Systems, Lockheed Martin

David Newman, Vice President, Research,Gartner, Inc.

Jeanne Owings, Partner, ProgramManagement, Crowe Horwath LLP

Doug Robinson, Executive Director,NASCIO

Bill Roth, Chief Technology Architect, TheState of Kansas

Jim Salb, Enterprise Architect, The State ofDelaware

Tricia Anne Saunders, Data Architect, TheState of Delaware

Dr. Anne Marie Smith, Principal Consultant,Director of Education, EWSolutions, Inc.

Daniel Teachey, Senior Director ofMarketing, DataFlux | A SAS Company

Glenn Thomas, Director of IT Governance,The Commonwealth of Kentucky

Christopher Traver, Senior Policy Advisor, Bureau of Justice Assistance, U.S.Department of Justice

Chuck Tyger, Associate Director EnterpriseArchitecture, The Commonwealth ofVirginia

Chris Walls, Senior Website & PublicationsCoordinator, AMR Management Services

Tom Walters, Division of Data Architecture,The Commonwealth of Kentucky

Page 27: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress 27

Appendix C: Resources

NASCIO www.nascio.org

IT Governance and BusinessOutcomes – A SharedResponsibility between IT andBusiness Leadershiphttp://www.nascio.org/committees/EA/download.cfm?id=98

Data Governance – ManagingInformation As An EnterpriseAsset Part I – An Introductionhttp://www.nascio.org/committees/EA/download.cfm?id=100

Enterprise Architecture: The Pathto Government Transformationhttp://www.nascio.org/committees/EA/

Call for Action, A Blueprint forBetter Government: TheInformation Sharing Imperativehttp://www.nascio.org/advocacy/dcFlyIn/callForAction05.pdf

PERSPECTIVES: GovernmentInformation Sharing Calls toActionhttp://www.nascio.org/publications/index.cfm#19

In Hot Pursuit: AchievingInteroperability Through XMLhttp://www.nascio.org/publications/index.cfm#21

We Need to Talk: GovernanceModels to AdvanceCommunications Interoperabilityhttp://www.nascio.org/publications/index.cfm#50

A National Framework forCollaborative InformationExchange: What is NIEM?http://www.nascio.org/publications/index.cfm#47

List of NASCIO Corporate Partnershttp://www.nascio.org/aboutNascio/corpProfiles/

List of NASCIO Publicationshttp://www.nascio.org/publications

List of NASCIO Committeeshttp://www.nascio.org/committees

The Data Administration Newsletterhttp://www.tdan.com/index.php

Presents 8 chapters that describehow to implement data gover-nance

The Data Governance Institutehttp://datagovernance.com/

DGI created a poster on datagovernance that can bedownloaded, or ordered inhardcopy online.

The Data Management AssociationInternational – DAMA – www.dama.org

The Data Management Body ofKnowledge (DMBOK) including aframework of data managementfunctions and environmentalelements.http://www.dama.org/i4a/pages/index.cfm?pageid=3364

The IT Governance Institute (ITGI)http://www.itgi.org/

Information Systems Audit and ControlAssociation (ISACA)http://www.isaca.org/

Certification in Governance ofEnterprise IT (CGEIT) from ISACAhttp://www.isaca.org/Template.cfm?Section=Certification&Template=/TaggedPage/TaggedPageDisplay.cfm&TPLID=16&ContentID=36129

The Center for Information SystemsResearch (CISR)http://mitsloan.mit.edu/cisr/

The National Information ExchangeModel (NIEM) www.niem.gov

Page 28: MDM Maturity Models.pdf

Global Justice Reference Architecturefor SOAhttp://www.it.ojp.gov/topic.jsp?topic_id=242

Global Justice ReferenceArchitecture (JRA) Specification:Version 1.7http://www.it.ojp.gov/documents/JRA_Specification_1-7.doc

The Global Justice ReferenceArchitecture (JRA) Web ServicesService Interaction Profile Version1.1http://www.it.ojp.gov/documents/WS-SIP_Aug_31_version_1_1_FINAL(3).pdf

The Global Justice ReferenceArchitecture (JRA) ebXMLMessaging Service InteractionProfile Version 1.0http://www.it.ojp.gov/documents/ebXML_SIP_v01_Final_Version_100407.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress28

Page 29: MDM Maturity Models.pdf

NASCIO: Representing Chief Information Officers of the States

Data Governance Part II: Maturity Models – A Path to Progress 29

Appendix D: Endnotes

1 “Data Governance – ManagingInformation As An Enterprise Asset Part I –An Introduction”, NASCIO, April, 2008,available at www.nascio.org/publications.

2 Note: this report has been liberal regard-ing the inclusion of maturity models. Thewords “data management” or “enterpriseinformation management” may appear inthe title of a specific maturity model.Nevertheless, if a maturity model capturesthe essence relevant to this report, then itwas included.

3 “Transforming Government throughChange Management: The Role of theState CIO”, April 2007, NASCIO,www.nascio.org/publications.

4 “IBM Council Predicts Data Will Becomean Asset on the Balance Sheet and DataGovernance a Statutory Requirement forCompanies Over Next Four Years”, IBMpress release, ARMONK, NY - 07 Jul 2008,see http://www-03.ibm.com/press/us/en/pressrelease/24585.wss.

5 RASIC charts present assigned roleswithin a project team: responsible,approving, supporting, informed, andconsulting.

6 Gartner’s Data Quality Maturity Model,February 7, 2007, ID Number G001139742.

7 Newman, D., Blechar, M.J., “PuttingEnterprise Information Management inContext”, Gartner, June 1, 2007, ID Number:G00148273.

8 The first of Dr. W. Edwards Deming’s 14Points. Deming, W.E., Out of The Crisis,Massachusetts Institute of Technology,Center for Advanced Engineering Study,1986. ISBN: 0911379010.

9 “The Four Stages of Data Maturity”,DataFlux, Tony Fisher, retrieved onNovember 10, 2008, fromhttp://www.sas.com/news/sascom/2007q4/column_tech.html.

10 “The Four Stages of Data Maturity”, page2; English, Larry. “Plain English aboutInformation Quality: Information QualityTipping Point.” DM Review, July 2007.

11 Adapted from “The Data GovernanceMaturity Model”, DataFlux Corporation.This paper presents these characteristics atthe various stages of data governancematurity. Retrieved on March 11, 2008,from http://www.dataflux.com/resources/resource.asp?rid=184.

12 See course materials from EWSolutions,“Enterprise Data Governance andStewardship”, available for purchase atwww.EWSolutions.com.

13 Newman, D., Logan, D., “GartnerIntroduces the EIM Maturity Model”,Gartner Research, ID Number: G00160425.

14 See IBM Data Governance Council,http://www-01.ibm.com/software/tivoli/governance/servicemanagement/data-governance.html.

15 See Carnegie Mellon SoftwareEngineering Institute at www.sei.cmu.edu.

16 IBM Data Governance Council MaturityModel, October 2007, retrieved on May 12,2008, from http://www-935.ibm.com/services/uk/cio/pdf/leverage_wp_data_gov_council_maturity_model.pdf.

17 Thomas, G.J., “Application of the DMBOKin an Enterprise Data Architecture”, presen-tation at 2008 DAMA Conference.

18 “Corporate Data Governance BestPractices, 2006-07 Scorecards for DataGovernance in the Global 5000, The CDIInstitute, April 2006, www.The-CDI-Institute.com.

Page 30: MDM Maturity Models.pdf

19 Newman, D., Blechar, M.J., page 8.

20 Aiken, P., Allen, D., Parker, B., Mattia, A.,“Measuring Data Management PracticeMaturity: A Community’s Self-Assessment”,Computer, Vol. 40, Iss. 4, April 2007, pp. 42-53.

21 Carnegie Mellon University SoftwareEngineering Institute, Capability MaturityModel: Guidelines for Improving theSoftware Process, 1st ed., Addison-WesleyProfessional, 1995.

22 Parker, B., “Enterprise Data ManagementProcess Maturity”, Handbook of DataManagement, S. Purba, ed., AuerbachPublications, CRC Press, 1999, pp. 824-843.

23 Ibid.

24 Aiken, et al., page 49.

Disclaimer

NASCIO makes no endorsement, expressor implied, of any products, services, orwebsites contained herein, nor is NASCIOresponsible for the content or the activi-ties of any linked websites. Any questionsshould be directed to the administratorsof the specific sites to which this publica-tion provides links. All critical informationshould be independently verified.

This report and the NASCIO EnterpriseArchitecture Program are funded by agrant from the Bureau of JusticeAssistance, Office of Justice Programs, U.S.Department of Justice.

The opinions, findings, conclusions, andrecommendations contained in this publi-cation are those of the contributors, anddo not necessarily reflect the officialpositions or policies of the Department ofJustice.

This project was supported by Grant No. 2007-RG-CX-K020 awarded by the Bureau of Justice Assistance. The Bureau of Justice Assistance is a component of the Office of Justice Programs, which also includes the Bureau of Justice Statistics, the National Institute of Justice, the Office of Juvenile Justice and Delinquency Prevention, and the Office for Victims of Crime. Points of view or opinions in this document are those of the author and do not represent the official position or policies of the U.S. Department of Justice.

Data Governance Part II: Maturity Models – A Path to Progress30