Network Computing is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

A Beast of Burden: 5 Reasons Your Data Governance Projects Have Stalled

data-4404730_640.jpg

Data
(Source: Pixabay)

To say data governance is complicated is putting it lightly. According to one survey, the average number of data sources in a single organization is 400, and nearly one-quarter are managing data from 1,000 or more sources. Across these systems, enterprises generate an estimated 6.4 zettabytes of new data per year. Corporate data is exploding, and all the while, the parameters around it are closing in, with more regulations, more security threats, more privacy laws, more blurring of geographical and jurisdictional lines, and more pressure for employees to extract value from data.

In response to these pressures, enterprises (roughly 90% according to Garter) have increasingly embraced the Chief Data Officer (CDO) role over the last several years. The CDO, an executive position that typically sits under the CEO, CIO, or CTO, is responsible for becoming the master of an organization’s data assets and its governance. Yet, even with a CDO in place, many data governance initiatives are so burdensome that they can stall or fail to deliver on expectations.

This is due in part to the fact that data governance means different things to different groups and is often not clearly defined across an organization. Different stakeholders also view data governance with varying levels of urgency. The objectives and value sought will vary across an enterprise, which makes the task of communicating the business case a challenge. For example, business users may buy in to the program based on opportunity for more reliable and better-organized data, while IT and database administrators will likely be driven by the opportunity to retire redundant and outdated systems, and security and legal teams are motivated by improving risk management. The resources and time required to implement these initiatives often falls more heavily on some than others, or in some cases, may overlap or infringe on existing initiatives. This requires careful planning and project management to ensure clear communication and objectives across all stakeholders.

While data governance will look different for every organization, there are a number of challenges that arise in most initiatives. If CDOs and other data governance stakeholders can plan for these and address them at the outset of a new initiative, their projects will have a much greater chance for success. Common issues include:

Resource fatigue. In almost every data governance initiative, the group owning the project will have large dependencies on other business units and data stewards from a variety of groups, all of whom have their own unique sets of priorities. Project leaders will be working with many different groups across countless moving parts, often requiring repeat touch points to gather information, clarify misunderstandings, obtain buy-off on decisions and provide project updates. In most cases, business stakeholders are being asked to fulfill these requests on top of their regular day jobs. Teams may uncover new issues or be forced to change timelines in response to unexpected changes in the organization. This can be exhausting for the project team and for the organization as a whole. Projects can fail, budgets may be cut, and stakeholders simply give up out of frustration.

Implementing a comprehensive data governance program can be a multi-year effort, and without clearly defined short-term and long-term milestones and timelines, teams can feel like they are on a hamster wheel. To avoid resource fatigue, it’s critical to set objectives that meet the needs of all stakeholders and define which groups and roles are responsible for the heavy lifting. Planning thoughtfully from the beginning, acknowledging successes along the way, and maintaining transparency between stakeholders will help alleviate the strain on various teams.

Decision overload. If a project has too broad of a scope, unclear priorities, or an excess of software solutions, teams will become paralyzed. Many outside providers will conduct an assessment and follow it up with an extensive list of nebulous recommendations and models to choose from, making it nearly impossible to know how to move forward. Individuals who are steeped in data governance planning often use terms that are confusing to anyone unfamiliar with governance frameworks. While there should be some effort to educate the business on the concepts, the team must ultimately speak the language of the business. Clear guidance and alignment on who is responsible for key decisions is also essential. Teams must clearly define their pain points and objectives up front in a way that is meaningful for the intended audience and can use time-boxing to keep work on track. Recommendations must map directly to the problems stakeholders agreed to solve.

Assessment dead ends. Conducting an assessment is the easy part. Often organizations get stuck after the assessment once the time comes to dig in to the tactical, technical next steps. Many teams will go back to previous data governance plans that have sat untouched for years and try to revive them. But modernizing stale assessments and plans and making them actionable may not be feasible depending on how the organization’s data and risk landscape have changed.

To successfully move past the assessment phase, a strategic formula or playbook is needed, but it must be accompanied by a detailed project roadmap that fits the current needs of the company and its employees. More, the assessment must make the business case for the project—in other words, identifying gaps and examining why those gaps are a problem for the organization.

Lack of technical resources. Deploying technology solutions, analytics, and mechanisms to support a data governance program requires deep technical expertise. Beyond the inherent complexity of managing data, the market is saturated with data governance technology platforms and solutions. Organizations should take the time to define their technology requirements, perform a market analysis, understand and sample various technologies, evaluate vendors, identify the support options for each and compare contract terms before attaching their program to a particular tool or product. It’s also critical to support the project with developers who understand how the systems are interconnected and architected. Without a proficient team of technical experts, organizations will struggle to lay a strong foundation for their overall data governance program. Outside experts who are familiar with the industry can be useful guides throughout technology selection and implementation.

Cost. Traditional end-to-end data governance solutions can be expensive, and sticker shock is a common cause for projects being canceled. This is why it’s so important for stakeholders to agree upon the business case and long-term vision for the project. Data governance frameworks and technologies can deliver a positive network effect, wherein the more applications that feed into it, the more data it manages, and the more people use it, the more powerful it becomes. Stakeholders need to understand this and be prepared to demonstrate the value to secure funding.

Across all of these challenges, clear ownership is key. We often help clients build upon the Responsible, Accountable, Consulted, Informed (RACI) matrix, so the overall data governance initiative can be operationalized without excess overhead. The Mutually Exclusive, Comprehensively Exhaustive (MECE) framework can also provide a strong foundation, along with understanding how the tools that will be used meet each of the initiative’s objectives. With an actionable go-forward strategy and a source of truth for the organization’s information, teams can avoid common pitfalls and ultimately add value—in the form of improved agility, mitigated risk, and a foundation for extracting value from a vast universe of corporate data.

Adam Ingber is a Senior Director in the FTI Technology practice.  

Bryce Snape is a Senior Director at FTI Consulting.