Data Governance At GE: Driving Business Outcomes

Tony Chacko

Part of the Digital CIO” series

Going back a decade or so, business’ focus on data governance typically was catalyzed by external forces: Sarbanes-Oxley, Basel I and II, HIPAA, and now GDPR, to name a few. Regulatory compliance was not the only reason to improve data governance, quality, and accessibility, but the need to meet these requirements made it imperative for companies to fund and staff these initiatives.

At GE, my data governance journey began as a regulatory need for the financial services arm of our corporation (GE Capital). This required us to build robust data management capabilities to measure various risks like liquidity, asset-liability mismatch, etc. on our balance sheet. We developed a data management infrastructure and governance framework that was “commensurate with the size and complexity of the organization” (to quote a regulatory phrase).

With this effort, our executives began to realize the value of data as a strategic asset. Governance soon became a broader corporate-level initiative. The company created a new role, chief data officer (CDO), to take ownership of data-related issues. With the support of our board, we began building a framework and business processes for data governance, data quality, and master data management. Today the CDO team manages the people, processes, and technology needed to ensure high-data quality throughout the lifecycle of the data. This was both essential and valuable and resulted in data being treated as the strategic asset it is.

For data to be meaningful, and as CIOs help their companies become intelligent enterprises, data management should be prioritized and linked to tangible business outcomes. For example, data should facilitate accurate measurement of key business measures such as inventory, cash, spend, and so on. Further, it should enable accurate analytics to facilitate decision-making and accurate, timely reporting (both externally and internally).

Another key aspect is the democratization and curation of data. In other words, there needs to be an operational framework to drive transparency and consistency around the accessibility, quality, and usability of data. An analogy we like to use is that of water from a faucet. You need to know where the faucet is, which way to turn the knob, and have an indication of the quality of the water from “potable” to “I wouldn’t water my lawn with it!”

Overcoming common challenges

What impedes organizations from using data governance to achieve these business outcomes?

In my experience, some senior leaders still hesitate to acknowledge data as a priority. They fail to understand the effort required to answer basic questions when data is in silos, spread among disparate systems, or owned by multiple parties. The attitude can be “it works for me, so it’s not really a problem.” CDOs can gain buy-in from these executives by exposing the number of hidden data “factories” and the outsized effort required to cleanse the data to produce the requested information. Cost, effort, and productivity are powerful levers to drive change.

Another challenge is the sheer breadth of the data stored by most enterprises. Companies have so many reports, data warehouses, data lakes, and applications – each with potentially hundreds of attributes – across the IT landscape. I think about my own computer – my eighth or ninth in 20+ years at GE. Like many people, I keep a lot of stuff. Someone recently asked me for a file from 2008, and I still had it and luckily could find it. Multiply my information glut across the enterprise, and you can see why volumes, access methods, and ability to find what one is looking for can be overwhelming.

Finally, users increasingly demand data without knowing the business outcome they want to drive. They see that information is available, so they assume they should have access to it. Users ask for a new report, and IT brings those attributes into the data lake. Maybe the user never generates that report again. Yet enterprises keep piling up the data, never removing assets that are no longer useful.

A closer look might reveal that only a small percentage of the data is being used at all, and even less of it is helping us drive business outcomes. This is why I feel it is critical to challenge users to clearly articulate why they need the data and what business outcome it drives. One of my philosophies has been that if you cannot explain in less than a couple of minutes why something is required, you probably do not need it. Also, periodic spring cleaning is essential to remove anything that is out of date or unused. This needs to be mandated and institutionalized.

Realizing our data governance goals

For us at GE, building a framework around processes and standards has been an important step in our data evolution. By first understanding what we need from our data, gaining a better understanding of the data landscape, determining where the data is available, making sure the data is accessible, and operationalizing a data quality program, we’re getting closer to achieving our goals. By tying our governance processes to business outcomes, we are able to deliver small wins, build credibility for the program, and make data governance relevant to users. Next step is to make data a source of true competitive advantage!

Of course, we can’t discount the positive impact of advanced technologies. Increased computational power and new technologies including the cloud can help with managing large amounts of data, and better visualization capabilities allow users to access and report on their data more efficiently. I expect that robotic process automation will standardize repeatable mechanical processes, and artificial intelligence, machine learning, and data science can drive uses that have hitherto been just concepts.

To read more about executive leadership in data management, please read “How the CDO Can Improve the Customer Experience.”


Tony Chacko

About Tony Chacko

Tony Chacko is responsible for Master Data Management and Governance at GE Power. Tony has been with GE for 22 years and spent most of that time in the financial services arm of the company in various roles. His foray into the world of data began when he was responsible for building a data management platform and analytical tools for Liquidity Risk Management to comply with management expectations and regulatory requirements. This involved bringing together for the first time GE Capital's balance sheet into a single, cohesive data model and implementation of the chosen solution to generate controlled, automated liquidity stress testing, management metrics, and regulatory reporting output. This led to a deep interest in data and his current role. Tony has a Masters degree in Economics from Purdue University and an International Business degree from the University of South Carolina. He lives in Atlanta with his wife, two daughters, dog, and cat and loves traveling, eating, and the outdoors.