How the Best Chief Data Officers Create Value

Data is fast becoming an integral element of every decision, customer touchpoint, and internal process. The opportunities that come with greater data and analytics capabilities are apparent across the entire value chain — from customer-facing algorithmic tools that yield personalized recommendations to enterprise tools that automate applicant résumé screening. With data and analytics becoming important drivers for increased efficiency, better decision-making, and greater innovation, CEOs have started to invite a new member to their C-suite: the chief data (and analytics) officer (CDO).

Despite the rapidly increasing prominence of data and analytics functions, the majority of CDOs fail to value and price the business outcomes created by their data and analytics capabilities. It comes as no surprise then that many CDOs fall behind expectations and have short tenures.

As part of a research project at the Digital Value Lab at Harvard’s Digital, Data, and Design Institute, we conducted 17 in-depth interviews with CDOs who are largely considered to be at the frontier of the role. Based on the interviews, we synthesized where CDOs can create value and how they can measure and price it. Beyond strategies for creating and demonstrating value, we provide insights into qualitative and quantitative measurements that data analytics leaders are currently adopting.

How CDOs Create Value

CDOs typically influence four interconnected areas. Depending on the level of federation of data responsibilities, CDOs wield more or less influence over these areas. The first sphere of influence is directly linked to business value and can be termed as data products that directly link to use cases. These products are developed, maintained, and continuously improved (predominantly by the CDO unit) to solve a specific business problem for an internal or external customer.

Second, CDOs typically curate the organization’s data assets and platforms (sometimes also referred to as multi-purpose data products): essentially reusable assets that allow teams across the organization to easily and efficiently access data and build use case–driven data products (including reports) on top of. For most companies, the first asset is the “Customer 360,” which brings together all information collected on a customer across business units.

The third sphere of influence, which conceptually sits below the first two, comprises the organization’s data architecture and governance. It hosts the underlying technical and administrative infrastructure needed for the data product teams to operate efficiently and in a standardized way.

The CDO’s fourth sphere of influence is the organization and, more specifically, its people. The organization reflects the foundation for any data initiative as it provides and develops both general data literacy and specialized data talent.

Understanding the playing field is important. Businesses typically hire a CDO with a portfolio of diverse expectations, including being a technology leader for data and analytics, a visionary for the overall business, a cultural revolutionary to promote data-driven decision-making, a data governance expert, and so on. Yet success requires a concerted effort between these different roles. An emboldened CDO needs to propose value, measure value, and claim value for each sphere of influence.

Value Proposition for Data Products

Data products are applications that are developed and managed by the CDO to provide the right data of the right quality at the right moment for a specific business process. Their objective is to generate actionable insights to improve internal operations, enhance product or service offerings, create an entirely new offering, or make data available to external partners. The CDO unit should mirror each line of business and then, for each one, develop a data product roadmap (jointly with the business side), prioritize the most impactful use cases, and deploy interdisciplinary teams to develop, maintain, and improve the data product following the principles of agile product development and DataOps.

Each of these teams should have all the necessary capabilities to run independently, from understanding the business problem to creating the data science models and visualizations to building the data pipelines. These lighthouse projects are important at a tactical level: They inspire buy-in and trust from other C-level executives and business units, and they demonstrate the feasibility and impact of data initiatives.

Mohammed Aaser, CDO of Domo Inc., a data and analytics platform provider, says, “CDOs are fighting a two-front war. In one case, they have to show value rapidly. At the same time, they have to build platforms that can endure and can create value. The reality is, if you start with the platform, it will take you two years, you will not show much value, and you will not have a job.”

Value Proposition for Data Assets and Platforms

The second sphere of influence comprises data assets and platforms that can be leveraged to build data products for all kinds of challenges. They integrate different data sets and elements from source systems from across the business into a single asset or platform, which, cohesively, can give a better view of the issue being considered. Their objective is to enable scale: The data asset or platform makes data available for easy and fast delivery by anyone in the organization (through data products or self-service APIs).

CDOs need to align with unit heads on which data asset or platform to prioritize. Then, dedicated teams own each asset or platform and use an agile product-centric delivery approach (similar to that of data products). They start by compiling user requirements, prioritizing features, and developing, maintaining, and improving the asset. This includes measuring adoption and collecting user feedback. As these assets span multiple lines of business, they typically require the alignment of definitions and calculations of central business metrics — a powerful demonstration of how the cross-functional role of the CDO can help lines of business speak the same language and forge a holistic understanding of the business.

Value Proposition for Data Architecture and Governance

In most businesses, the proliferation of data has led to a proliferation of systems, practices, and policies. Accordingly, it is one of the CDO’s core responsibilities to manage the data architecture and governance — what we consider the third sphere of influence.

The objective of both data architecture and governance is to manage the basics of the data organization by providing the required data systems in a cost-efficient way and by ensuring data is produced, documented, and consumed in a streamlined manner. Regarding the data architecture, the CDO’s value proposition is to define and align which systems and tooling (e.g., security) are required in the target data architecture to support the desired data products and then acquire or build and run these systems.

Additionally, the CDO should centrally manage licenses and instances of often extremely costly data-specific systems and define usage criteria. For data governance, the CDO’s value proposition includes the development of data policies with regard to accessibility, quality, interoperability, and security, as well as the development of a data catalog as the single source of truth of how, where, and which data is stored in the data assets and platforms. Data governance should also include central best practices and standards for data operations and the use of algorithms and AI.

Value Proposition for Organizational Data-Readiness

The CDO’s objective should be to develop the required data-specific roles and organization-wide competencies to support all other data-related activities. While this requires that sufficient data talent is hired and retained, real success comes from the activation of the entire organization: Leading CDOs define different internal data personas, categorize every role in the organization, and map professional development journeys for them.

Upskilling employees outside of the CDO unit, for example, through “data translator” training for business leaders and subject-matter experts, creates a strong demand for data products. Upskilling also creates local champions that ensure the embeddedness and adoption of data products in their business units. Frontline employees need to understand the basics (e.g., how a metric is calculated, how to de-average and identify lower-level trends) because otherwise, the fear of making a mistake, taking the wrong number, or misinterpreting it makes them revert to their old processes.

How Can CDOs Measure, Show, and Price Their Value Contribution?

For CDOs, clearly defining the value proposition across all spheres of influence is a necessary precondition to measure, demonstrate, and sometimes price their value. Leading CDOs have started to develop qualitative and quantitative measurements that can prevent organizations from under- or over-investing in their data initiatives.

One CDO claimed that business units would no longer (or less frequently) request or use data products if they had to consider the associated price of the data teams and the underlying data asset/platform, which may explain why some organizations shy away from internal chargeback models. However, pricing data allows business leaders to better evaluate the most efficient use of their budget. Additionally, if business units allocate funds from their own budgets, they have skin in the game and want to achieve measurable results with it.

Data leaders often start with a “shadow” chargeback (i.e., the theoretical internal transfer price) and a “shadow” revenue account (i.e., the enabled revenue increment) for every service, product, or asset owned by the CDO unit — with “shadow” indicating that it is not (yet) a formal profit and loss record. The shadow chargeback helps put a price tag on CDO projects, which is important to know regardless of who pays for them. Both the CDO and business units often tap into CFO-controlled central budgets to cover the chargeback, given the strategic relevance of data initiatives. The upside of (shadow) chargebacks and revenue accounts? Resources are allocated efficiently, business units have a deeper interest in driving adoption and success, CDOs are incentivized to go into selling mode, and CDOs can more easily show their value contribution.

Value Measurement of Data Products

The CDO role requires a great deal of persuasion and storytelling. Demonstrating the potential of data products for concrete business use cases is pivotal in this regard. CDOs should measure at least the following set of indicators:

Business impact of lighthouse data products

  • Quantitative measurement: Adoption rate of the data product for internal customer (e.g., frontline employee) or external customer
  • Quantitative measurement: Improvement of the data product against a pre-implementation baseline or reference case (through A/B testing), e.g., churn reduction

Operational efficiency and maturity of data product teams

  • Quantitative measurement: Resources used in the development and operations of the data products (e.g., person-hours per data product or feature, additional CPU needed)
  • Qualitative measurement: Ability to support business, responsiveness to feedback, and SLA fulfillment through surveys
  • Qualitative measurement: Maturity of the agile data product teams (e.g., sophistication of agile ceremonies) through surveys

Most business leaders still struggle to understand how data can change the way their core processes work. Showcasing high-value lighthouse data products and their business impact helps business unit leads get a sense of what is possible.

However, sensible investment decisions require an estimation of costs. Before deploying their teams, CDOs need to engage them in project-sizing exercises to determine which roles and how many person-hours are needed for potential data products. For each line of business, the CDO unit should keep a data product roadmap that mirrors and follows their respective business strategy and objectives/key results. One CDO told us that he holds an annual executive value-engineering workshop where they boil down a long list of 60 to 70 data product ideas to 10 candidates, and then to four data products that are realized in the upcoming year. Their decision is based on a 2×2 matrix with estimated business impact on one axis and estimated resource requirements (i.e., costs) on the other axis.

One CDO pioneer, Sebastian Klapdor, CDO of Vista, a marketing and design company, established various metrics to capture the value created by his team. Through the adoption of data products and pre- and post-implementation measurements, Vista has been able to generate an incremental $90 million in profits with a large share of it recurring on an annual basis.

Value Measurement of Data Assets and Platforms

Mohammed Aaser of Domo Inc. describes multi-purpose data assets and platforms as the “assembly line” that allows for the development of data products at scale across the organization. The CDO develops and continuously improves the assembly line, relentlessly removing all bottlenecks and incorporating user feedback. However, value is added by the teams building data products on top of data assets and platforms, which, with increasing maturity, may become increasingly decentralized and performed by the respective lines of business.

Therefore, CDOs need to implement various qualitative and quantitative measurements to measure, show, and claim the value they provide with the data assets/platforms.

Business impact of multi-purpose data products/platforms

  • Quantitative measurement: Usage and engagement of the data asset and platform (e.g., number of users, data volume accessed)
  • Quantitative measurement: Processing time of queries using the data asset and platform
  • Qualitative measurement: Anecdotal stories of the improvements for the use case–driven data product development process (from business unit and CDO unit teams)

Operational efficiency of data product/platform team

  • Quantitative measurement: Resources used in the development and operations of data asset/platform (e.g., person-hours per data asset/platform)
  • Quantitative measurement: Speed of the feature development and source system integration

Building a multi-purpose data asset/platform such as the “Customer 360” asset requires a significant budget and usually has a long development time (one to three years, depending on the business).

Prior to development, CDOs should prioritize with the business unit leads the most relevant multi-purpose data assets and platforms. They should estimate per domain the associated value pool based on the 30 to 40 data products it could enable within five years (a back-of-the-envelope estimation is suitable here).

Based on the average ROI of existing data products and a flat-rate reduction for additional development needs (to build on top of these multi-purpose data assets/platforms), the investment size should be budgeted. Given that such assets and platforms are cross-functional and have strategic value, the CFO functions most often as the project sponsor. Nonetheless, CDOs should implement an internal (shadow) chargeback once they are developed, using the estimated impact the business unit aspires to generate and the average ROI of use case–driven data products. Alternatively, CDOs can define user- or usage-based pricing models.

Value Measurement of Data Architecture and Governance

Most of the responsibilities of providing the data architecture center around central and efficient management through, for example, central allocation of licenses and instances or rationalization of systems. Wendy Batchelder, SVP and CDO of Salesforce, explains that these are often quick wins, and that the CFO becomes a great partner in driving the projects. By implementing a tiering scheme that regulated when a query is considered a high/medium/low-velocity query, she was quickly able to realize significant cost savings.

Business impact of data architecture

  • Quantitative measurement: Improvement of the cost base through central management and rationalization of systems against a pre-implementation baseline

Future-readiness of data architecture and governance

  • Qualitative measurement: Progress on the systems roadmap (status quo versus target state)
  • Qualitative measurement: Progress on the data policies (status quo versus target state)
  • Qualitative measurement: Progress on the data catalog (status quo versus target state)

Value Measurement of Organizational Data-Readiness

Creating a data culture is often cited as one of the biggest challenges for CDOs. Leading CDOs work closely with internal partners such as HR and external partners such as universities to curate an educational program and to develop data-specific professional development journeys.

To understand the effectiveness of their educational efforts, CDOs should set up Net Promoter Score (NPS) surveys. One of the CDOs interviewed has garnered a world-class NPS of 80 to 90 for his middle-management training focusing on business value creation and ethical use of data.

Three measurements form the basis of performance tracking in the organizational sphere of influence:

  • Quantitative measurement: Completion rate of the data literacy employee training, academies, bootcamps, and e-learnings
  • Quantitative measurement: Net Promoter Score of the data literacy employee training, academies, bootcamps, and e-learnings
  • Qualitative measurement: Concept for the data personas and development journeys

. . .

Stepping out of the shadow as a CDO is much more than a career ambition. It heralds an important shift in how the organization thinks about data and ensures that data-related projects are funded or de-funded according to their value contribution. Given the vast opportunities associated with data, analytics, and AI, businesses can no longer afford uncoordinated data efforts without a clear link to value. While AI dominates the headlines, there is consensus on the following: data maturity necessarily comes before AI maturity.

Source link

About The Author

Scroll to Top