top of page

The Data Strategy That Works — A 6-Step Framework (Part II)

Data strategy that works — Petgrave.io
Data strategy that works — Petgrave.io

In Part I, we talked about two things every organisation needs before data can work: a clear understanding of how the business creates value and a clear understanding of what to measure to know if that value is being delivered.

If you did that thinking honestly, you now have something real to work with.

But here is where most organisations make their next mistake. They take that clarity and immediately start building dashboards, pipelines, and reporting systems. They assume that because they now know what they want to measure, they are ready to measure it.

They are usually not.

There are two more questions to answer before you build anything. The first is about your organisation's capability. The second is about the reality of your data. Get these wrong, and you will invest in tools your organisation cannot use and surfaces that look impressive but never change what anyone does.


Step Three: Are You Actually Capable of Using Data?


This is the question most organisations skip entirely, and it is the one that causes the most expensive failures.

It is easy to describe an ambitious data future. Real-time dashboards. Predictive insights. AI-powered decision support. These things are genuinely possible. But the organisations that achieved them did not get there by buying the technology first. They got there by being honest about where they started.

The hard question is not 'Where do we want to be?' but 'Where are we actually today?'

And the even harder question: are we ready to use what we are about to build?

Here is what that looks like in practice. Walk through each area of your organisation and ask the following.


  • Do people actually use data to make decisions, or do they use it to confirm decisions already made?

There is a difference between an organisation that is data-informed and one that is data-decorated. In a data-decorated organisation, reports exist, dashboards are built, and numbers are presented in meetings, but the actual decisions happen the way they always did, on instinct and seniority, with the data brought in afterwards as justification.

Ask yourself honestly: when a difficult decision needs to be made, does anyone reach for the data first? Or does the data come out after the decision to support the argument? Is data trusted across the organisation, or do people quietly question its accuracy? Do your people know how to read what the data is telling them, or does it require one specialist to translate it for everyone else?


  • Are your processes documented, or do they live in people's heads?

This is one of the most revealing questions in any data conversation. If your key processes depend on specific individuals, if things break or slow down when those people are absent, then your organisation is more fragile than it appears, and your data strategy will be too.

Data work is only repeatable if the processes it supports are repeatable. You cannot automate what has never been written down. You cannot scale what only one person knows how to do.

Ask yourself: what breaks when someone is on leave? Are your core processes documented well enough that a new person could follow them? Where does knowledge live in people's heads rather than in systems? Where do errors happen most frequently, and is there a pattern?


  • Do your systems support decision-making, or just store information?

Technology is an enabler, not a solution. The question is not what tools you have; it is whether those tools actually support the decisions that matter. Many organisations have more technology than they can use effectively. Data lives in multiple places, moves manually between systems, and requires significant effort just to produce a basic report.

Ask yourself: where does your data actually live right now? How many systems are involved when someone tries to get a clear picture of the business? Are those systems connected, or is data being moved manually between them? How easy is it for the people who need data to access it reliably, without always needing help from someone technical?


  • Does anyone own your data, or does quality just degrade quietly over time?

When nobody is responsible for data quality, quality always deteriorates. It happens slowly, then suddenly: different teams using different definitions of the same metric, numbers that cannot be reconciled, reports that contradict each other. By the time it becomes a visible problem, the trust is already broken.

Ask yourself: who owns each important dataset in your organisation? Who decides what key terms mean and what counts as an active customer, a completed order, or a resolved issue? How is quality maintained, or is it not maintained at all? Are privacy and data ethics considered in how data is used, or is that conversation still to come?


  • Does data actually influence your strategy, or does it just get reported?

This is the final and most important capability question. It is entirely possible to have data flowing, dashboards running, and metrics being reviewed in every meeting and still have an organisation where strategy is driven entirely by opinion.

Ask yourself: can you point to specific decisions that were made differently because of what the data showed? Are insights acted on or just noted? Does data show up in your planning conversations or only in your reporting conversations? When the numbers say one thing, and the instinct says another, which one wins?

The point of these questions is not to produce a score or a rating. It is to surface the honest answer to a single question: what are we actually capable of right now?

Because here is the principle that governs this entire conversation: you cannot build a data strategy that your organisation is not ready to execute. Designing something your people cannot use, on systems that cannot support it, inside a culture that does not trust it, is not a strategy. It is an expensive theatre.

The organisations that get data right do not always start with the best technology. They start with the most honest self-assessment.


Step Four: Where Does Your Data Actually Break Down?


Even in organisations with genuine capability, there is a gap that almost no one openly discusses. It is the gap between the data that exists and the data that creates value.

Data does not create value by being collected. It creates value by moving from the moment it is captured, through the people and systems that process it, all the way to a decision that would not have been made without it. And somewhere in that journey, in almost every organisation, something breaks.

The question is not whether your data breaks down. It almost certainly does. The question is where, and whether you know where.

Here are the questions that reveal the answer.


  • Where does data enter your organisation, and what gets lost right at the start?

Every piece of data has a moment of creation: a customer interaction, a transaction, a form submission, or a process completion. That moment is where quality is either built in or lost. Inconsistent entry, missing fields, and manual workarounds at the point of capture: these problems travel downstream and compound.

Ask yourself: where is data first created in your key processes? Is that capture happening automatically or manually? Is it consistent, or does quality vary depending on who is doing it or when? What gets missed or lost at the point of capture, and has anyone ever audited that?


  • Where does your data live, and does it live in one place or many?

After data is captured, it has to go somewhere. For many organisations, it goes to several places at once: a CRM, a spreadsheet, a shared drive, and someone's inbox. Multiple versions of the same data, stored differently, owned by nobody, accessed inconsistently.

Ask yourself: where does your data actually live right now, all of it, not just the official version? Are there multiple versions of the same data floating around? Who can access what, and how? Is there a single reliable source of truth, or are different teams working from different numbers?


  • How much time is spent preparing data versus using it?

This is the question that makes analysts uncomfortable, because the honest answer is usually 'far too much'. Raw data is rarely ready to use. It needs cleaning, combining, and standardising. In many organisations, the people who should be generating insights spend most of their time just getting the data into a usable state.

Ask yourself: what work is required to get data ready for analysis? How much of that is manual? Where do errors most commonly creep in during preparation? How long does it take from data capture to its availability for a decision, and is that fast enough to be useful?


  • When insights are generated, do the right people actually see them?

An analysis that nobody acts on is not insight. It is an effort. Yet in many organisations, the people generating analysis and the people who need to make decisions based on it are disconnected — by time, by format, by access, or simply by habit.

Ask yourself: how are insights shared in your organisation? Who sees them, and who does not? Are the people making decisions actually looking at the analysis being produced? Do the visualisations and reports being created make things clearer for the people who need them, or do they require interpretation?


  • When data reveals something important, does anyone actually do anything differently?

This is where data value is either created or destroyed. Everything before this moment – the capture, the storage, the preparation, the analysis, the visualisation – is only worth something if it changes what someone does.

Ask yourself: what decisions are actually being made because of your data? Who is accountable for acting on insights? Is that clear? How quickly does action follow an insight, or does it sit in a report for weeks before anyone responds? Is the response to data systematic or ad hoc and inconsistent?


  • When you act on data, and something happens, do you learn from it?

The final and most neglected part of the data journey is the feedback loop. When a decision is made based on data, and an outcome follows, does the organisation capture what it learned? Does that learning feed back into better decisions next time? Or does each decision exist in isolation, with no connection to what came before?

Ask yourself: are outcomes reviewed systematically? Is there a process for capturing what worked and what did not? Do insights genuinely improve future decisions, or does the organisation repeat the same mistakes because nobody tracked the results?


What You Now Have


Between Part I and Part II, you have worked through four layers of honest questioning about your organisation and its relationship with data.

You understand how your business creates value. You know what you need to measure. You have a clearer picture of your genuine capability. And you can now see where data breaks down before it becomes a decision.

This is the foundation. Most organisations never build it, and that is exactly why their data strategies fail.

In Part III, we will look at what comes next: deciding what is actually worth building and articulating a clear direction for your data strategy.

Petgrave.io helps founders, teams, and organisations build data strategies grounded in business reality. If these questions surfaced something worth exploring in your organisation, we would be glad to talk.





Comments


© 2025 Petgrave.io
Data-Driven Transformation.

  • Facebook
  • X
  • Instagram
  • LinkedIn
bottom of page