The world’s stored data will exceed 400 exabytes by 2020. 99% of this will be digital. Currently, less than 1% of this is net value creating. We call value creating data intelligence. Digital intelligence differs from statistics, research, insight, and analytics on seven key dimensions.

1. We are now in a world where data is abundant. When data was scarce, we had to extract, track, and survey to get at it. Often today, we find ourselves swimming in data. And this data does not have to be coded before it can be used; it is native digital. Our main challenges come in making high volumes of noisy, unstructured data, machine readable. Our traditional analysts do not always have the right tools or skills for the job.

2. Scarce data is often periodic. Traditionally, data moved in batches, was stored in databases, and analysed at rest. Now data is continuous, fluid, and distributed. Insight comes from the stream which means orientation, decision, action, and reaction all happen at the millisecond level. If our systems, processes, governance, and strategies can catch up to this new baseline state we will access advantage. If not, we will just incur the system cost.

3. Customer data in the old world was mostly claimed or stated. Behavioural data trumps these sources every time. We have long known that what she says she’ll do does not correlate with action but we brushed this inconvenient truth under the rug. Tracking outcomes in behaviour moves us from abstract reporting to concrete, actionable intelligence.

4. Yesterday’s closed channels are now wide open. Once, a poor rating was known to you, the survey operator, and maybe a few friends. Today that rating is indexable content. Our responses are more than just data, they are channels and worthy of more sophisticated analysis. A rant’s emotional content is richer and more informative than a rating on a Likert scale. Understanding this means adopting the attitude of the operative, not the lab technician.

5. Data extraction often meant sampling where now we read a universe of data. Population level data means we can do away with a whole tranche of frequentist statistics and get to grips with the mathematics of the real world; of probability, geometry, and optimisation. These approaches demand a new mindset based not on culturing ‘significance’ but on trading off uncertainties.

6. Our records used to aspire to the fantasy of a ‘single source of truth’ but links have undermined that hierarchy too. A little digging can identify rich connections across multimodal data sets; can overlay and transpose records, and establish context. While legislators fuss over privacy, the real progress is being made in intent, context, and behaviour. We don’t need to know who you are to be able to predict what you’re likely to do next. Our policy and governance frameworks need to catch up to an accelerating technical horizon.

7. The real-time richness of digital intelligence increases signal density. The dominant mode of traditional analytics was forensic but better signals delivered with less system latency increase our chances of staying ‘left of the bang’. This allows action to shift from reactive to proactive mode.

Intelligence is fuel for digital change because it is central to making systems and processes software. Once channels, products, services, and industries become software they are available for optimisation and the fast-feedback loops that drive non-linear progress.


This article is by Jason Juma-Ross, former Digital Intelligence Lead for PwC.

 

Contributor Placeholder

Guest Contributor

Jason Juma-Ross