How Old Metrics may strand you strategically

Ever stIMG_0267op to consider how the ever present changes going on around you make your own transformation easier?

John Hagel relatively recent blog post describes the opposite.

In a world of accelerating change, one of our greatest imperatives is to “unlearn” – to challenge and ultimately abandon some of our most basic beliefs about how the world works and what is required for success.

Accenture a few years ago noticed that many different companies had shifted their approach to strategy. Perhaps, the availability of cheap, powerful computing capacity and Big Data are responsible for driving changes in strategy development as more organizations using technology find it easier to build consideration of the future into their present planning.   Hagel, a long time fan of scenario planning would applaud these efforts too.

With the rise of automated business processes, analytics too get incorporated automatically to enhance decision making and may be simultaneously compromising management capabilities to internalize all of these changes or understand the underlying dynamics traditional measures mask. Several articles provided case studies in different industries provided the basis of discussion around transformation (see the bottom of the post for specific article links).

how to lie with staticsSuccessful organizations rely on their strategy to put forward action plans, realize new ideas while averting risk. Statesmen and management alike find themselves in precarious places when they assume a trend will continue without change. Many statistical methods and decision-makers use of data remain unchanged from 1954 when Daniel Huff first published How to Lie with Statistics. His timeless book describes very simply the perils of improper use of methods that were designed to capture and explain if not contextualize the significance of singular observations, or data.   The current transformations enabled by technology have done more to alter behavior than organizations seem to recognize. That’s the path our discussion took.

The capability for insight

Prospective vs retrospective cohort analysis  and data mining techniques are far from new. Though the volume and speed of available data to digest and process with ever The increasingly sophisticated tools and the ease with which volume and speed of available can be processed may help as well as hinder their digestion. Sure the time to test alternative scenarios may be faster, but how do you choose the model?

Do you begin with the intended outcome? The scientific method and numerous models from multiple disciplines make it possible to isolate factors, determine their significance, and estimate alternative scenarios and assess how these variations produce changes in impact.

Similarly, the cross pollination of data modeling from one discipline into multiple industries and use cases continue to shift management beliefs regarding the importance of specific factors and interactions in their processes. The perennial blind spot denies many organizations and their leadership the insight necessary to transform both their internal strategic thinking process and business operating models. Last month’s discussion of McDonald’s and Coca-Cola illustrated how easily leadership misinterpreted fluctuating performance as temporal issues versus recognizing structural factors. It’s one thing to balance efficiency and effectiveness, quality and satisfaction and another to manage awareness of change and insights necessary to your continued survival.

What else thinking

“…both the digital world and the physical one are indispensable parts of life and of business. The real transformation taking place today isn’t the replacement of the one by the other, it’s the marriage of the two into combinations that create wholly new sources of value.  “

The sudden availability of online data tracking provided many organizations with the proper capability to understand user behavior differently. A whole new industry arose to focus on interpretation while creating of new measures while also introducing new thinking about effectiveness in sales, customer service, training etc.  Metrics, once created to prove out a strategy or an idea, now leave many organizations vulnerable until they build up the capacity to understand this new thinking let alone make corresponding operational changes necessary to sustain their business.

This is not the story of companies who fail to adapt such as Kodak who invented digital cameras only to retain their focus on film; but maybe it is. reporting dashboards summarize specific indicators or activity associated with managing process or business relevant factors. The time and reporting cost savings that result from the automatic generation and ready access to information by managers and executives reinforce existing thinking and leave little room for understanding wider changes that may be impacting their business. It wasn’t long ago that analysts, and teams of them, spent their entire day pulling data and then calculating critical statistics detailing the effectiveness and efficiency of organizational activities to create reports for senior management. These efforts also made them accountable by insuring the data was clean, verifying whether outliers were real or indicative of a model failing to fully capture the wider dynamics. I was once one of those managers.  Today, automated reporting has eliminated many of the people capable of deeper data exploration and who chose what data, which statistics and the context necessary to understand the situation. The second problem is that data shared graphically or in tables never tell the whole story, though infographics do try.

A good analyst is taught to review the data and results, double-check whether the model or calculated results makes sense. Sure managers and executives may be quicker to detect aberrations and then raise questions but , how many of them have the time, patience or skills to test their ideas or intuitions? I imagine very few if any. Where are these available resources and how widely known are they to questioning executives?   How might the dashboard provide additional information to help frame the results executives see as they too seek to understand or make sense of the results?

Outside in thinking

Established data flow processes and automated reporting do deliver great advantages but they may also explain why outsiders find it easier than insiders to create new business models.   Where’s the out of the box thinking? And how can different data help?

Sure, it’s easy to blame regulatory requirements or compensation structures incentivized to focus on effectiveness and efficiency that leave little latitude to notice opportunity. For example, in the airline industry route fares were once set by regulations. The minimum fares were intended to cover airlines operating expenses that both insured passenger safety and access to air travel in more locations where market forces may lead airlines to cut corners. Deregulation may have given airlines additional freedom but many manage their business using the same metrics that they report to the Department of Transportation. Likewise in Healthcare, the imposition of new regulatory requirements came with new metrics that forced hospitals to focus on patient outcomes not just their costs.

When executives bottom line focus limits their thinking as an exercise in how making corrections in operation may maximize that number they overlook other contexts. Data quality issues should surface quickly in most organizations, but what if another factor created the data issue? A misplaced data point, or inconsistent treatment of the content of a data field rarely explain all aberrations in the results.   Weather, for example exemplifies a ubiquitous, exogenous variable. Observable data fluctuations may be directly or indirectly responsible by affecting other more directly connected factors, such as a snowstorms that change people’s activity plans. I’m not familiar with any automated reporting system that will automatically create a footnote to the data point associated with the arrival of a snowstorm. The reviewer is forced to remember or manually if possible add the footnote for others.

Bigger transformations to come

Bain believes there are significant implications for every organization that result from this digital and physical combination of innovations , they call Digical. It’s not easy to keep up with the corresponding behavioral shifts that result from these rapidly changing technological capabilities.

Focusing exclusively on efficiency and cost data helped management measure impact in the old era, though still necessary today they may no longer suffice. Do you know how social behaviors of your customers impact your bottom line? The technologies to support your business, such as your website or your cash register misses out on the social behaviors evident on sites like Facebook, Twitter, Yelp or even their bank. Mapping the ecosystem and then aligning the digital tracking data can now be supplemented with sensor data that may be anonymous to specific customers but can inform movement and actions relevant to your engagement.

Naturally, as mentioned earlier bias plays a role in our inability to notice the significance of new data. The more we automate and configure systems to measure what we always knew mattered, the less likely we are to be able to recognize new data and its significance. What should you the analyst and you the executive do to counteract these factors?


Monitor the activity of smaller companies as they experiment to learn what’s most relevant.

Don’t make assumptions, exercise strategic intentions to become more open receptive and curious about anomalies and be more creativity and persistent in identifying the drivers or possible factors.

Historically, metrics were an output designed to assess the validity of your strategy –did it work and/or deliver value. Not it’s time for strategic thinking to view metrics as an input. The use of statistics enabling analysis tools partnered with business knowledge and acumen must be part of communicating to higher levels in the business.

Often we measure the wrong things because the incentives are misaligned. Am I paid based on my proven ability to produce widgets at specific levels , or to produce effective, sustainable results for the business, not just my business unit?

Computers are useless they can only give you answers. For strategy, validating the questions may be important but so too is taking the time and effort needed to determine even better questions.


Alternative case examples

Bain’s study and understanding of the state of “digical” transformation:
Fast Food

BIG DATA: Big Deal or Just Big Business?

Technology evolves and for those of us who spend their lives adapting and endeavoring to keep up with the advancements it’s hard  not to notice a curious underlying dynamic.   Data and our ability to calculate or manipulate it for greater meaning is a little like resolving the chicken and egg paradox.  More of one begs more of the other, and yet we continue to ask which came first as if that question were important.  For many of us, our interest in  closing the uncertainty gap wishes for more data. We expect it will  help minimize the error or noise because the present picture of relationships remains a little too ambiguous. The constraint in this case is often our own experience and knowledge.

Professionally,  my own work warns against this unconscious bias.  I simply ask people to imagine three dots and ask that they line them up.  I then remind them that all three dots are coincident data points in time, and ask whether this new piece of information has changed their vision of the dots?  I then ask them to place the dots on an axis of time, and tell them that the dots now represent demand, growth or performance like ROI.  Does the way you’ve visualized the dots changed again?  I explain that the context I’ve added snapped into their own experience to create an image that creates a new puzzle as what they see fights with their expectation and they need more data to explain it.

The Economist in revisiting the Growth Matrix in 2009, put it another way. Bruce Henderson, credited with originating this framework reportedly believed  “while most people understand first-order effects, few deal well with second-and third-order effects. Unfortunately, virtually everything interesting in business lies in fourth-order effects and beyond”.

Big Data and the volume variety and velocity of its availability now has several partners,  real time processing power and plummeting data storage costs and lest we forget, simple access and manipulation tools  placing the data in an ever increasing number of users’ hands.  It is the number of people who now want to use the power of analytics that lends Big Data its influence, or at least that’s what several Chicago Booth alums who shared their thoughts last week recognized.

On May 18, 2012 Chicago was busy preparing for the arrival of NATO delegates and support.  The result was many businesses strongly encouraged their employees to work from home, leaving the monthly strategy discussion homeless.  We took advantage of the opportunity to launch our first virtual discussion combining a real-time interaction platform  (Group Systems Thinktank) and conference call (  Interest in the topic proved overwhelming prompting us to open up a second lunch interaction following  our usual early morning time.  The comments that follow represent a condensed version of the conversation.  Note, links to the discussion prep video and articles we encouraged participants to review in advance can be found at the bottom of this post. Also, a full transcripts are available to those who interested in seeing the automated output from ThinkTank, just drop me a note.

What’s the deal

Its a toss up whether mobility or big data has captured the imagination of business media more. The duel isn’t the point. Other driving forces and a growing need for critical thinking skills that were already in short supply.  Data reduction may be an emerging competency.  As the earlier references to Henderson point out, the questions you are trying to answer don’t get any easier just because you suddenly have access to more data. What to do with this new wealth of rich information are the bigger questions and challenges not merely for business but for consumers as well.

In the process of  generating the following list of examples, the interaction on Thinktank let participants also provide some links, raise new questions and add additional comments.

Twitter ,Telephone call records, Smart Grid, Real time Electricity meter  data , Nike + ,Scanner data,  Comments from Call Centers,  Providers’ case and disease management notes, EMR records, Geospatial (GoogleEarth, Navistar, etc), Mobile and GPS,  Gov’t DBs (big-data in an unstructured/non-uniform sense) , The quantified self, Output or processed Data from SAS,, other enterprise databases, Loyalty program, Amazon purchase history,, Moneyball (Big data in baseball), the new NSA data warehouse in Utah, QR, The internet of things,, The London Datastore, created by the Greater London Authority (GLA–Chicago has similar initiative) offers citizens open access , Netflix movie recommendations, SAP’s HANA usage (profiled in the report on their Sapphire Conference )

Twitter for example has evolved in ways that surprised their founders and also launched a number of new businesses with very unusual purposes. As one article pointed out routing the data can be equally important as tallying it, as illustrated by Procter and Gamble’s practice of funneling social media conversation/data to the appropriate person’s screen for monitoring and response.  In other words to be meaningful, sometimes just knowing something happened is enough, it doesn’t necessarily have to be mathematically manipulated to derive value.

Persistent challenges remain in dealing with the enormous variety of formats in which data are presented — some  sources are difficult to analyze — their pages long data dictionaries  often include details about its collection.Add to that the realization that Data is not just numbers anymore. The automatic semantic annotation required to make sense of this has also entered a new era.

Facebook, LinkedIn, Google+, Pinterest and similar social media sites would fit in here, as well.  All offer a richness of information, much of it real-time, that can be monitored, mined and used to drive decisions and actions. The customer center conversations , or customer audio recordings , transcribed to text, and then subjected to text analytics is helping improve performance management practice, allows for campaign conversion performance tracking etc

The impact or convergence of the evolving technologies with all of this new data, is as overwhelming as the scanner data that was available and stored historically but few had resource capability or interest in mining it. That era has passed and with it, new questions and promises arise.

Can we better understand and use consumers sentiment as in  do retail customers use more electricity/phone service/etc in a method that correlates to changes in the economy? weather? Or SAPs HANA data, now makes cancer DNA genome type analysis possible in minutes.

FB and Twitter,  may be drawing more attention, but those who  bring together the different streams  and mining more deeply old sources such as scanner data  are also causing quite a stir. Target found out that teenage girl was pregnant before her dad did using methods such as these.

Add the RFID  and imagine the benefit to stores knowing the quantity of each SKU that they currently have?  Can they coordinate sharing of product between their brick-and-mortar stores?

Perhaps the focus on twitter, etc., rather than scanner data comes from hoping that trending sentiment will precede and can be used to predict purchase, rather than log it as with scanner data. At, a participant shared  that vehicle search data on our website is predictive of sales.

It looks like QR codes might achieve the same benefits of RFID as smart phones are becoming ubiquitous.

The Geo-spatial data, according to McKinsey’s recent report on Big Data as the next big innovation, quantified billions in time savings from just helping consumers avert traffic.

Several other behavioral nudging based on real time feedback is now possible. Nike is exploring and furthering this automatic feedback. Check this article for more brands using the quantified self, and more information


SAP’s HANA, Google’s BigQuery , Splunk, Hadoop and NoSQL databases , Tableau, Tibco Spotfire, Omniture ,Pentaho (open source BI), Amazon’s Web Service suite (more of a platform), Cloud computing platforms, Data visualization, most business users, make the data preparation easier and allows them to focus more on analysis and develop insights in combination with Machine learning tools (neural networks, support vector machines, natural language processing, etc.)

Decision making , can and does BIG data make more accuracy possible?

It offers higher granularity — like Target’s ability to create coupon books customized to individual households, Or integrate GPS level stuff — e.g. texting coupons to customers right while they’re standing next to a certain store.  Or 2nd/3rd order analysis – correlating Target’s sales with weather data; creating ‘real-time’ personalized coupons; identifying ‘trend-setters’ among the customer base to influence ‘trend-followers’ coupons

The downside? Detecting or separating out spammers from these data sets or paid to  express a certain sentiment. Totally! Like those girls paid to say great things about clothes on Facebook — not maybe necessarily analyzing big data, but using the platform.

Greater real-time evidence  can reduce risk and  insure assumptions in product/service development and marketing are on or off target.

Hard to appreciate the analytic without the qualitative context or understanding;  but maybe some strange new ideas can come out of the data, like the “diapers next to beer” epiphany. Data needs some drivers to make it meaningful, as in cause and effect. In part because qualitative data is harder to analyze.

Is quant vs. qual or the social science methods to data collection really that different from the scientific data analysis approach?  Both approaches seek to explain cause and effect, or the relationship between a stimulus to produce a predicted response. The problem is  too many people will extend a model beyond it’s capacity. Claiming “the data said so”  lets people off the hook and avoids responsibility for  decision-making.

Suggested tips include  avoid extending a model beyond its capacity, or  understand and differentiate descriptive and predictive Stats. Likewise, be wary of  finding trends that don’t exist (e.g. “data mining” or “straws that look like needles”) and confusing correlation with causality.

Perhaps cross-validation from trained analysts can help avoid  these . Danger of expecting tools to automatically extract value from large datasets.  Need to ensure good analysis, disciplined hypothesis generation, etc.

The data, even when analyzed, does not represent the decision.  This is true with small and big data.



  • Big data is here to stay – need to figure out how to use it effectively.
  • I liked the point that lots of data is around and that people just don’t know what to do with it.  The best BIG DATA process or engine in the world still won’t create the insights that are needed.
  • Corporate culture is a huge factor–the problem is not availability of data, but commitment and focus of corporate leaders to shape a culture that moves the organization in that direction.
  • Big data is here. It’s a tool and like any other, it’s the latest and greatest on the block, with a bit too much hype. But it has a definite value in providing and stronger qualitative base to identifying trends and activities.
  • My realization  is that, once again, the technology is interesting, but it is the corporate culture and will that will matter. The culture and vision lead; the strategy and models follow.
  • sharing the questions with a wider audience confirms concerns and clearly lots of assumptions that need to be played out. There is a large dark side that we still don’t understand; but the positives and opportunities for real time decision.
  • Big Data! The piles get higher and  higher and wider and wider…to what purpose? That implies the need to “mine” the data, reduce it and subject it to analysis before it can be made useful.
  • Big data will revolutionize business but it is not strategy, potential for a lot of false positives .
  • The wise use of big data offers a huge opportunity for developing differentiating strategies and for finding new product/service needs.
  • “Big data” is the current term for things that have existed for a long time.  All types and sizes of organizations can benefit from big data if they recognize the importance of the human component (not just the data and software) and have specific objectives in mind before starting.
  • Much of the expertise about analytics developed over many decades still applies, and there are new dimensions to integrate and understand because of the availability of the technologies and data.
  • Everyone on the line has experience with Big data, so I don’t think it’s so scary.  Most people have business perspectives, wanting to teach the Analysts that their conclusions need to be driven by business needs.   My comment is that as leaders, and those trained with some behavioral awareness through business school, it is _OUR_ responsibility to try and massage the analysts towards an enthusiasm for our world view…;-)
  • The human side of utilizing the technology and expertise is just as challenging as ever. (Cognitive biases, communication skills, influencing skills) Garbage in – garbage out is a big risk without proper attention and skill in applying the technology and in communicating.  The data, even when analyzed, does not represent the decision.  This is true with small and big data.

In closing, let me return to my observations about the limitations of developing strategy rooted in an expectations of the experience curve relationship.  The frame with which you approach the problem often has far more bearing than the data, your analysis or the tools.  Or at least, in June we  plan to look at some of the assumptions around growth as the ultimate strategy.

Please throw your responses, or continue to post links for others as Big Data continues to be quite newsworthy as its impact and influence continues to unfold.

Articles and links:

We suggest an  Optional  short 5 min. video tutorial , EMC produced to understand what Big Data IS?
The following are required advance reading.

1. IBM’s Institute for Business Value, in collaboration with MIT Sloan Management Review  2010 research findings
Analytics: The new path to value: How the smartest organizations are embedding analytics to transform insights into action

2. Strata keynote  short 7 min. video by Google’s Digital Marketing Evangelist Avinash Kaushik- a bit irreverent and a little over the top – a bit irreverent, bordering on over the top – but not boring  effort to help us understand the problems and the approach undertaken by Google.

Big Data Imperative 
March 2012

3. Tom Davenport’s Culture of Analytics

April 5, 2011 

4. Inside P&G’s digital revolution
McKinsey Quarterly November 2011
overseeing the large-scale application of digital technology and advanced analytics across every aspect of P&G’s operations and activities—from the way the consumer goods giant creates molecules in its R&D labs to how it maintains relationships with retailers, manufactures products, builds brands, and interacts with customers. The prize: better innovation, higher productivity, lower costs, and the promise of faster growth…

One more optional overview

The age of big data
NYTimes, Feb 2, 2012