BMA Media Group About Us

Mailing List

Business, Bias, and How Not to Fall in Love with Your Metrics

June 13, 2016 | Back To News

Understanding your data is contingent on how well you know your business: your goals, your customers, your product and how it fits into your chosen target markets, your client demographics, segmentation, and their buying cycles, the list goes on and on. 
Knowing yourself and your customers is the hard work, looking at data and interpreting it based on some often arbitrary, universally accepted variables and indicators, is easy.  Henry Ford once said that “Thinking is the hardest work of all, which is why so few people do it.” So, before you look at your data and attempt to gain insight from it, do the hard work, and study your business. Analysis without understanding and strategy will yield conclusions that are of no real value or meaning. 
Once you have studied your business, begun to understand how it works (or doesn’t work), and begun to formulate a strategy that will allow you to reach your business goals, only then are you ready to start understanding your data. 
Only when you begin to understand yourself, your goals, your mission, your purpose, your culture, and therefore your customers, competition, and possible partners, will you be able even to start analyzing your data with any real understanding or confidence. 
Not only is understanding your business necessary before you can begin to understand your data, but reminding yourself of basic statistics, and your innate inability to do basic statistics, is also a need. 
In his book Thinking Fast and Slow, the economist and Nobel Laureate, Daniel Kahneman, discusses how many people, even statisticians, are not naturally good at statistics. It is not that people can’t be good statisticians; it is that typically unless we consciously make the shift to statistics mode (what Kahneman calls System 2) we are naturally bad at practicing sound statistical thinking. Kahneman argues that we naturally use heuristic methods (Kahneman calls this innate, natural heuristic domain of thinking System 1) that work well in low volume contexts, but that cease to work when numbers get bigger and statistical bandwidth is tested by larger volumes of data. 
With these two factors in mind (i.e. understanding your business and acknowledging your tendency to think naturally with System 1 and not System 2), let’s look at an example. 
Again, before we look at an example, let’s remember that your data cannot be considered in a vacuum and that you must analyze your data contextually, based on your target audience and the specific action you want your audience to take, as well as the business goals you want to achieve. 
If you send out a focused product Email (Email Direct Marketing) to a broad audience, you should expect a smaller open rate than average. But the opens you do get on this type of Email likely will be valuable opens, as the targets who opened the Email have interest and are more likely nearer-term sales opportunities.  Your Email metrics will not look impressive statistically (i.e. the open rates and click rates will be low) but that does not mean that your Email was not successful, or that it won’t help you sell. 
There is no way of knowing how successful your Email campaign was based exclusively on your Email report metrics alone. The success of your Email campaign can only be determined after you have followed up with the people who showed interest (i.e. those who opened the email, clicked through, or subscribed). 
Again, when analyzing your EDM (Email Direct Marketing) metrics, keep in mind that the EDM’s purpose is to cast a broad net over a vast space. The EDMs’ purpose is not to close sales; closing sales is the salesperson’s job. The EDM’s job is to get the message out to build desire and traction for your product or service. 
Often the problem with EDMs is that the people sending out the EDMs artificially make the report metrics look good by sending out super-curated lists they know will yield good looking results. While this will impress an unwitting client or manager, it is counterproductive, in that the results are artificially inflated, and are likely insidious to the actual growth of your business by hiding the reality of your target audiences’ true response. 
This artificial inflation of metrics (outputs) which stems from the over-curation of lists (inputs)— which is most often done to ensure client retention and job security— is a classic example of Goodhardt’s Law. 
Goodhardt’s Law, named after the economist Charles Goodhardt, who originated it, states that: “When a measure becomes a target, it ceases to be a good measure.” Rory Sutherland, a popular Ad man, and writer expounds on Goodhardt’s Law saying, “The single-minded, direct pursuit of any numerical measure, though it may be valuable for a time, will lead to harmful effects. Goodhardt’s Law broadly states ‘any metric which becomes a target loses its value as a metric.’” Sutherland also likes to rephrase Goodhardt’s Law as “any metric which becomes a target becomes a rubbish metric.”  
When we look at EDM metrics— or any metric for that matter— in light of Goodhardt’s Law, it is easy to see that we must be careful when we are interpreting output metrics whose inputs are not completely independent of our control. A prescription for good analysis would be to forget about our goal when developing our metrics, and to forget about our metrics when formulating our goals. Ideally, our analyses should be setup based on a double blind methodology, where the people creating the metrics would not know the targets, and the individuals who know the targets would not know how the metrics were formulated.   
While it is possible to run a double blind analysis as described above, most companies do not have the resources, personnel, or time required to take these bias inhibiting steps; but the one thing every company can do is be aware of the tendencies and inherent biases that often cause us to make less than optimal decisions.