The Dangers of Data-Driven Marketing

Marketing has gone digital, and we can now measure our efforts like never before. As a result, marketers have fallen in love with data. Head over heels in love—to the point where we want data to drive our marketing, instead of people, like you and me. I think this has gone too far.

I'm a big proponent of data-driven marketing, in this article Ezra Fishman uses semantics to say this is bad, but what he is trying to get is there is a need to go beyond just the data.  As I wrote in Data + Insight = Action, data all by itself cannot create actionable outcomes.  

Data-informed marketing
Instead of focusing on data alone, data-informed marketing considers data as just one factor in making decisions. We then combine relevant data, past experiences, intuition, and qualitative input to make the best decisions we can.
Instead of poring over data hoping to find answers, we develop a theory and a hypothesis first, then test it out. We force ourselves to make more gut calls, but we validate those choices with data wherever possible so that our gut gets smarter with time.

This is what I was trying to articulate in my article.  To be an excellent data-driven marketing organizations takes a little bit of "science" and a little bit of "art" to determine the best course of action.  When a data scientist is driving your organization, there are years of experience being unused to help him understand even further what the data is saying.  

Most times when a data scientist is off on their own, it takes an inordinate amount of time to come up with a conclusion, mostly because they lack the context of how the business is generating the data.  How the strategy manipulates the data.  How a customer being underserved may be an intentional outcome.

The ease of measurement trap
When we let data drive our marketing, we all too often optimize for things that are easy to measure, not necessarily what matters most.
Some results are very easy to measure. Others are significantly harder. Click-through rate on an email? Easy. Brand feelings evoked by a well-designed landing page? Hard. Conversion rate of visitors who touch your pricing page? Easy. Word-of-mouth generated from a delightful video campaign? Hard.

Right on!  Of course the organizations that take the easy way out are ones that I would not consider to be data-driven.  KPI's are a great item, but they can be deadly.  There are usually so many moving parts that make up the business and the data being generated.  This can cause business KPI's to look fine, yet drilling down into the performance from a customer perspective may show some very scary trends that would cause alarm.  However, a non data-driven company will continue with their strategy because of the KPI's (hello RIM/Blackberry).  

The local optimization trap
The local optimization trap typically rears its head when we try to optimize a specific part of the marketing funnel. We face this challenge routinely at Wistia when we try increase the conversion rate of new visitors. In isolation, improving the signup rate is a relatively straightforward optimization problem that can be "solved" with basic testing.
The problem is, we don't just want visitors to sign up for our Free Plan. We want them to sign up for our Free Plan, then use their account, then tell others how great Wistia is, then eventually purchase one of our paid plans (and along the way generate more and more positive feelings toward our brand).

This can be combined with the previous bullet.  When analytics is only seen from a high level, simple statements like "we need to increase the number of signups, which will flow down at the same rate as we currently have, will increase conversion."  Nothing could be further from the truth.  To increase anything there needs to be an additional action.  This action may include advertising to a different group of individuals or giving an incentive that will increase signups.  The issue with this thinking is these aren't the same individuals that are converting in your current funnel.  The proper strategy is to figure out the converters and try and target customers like them, which may actually decrease the size of the funnel if done right.

The data quality trap
We are rarely as critical of our data as we ought to be. Consider, for example, A/B tests, which have become the gold standard for marketing experimentation. In theory, these tests should produce repeatable and accurate results, since website visitors are assigned randomly to each page variant.
In practice, however, there are lots of ways even the simplest A/B tests can produce misleading results. If your website traffic is anything like ours, visitors come from a variety of sources: organic, direct, referral, paid search, and beyond. If one of those sources converts at a much higher rate than others, it's easy to get skewed results by treating your traffic as a single, uniform audience.

One should rarely just take the conversion or redemption results from the A/B test without digging into the data.  Making sure all segments are driving the results is key.  Don't take for granted the customers that were randomly selected for each group ended up being totally random.  Ensure there was proper representation from each segment of the business and identify any other changes that could be tested based on different behaviors within the segments.

Data vigilance
As marketers, we should continue to explore new and better ways to harness the power of data, but we also must remain vigilant about becoming overly reliant on data.
Data can be a tremendous source of insight. Harness that. But don't pretend it's something more. And definitely don't put it in charge of your marketing team.

This reminds me when I was a product manager and we would receive these RFP's to determine if we were the right company to supply them with our product.  Sometimes the requirements were such that we wondered if the company wanted humans to continue to work for them.  I would comically refer to some of these as automated manager.  It seemed companies wanted to press a button and have a system do everything for them.  This is the trap Fishman is referring.  Humans have great insight.  Humans are the "art" in the equation to actionable outcomes.  This is equally important as the "science".

Source: http://wistia.com/blog/data-informed-marke...

To Benefit From Big Data, Resist The Three False Promises

From Forbes.com:

Gartner recently predicted that “through 2017, 60% of big data projects will fail to go beyond piloting and experimentation and will be abandoned.” This reflects the difficulty of generating value from existing customer, operational and service data, let alone the reams of unstructured internal and external data generated from social media, mobile devices and online activity.

Yet some leading users of big data have managed to create data-driven business models that win in the marketplace. Auto insurer Progressive PGR -1.22%, for instance, uses plug-in devices to track driver behavior. Progressive mines the data to micro-target its customer base and determine pricing in real time. Capital One, the financial services company, relies heavily on advanced analytics to shape its customer risk scoring and loyalty and offer optimization initiatives. It exploits multiple types of customer data, including advanced text and voice analytics.

I believe what most people miss when they hear these success stories is the amount of human capital that gets thrown at these problems.  Hundreds of data scientists create thousands of models, of which very few are actually incorporated into final production.  The reason the Gartner stats ring true is most companies don't have the kind of resources to throw at the problem and most companies won't realize an ROI even if they could throw these types of resources at a problem.

Promise 1: The technology will identify business opportunities all by itself.

This is the direction the technology is moving towards, but it is not there yet.  The technology enables a group of data scientists to identify the opportunities, it's not magic.

Promise 2: Harvesting more data will automatically generate more value. 

The temptation to acquire and mine new data sets has intensified, yet many large organizations are already drowning in data, much of it held in silos where it cannot easily be accessed, organized, linked or interrogated.

More data does not mean better ROI on your initiatives.  In fact, most companies don't take advantage of the data they already have to generate the maximum ROI.  I always use a rule of thumb when purchasing new technology.  If as an organization you don't believe you are already using the technology you currently posses to its fullest, then its not time to move on to something better.  Your current technology should be preventing you from innovating, if its not then you either have the wrong technology or the wrong people.

Promise 3: Good data scientists will find value for you. 

To profit consistently from big data, you need an operating model that deploys advanced analytics in a repeatable manner. And that involves many more people than data scientists.

Remember, data + insight = action.  Actionable data is a combination or art and science.  data scientists provide the science, however you need the team with the business acumen to provide the insight, this is the art.  Data scientists will create a lot of questions that you never thought to ask of your data, but they cannot provide a solution in and of themselves.  

Remember to walk before you run when it comes to data initiatives.  It's always good to have a goal of using "big data" to improve your business and create ROI from where it didn't previously exist, however the journey to "big data" is more important.  These examples of success with "big data" did not happen over night.  They happened because advanced companies were butting up against the limits of their current technology and they were ready to take the next step.  

Source: http://www.forbes.com/sites/baininsights/2...

5 habits of effective data-driven organizations

Size doesn’t matter, but variety does. You would think that a data-driven organization has a lot of data, petabytes of data, exabytes of data. In some cases, this is true. But in general, size matters only to a point. For example, I encountered a large technology firm with petabytes of data but only three business analysts. What really matters is the variety of the data. Are people asking questions in different business functions? Are they measuring cost and quality of service, instrumenting marketing campaigns, or observing employee retention by team? Just getting a report at month end on profits? You’re probably not data driven.

As I have articulated previously, data-driven organizations are a culture, it is not about toolsets or data scientists.  It doesn't matter how much data you have, it matters that you have enough data to make an informed business decision.

Everyone has access to some data. Almost no one has access to all of it. There are very few cultures where everyone can see nearly everything. Data breach threats and privacy requirements are top of mind for most data teams. And while these regulations certainly stunt the ability of the company to make data available, most data-driven companies reach a stage where they have developed clear business processes to address these issues.

It comes down to what data is important for each business unit.  Most business units don't need credit card information or PII information about individual customers.  Understanding what data will drive better business decisions in each unit and focusing on getting those units the needed data in a consumable format is the key.

Data is all over the place. One would think that the data is well organized and well maintained — as in a library, where every book is stored in one place. In fact, most data-driven cultures are exactly the opposite. Data is everywhere — on laptops, desktops, servers.

This can be dangerous.  Remember there is nothing worse than fighting about the validity of data.  If operating units all have their own sets of data, then it becomes a competition of who's data is right instead of what decision we should make based on the information at hand.

Companies prize insights over technology standards. Generally, the principal concern of people in data-driven businesses is the ability to get the insight quickly. This is a corollary of point #3. Generally, the need to answer a question trumps the discussion of how to best answer it. Expediency wins, and the person answering the question gets to use the tool of their choice. One top 10 bank reported using more than 100 business intelligence technologies.

I really like this, as long as you don't fall into the trap I discussed above.  To get people to adjust to a technology instead of providing insight is lost time.  Getting a huge organization on 1 platform is problematic at best, a disaster at worst.  If analysts can work in tools they have mastered, it will allow them to get insights faster.  Faster insight is a major competitive advantage.

Data flows up, down, and even side to side. In data-driven companies, data isn’t just a tool to inform decision makers. Data empowers more junior employees to make decisions, and leaders often use data to communicate the rationale behind their decisions and to motivate action. In one data-driven company, I observed a CEO present a 50-slide deck to his full team, and almost all of those slides were filled with charts and numbers. Most fundamentally, data empowers people to make decisions without having to consult managers three levels up — whether it’s showing churn rates to explain additional spend on customer services vs. marketing or showing revenues relative to competitors to explain increased spend on sales.

The old thinking was to create a business intelligence team that would provide the data for the organization.  Each operating unit should be in charge of their own data analytics.  There should be a centralized business intelligence team to provide a checks and balances, but operating units are best to answer their own questions, they know their business best.  Democratizing data throughout the organization is key to having a data-driven organization.  

Source: http://venturebeat.com/2015/04/12/5-habits...

What to Do When People Draw Different Conclusions From the Same Data

Walter Frick writes for HBR:

That famous line from statistician William Edwards Deming has become a mantra for data-driven companies, because it points to the promise of finding objective answers. But in practice, as every analyst knows, interpreting data is a messy, subjective business. Ask two data scientists to look into the same question, and you’re liable to get two completely different answers, even if they’re both working with the same dataset.
So much for objectivity.
But several academics argue there is a better way. What if data analysis were crowdsourced, with multiple analysts working on the same problem and with the same data? Sure, the result might be a range of answers, rather than just one. But it would also mean more confidence that the results weren’t being influenced by any single analyst’s biases. Raphael Silberzahn of IESE Business School, Eric Luis Uhlmann of INSEAD, Dan Martin of the University of Virginia, and Brian Nosek of the University of Virginia and Center for Open Science are pursuing several research projects that explore this idea. And a paper released earlier this year gives an indication of how it might work.

I believe it is best practice to have multiple analysts look at a problem to at least devise what their methodology would be for a certain problem.  In fact, I always like to take a crack myself when the problem is particularly difficult, just so I have an idea of what the data looks like and how certain variables are influencing the results.  

I think too many executives are unwilling to dig into the data and work with a problem.  I believe it is very important to have a deep understanding of the data issues so, as an executive, you can make better decisions on how to guide the team.  Many times the answer is not a deployable model, but a data mining exercise that will glean some testable hypothesis.  

Though most companies don’t have 60 analysts to throw at every problem, the same general approach to analysis could be used in smaller teams. For instance, rather than working together from the beginning of a project, two analysts could each propose a method or multiple methods, then compare notes. Then each one could go off and do her own analysis, and compare her results with her partner’s. In some cases, this could lead to the decision to trust one method over the other; in others, it could lead to the decision to average the results together when reporting back to the rest of the company.
“What this may help [to do] is to identify blind spots from management,” said Raphael Silberzahn, one of the initiators of the research. “By engaging in crowdsourcing inside the company we may balance the influence of different groups.”

I do believe in internal "crowdsourcing".  The minute tough problems start to be outsourced, the company loses the great insight their analysts and business owners have that can bring insight tot he data that many analysts outside of the company could never understand.  I truly believe analytics is art and science, but too many times the art is under appreciated.  

Source: https://hbr.org/2015/03/what-to-do-when-pe...