Big Data: How Netflix Uses It to Drive Business Success

Bernard Marr writes how Netflix uses data to fuel their business:

Netflix is said to account for one third of peak-time internet traffic in the US. Last year it announced that it had signed up 50 million subscribers around the world. Data from all of them is collected and monitored in an attempt to understand our viewing habits. But its data isn’t just “big” in the literal sense. It is the combination of this data with cutting edge analytical techniques that makes Netflix a true Big Data company.

Netflix is a fascinating company.  They were able to build a business model that put a giant industry, retail movie rentals, out of business and then pivot to streaming before being out innovated by other companies.  They are constantly ahead of the curve when it comes to recognizing the next new technology and digital strategy.  They recognized early that original content was also a key to success, so they are pivoting into becoming greater than HBO at their own game.

More recently, Netflix has moved towards positioning itself as a content creator, not just a distribution method for movie studios and other networks. Its strategy here has also been firmly driven by its data – which showed that its subscribers had a voracious appetite for content directed by David Fincher and starring Kevin Spacey. After outbidding networks including HBO and ABC for the rights to House of Cards, it was so confident that it fitted its predictive model for the “perfect TV show” that is bucked convention of producing a pilot, and immediately commissioned two seasons comprising of 26 episodes.

This is how data-driven organizations behave.  They look at their customers and use data to determine the optimal next move.  All their strategy and tactics are based on using what they know about their customers and what they will do.  So many times organizations are obsessed with what other companies are doing, regardless of what their data is telling them.  They will copy their competitors for fear they are missing out on opportunities.  

The question I always ask is, "how do you know what the other guys are doing is working?"  What you see as a threat, may be a disaster because they haven't set up the correct means to measure the performance or are looking at the wrong KPI's.  Worse yet, they may be attracting an entirely different customer than what you are trying to target.  

A data-driven organization looks at their data and reacts.  Netflix, I am assuming, saw that many of their users were binge watching TV series as soon as they came out.  I'm sure this started with Breaking Bad, Mad Men, great content.  They saw an opportunity to create this content on there own as the majority of the time spent on Netflix is binge watching TV.  They looked at their own data and saw the opportunity to increase time on Netflix and add subscriptions by creating content.  But not just any ole content.  They had the data which showed what their customers loved watching and what resonated with them.  They were able to see what shows were being dropped off of the binge halfway through.  They saw what types of shows were most addictive.  

The content creators gave their biggest competitor the keys to the kingdom, data.  Now Netflix is poised to put a lot of the content creators out of business because they know way more about their customers behaviors than the content creators know.  Because Netflix controls the entire experience, from creation, to delivery, to analyzing the behavior, they can create superior content.  It is a model that is brilliant.  Netflix will continue to dominate, especially in the age where people are looking to become "cord-cutters".  I believe we will see even better content coming out of Netflix in the near future as they learn even more about what we like to watch.

Source: http://smartdatacollective.com/bernardmarr...

If Algorithms Know All, How Much Should Humans Help? - NYTimes.com

Steve Lohr writes for NYTimes.com:

Armies of the finest minds in computer science have dedicated themselves to improving the odds of making a sale. The Internet-era abundance of data and clever software has opened the door to tailored marketing, targeted advertising and personalized product recommendations.
Shake your head if you like, but that’s no small thing. Just look at the technology-driven shake-up in the advertising, media and retail industries.
This automated decision-making is designed to take the human out of the equation, but it is an all-too-human impulse to want someone looking over the result spewed out of the computer. Many data quants see marketing as a low-risk — and, yes, lucrative — petri dish in which to hone the tools of an emerging science. “What happens if my algorithm is wrong? Someone sees the wrong ad,” said Claudia Perlich, a data scientist who works for an ad-targeting start-up. “What’s the harm? It’s not a false positive for breast cancer.”

I have written here many times of analytics being a combination of "art" and "science".  Having data and insight leads to the most action, yet some data scientists want to remove the "art" part of the equation.  The belief is that computers and algorithms can see more about the data and the behavior than a human ever could.  Also, once there is so much data about an individuals behavior, there is no "art" left, all the data points are accounted for so the "science" is indisputable.  

However, I have a hard time believing that "art", or the human insight, will ever be replaceable.  There are so many variables still left unknown and a computer can't know all of them.  The "science" portion will always get better at explaining the "what" happened, but they don't understand the business operations and strategy that goes behind the decisions that were made. I am a true believer in the "big data" coming of age.  I believe it is fundamentally changing the way companies have to do business, but never forget about the human side, the "art" of understanding "why" the data is telling you "what" is happening.  

These questions are spurring a branch of academic study known as algorithmic accountability. Public interest and civil rights organizations are scrutinizing the implications of data science, both the pitfalls and the potential. In the foreword to a report last September, “Civil Rights, Big Data and Our Algorithmic Future,” Wade Henderson, president of The Leadership Conference on Civil and Human Rights, wrote, “Big data can and should bring greater safety, economic opportunity and convenience to all people.”
Take consumer lending, a market with several big data start-ups. Its methods amount to a digital-age twist on the most basic tenet of banking: Know your customer. By harvesting data sources like social network connections, or even by looking at how an applicant fills out online forms, the new data lenders say they can know borrowers as never before, and more accurately predict whether they will repay than they could have by simply looking at a person’s credit history.
The promise is more efficient loan underwriting and pricing, saving millions of people billions of dollars. But big data lending depends on software algorithms poring through mountains of data, learning as they go. It is a highly complex, automated system — and even enthusiasts have qualms.
“A decision is made about you, and you have no idea why it was done,” said Rajeev Date, an investor in data-science lenders and a former deputy director of Consumer Financial Protection Bureau. “That is disquieting.”
Blackbox algorithms have always been troubling for the majority of individuals, even for the smartest of executives when trying to understand their business.  Humans need to see why.  There is a reason why Decision Trees are the most popular of the data models, even though they inherently have less predictive prowess than their counterparts like Neural Networks.

Decision Trees output a result that a human can interpret.  It is a road map to the reason why the prediction was made.  This makes us humans feel comfortable.  We can tell story around the data that explains what is happening.  With a blackbox algorithm, we have to trust that what is going on inside is correct.  We do have the results to measure against, but as these algorithms become more commonplace, it will be imperative that humans can trust the algorithms.  In the above bank loan example, when making decisions regarding bank loans, a human needs to understand why they are being denied and what actions they can take to secure the loan in the future.  

This ties into creating superior customer experiences.  Companies that will be able to harness "big data" and blackbox algorithms and create simple narratives for customers to understand will have a significant competitive advantage.  Creating algorithms to maximize profits is a very businesslike approach, but what gets left out is the customer experience.  What will happen over time is the customer will dislike the lack of knowledge and communication and they will not become future customers.  A bank may say, this is good, they would have defaulted anyway.  But what happens in the future when too many people have bad customer experiences?  I don't believe that is a good longterm strategy.  

In a sense, a math model is the equivalent of a metaphor, a descriptive simplification. It usefully distills, but it also somewhat distorts. So at times, a human helper can provide that dose of nuanced data that escapes the algorithmic automaton. “Often, the two can be way better than the algorithm alone,” Mr. King said.  

Businesses need to also focus on the human side.  When we forget there is also an "art" to enhance all of these great algorithms, businesses will be too focused on transaction efficiency instead of customer experiences which in turn will lead to lower sales.  

Source: http://www.nytimes.com/2015/04/07/upshot/i...

7 Limitations Of Big Data In Marketing Analytics

Anum Basir writes:

As everyone knows, “big data” is all the rage in digital marketing nowadays. Marketing organizations across the globe are trying to find ways to collect and analyze user-level or touchpoint-level data in order to uncover insights about how marketing activity affects consumer purchase decisions and drives loyalty.
In fact, the buzz around big data in marketing has risen to the point where one could easily get the illusion that utilizing user-level data is synonymous with modern marketing.
This is far from the truth. Case in point, Gartner’s hype cycle as of last August placed “big data” for digital marketing near the apex of inflated expectations, about to descend into the trough of disillusionment.
It is important for marketers and marketing analysts to understand that user-level data is not the end-all be-all of marketing: as with any type of data, it is suitable for some applications and analyses but unsuitable for others.

There are a lot of companies looking towards "big data" as their savior, but just aren't ready to implement.  This leads to disenfranchisement towards lower level data.  It reminds me of the early days of Campaign Management (now Marketing Automation) where there were so many failed implementations.  The vendors were too inexperienced to determine how to successfully implement their products, the technology was too nascent and the customers were just not ready culturally to handle the products.  This is "big data" in a nutshell.  

1. User Data Is Fundamentally Biased
The user-level data that marketers have access to is only of individuals who have visited your owned digital properties or viewed your online ads, which is typically not representative of the total target consumer base.
Even within the pool of trackable cookies, the accuracy of the customer journey is dubious: many consumers now operate across devices, and it is impossible to tell for any given touchpoint sequence how fragmented the path actually is. Furthermore, those that operate across multiple devices is likely to be from a different demographic compared to those who only use a single device, and so on.
User-level data is far from being accurate or complete, which means that there is inherent danger in assuming that insights from user-level data applies to your consumer base at large.

I don't necessarily agree with this.  While there are true statements, having some data is better than none.  Would I change my entire digital strategy on incomplete data?  Maybe if the data was very compelling, but this data will lead to testable hypothesis that will lead to better customer experiences.  Never be afraid of not having all the data and never search for all the data, that pearl is not worth the dive.

2. User-Level Execution Only Exists In Select Channels
Certain marketing channels are well suited for applying user-level data: website personalization, email automation, dynamic creatives, and RTB spring to mind.

Very true.  Be careful to apply to the correct channels and don't make assumptions about everyone.  When there is enough data to make a decision, use that data.  If not, use the data you have been working with for all these years, it has worked up till now.

3. User-Level Results Cannot Be Presented Directly
More accurately, it can be presented via a few visualizations such as a flow diagram, but these tend to be incomprehensible to all but domain experts. This means that user-level data needs to be aggregated up to a daily segment-level or property-level at the very least in order for the results to be consumable at large.

Many new segments can come from this rich data and become aggregated.  It is fine to aggregate data for reporting purposes to executives, in fact this is what they want to see.  Every once in awhile throw in a decision tree or a naive bayes output to show there is more analysis being done at a more granular level. 

4. User-Level Algorithms Have Difficulty Answering “Why”
Largely speaking, there are only two ways to analyze user-level data: one is to aggregate it into a “smaller” data set in some way and then apply statistical or heuristic analysis; the other is to analyze the data set directly using algorithmic methods.
Both can result in predictions and recommendations (e.g. move spend from campaign A to B), but algorithmic analyses tend to have difficulty answering “why” questions (e.g. why should we move spend) in a manner comprehensible to the average marketer. Certain types of algorithms such as neural networks are black boxes even to the data scientists who designed it. Which leads to the next limitation:

This is where the "art" comes into play when applying analytics on any dataset.  There are too many unknown variables that go into a purchase decision of a human being to be able to predict with absolute certainty an outcome, so there should never be a decision to move all spending in some direction or change an entire strategy based on any data model.  What should be done is test the new data models against the old way of doing business and see if they perform better.  If they do, great, you have a winner.  If they don't, use that new data to create models that will maybe create better results than the current model.  Marketing tactics and campaigns are living and breathing entities, they need to be cared for and changed constantly.

5. User Data Is Not Suited For Producing Learnings
This will probably strike you as counter-intuitive. Big data = big insights = big learnings, right?
Actionable learnings that require user-level data – for instance, applying a look-alike model to discover previously untapped customer segments – are relatively few and far in between, and require tons of effort to uncover. Boring, ol’ small data remains far more efficient at producing practical real-world learnings that you can apply to execution today.

In some cases yes, but don't discount the learnings that can come from this data.  Running this data through multiple modeling techniques may not lead to production ready models that will impact revenue streams overnight.  These rarely happen and takes many hundreds of data scientists with an accuracy rating of maybe 3% of the models making it into production.  However, running data through data mining techniques can give you unique insights into your data that regular analytics could never produce.  These are true learnings that create testable hypothesis that can be used to enhance the customer experience.

6. User-Level Data Is Subject To More Noise
If you have analyzed regular daily time series data, you know that a single outlier can completely throw off analysis results. The situation is similar with user-level data, but worse.

 This is very true.  There is so much noise in the data, that is why most time spent data modeling involves cleaning of the data.  This noise is why it is so hard to predict anything using this data.  The pearl may not be worth the dive for predictive analytics, but for data mining it is certainly worth the effort.

7. User Data Is Not Easily Accessible Or Transferable

Oh so true.  Take manageable chucks when starting to dive into these user-level data waters. 

This level of data is much harder to work with than traditional data.  In fact, executives usually don't appreciate the time and effort it takes to glean insights from large datasets.  Clear expectations should be set to ensure there are no overinflated expectations at the start of the user-level data journey.  Under promise and over deliver for a successful implementation.  

Source: http://analyticsweek.com/7-limitations-of-...

What to Do When People Draw Different Conclusions From the Same Data

Walter Frick writes for HBR:

That famous line from statistician William Edwards Deming has become a mantra for data-driven companies, because it points to the promise of finding objective answers. But in practice, as every analyst knows, interpreting data is a messy, subjective business. Ask two data scientists to look into the same question, and you’re liable to get two completely different answers, even if they’re both working with the same dataset.
So much for objectivity.
But several academics argue there is a better way. What if data analysis were crowdsourced, with multiple analysts working on the same problem and with the same data? Sure, the result might be a range of answers, rather than just one. But it would also mean more confidence that the results weren’t being influenced by any single analyst’s biases. Raphael Silberzahn of IESE Business School, Eric Luis Uhlmann of INSEAD, Dan Martin of the University of Virginia, and Brian Nosek of the University of Virginia and Center for Open Science are pursuing several research projects that explore this idea. And a paper released earlier this year gives an indication of how it might work.

I believe it is best practice to have multiple analysts look at a problem to at least devise what their methodology would be for a certain problem.  In fact, I always like to take a crack myself when the problem is particularly difficult, just so I have an idea of what the data looks like and how certain variables are influencing the results.  

I think too many executives are unwilling to dig into the data and work with a problem.  I believe it is very important to have a deep understanding of the data issues so, as an executive, you can make better decisions on how to guide the team.  Many times the answer is not a deployable model, but a data mining exercise that will glean some testable hypothesis.  

Though most companies don’t have 60 analysts to throw at every problem, the same general approach to analysis could be used in smaller teams. For instance, rather than working together from the beginning of a project, two analysts could each propose a method or multiple methods, then compare notes. Then each one could go off and do her own analysis, and compare her results with her partner’s. In some cases, this could lead to the decision to trust one method over the other; in others, it could lead to the decision to average the results together when reporting back to the rest of the company.
“What this may help [to do] is to identify blind spots from management,” said Raphael Silberzahn, one of the initiators of the research. “By engaging in crowdsourcing inside the company we may balance the influence of different groups.”

I do believe in internal "crowdsourcing".  The minute tough problems start to be outsourced, the company loses the great insight their analysts and business owners have that can bring insight tot he data that many analysts outside of the company could never understand.  I truly believe analytics is art and science, but too many times the art is under appreciated.  

Source: https://hbr.org/2015/03/what-to-do-when-pe...

Big data: are we making a big mistake? by Anum Basir

Anum Basir writes for Analytics Weekly:

“Big data” has arrived, but big insights have not. The challenge now is to solve new problems and gain new answers – without making the same old statistical mistakes on a grander scale than ever.

This is an article every executive should read about "big data".  I believe it fits right in to my narrative about event with data, companies need art along with the science to have true insight as I wrote here.  The article is a long read, but it details the promises of "big data" along with the pitfalls that come in only trusting the results without having the proper insights and testing.

As with so many buzzwords, “big data” is a vague term, often thrown around by people with something to sell.

I believe in more data, not a term of "big data"  When people are trying to sell "big data" to corporations, are they really helping?  

But the “big data” that interests many companies is what we might call “found data”, the digital exhaust of web searches, credit card payments and mobiles pinging the nearest phone mast.

In some circumstances they might be, but what Basir discusses in this article is the idea of "found data".  This is data that already exists inside the company, or data that is just not being tracked and analyzed.  As I wrote here, companies are sitting on a treasure trove of data that they aren't using optimally already.  Adding "big data" may send the corporation down a path they are not ready for.  Always search for the next best data that will solve the answers to the questions that need answering.  

Cheerleaders for big data have made four exciting claims, each one reflected in the success of Google Flu Trends: that data analysis produces uncannily accurate results; that every single data point can be captured, making old statistical sampling techniques obsolete; that it is passé to fret about what causes what, because statistical correlation tells us what we need to know; and that scientific or statistical models aren’t needed because, to quote “The End of Theory”, a provocative essay published in Wired in 2008, “with enough data, the numbers speak for themselves”.
Unfortunately, these four articles of faith are at best optimistic oversimplifications. At worst, according to David Spiegelhalter, Winton Professor of the Public Understanding of Risk at Cambridge university, they can be “complete bollocks. Absolute nonsense.”

Basic goes on in the article to punch holes in these four claims.  "Big Data" is very promising, but it is a destination for most companies.  When something is a destination, there is a path that needs to be taken to get there.  The path may change and there are detours on the way, but most companies can't just jump all in on "big data" or "found data".  Companies must build an analytics culture, live in their data and use that data to make decisions with the business acumen they have built up for many years.  

As Basir points out in the article, the problems with data do not go away with more of it, they just get bigger.  

Four years after the original Nature paper was published, Nature News had sad tidings to convey: the latest flu outbreak had claimed an unexpected victim: Google Flu Trends. After reliably providing a swift and accurate account of flu outbreaks for several winters, the theory-free, data-rich model had lost its nose for where flu was going. Google’s model pointed to a severe outbreak but when the slow-and-steady data from the CDC arrived, they showed that Google’s estimates of the spread of flu-like illnesses were overstated by almost a factor of two.
The problem was that Google did not know – could not begin to know – what linked the search terms with the spread of flu. Google’s engineers weren’t trying to figure out what caused what. They were merely finding statistical patterns in the data. They cared about ­correlation rather than causation. This is common in big data analysis. Figuring out what causes what is hard (impossible, some say). Figuring out what is correlated with what is much cheaper and easier. That is why, according to Viktor Mayer-Schönberger and Kenneth Cukier’s book, Big Data, “causality won’t be discarded, but it is being knocked off its pedestal as the primary fountain of meaning”.
But a theory-free analysis of mere correlations is inevitably fragile. If you have no idea what is behind a correlation, you have no idea what might cause that correlation to break down.

Correlation without causation arguments do not go away with "big data".  Having insights to enhance the results is key to successful analytics.  We are all familiar with the story of ice cream sales and shark bites are strongly correlated, so selling more ice cream causes shark bites?  Well thats just silly and obvious, we all know because sales of ice cream and swimming in the ocean increase in the summertime.  

But the example brings up a crucial point, do not trust the output of the data without using your vast knowledge on the subject as a barometer.  Anyone can see the shark bite, ice cream example has nothing to do with each other, but findings of big data can be a lot more tricky to detect.  What may look to be a relatively reasonable explanation of data from a model a data scientist created may actually ruin a business because the data scientist had no knowledge of the subject matter.  When just solely relying on data, all the great human knowledge about the business are thrown away.  This is art and science.  Treat it as such.  Get the artists into the room with the scientists and find the best answer, not the cheapest and easiest one.  Actionable analytics is hard, don't underestimate the complexity of the problem.     

Source: http://analyticsweek.com/big-data-are-we-m...

Big data as a driver of organizational change

Analytics can “open up many doors for healthcare organizations” of all types, including life sciences companies aiming to get new medications to patients faster or to “provide regulatory bodies with evidence of drug safety.”  Or, for payers, analytics can “answer questions about future growth, profitability and sustainability,” or help them to detect and prevent fraud.”

It still amazes me that big organizations in so many industries are still talking about what analytics can do for them.  Of course, this headline is a little deceiving as there really is no "big data" to be found.  You get a lot of clicks when you have "big data" in the headline though.

Source: http://www.datamashup.info/big-data-as-a-d...

Data + Insight = Action and Back Again

Adobe Summit brought with it a great nirvana of a near-future where marketers are able to deliver relevant content to customers creating great experiences.  The words that permeated throughout the conference were those, content, experiences and data.  The big stars of the show were content and customer experiences, which I believe are extremely important as I have written before.  

However, there is no right content delivered at the perfect moment to create wonderful customer experiences without data.  Data is the key to making this all work, and not just any data.  No, I'm not talking about "big data", I'm talking about actionable data.  

Most companies are sitting on a treasure trove of data already.  Without purchasing third-party data, understanding every click, customers have data that can transform their business.  The issue is in interpreting the data, making it actionable.  Actionable data isn't a product, it's a culture.  

Actionable data is the combination of art and science.  The path to actionable data isn't necessarily going out and hiring a bunch of talented data scientists, though it doesn't hurt to have these people on your team.  The path to actionable data is marrying the data with the business acumen.  It's not enough to have data telling you something happened, there has to be an understanding of the business as to why it happened.

Once there is an understanding of what happened (science) and why it happened (art), you have actionable data.  Now you can create optimal tactics to deliver relevant content to create targeted experiences in the digital age.  The great thing about this process is it's circular.  Once a company creates great targeted experiences for their customers, customer behaviors will change and the entire process starts all over again.  There are always puzzles to solve and amazing content and experiences to create.  

Why Netflix walked away from personalization | ThoughtGadgets

n 2006 Netflix offered a $1 million prize for anyone who could improve its movie preference recommendations by 10%. Netflix, at the time, made most of its money sending DVDs in the mail to users’ homes

Mathematicians went wild. The competition was lauded by business pundits as an example of crowdsourcing genius. Because this was damned hard math, the project took years. And then in 2009, a team of mathematicians called “BellKor’s Pragmatic Chaos” actually cracked the code, achieved a 10% lift, and Netflix gave them the $1 million.

And then … Netflix never implemented the winning algorithm. Because personalization at that point no longer mattered.

Personalization has been such a buzzword for so many years.  Netflix was one of the poster children for this.  It's interesting to look at articles like this and understand they really don't utilize it like say an Amazon does.

In fact, this article is very critical of Amazon and I'd have to agree.  Amazon has decent recommendations, but it seems to be a fairly basic market basket model that shows what others who bought similar items.  That may be the best way to offer items to customers.  Amazon has all the money in the world for R&D, in fact they flaunt how much money they put back into their business and if they are using this model, it must mean the personalization models of predicting other types of product must not bring in as much as the market basket.

Source: http://www.thoughtgadgets.com/why-netflix-...

How to Get More Value Out of Your Data Analysts

Organizations succeed with analytics only when good data and insightful models are put to regular and productive use by business people in their decisions and their work.

Actionable data is a buzzword that I have used and heard for many, many years.  However, in practice it is much harder to produce actionable insight for business users.  C-level executives are always trying to dissect old information which can be very useful, but only in cases where the answer will have some actionable insight.  If the answers to question have interesting insight, but don't provide any action, it creates a time consuming chase in how to make something out of the data.

Organizations have to educate themselves of what is capable of their data.  It's no use to ask questions that the data will not be able to turn into actionable insight.  Organizations should spend time on questions that can be solved with the current capabilities, in other words the low hanging fruit.  Once there is no longer any low hanging fruit, enhance the analytics with new tools and modeling capabilities to find the next round of low hanging fruit.

If you want to put analytics to work and build a more analytical organization, you need two cadres of employees:

  • Analytics professionals to mine and prepare data, perform statistical operations, build models, and program the surrounding business applications.
  • Analytical business people who are ready, able, and eager to use better information and analyses in their work, as well as to work with the professionals on analytics projects.

Very true.  An organization can hire all the analysts they can handle, but if the analysts have no business acumen and the business has no data acumen, there will always be a disconnect.  

Organizations need to have both the business and analysts working together to find the best answers.  Data Scientists can find very interesting items for them, however the business side may provide insight that shows the data scientists work is a known insight.  The business may be working very hard to solve a problem that the data scientist can solve, taking intuition out of play and using data in the place of trial and error.

In my history I have always liked business and data analysts reporting to the same group.  Many organizations don't like this structure because it can lead to a group "grading their own paper".  However, the tight integration of the teams produces results more efficiently.  Analysts are not wasting time chasing problems that don't exist and business people can bounce ideas off of analysts for quick insight before making decisions.

 

Source: http://blogs.hbr.org/2013/12/how-to-get-mo...

The Revolutionary Way Marketers Read Your Financial Footprints

Laube, 43, Cardlytics’ president and COO, and Grimes, 51, its CEO, have since helped pioneer a data-driven advertising niche called merchant-funded rewards. It targets people based on what they buy, not who they are. “If you know where and how someone is spending money, you know lots of things about them without having to know their personally identifying information,” Laube says.

I have found that the transactions of customers is the most important predictor of future behavior in all data I have studied.  While the demo, geo and psychographics of customers is very interesting data, to maximize the revenue from known customers is to really get to know their transactions.

While I don't know how good these algorithms are, the theory is solid.  I happen to be a Bank Of America customer and the deals I receive don't seem to be any better than say my Rapid Rewards dining offers which don't seem to know my eating habits whatsoever.  If these can be perfected, I think it's something that would get me to use my card more often instead of using my AMEX.  I'll be watching because this is very intriguing.  

 

 

Source: http://www.forbes.com/sites/adamtanner/201...

The Three Tribes of Social Shopping

Do people share what they plan to buy, or buy what they share?

That’s the question my colleagues and I had to ask after discovering that a significant number of Pinterest users go on to buy the items they have pinned.  As we reported in Harvard Business Review, 21% of Pinterest users say they have purchased an item in-store after pinning, re-pinning, or liking it on Twitter, and 36% of users under 35 said they had done so. Look beyond in-store purchases, and the numbers are even bigger:

29% of Pinterest users have purchased something (in-store or online) after sharing or favoriting it on Pinterest

22% of Twitter users have purchased something after tweeting, retweeting, or favoriting on on Twitter

38% of Facebook users have purchased something after liking, sharing, or commenting on it on Facebook

Very interesting article.  What they still didn't answer is the act of sharing a driver?  Does the customer share because it is available to share or does the share drive a purchase? And the outcome is to monitor and ask a lot of questions from people who share your products on social media channels.  Sounds pretty pricey.  

For marketers, the opportunity of social lies in the ability to reach and inspire the “thinkers” and “leapers” — to find and drive sales from people who might never otherwise move from interest to commitment.

First you have to identify these people.  The key is predicting who is not going to purchase and nudge those potential buyers, that will optimize revenue.  If you nudge everyone with discounts or more ads, you might have just spent money on people who were already going to purchase. 

 

Source: http://blogs.hbr.org/2013/09/the-three-tri...

Keep Up with Your Quants - Harvard Business Review

Very good article about using data to make decisions and communicate results.  

having big data—and even people who can manipulate it successfully—is not enough. Companies need general managers who can partner effectively with “quants” to ensure that their work yields better strategic and tactical decisions.

Often times analysts struggle to communicate their findings in a way the organization understands.  Finding a common ground makes for a great combination. 

We all know how easily “figures lie and liars figure.” Analytics consumers should never pressure their producers with comments like “See if you can find some evidence in the data to support my idea.” Instead, your explicit goal should be to find the truth.

How many times I have heard this in my career?  Quite a few times.  As data consumers, we can't be afraid of being wrong or showing that a decision we have made lost money.  That happens.  Always strive for the truth.  It is much better to improve results then to take a hit to your ego.

The rest of the article is a great read on how to better receive analytics.   Being someone who can take analytics and turn data into money is what separates the men from the boys.    

 

 

Source: http://hbr.org/2013/07/keep-up-with-your-q...

FiveThirtyEight's Nate Silver Explains Why We Suck At Predictions (And How To Improve) | Fast Company

When human judgment and big data intersect there are some funny things that happen. On the one hand, we get access to more and more information that ought to help us make better decisions. On the other hand, the more information you have, the more selective you can be in which information you pick out to tell the narrative that might not be the true or accurate, or the one that helps your business, but the one that makes you feel good or that your friends agree with.

This is a great article on using data and predictions.  I just bought this book as a good friend of mine suggested it is a great read.  I always hear "You can make numbers tell whatever story you want."  Ain't that the truth?  So many times colleagues of mine hold on to a certain part of the data that tells the story they want to tell and soon it becomes truth, however this only helps them look good instead of moving the business forward.  

Source: http://www.fastcompany.com/3001794/fivethi...