The Dangers of Data-Driven Marketing

Marketing has gone digital, and we can now measure our efforts like never before. As a result, marketers have fallen in love with data. Head over heels in love—to the point where we want data to drive our marketing, instead of people, like you and me. I think this has gone too far.

I'm a big proponent of data-driven marketing, in this article Ezra Fishman uses semantics to say this is bad, but what he is trying to get is there is a need to go beyond just the data.  As I wrote in Data + Insight = Action, data all by itself cannot create actionable outcomes.  

Data-informed marketing
Instead of focusing on data alone, data-informed marketing considers data as just one factor in making decisions. We then combine relevant data, past experiences, intuition, and qualitative input to make the best decisions we can.
Instead of poring over data hoping to find answers, we develop a theory and a hypothesis first, then test it out. We force ourselves to make more gut calls, but we validate those choices with data wherever possible so that our gut gets smarter with time.

This is what I was trying to articulate in my article.  To be an excellent data-driven marketing organizations takes a little bit of "science" and a little bit of "art" to determine the best course of action.  When a data scientist is driving your organization, there are years of experience being unused to help him understand even further what the data is saying.  

Most times when a data scientist is off on their own, it takes an inordinate amount of time to come up with a conclusion, mostly because they lack the context of how the business is generating the data.  How the strategy manipulates the data.  How a customer being underserved may be an intentional outcome.

The ease of measurement trap
When we let data drive our marketing, we all too often optimize for things that are easy to measure, not necessarily what matters most.
Some results are very easy to measure. Others are significantly harder. Click-through rate on an email? Easy. Brand feelings evoked by a well-designed landing page? Hard. Conversion rate of visitors who touch your pricing page? Easy. Word-of-mouth generated from a delightful video campaign? Hard.

Right on!  Of course the organizations that take the easy way out are ones that I would not consider to be data-driven.  KPI's are a great item, but they can be deadly.  There are usually so many moving parts that make up the business and the data being generated.  This can cause business KPI's to look fine, yet drilling down into the performance from a customer perspective may show some very scary trends that would cause alarm.  However, a non data-driven company will continue with their strategy because of the KPI's (hello RIM/Blackberry).  

The local optimization trap
The local optimization trap typically rears its head when we try to optimize a specific part of the marketing funnel. We face this challenge routinely at Wistia when we try increase the conversion rate of new visitors. In isolation, improving the signup rate is a relatively straightforward optimization problem that can be "solved" with basic testing.
The problem is, we don't just want visitors to sign up for our Free Plan. We want them to sign up for our Free Plan, then use their account, then tell others how great Wistia is, then eventually purchase one of our paid plans (and along the way generate more and more positive feelings toward our brand).

This can be combined with the previous bullet.  When analytics is only seen from a high level, simple statements like "we need to increase the number of signups, which will flow down at the same rate as we currently have, will increase conversion."  Nothing could be further from the truth.  To increase anything there needs to be an additional action.  This action may include advertising to a different group of individuals or giving an incentive that will increase signups.  The issue with this thinking is these aren't the same individuals that are converting in your current funnel.  The proper strategy is to figure out the converters and try and target customers like them, which may actually decrease the size of the funnel if done right.

The data quality trap
We are rarely as critical of our data as we ought to be. Consider, for example, A/B tests, which have become the gold standard for marketing experimentation. In theory, these tests should produce repeatable and accurate results, since website visitors are assigned randomly to each page variant.
In practice, however, there are lots of ways even the simplest A/B tests can produce misleading results. If your website traffic is anything like ours, visitors come from a variety of sources: organic, direct, referral, paid search, and beyond. If one of those sources converts at a much higher rate than others, it's easy to get skewed results by treating your traffic as a single, uniform audience.

One should rarely just take the conversion or redemption results from the A/B test without digging into the data.  Making sure all segments are driving the results is key.  Don't take for granted the customers that were randomly selected for each group ended up being totally random.  Ensure there was proper representation from each segment of the business and identify any other changes that could be tested based on different behaviors within the segments.

Data vigilance
As marketers, we should continue to explore new and better ways to harness the power of data, but we also must remain vigilant about becoming overly reliant on data.
Data can be a tremendous source of insight. Harness that. But don't pretend it's something more. And definitely don't put it in charge of your marketing team.

This reminds me when I was a product manager and we would receive these RFP's to determine if we were the right company to supply them with our product.  Sometimes the requirements were such that we wondered if the company wanted humans to continue to work for them.  I would comically refer to some of these as automated manager.  It seemed companies wanted to press a button and have a system do everything for them.  This is the trap Fishman is referring.  Humans have great insight.  Humans are the "art" in the equation to actionable outcomes.  This is equally important as the "science".

Source: http://wistia.com/blog/data-informed-marke...

Business Intelligence vs Analytics vs Big Data vs Data Mining

To help you navigate the terrain of business data concepts, we’re going to give you a basic summary of what some of the most common terms refer to and how they relate to each other.  

This is a very good article on definitions in the data space.  So many times I hear executives talk about topics such as "big data", but they are really referring to analytics or data collection.  

Source: http://blog.apterainc.com/business-intelli...

Big Data: How Netflix Uses It to Drive Business Success

Bernard Marr writes how Netflix uses data to fuel their business:

Netflix is said to account for one third of peak-time internet traffic in the US. Last year it announced that it had signed up 50 million subscribers around the world. Data from all of them is collected and monitored in an attempt to understand our viewing habits. But its data isn’t just “big” in the literal sense. It is the combination of this data with cutting edge analytical techniques that makes Netflix a true Big Data company.

Netflix is a fascinating company.  They were able to build a business model that put a giant industry, retail movie rentals, out of business and then pivot to streaming before being out innovated by other companies.  They are constantly ahead of the curve when it comes to recognizing the next new technology and digital strategy.  They recognized early that original content was also a key to success, so they are pivoting into becoming greater than HBO at their own game.

More recently, Netflix has moved towards positioning itself as a content creator, not just a distribution method for movie studios and other networks. Its strategy here has also been firmly driven by its data – which showed that its subscribers had a voracious appetite for content directed by David Fincher and starring Kevin Spacey. After outbidding networks including HBO and ABC for the rights to House of Cards, it was so confident that it fitted its predictive model for the “perfect TV show” that is bucked convention of producing a pilot, and immediately commissioned two seasons comprising of 26 episodes.

This is how data-driven organizations behave.  They look at their customers and use data to determine the optimal next move.  All their strategy and tactics are based on using what they know about their customers and what they will do.  So many times organizations are obsessed with what other companies are doing, regardless of what their data is telling them.  They will copy their competitors for fear they are missing out on opportunities.  

The question I always ask is, "how do you know what the other guys are doing is working?"  What you see as a threat, may be a disaster because they haven't set up the correct means to measure the performance or are looking at the wrong KPI's.  Worse yet, they may be attracting an entirely different customer than what you are trying to target.  

A data-driven organization looks at their data and reacts.  Netflix, I am assuming, saw that many of their users were binge watching TV series as soon as they came out.  I'm sure this started with Breaking Bad, Mad Men, great content.  They saw an opportunity to create this content on there own as the majority of the time spent on Netflix is binge watching TV.  They looked at their own data and saw the opportunity to increase time on Netflix and add subscriptions by creating content.  But not just any ole content.  They had the data which showed what their customers loved watching and what resonated with them.  They were able to see what shows were being dropped off of the binge halfway through.  They saw what types of shows were most addictive.  

The content creators gave their biggest competitor the keys to the kingdom, data.  Now Netflix is poised to put a lot of the content creators out of business because they know way more about their customers behaviors than the content creators know.  Because Netflix controls the entire experience, from creation, to delivery, to analyzing the behavior, they can create superior content.  It is a model that is brilliant.  Netflix will continue to dominate, especially in the age where people are looking to become "cord-cutters".  I believe we will see even better content coming out of Netflix in the near future as they learn even more about what we like to watch.

Source: http://smartdatacollective.com/bernardmarr...

To Benefit From Big Data, Resist The Three False Promises

From Forbes.com:

Gartner recently predicted that “through 2017, 60% of big data projects will fail to go beyond piloting and experimentation and will be abandoned.” This reflects the difficulty of generating value from existing customer, operational and service data, let alone the reams of unstructured internal and external data generated from social media, mobile devices and online activity.

Yet some leading users of big data have managed to create data-driven business models that win in the marketplace. Auto insurer Progressive PGR -1.22%, for instance, uses plug-in devices to track driver behavior. Progressive mines the data to micro-target its customer base and determine pricing in real time. Capital One, the financial services company, relies heavily on advanced analytics to shape its customer risk scoring and loyalty and offer optimization initiatives. It exploits multiple types of customer data, including advanced text and voice analytics.

I believe what most people miss when they hear these success stories is the amount of human capital that gets thrown at these problems.  Hundreds of data scientists create thousands of models, of which very few are actually incorporated into final production.  The reason the Gartner stats ring true is most companies don't have the kind of resources to throw at the problem and most companies won't realize an ROI even if they could throw these types of resources at a problem.

Promise 1: The technology will identify business opportunities all by itself.

This is the direction the technology is moving towards, but it is not there yet.  The technology enables a group of data scientists to identify the opportunities, it's not magic.

Promise 2: Harvesting more data will automatically generate more value. 

The temptation to acquire and mine new data sets has intensified, yet many large organizations are already drowning in data, much of it held in silos where it cannot easily be accessed, organized, linked or interrogated.

More data does not mean better ROI on your initiatives.  In fact, most companies don't take advantage of the data they already have to generate the maximum ROI.  I always use a rule of thumb when purchasing new technology.  If as an organization you don't believe you are already using the technology you currently posses to its fullest, then its not time to move on to something better.  Your current technology should be preventing you from innovating, if its not then you either have the wrong technology or the wrong people.

Promise 3: Good data scientists will find value for you. 

To profit consistently from big data, you need an operating model that deploys advanced analytics in a repeatable manner. And that involves many more people than data scientists.

Remember, data + insight = action.  Actionable data is a combination or art and science.  data scientists provide the science, however you need the team with the business acumen to provide the insight, this is the art.  Data scientists will create a lot of questions that you never thought to ask of your data, but they cannot provide a solution in and of themselves.  

Remember to walk before you run when it comes to data initiatives.  It's always good to have a goal of using "big data" to improve your business and create ROI from where it didn't previously exist, however the journey to "big data" is more important.  These examples of success with "big data" did not happen over night.  They happened because advanced companies were butting up against the limits of their current technology and they were ready to take the next step.  

Source: http://www.forbes.com/sites/baininsights/2...

7 Limitations Of Big Data In Marketing Analytics

Anum Basir writes:

As everyone knows, “big data” is all the rage in digital marketing nowadays. Marketing organizations across the globe are trying to find ways to collect and analyze user-level or touchpoint-level data in order to uncover insights about how marketing activity affects consumer purchase decisions and drives loyalty.
In fact, the buzz around big data in marketing has risen to the point where one could easily get the illusion that utilizing user-level data is synonymous with modern marketing.
This is far from the truth. Case in point, Gartner’s hype cycle as of last August placed “big data” for digital marketing near the apex of inflated expectations, about to descend into the trough of disillusionment.
It is important for marketers and marketing analysts to understand that user-level data is not the end-all be-all of marketing: as with any type of data, it is suitable for some applications and analyses but unsuitable for others.

There are a lot of companies looking towards "big data" as their savior, but just aren't ready to implement.  This leads to disenfranchisement towards lower level data.  It reminds me of the early days of Campaign Management (now Marketing Automation) where there were so many failed implementations.  The vendors were too inexperienced to determine how to successfully implement their products, the technology was too nascent and the customers were just not ready culturally to handle the products.  This is "big data" in a nutshell.  

1. User Data Is Fundamentally Biased
The user-level data that marketers have access to is only of individuals who have visited your owned digital properties or viewed your online ads, which is typically not representative of the total target consumer base.
Even within the pool of trackable cookies, the accuracy of the customer journey is dubious: many consumers now operate across devices, and it is impossible to tell for any given touchpoint sequence how fragmented the path actually is. Furthermore, those that operate across multiple devices is likely to be from a different demographic compared to those who only use a single device, and so on.
User-level data is far from being accurate or complete, which means that there is inherent danger in assuming that insights from user-level data applies to your consumer base at large.

I don't necessarily agree with this.  While there are true statements, having some data is better than none.  Would I change my entire digital strategy on incomplete data?  Maybe if the data was very compelling, but this data will lead to testable hypothesis that will lead to better customer experiences.  Never be afraid of not having all the data and never search for all the data, that pearl is not worth the dive.

2. User-Level Execution Only Exists In Select Channels
Certain marketing channels are well suited for applying user-level data: website personalization, email automation, dynamic creatives, and RTB spring to mind.

Very true.  Be careful to apply to the correct channels and don't make assumptions about everyone.  When there is enough data to make a decision, use that data.  If not, use the data you have been working with for all these years, it has worked up till now.

3. User-Level Results Cannot Be Presented Directly
More accurately, it can be presented via a few visualizations such as a flow diagram, but these tend to be incomprehensible to all but domain experts. This means that user-level data needs to be aggregated up to a daily segment-level or property-level at the very least in order for the results to be consumable at large.

Many new segments can come from this rich data and become aggregated.  It is fine to aggregate data for reporting purposes to executives, in fact this is what they want to see.  Every once in awhile throw in a decision tree or a naive bayes output to show there is more analysis being done at a more granular level. 

4. User-Level Algorithms Have Difficulty Answering “Why”
Largely speaking, there are only two ways to analyze user-level data: one is to aggregate it into a “smaller” data set in some way and then apply statistical or heuristic analysis; the other is to analyze the data set directly using algorithmic methods.
Both can result in predictions and recommendations (e.g. move spend from campaign A to B), but algorithmic analyses tend to have difficulty answering “why” questions (e.g. why should we move spend) in a manner comprehensible to the average marketer. Certain types of algorithms such as neural networks are black boxes even to the data scientists who designed it. Which leads to the next limitation:

This is where the "art" comes into play when applying analytics on any dataset.  There are too many unknown variables that go into a purchase decision of a human being to be able to predict with absolute certainty an outcome, so there should never be a decision to move all spending in some direction or change an entire strategy based on any data model.  What should be done is test the new data models against the old way of doing business and see if they perform better.  If they do, great, you have a winner.  If they don't, use that new data to create models that will maybe create better results than the current model.  Marketing tactics and campaigns are living and breathing entities, they need to be cared for and changed constantly.

5. User Data Is Not Suited For Producing Learnings
This will probably strike you as counter-intuitive. Big data = big insights = big learnings, right?
Actionable learnings that require user-level data – for instance, applying a look-alike model to discover previously untapped customer segments – are relatively few and far in between, and require tons of effort to uncover. Boring, ol’ small data remains far more efficient at producing practical real-world learnings that you can apply to execution today.

In some cases yes, but don't discount the learnings that can come from this data.  Running this data through multiple modeling techniques may not lead to production ready models that will impact revenue streams overnight.  These rarely happen and takes many hundreds of data scientists with an accuracy rating of maybe 3% of the models making it into production.  However, running data through data mining techniques can give you unique insights into your data that regular analytics could never produce.  These are true learnings that create testable hypothesis that can be used to enhance the customer experience.

6. User-Level Data Is Subject To More Noise
If you have analyzed regular daily time series data, you know that a single outlier can completely throw off analysis results. The situation is similar with user-level data, but worse.

 This is very true.  There is so much noise in the data, that is why most time spent data modeling involves cleaning of the data.  This noise is why it is so hard to predict anything using this data.  The pearl may not be worth the dive for predictive analytics, but for data mining it is certainly worth the effort.

7. User Data Is Not Easily Accessible Or Transferable

Oh so true.  Take manageable chucks when starting to dive into these user-level data waters. 

This level of data is much harder to work with than traditional data.  In fact, executives usually don't appreciate the time and effort it takes to glean insights from large datasets.  Clear expectations should be set to ensure there are no overinflated expectations at the start of the user-level data journey.  Under promise and over deliver for a successful implementation.  

Source: http://analyticsweek.com/7-limitations-of-...

What to Do When People Draw Different Conclusions From the Same Data

Walter Frick writes for HBR:

That famous line from statistician William Edwards Deming has become a mantra for data-driven companies, because it points to the promise of finding objective answers. But in practice, as every analyst knows, interpreting data is a messy, subjective business. Ask two data scientists to look into the same question, and you’re liable to get two completely different answers, even if they’re both working with the same dataset.
So much for objectivity.
But several academics argue there is a better way. What if data analysis were crowdsourced, with multiple analysts working on the same problem and with the same data? Sure, the result might be a range of answers, rather than just one. But it would also mean more confidence that the results weren’t being influenced by any single analyst’s biases. Raphael Silberzahn of IESE Business School, Eric Luis Uhlmann of INSEAD, Dan Martin of the University of Virginia, and Brian Nosek of the University of Virginia and Center for Open Science are pursuing several research projects that explore this idea. And a paper released earlier this year gives an indication of how it might work.

I believe it is best practice to have multiple analysts look at a problem to at least devise what their methodology would be for a certain problem.  In fact, I always like to take a crack myself when the problem is particularly difficult, just so I have an idea of what the data looks like and how certain variables are influencing the results.  

I think too many executives are unwilling to dig into the data and work with a problem.  I believe it is very important to have a deep understanding of the data issues so, as an executive, you can make better decisions on how to guide the team.  Many times the answer is not a deployable model, but a data mining exercise that will glean some testable hypothesis.  

Though most companies don’t have 60 analysts to throw at every problem, the same general approach to analysis could be used in smaller teams. For instance, rather than working together from the beginning of a project, two analysts could each propose a method or multiple methods, then compare notes. Then each one could go off and do her own analysis, and compare her results with her partner’s. In some cases, this could lead to the decision to trust one method over the other; in others, it could lead to the decision to average the results together when reporting back to the rest of the company.
“What this may help [to do] is to identify blind spots from management,” said Raphael Silberzahn, one of the initiators of the research. “By engaging in crowdsourcing inside the company we may balance the influence of different groups.”

I do believe in internal "crowdsourcing".  The minute tough problems start to be outsourced, the company loses the great insight their analysts and business owners have that can bring insight tot he data that many analysts outside of the company could never understand.  I truly believe analytics is art and science, but too many times the art is under appreciated.  

Source: https://hbr.org/2015/03/what-to-do-when-pe...

When it Comes to Data, Small is the New Big

A great example of taking the data you already have by Anthony Smith:

Customer data is a commonly unrecognized superpower. Companies that provide software as a service (SaaS) are especially likely to have massive amounts of uncategorized, unmanaged customer data, creating a well of potential that largely goes unused. This data, when analyzed properly, can provide companies with unlimited opportunities to improve product functionality, increase customer satisfaction and stimulate business growth.

As I wrote in Data + Action = Insight and Back Again, big data is not necessarily the next key driver for a business.  The next key driver may be harnessing the data you currently have and using that to glean more insight.  Once you have this insight, you can develop new strategies and tactics to deliver targeted content and create great customer experiences, which in turn lead to increased revenue.

Anthony talks about how he did this with his company and I believe so may companies can do this too.  Make sure you have exhausted all current data possibilities before leaping into "big data".  The data you have may be the next key driver for your business.

Source: http://www.information-management.com/news...

How to Make Big Data Work for You

This article is the problem with Big Data.  Everyone wants to jump so many steps on their way to true 1-to-1 marketing using data as the cornerstone.  Great marketing is always an evolution.  One step forward using data brings results and different behavior is gleaned from that data.  Then that data is taken and different questions are asked of the data based on the results using the previous data set.  This is how marketing problems are solved using data.

Marketers can't take a dataset that is fairly large, one they are already struggling to make the most of anyway, and then be given a much larger dataset and told to "go make magic".  Marketing with data is a disciplined venture.  As a marketer, make sure you are making the most out of the data you already have before worrying about what keystrokes the customer is making or the "Internet of Things".  

Always make sure the next step in the data is one that will bring you value today.  Have a long term understanding of where the data can take you, but be disciplined in getting there or you just might miss a lot of insight on the way. 

Source: http://www.dmnews.com/how-to-make-big-data...

How to Get More Value Out of Your Data Analysts

Organizations succeed with analytics only when good data and insightful models are put to regular and productive use by business people in their decisions and their work.

Actionable data is a buzzword that I have used and heard for many, many years.  However, in practice it is much harder to produce actionable insight for business users.  C-level executives are always trying to dissect old information which can be very useful, but only in cases where the answer will have some actionable insight.  If the answers to question have interesting insight, but don't provide any action, it creates a time consuming chase in how to make something out of the data.

Organizations have to educate themselves of what is capable of their data.  It's no use to ask questions that the data will not be able to turn into actionable insight.  Organizations should spend time on questions that can be solved with the current capabilities, in other words the low hanging fruit.  Once there is no longer any low hanging fruit, enhance the analytics with new tools and modeling capabilities to find the next round of low hanging fruit.

If you want to put analytics to work and build a more analytical organization, you need two cadres of employees:

  • Analytics professionals to mine and prepare data, perform statistical operations, build models, and program the surrounding business applications.
  • Analytical business people who are ready, able, and eager to use better information and analyses in their work, as well as to work with the professionals on analytics projects.

Very true.  An organization can hire all the analysts they can handle, but if the analysts have no business acumen and the business has no data acumen, there will always be a disconnect.  

Organizations need to have both the business and analysts working together to find the best answers.  Data Scientists can find very interesting items for them, however the business side may provide insight that shows the data scientists work is a known insight.  The business may be working very hard to solve a problem that the data scientist can solve, taking intuition out of play and using data in the place of trial and error.

In my history I have always liked business and data analysts reporting to the same group.  Many organizations don't like this structure because it can lead to a group "grading their own paper".  However, the tight integration of the teams produces results more efficiently.  Analysts are not wasting time chasing problems that don't exist and business people can bounce ideas off of analysts for quick insight before making decisions.

 

Source: http://blogs.hbr.org/2013/12/how-to-get-mo...

Can You See the Opportunities Staring You in the Face?

I’ve come to believe that less than 1% of the data is truly useful.

Exactly!  Most businesses are very simple if you look for the key metrics.  So many times people want to show their worth by over thinking the problem.  If I can come up with some new innovative way to look at this problem, I'll be a superstar.  But more times than not it isn't a complex problem.  Humans are fairly simple to predict.  Most humans will fall into patterns and want very straightforward things.  New data doesn't need to be introduced until you have gotten everything out of the current data you have.

Big-data initiatives are proliferating, and the information is getting more complex all the time.

There’s a lot of potential benefit for both retailers and customers.

But only if the data is well managed and well understood. Statistics literacy isn’t very high in most businesses. A few educational institutions have realized this and are making a push to turn out business graduates who know their way around a regression analysis. But for the most part, businesspeople aren’t familiar enough with statistics to use them as the basis for good decisions. If you don’t understand the numbers, you can go a long way down a bad road very quickly. That’s why every team charged with making decisions about customers should include a trusted individual who understands statistics. If that understanding isn’t between your own two ears, make sure you bring a person with that skill set onto your team.

Being able to understand what the data is telling you is more important that having a degree in statistics.  Interpreting data is really where the opportunities present themselves, not in figuring out the most optimal model.  I suggest having someone who is proficient in building statistical models and ask a lot of questions from the output.  Start to understand what the answers  of models are telling you and simplify the results into something that can be used in the future.  A model may tell you that people who buy a particular item are likely to be loyal, but is it the item that drives the loyalty or is this just a coincidence?  The better you understand your data, the better decisions you will make and you don't have to be a data scientist to do that.

Source: http://blogs.hbr.org/2013/11/can-you-see-t...

FiveThirtyEight's Nate Silver Explains Why We Suck At Predictions (And How To Improve) | Fast Company

When human judgment and big data intersect there are some funny things that happen. On the one hand, we get access to more and more information that ought to help us make better decisions. On the other hand, the more information you have, the more selective you can be in which information you pick out to tell the narrative that might not be the true or accurate, or the one that helps your business, but the one that makes you feel good or that your friends agree with.

This is a great article on using data and predictions.  I just bought this book as a good friend of mine suggested it is a great read.  I always hear "You can make numbers tell whatever story you want."  Ain't that the truth?  So many times colleagues of mine hold on to a certain part of the data that tells the story they want to tell and soon it becomes truth, however this only helps them look good instead of moving the business forward.  

Source: http://www.fastcompany.com/3001794/fivethi...

Big Data's Human Component - Jim Stikeleather - Harvard Business Review

Machines don't make the essential and important connections among data and they don't create information. Humans do. Tools have the power to make work easier and solve problems. A tool is an enabler, facilitator, accelerator and magnifier of human capability, not its replacement or surrogate ... That's what the software architect Grady Booch had in mind when he uttered that famous phrase: "A fool with a tool is still a fool."

From my last post, I talked about humans being able to make the data actionable.  The understanding of the data that is used for the model is more important than understanding the ​math behind the algorithms.  Algorithms can find the patterns humans can't, however the algorithms can't determine if the answers are relevant. 

We forget that it is not about the data; it is about our customers having a deep, engaging, insightful, meaningful conversation with us

Exactly.​

Understand that expertise is more important than the tool.  Otherwise the tool will be used incorrectly and generate nonsense (logical, properly processed nonsense, but nonsense nonetheless).

The answers will be fancy, but will not help make decisions for frontline or CRM more effective.​

When we over-automate big-data tools, we get Target's faux pas of sending baby coupons to a teenager who hadn't yet told her parents she was pregnant, or the Flash Crash on Thursday May 6, 2010, in which the Dow Jones Industrial Average plunged about 1000 points — or about nine percent.

Humans should always be paying attention to the outcomes and put parameters around the use of automated answers.  Answers should be used in conjunction with other factors for the best decision.​

Although data does give rise to information and insight, they are not the same. Data's value to business relies on human intelligence, on how well managers and leaders formulate questions and interpret results. More data doesn't mean you will get "proportionately" more information. In fact, the more data you have, the less information you gain as a proportion of the data (concepts of marginal utility, signal to noise and diminishing returns). Understanding how to use the data we already have is what's going to matter most.

Source: http://blogs.hbr.org/cs/2012/09/big_datas_...

Can Big Data Smoke Out the Silent Majority?

Of course sophisticated analytics are being used to find pockets of money, just as they have long been used by financial service companies.

The private sector has been using these tactics for years now.  It doesn't surprise me that sophisticated political campaigns are also using it.  Data mining is very powerful, everyone should be using it.​