How do you create a data-driven culture in your marketing team?

Becoming a data-driven organisation doesn’t just rely on the right technology, structure and processes. The human element is essential, and without the right skills, qualities and roles, any effort to be successful at data-driven marketing is destined to struggle.
And the kinds of skills that support a data-driven philosophy are rich and varied.

This is very true.  The "art" and "science" that requires actionable data lean more to the art side in most marketing departments.  The biggest change is to respect what data can bring to the equation.  Many marketers don't respect data, they respect their gut and soft metrics like awareness.  While data doesn't solve all problems, it helps inform direction.  It helps decide what is happening with the customers you are trying to target, plus the ones that you aren't targeting and whether you should.  

“The data-driven marketing team is knowledgeable enough to converse freely with technical and statistical resources while staying laser-focused on getting the right message to the right person at the right time. But the most important quality needed in a modern marketing team is curiosity. Without that, I may as well outsource all of my data related work to a third party. Curiosity stimulates creativity and conversation, and aids decision-making.” 

I like the statement of staying laser-focused on getting the right message to the right customer at the right time.  Too many times teams lose their focus and start drifting on to answers that are either easy or different from the most relevant topics.  Understanding what data is saying is more valuable than having the person who can put the report together.  Money is made by providing the insight to the data, that is what I look for in my team.  But don't forget to have the guy that can put all the data together.

Quintero adds: “Building a data-driven culture is not an overnight process. It takes time. To me, a data-driven culture means building a safe environment where experimentation is encouraged and mistakes are tolerated. It’s less about having all the right tools in place – although that’s a critical part of the process – and definitely more about cultivating excitement around discovery and objectivity. Being data-driven is exciting and people should be encouraged to enjoy the process as much as making things happen.”

Changing a culture is a journey.  Teaching the team about why decisions are made from looking at the data and what the thought process for coming to the conclusions is critical in building the data culture.  No matter how smart a person is, if they don't understand the thought processes of decisions they will never be able to take leaps with the data.  If they understand what to look for in data, they beginning asking the right questions and delivering recommendations along with their questions.

Source: http://www.mycustomer.com/feature/marketin...

The State and Drivers of Data Marketing

What matters most is the optimization of the customer experience, relevance and (perceived) customer value as a driver of business value. Data-driven marketing certainly is not (just) about advertising and programmatic ad buying as some believe. Nor is it just about campaigns. On the contrary: if done well, data-driven marketing is part of digital marketing transformations whereby connecting around the customer across the customer life cycle is key.

Very succinct vision of what data-driven marketing is, it's all about the customer experience.  The advent of "big data" was nothing more than gathering extra data about the customers.  Gathering data is only the first step of the process, albeit a time-consuming one.  The good news is after the hard work of gathering the data has been completed, the harder part starts.  Once you have data, making sense of the data and creating actionable outcomes to enhance the customer experience becomes the goal.  This is very hard work.  It takes plenty of analysis and insight to reach this goal.  But the companies who will do this the best will be the ones that succeed in the digital age.

Among the key takeaways of the data-driven marketing report by the GlobalDMA:
  • 77% of marketers are confident in the data-driven approach and 74% expect to increase data marketing budgets this year.
  • Data efforts by far focus on offers, messages and content (marketing) first (69% of respondents). Second ranks a data-driven strategy or data-driven product development. Customer experience optimization unfortunately only ranks third with 49% of respondents.
  • Among the key drivers of increased data marketing: first of all a need to be more customer-centric (reported by 53% of respondents). Maximizing efficiency and return ranks second followed by gaining more knowledge of customers and prospects.

I believe the first step in the process is understanding where the puck is going to be and skate in that direction.  Marketers are understanding this data revolution is coming and they are saying the right things in surveys.  The real question will be how to get there.  It's easy to identify problems, it's hard to implement solutions.  The marketers who will show they are adept at change will thrive in this new paradigm.  

Customer analytics is something I have focused my entire career.  In the casino industry we have had the optimal opt-in mechanism for many years and have collected amazing amounts of data about our customers behavior.  We have used this to create targeted marketing campaigns to our customers, so I believe in the direction the entire industry is taking.  Always start with the customer.  It will lead to creating better experiences and more profitable results.

 
Source: http://www.i-scoop.eu/infographics/data-dr...

The Dangers of Data-Driven Marketing

Marketing has gone digital, and we can now measure our efforts like never before. As a result, marketers have fallen in love with data. Head over heels in love—to the point where we want data to drive our marketing, instead of people, like you and me. I think this has gone too far.

I'm a big proponent of data-driven marketing, in this article Ezra Fishman uses semantics to say this is bad, but what he is trying to get is there is a need to go beyond just the data.  As I wrote in Data + Insight = Action, data all by itself cannot create actionable outcomes.  

Data-informed marketing
Instead of focusing on data alone, data-informed marketing considers data as just one factor in making decisions. We then combine relevant data, past experiences, intuition, and qualitative input to make the best decisions we can.
Instead of poring over data hoping to find answers, we develop a theory and a hypothesis first, then test it out. We force ourselves to make more gut calls, but we validate those choices with data wherever possible so that our gut gets smarter with time.

This is what I was trying to articulate in my article.  To be an excellent data-driven marketing organizations takes a little bit of "science" and a little bit of "art" to determine the best course of action.  When a data scientist is driving your organization, there are years of experience being unused to help him understand even further what the data is saying.  

Most times when a data scientist is off on their own, it takes an inordinate amount of time to come up with a conclusion, mostly because they lack the context of how the business is generating the data.  How the strategy manipulates the data.  How a customer being underserved may be an intentional outcome.

The ease of measurement trap
When we let data drive our marketing, we all too often optimize for things that are easy to measure, not necessarily what matters most.
Some results are very easy to measure. Others are significantly harder. Click-through rate on an email? Easy. Brand feelings evoked by a well-designed landing page? Hard. Conversion rate of visitors who touch your pricing page? Easy. Word-of-mouth generated from a delightful video campaign? Hard.

Right on!  Of course the organizations that take the easy way out are ones that I would not consider to be data-driven.  KPI's are a great item, but they can be deadly.  There are usually so many moving parts that make up the business and the data being generated.  This can cause business KPI's to look fine, yet drilling down into the performance from a customer perspective may show some very scary trends that would cause alarm.  However, a non data-driven company will continue with their strategy because of the KPI's (hello RIM/Blackberry).  

The local optimization trap
The local optimization trap typically rears its head when we try to optimize a specific part of the marketing funnel. We face this challenge routinely at Wistia when we try increase the conversion rate of new visitors. In isolation, improving the signup rate is a relatively straightforward optimization problem that can be "solved" with basic testing.
The problem is, we don't just want visitors to sign up for our Free Plan. We want them to sign up for our Free Plan, then use their account, then tell others how great Wistia is, then eventually purchase one of our paid plans (and along the way generate more and more positive feelings toward our brand).

This can be combined with the previous bullet.  When analytics is only seen from a high level, simple statements like "we need to increase the number of signups, which will flow down at the same rate as we currently have, will increase conversion."  Nothing could be further from the truth.  To increase anything there needs to be an additional action.  This action may include advertising to a different group of individuals or giving an incentive that will increase signups.  The issue with this thinking is these aren't the same individuals that are converting in your current funnel.  The proper strategy is to figure out the converters and try and target customers like them, which may actually decrease the size of the funnel if done right.

The data quality trap
We are rarely as critical of our data as we ought to be. Consider, for example, A/B tests, which have become the gold standard for marketing experimentation. In theory, these tests should produce repeatable and accurate results, since website visitors are assigned randomly to each page variant.
In practice, however, there are lots of ways even the simplest A/B tests can produce misleading results. If your website traffic is anything like ours, visitors come from a variety of sources: organic, direct, referral, paid search, and beyond. If one of those sources converts at a much higher rate than others, it's easy to get skewed results by treating your traffic as a single, uniform audience.

One should rarely just take the conversion or redemption results from the A/B test without digging into the data.  Making sure all segments are driving the results is key.  Don't take for granted the customers that were randomly selected for each group ended up being totally random.  Ensure there was proper representation from each segment of the business and identify any other changes that could be tested based on different behaviors within the segments.

Data vigilance
As marketers, we should continue to explore new and better ways to harness the power of data, but we also must remain vigilant about becoming overly reliant on data.
Data can be a tremendous source of insight. Harness that. But don't pretend it's something more. And definitely don't put it in charge of your marketing team.

This reminds me when I was a product manager and we would receive these RFP's to determine if we were the right company to supply them with our product.  Sometimes the requirements were such that we wondered if the company wanted humans to continue to work for them.  I would comically refer to some of these as automated manager.  It seemed companies wanted to press a button and have a system do everything for them.  This is the trap Fishman is referring.  Humans have great insight.  Humans are the "art" in the equation to actionable outcomes.  This is equally important as the "science".

Source: http://wistia.com/blog/data-informed-marke...

Five Ways to Win with Data-Driven Marketing

Data-driven marketing has come to the forefront for companies that want to better engage their customers and prospects. With data-driven marketing, firms are able to gather, integrate, and assess data from a variety of internal and external sources to help enhance value.

Marketing automation starts with data.  In fact, in the digital age, almost all marketing initiatives start with data.  Companies who are data-driven have a distinct advantage over their competitors.  When a company is data-driven, they focus on their strengths, enhance their weaknesses and they don't obsess over their competition.  They have the data to understand how they can improve.

1. Determine what really makes customers tick. According to the DMA, data-driven marketing is about discerning what customers want and need and engineering the company to provide it: “The more firms can use data to develop a 360-degree, multi-channel view of what customers think and want, the more the customer will truly be king.” Through the use of both internal and external data, companies are learning how to “crown” their customers — truly understand what makes them tick, and then develop campaigns that engage them in the most effective manner possible.

This all comes with data analytics.  Understanding what drives your customers behaviors is step one to developing campaigns and offers.  Without an understanding of what your customers want, there is not an efficient way to determine what they would like from you.

2. Set baselines for campaign effectiveness. Data-driven marketing has effectively replaced the traditional “hit-or-miss” test component of the typical direct marketing campaign.

Baselines are a very important piece to understand when analyzing campaigns.  This is the beginning of the journey to understand the effectiveness of any changes that are made.  If an organization cannot answer what a particular program is bringing them, they should test the campaigns without the program and determine what, if any, the effectiveness of the program is bringing.  

3. Block out the “noise” and focus on what’s relevant. When assessing data over multi-year periods — and across different marketing channels — it’s not unusual for things to be extremely “busy” at the outset. There’s a lot of static and responses are all over the place. However, by using proven data-driven marketing techniques, you can start to pull out the relevant information, analyze it over time, pick up on traffic patterns, and drill down to specific marketing touch points (i.e., number of website hits that come in when a specific direct-response show airs).

This is a lot harder than it sounds.  Marketers are the kings of taking a piece of data and selling their story with it, even though it is just noise or a small sample of customers.  This is where the "art and science" approach is necessary.  Being able to combine data mining techniques with the business acumen is key to focusing on the relevance of the data.

4. Determine exactly how customers are responding.

Again, this is important to understand multi-channel marketing.  The ability to reach your customers on the right channel at the right time is only possible through data.  

5. Reach extremely targeted customer bases.

The promise of 1-to-1 marketing is arriving.  Be careful to shoot for this level of personalization, because it is very expensive and the pearl is not worth the dive for the majority of your database. However, being able to target your best customers in a very personal nature could help grow the business exponentially.  This takes extreme focus.  

 

Source: http://adage.com/article/digitalnext/pract...

Building credibility for your analytics team—and why it matters

If you work with data regularly, chances are you trust it. You know how it's collected and stored. You know the caveats and the roadblocks you face when analyzing it. But, when you bring your findings to those further removed, you're asking them to take a leap of faith and trust in data they may know very little about.

Multiple times in my career I had to come into organizations and take teams that were not trusted in the organization and help build them into the trusted source of data accuracy and insights.  This journey is never easy.  It takes patience and requires a lot of persistence to change an organizations perception of the department.  But these points are good advice on a roadmap to do this.

Start Small

When trying to get people to believe in your team, it can be tempting to chase the biggest problems first. These problems often take a long time to answer, and can take several tries to get right. It's often better to first establish trust by picking early projects that you know you can win, and win quickly.Try starting with basic arithmetic to answer crucial business and product questions. For startups, some example questions might be:

  • What are the most engaging features of your product?

  • What is the company's core demographic? What do they like about the product?

Often, people don’t judge the answers to these questions on technical rigor, they judge them on business impact. Starting small can open doors to the big questions that you may have wanted to start with; if you've earned credibility along the way, you'll have more time, flexibility—and maybe resources—to tackle them.

I always find it helpful to start answering questions that are not currently being answered.  As Derek Steer points out in the article, don't start by trying to solve the worlds problem.  If your team tries to tackle tough problems, there will be a much more critical eye on the work and the data produce.  Allow your team to get some wins under its belt.  Remember, this is a journey, not a sprint.  Trust comes with wins, not home runs.

Know your audience

Keep your audience in mind as you begin to craft the story from your data. Add in the appropriate amount of detail your audience needs to focus on decisions rather than methods. What context might they need? Spending a little time thinking about what your audience cares about most also helps you anticipate possible questions and prepare answers in advance. Few things can help establish credibility faster than fielding a question during a presentation and immediately flipping to a slide that answers it.

This point is critical to garner trust.  When presenting data, make sure there is a story that is being told along with the data.  Guide the audience to the answers that you have found, don't let them have to figure it our themselves.  Be sure to explain to the audience what they are seeing and why it matters.  Make it simple, quick and insightful.

Don’t be a House

House, a brilliant albeit fictional doctor, routinely diagnosed rare diseases but had abysmal bedside manner. The thing was, House didn’t have to win his patients over—they were so desperate to survive that they would listen to his every word.

It's subtle, but consider your findings a conversation starter. Understand that the non-analysts have valid points too: they have experiences you don't have and they likely know something you don't. These discussions aren't about winning an argument, but making the right decision for the business.

Never use data for evil.  This is fairly common in Finance departments, but it is important not to attack decisions, but rather try to initiate conversations to come to the best business decision.  Once there is a tone of implication in the analysis, your team will lose the trust of the department that you are creating the analysis for.  Those departments made decisions that didn't have data to drive their decisions, so treat them as a partner, not someone that needs help.

Be Transparent

Analytics can feel like a black box to many people—making that leap of faith appear even larger. By showing even just the basics of your process, you can help others believe in it. To increase transparency try:

  • Making your work simple and understandable. Monica Rogatti would urge you to try division before doing anything harder. As your audience becomes more comfortable, up the game to simple regression models—it's not usually difficult for folks to understand the direction and magnitude of coefficients.

  • Finding simple ways to convey advanced concepts. For example,confidence intervals and p-values can be confusing for many people, but charts with error bars make these concepts easy to understand.

  • Using stories. If you're presenting information about feature usage, or events with technical backend names, paint a picture of how a user would see these features, or put events in plain-English names.

Numbers are difficult to interpret at times, taking the complex and being able to tell a story with it is an art.  Most analysts are great at finding data and creating insights if they have domain knowledge, however they can be terrible at communicating their findings.  Always have available the methodology for coming up with the answers, even if you believe it is a waste of time.  The haters in the organization will demand this, but it also humanizes the process for the non-analytical audience members that you want on your side.  

The most important part of the journey is to persevere.  The beginning of the journey is the hardest part.  I remember in my last position, the department I took over was the laughing stock of the organization.  It took them weeks to come up with an answer and no one believed in what they were saying because they were just being report monkeys, instead of providing any insights.  By the time we got going, we were the defacto data source for the organizations.  We created analysis for parts of the organization we didn't have anything to do with, but when the organization needed something done right, it came through our team.  That was because we had many wins along a journey.  

Source: http://www.datasciencecentral.com/profiles...

To Benefit From Big Data, Resist The Three False Promises

From Forbes.com:

Gartner recently predicted that “through 2017, 60% of big data projects will fail to go beyond piloting and experimentation and will be abandoned.” This reflects the difficulty of generating value from existing customer, operational and service data, let alone the reams of unstructured internal and external data generated from social media, mobile devices and online activity.

Yet some leading users of big data have managed to create data-driven business models that win in the marketplace. Auto insurer Progressive PGR -1.22%, for instance, uses plug-in devices to track driver behavior. Progressive mines the data to micro-target its customer base and determine pricing in real time. Capital One, the financial services company, relies heavily on advanced analytics to shape its customer risk scoring and loyalty and offer optimization initiatives. It exploits multiple types of customer data, including advanced text and voice analytics.

I believe what most people miss when they hear these success stories is the amount of human capital that gets thrown at these problems.  Hundreds of data scientists create thousands of models, of which very few are actually incorporated into final production.  The reason the Gartner stats ring true is most companies don't have the kind of resources to throw at the problem and most companies won't realize an ROI even if they could throw these types of resources at a problem.

Promise 1: The technology will identify business opportunities all by itself.

This is the direction the technology is moving towards, but it is not there yet.  The technology enables a group of data scientists to identify the opportunities, it's not magic.

Promise 2: Harvesting more data will automatically generate more value. 

The temptation to acquire and mine new data sets has intensified, yet many large organizations are already drowning in data, much of it held in silos where it cannot easily be accessed, organized, linked or interrogated.

More data does not mean better ROI on your initiatives.  In fact, most companies don't take advantage of the data they already have to generate the maximum ROI.  I always use a rule of thumb when purchasing new technology.  If as an organization you don't believe you are already using the technology you currently posses to its fullest, then its not time to move on to something better.  Your current technology should be preventing you from innovating, if its not then you either have the wrong technology or the wrong people.

Promise 3: Good data scientists will find value for you. 

To profit consistently from big data, you need an operating model that deploys advanced analytics in a repeatable manner. And that involves many more people than data scientists.

Remember, data + insight = action.  Actionable data is a combination or art and science.  data scientists provide the science, however you need the team with the business acumen to provide the insight, this is the art.  Data scientists will create a lot of questions that you never thought to ask of your data, but they cannot provide a solution in and of themselves.  

Remember to walk before you run when it comes to data initiatives.  It's always good to have a goal of using "big data" to improve your business and create ROI from where it didn't previously exist, however the journey to "big data" is more important.  These examples of success with "big data" did not happen over night.  They happened because advanced companies were butting up against the limits of their current technology and they were ready to take the next step.  

Source: http://www.forbes.com/sites/baininsights/2...

If Algorithms Know All, How Much Should Humans Help? - NYTimes.com

Steve Lohr writes for NYTimes.com:

Armies of the finest minds in computer science have dedicated themselves to improving the odds of making a sale. The Internet-era abundance of data and clever software has opened the door to tailored marketing, targeted advertising and personalized product recommendations.
Shake your head if you like, but that’s no small thing. Just look at the technology-driven shake-up in the advertising, media and retail industries.
This automated decision-making is designed to take the human out of the equation, but it is an all-too-human impulse to want someone looking over the result spewed out of the computer. Many data quants see marketing as a low-risk — and, yes, lucrative — petri dish in which to hone the tools of an emerging science. “What happens if my algorithm is wrong? Someone sees the wrong ad,” said Claudia Perlich, a data scientist who works for an ad-targeting start-up. “What’s the harm? It’s not a false positive for breast cancer.”

I have written here many times of analytics being a combination of "art" and "science".  Having data and insight leads to the most action, yet some data scientists want to remove the "art" part of the equation.  The belief is that computers and algorithms can see more about the data and the behavior than a human ever could.  Also, once there is so much data about an individuals behavior, there is no "art" left, all the data points are accounted for so the "science" is indisputable.  

However, I have a hard time believing that "art", or the human insight, will ever be replaceable.  There are so many variables still left unknown and a computer can't know all of them.  The "science" portion will always get better at explaining the "what" happened, but they don't understand the business operations and strategy that goes behind the decisions that were made. I am a true believer in the "big data" coming of age.  I believe it is fundamentally changing the way companies have to do business, but never forget about the human side, the "art" of understanding "why" the data is telling you "what" is happening.  

These questions are spurring a branch of academic study known as algorithmic accountability. Public interest and civil rights organizations are scrutinizing the implications of data science, both the pitfalls and the potential. In the foreword to a report last September, “Civil Rights, Big Data and Our Algorithmic Future,” Wade Henderson, president of The Leadership Conference on Civil and Human Rights, wrote, “Big data can and should bring greater safety, economic opportunity and convenience to all people.”
Take consumer lending, a market with several big data start-ups. Its methods amount to a digital-age twist on the most basic tenet of banking: Know your customer. By harvesting data sources like social network connections, or even by looking at how an applicant fills out online forms, the new data lenders say they can know borrowers as never before, and more accurately predict whether they will repay than they could have by simply looking at a person’s credit history.
The promise is more efficient loan underwriting and pricing, saving millions of people billions of dollars. But big data lending depends on software algorithms poring through mountains of data, learning as they go. It is a highly complex, automated system — and even enthusiasts have qualms.
“A decision is made about you, and you have no idea why it was done,” said Rajeev Date, an investor in data-science lenders and a former deputy director of Consumer Financial Protection Bureau. “That is disquieting.”
Blackbox algorithms have always been troubling for the majority of individuals, even for the smartest of executives when trying to understand their business.  Humans need to see why.  There is a reason why Decision Trees are the most popular of the data models, even though they inherently have less predictive prowess than their counterparts like Neural Networks.

Decision Trees output a result that a human can interpret.  It is a road map to the reason why the prediction was made.  This makes us humans feel comfortable.  We can tell story around the data that explains what is happening.  With a blackbox algorithm, we have to trust that what is going on inside is correct.  We do have the results to measure against, but as these algorithms become more commonplace, it will be imperative that humans can trust the algorithms.  In the above bank loan example, when making decisions regarding bank loans, a human needs to understand why they are being denied and what actions they can take to secure the loan in the future.  

This ties into creating superior customer experiences.  Companies that will be able to harness "big data" and blackbox algorithms and create simple narratives for customers to understand will have a significant competitive advantage.  Creating algorithms to maximize profits is a very businesslike approach, but what gets left out is the customer experience.  What will happen over time is the customer will dislike the lack of knowledge and communication and they will not become future customers.  A bank may say, this is good, they would have defaulted anyway.  But what happens in the future when too many people have bad customer experiences?  I don't believe that is a good longterm strategy.  

In a sense, a math model is the equivalent of a metaphor, a descriptive simplification. It usefully distills, but it also somewhat distorts. So at times, a human helper can provide that dose of nuanced data that escapes the algorithmic automaton. “Often, the two can be way better than the algorithm alone,” Mr. King said.  

Businesses need to also focus on the human side.  When we forget there is also an "art" to enhance all of these great algorithms, businesses will be too focused on transaction efficiency instead of customer experiences which in turn will lead to lower sales.  

Source: http://www.nytimes.com/2015/04/07/upshot/i...

Business Intelligence for the Other 80 Percent

Ted Cuzzillo writes for Information-Technology:

We give business people everything. They’ve got data, and often it’s clean. They’ve got tools, and many are easy to use. They’ve got visualizations, and many of them speed things up. They’ve got domain knowledge, at least most do. Tell me: Why hasn’t business intelligence penetrated more than about 20 percent of business users?

This is a great question.  So many organizations have executive leadership that says they want information, dashboards and realtime information, yet when provided to them, it goes unread.  How does this happen?  The answer is what most executives want is a story.  They want someone to interpret the analytics and let them know what they should be looking at.  The dashboards act as content for speaking points.  Executives want the most important numbers at their fingertips so they can spit them out at a moments notice.  

What executives want is the rest of the data to be fed to them in a story with a narrative.  Here is the data, here is what we believe it says and here is what we are going to do about it.  It coincides with my article Data + Insight = Action.  

What executives need is all of these parts (data, insight and action) in one analysis.  They need to see the data, using visualizations to make the data easier to read.  They need the insight of the business experts in the form of a commentary, succinct and to the point.  Then they need what action is the business going to take with this newfound knowledge.  With all of this information to arm the executive, they can understand and make a decision on what to do.  

To reach "The Other 80 Percent," let’s turn away from the “data scientist” and to the acting coach. “A lot has to do with intangible skills,” said Farmer. A lot also has to do with traditional story structure, which appeals to “a deep grammar that’s very persuasive and memorable.”
Storytelling isn’t a feature, it’s a practice. One practicing storyteller, with the title “transmedia storyteller,” is Bree Baich, on the team of Summit regular Jill DychéSAS vice president, best practices.  While others talk about stories, she said, most people seem to start and end with data and leave out the storytelling art. They fail to connect data with any underlying passion. “What we need are translators, people who understand data but can tell the human story from which it arose.”

There is always an assumption that is made from an analyst that a visualization or a table of data is plain and understandable.  A good rule of thumb is to assume the audience of an analysis doesn't see what the analyst is seeing.  If analysts start with this assumption, they can then tell a story of why this data is fascinating.  An analysis without text that explains why the data is interesting is going to fall on deaf ears.  Once the analysis gets to a higher level, the executives will not have time to create the "insight" portion of the data and they will either send the analysis back, or ignore it completely.  Always remember to include the data, with the insight as a story and what action is going to be taken.  With this formula analysts will become more than report generators.  

Source: http://www.information-management.com/news...

7 Limitations Of Big Data In Marketing Analytics

Anum Basir writes:

As everyone knows, “big data” is all the rage in digital marketing nowadays. Marketing organizations across the globe are trying to find ways to collect and analyze user-level or touchpoint-level data in order to uncover insights about how marketing activity affects consumer purchase decisions and drives loyalty.
In fact, the buzz around big data in marketing has risen to the point where one could easily get the illusion that utilizing user-level data is synonymous with modern marketing.
This is far from the truth. Case in point, Gartner’s hype cycle as of last August placed “big data” for digital marketing near the apex of inflated expectations, about to descend into the trough of disillusionment.
It is important for marketers and marketing analysts to understand that user-level data is not the end-all be-all of marketing: as with any type of data, it is suitable for some applications and analyses but unsuitable for others.

There are a lot of companies looking towards "big data" as their savior, but just aren't ready to implement.  This leads to disenfranchisement towards lower level data.  It reminds me of the early days of Campaign Management (now Marketing Automation) where there were so many failed implementations.  The vendors were too inexperienced to determine how to successfully implement their products, the technology was too nascent and the customers were just not ready culturally to handle the products.  This is "big data" in a nutshell.  

1. User Data Is Fundamentally Biased
The user-level data that marketers have access to is only of individuals who have visited your owned digital properties or viewed your online ads, which is typically not representative of the total target consumer base.
Even within the pool of trackable cookies, the accuracy of the customer journey is dubious: many consumers now operate across devices, and it is impossible to tell for any given touchpoint sequence how fragmented the path actually is. Furthermore, those that operate across multiple devices is likely to be from a different demographic compared to those who only use a single device, and so on.
User-level data is far from being accurate or complete, which means that there is inherent danger in assuming that insights from user-level data applies to your consumer base at large.

I don't necessarily agree with this.  While there are true statements, having some data is better than none.  Would I change my entire digital strategy on incomplete data?  Maybe if the data was very compelling, but this data will lead to testable hypothesis that will lead to better customer experiences.  Never be afraid of not having all the data and never search for all the data, that pearl is not worth the dive.

2. User-Level Execution Only Exists In Select Channels
Certain marketing channels are well suited for applying user-level data: website personalization, email automation, dynamic creatives, and RTB spring to mind.

Very true.  Be careful to apply to the correct channels and don't make assumptions about everyone.  When there is enough data to make a decision, use that data.  If not, use the data you have been working with for all these years, it has worked up till now.

3. User-Level Results Cannot Be Presented Directly
More accurately, it can be presented via a few visualizations such as a flow diagram, but these tend to be incomprehensible to all but domain experts. This means that user-level data needs to be aggregated up to a daily segment-level or property-level at the very least in order for the results to be consumable at large.

Many new segments can come from this rich data and become aggregated.  It is fine to aggregate data for reporting purposes to executives, in fact this is what they want to see.  Every once in awhile throw in a decision tree or a naive bayes output to show there is more analysis being done at a more granular level. 

4. User-Level Algorithms Have Difficulty Answering “Why”
Largely speaking, there are only two ways to analyze user-level data: one is to aggregate it into a “smaller” data set in some way and then apply statistical or heuristic analysis; the other is to analyze the data set directly using algorithmic methods.
Both can result in predictions and recommendations (e.g. move spend from campaign A to B), but algorithmic analyses tend to have difficulty answering “why” questions (e.g. why should we move spend) in a manner comprehensible to the average marketer. Certain types of algorithms such as neural networks are black boxes even to the data scientists who designed it. Which leads to the next limitation:

This is where the "art" comes into play when applying analytics on any dataset.  There are too many unknown variables that go into a purchase decision of a human being to be able to predict with absolute certainty an outcome, so there should never be a decision to move all spending in some direction or change an entire strategy based on any data model.  What should be done is test the new data models against the old way of doing business and see if they perform better.  If they do, great, you have a winner.  If they don't, use that new data to create models that will maybe create better results than the current model.  Marketing tactics and campaigns are living and breathing entities, they need to be cared for and changed constantly.

5. User Data Is Not Suited For Producing Learnings
This will probably strike you as counter-intuitive. Big data = big insights = big learnings, right?
Actionable learnings that require user-level data – for instance, applying a look-alike model to discover previously untapped customer segments – are relatively few and far in between, and require tons of effort to uncover. Boring, ol’ small data remains far more efficient at producing practical real-world learnings that you can apply to execution today.

In some cases yes, but don't discount the learnings that can come from this data.  Running this data through multiple modeling techniques may not lead to production ready models that will impact revenue streams overnight.  These rarely happen and takes many hundreds of data scientists with an accuracy rating of maybe 3% of the models making it into production.  However, running data through data mining techniques can give you unique insights into your data that regular analytics could never produce.  These are true learnings that create testable hypothesis that can be used to enhance the customer experience.

6. User-Level Data Is Subject To More Noise
If you have analyzed regular daily time series data, you know that a single outlier can completely throw off analysis results. The situation is similar with user-level data, but worse.

 This is very true.  There is so much noise in the data, that is why most time spent data modeling involves cleaning of the data.  This noise is why it is so hard to predict anything using this data.  The pearl may not be worth the dive for predictive analytics, but for data mining it is certainly worth the effort.

7. User Data Is Not Easily Accessible Or Transferable

Oh so true.  Take manageable chucks when starting to dive into these user-level data waters. 

This level of data is much harder to work with than traditional data.  In fact, executives usually don't appreciate the time and effort it takes to glean insights from large datasets.  Clear expectations should be set to ensure there are no overinflated expectations at the start of the user-level data journey.  Under promise and over deliver for a successful implementation.  

Source: http://analyticsweek.com/7-limitations-of-...

What to Do When People Draw Different Conclusions From the Same Data

Walter Frick writes for HBR:

That famous line from statistician William Edwards Deming has become a mantra for data-driven companies, because it points to the promise of finding objective answers. But in practice, as every analyst knows, interpreting data is a messy, subjective business. Ask two data scientists to look into the same question, and you’re liable to get two completely different answers, even if they’re both working with the same dataset.
So much for objectivity.
But several academics argue there is a better way. What if data analysis were crowdsourced, with multiple analysts working on the same problem and with the same data? Sure, the result might be a range of answers, rather than just one. But it would also mean more confidence that the results weren’t being influenced by any single analyst’s biases. Raphael Silberzahn of IESE Business School, Eric Luis Uhlmann of INSEAD, Dan Martin of the University of Virginia, and Brian Nosek of the University of Virginia and Center for Open Science are pursuing several research projects that explore this idea. And a paper released earlier this year gives an indication of how it might work.

I believe it is best practice to have multiple analysts look at a problem to at least devise what their methodology would be for a certain problem.  In fact, I always like to take a crack myself when the problem is particularly difficult, just so I have an idea of what the data looks like and how certain variables are influencing the results.  

I think too many executives are unwilling to dig into the data and work with a problem.  I believe it is very important to have a deep understanding of the data issues so, as an executive, you can make better decisions on how to guide the team.  Many times the answer is not a deployable model, but a data mining exercise that will glean some testable hypothesis.  

Though most companies don’t have 60 analysts to throw at every problem, the same general approach to analysis could be used in smaller teams. For instance, rather than working together from the beginning of a project, two analysts could each propose a method or multiple methods, then compare notes. Then each one could go off and do her own analysis, and compare her results with her partner’s. In some cases, this could lead to the decision to trust one method over the other; in others, it could lead to the decision to average the results together when reporting back to the rest of the company.
“What this may help [to do] is to identify blind spots from management,” said Raphael Silberzahn, one of the initiators of the research. “By engaging in crowdsourcing inside the company we may balance the influence of different groups.”

I do believe in internal "crowdsourcing".  The minute tough problems start to be outsourced, the company loses the great insight their analysts and business owners have that can bring insight tot he data that many analysts outside of the company could never understand.  I truly believe analytics is art and science, but too many times the art is under appreciated.  

Source: https://hbr.org/2015/03/what-to-do-when-pe...

Data + Insight = Action and Back Again

Adobe Summit brought with it a great nirvana of a near-future where marketers are able to deliver relevant content to customers creating great experiences.  The words that permeated throughout the conference were those, content, experiences and data.  The big stars of the show were content and customer experiences, which I believe are extremely important as I have written before.  

However, there is no right content delivered at the perfect moment to create wonderful customer experiences without data.  Data is the key to making this all work, and not just any data.  No, I'm not talking about "big data", I'm talking about actionable data.  

Most companies are sitting on a treasure trove of data already.  Without purchasing third-party data, understanding every click, customers have data that can transform their business.  The issue is in interpreting the data, making it actionable.  Actionable data isn't a product, it's a culture.  

Actionable data is the combination of art and science.  The path to actionable data isn't necessarily going out and hiring a bunch of talented data scientists, though it doesn't hurt to have these people on your team.  The path to actionable data is marrying the data with the business acumen.  It's not enough to have data telling you something happened, there has to be an understanding of the business as to why it happened.

Once there is an understanding of what happened (science) and why it happened (art), you have actionable data.  Now you can create optimal tactics to deliver relevant content to create targeted experiences in the digital age.  The great thing about this process is it's circular.  Once a company creates great targeted experiences for their customers, customer behaviors will change and the entire process starts all over again.  There are always puzzles to solve and amazing content and experiences to create.  

How to Get More Value Out of Your Data Analysts

Organizations succeed with analytics only when good data and insightful models are put to regular and productive use by business people in their decisions and their work.

Actionable data is a buzzword that I have used and heard for many, many years.  However, in practice it is much harder to produce actionable insight for business users.  C-level executives are always trying to dissect old information which can be very useful, but only in cases where the answer will have some actionable insight.  If the answers to question have interesting insight, but don't provide any action, it creates a time consuming chase in how to make something out of the data.

Organizations have to educate themselves of what is capable of their data.  It's no use to ask questions that the data will not be able to turn into actionable insight.  Organizations should spend time on questions that can be solved with the current capabilities, in other words the low hanging fruit.  Once there is no longer any low hanging fruit, enhance the analytics with new tools and modeling capabilities to find the next round of low hanging fruit.

If you want to put analytics to work and build a more analytical organization, you need two cadres of employees:

  • Analytics professionals to mine and prepare data, perform statistical operations, build models, and program the surrounding business applications.
  • Analytical business people who are ready, able, and eager to use better information and analyses in their work, as well as to work with the professionals on analytics projects.

Very true.  An organization can hire all the analysts they can handle, but if the analysts have no business acumen and the business has no data acumen, there will always be a disconnect.  

Organizations need to have both the business and analysts working together to find the best answers.  Data Scientists can find very interesting items for them, however the business side may provide insight that shows the data scientists work is a known insight.  The business may be working very hard to solve a problem that the data scientist can solve, taking intuition out of play and using data in the place of trial and error.

In my history I have always liked business and data analysts reporting to the same group.  Many organizations don't like this structure because it can lead to a group "grading their own paper".  However, the tight integration of the teams produces results more efficiently.  Analysts are not wasting time chasing problems that don't exist and business people can bounce ideas off of analysts for quick insight before making decisions.

 

Source: http://blogs.hbr.org/2013/12/how-to-get-mo...

Will Data Science Become the New Bottleneck? - Forbes

many have posited that recalcitrant IT departments, hidebound by a history of rigid organization, have been a bottleneck to the adoption of new technologies and, by extension, the ability to distill business value from data.

We’ve also examined the potential and difficulties of analyzing big data, arguing that a new class of analyst, the data scientist, is on the rise in many organizations.

That may be part of the problem, rather than the solution.

I think this will be a major problem as big data moves into the mainstream.  So many organizations struggle with IT getting data that is readily available in the organization, wait until 20 groups are pinging 2 data scientists, who by nature are slow and go down unnecessary rabbit holes to find the truth.   

In reality, most businesses don't need all this data.  They need to perfect using their current data to drive actionable results.  Until they do that, there is no need to bring big data into the organization.  ​

Source: http://www.forbes.com/sites/danwoods/2012/...