What to Do When People Draw Different Conclusions From the Same Data

Walter Frick writes for HBR:

That famous line from statistician William Edwards Deming has become a mantra for data-driven companies, because it points to the promise of finding objective answers. But in practice, as every analyst knows, interpreting data is a messy, subjective business. Ask two data scientists to look into the same question, and you’re liable to get two completely different answers, even if they’re both working with the same dataset.
So much for objectivity.
But several academics argue there is a better way. What if data analysis were crowdsourced, with multiple analysts working on the same problem and with the same data? Sure, the result might be a range of answers, rather than just one. But it would also mean more confidence that the results weren’t being influenced by any single analyst’s biases. Raphael Silberzahn of IESE Business School, Eric Luis Uhlmann of INSEAD, Dan Martin of the University of Virginia, and Brian Nosek of the University of Virginia and Center for Open Science are pursuing several research projects that explore this idea. And a paper released earlier this year gives an indication of how it might work.

I believe it is best practice to have multiple analysts look at a problem to at least devise what their methodology would be for a certain problem.  In fact, I always like to take a crack myself when the problem is particularly difficult, just so I have an idea of what the data looks like and how certain variables are influencing the results.  

I think too many executives are unwilling to dig into the data and work with a problem.  I believe it is very important to have a deep understanding of the data issues so, as an executive, you can make better decisions on how to guide the team.  Many times the answer is not a deployable model, but a data mining exercise that will glean some testable hypothesis.  

Though most companies don’t have 60 analysts to throw at every problem, the same general approach to analysis could be used in smaller teams. For instance, rather than working together from the beginning of a project, two analysts could each propose a method or multiple methods, then compare notes. Then each one could go off and do her own analysis, and compare her results with her partner’s. In some cases, this could lead to the decision to trust one method over the other; in others, it could lead to the decision to average the results together when reporting back to the rest of the company.
“What this may help [to do] is to identify blind spots from management,” said Raphael Silberzahn, one of the initiators of the research. “By engaging in crowdsourcing inside the company we may balance the influence of different groups.”

I do believe in internal "crowdsourcing".  The minute tough problems start to be outsourced, the company loses the great insight their analysts and business owners have that can bring insight tot he data that many analysts outside of the company could never understand.  I truly believe analytics is art and science, but too many times the art is under appreciated.  

Source: https://hbr.org/2015/03/what-to-do-when-pe...