While an analyst could actively influence
the conclusions of a complex modeling project,
in many cases, one could also simply believe
the results because it is in one’s best interest
to promote a particular narrative. In fact,
as suggested in the behavioral economics
literature, it is in our very nature to either
skew results to confirm a long-held belief
(confirmation bias) or disbelieve facts that
threaten an individual’s career interest (illusion of skill).
Specific challenges that could lead to less-than-optimal decision-making include:
Rote memorization. The most simple, but often the most
effective, is rote memorization. The ability to memorize and then
put information into a simplified story is often the best approach
for articulating a business story involving a complex system. It gives
the appearance of mastery and distinguishes those people who have
learned the art of storytelling through large data sets where numerous
pieces of information could be made useful within a broader story
or answer a succinct business question. The problem is that rote
memorization often does not include a deep understanding of the
underlying system and the potential variability in an observed result.
Cherry-picked data. An alternative but nearly as simple approach is “cherry-picking” the data to present an incomplete, biased
presentation of information. This tried-and-true method has been
used for as long as data has been available, but the opportunities
are now even more plentiful because of the large increase in the
volume of the data and the ability for individuals to manipulate
the data to highlight their preferred narrative.
Technically sophisticated, but biased predictions. While rote
memorization and cherry-picking are available to most people, the
strategies available to the more technically minded include much
more sophisticated modeling approaches. The list is long, but one
of the most widely used practices is to calibrate a regression model
with handpicked explanatory variables that achieve the desired
statistical conclusion for the model. The strategies can differ, but
the approaches suffer from many of the same problems associated
with cherry-picking. A skilled technical analyst can skew the data
to present the data that highlights his or her preferred narrative.
Complexity promoters. With the increase in the availability of
data, many people have furthered their careers and advanced their
consulting opportunities by advocating for complex, analytically
driven solutions. In many cases, these solutions add precious little to
the predictive power, but instead allow individuals to further their
career interests by becoming the only person with the knowledge
to adequately understand a complex model.
Observation overload without a consideration for the management of an organization. With the increase in the volume of
data, analysts can increase the number of observations and suggest a wide range of potential solutions or improvements. While
these suggestions can be helpful, in many cases, this approach can
lead an organization to lack focus as it attempts to address all the
opportunities rather than the most impactful initiative.
Taken in total, the competition to use
large data sets and sophisticated computing power has led, in many cases, to biased analysis and ultimately less-effective
decision-making. The structural challenge
is that most quantitative work in complex
systems can be conducted with a variety of
techniques such that individuals can differentiate themselves based on the conclusions of
their models or their analytical techniques, rather
than the quality of the decision-making process.
The Solution to Analytic Competition:
Analytic Fundamentals and Teams
The deluge of information and computing power combined with
analytic competition have blinded many leaders to important
qualitative factors that are important in making decisions and
has distracted them from the true task at hand—developing
well-reasoned decisions that incorporate all the available data to
make the best possible decision.
The attention put on computationally sophisticated models
and Big Data has shifted the focus from the most fundamental
aspects of analytic decision-making: how to structure relevant
business questions, understanding the source and collection
process of the data, determining when a model is useful and
when additional complexity is worth the cost, understanding
the incentive structure of those providing data, and holistically
analyzing risk and uncertainty using qualitative factors beyond
additional data and modeling.
In addition to these fundamental aspects of analytical decisions,
the power of technical teams to check individual incentives is often
not explicitly considered as a strategy to avoid biased analysis. By
ensuring a work product comes from a group, it is much more
likely to represent an honest conclusion rather than one impacted
by the individual incentives of any one person.
The steps necessary to ensure effective decision-making in a
team environment are not different than the basis of good management—active listening, ensuring an environment of collaboration,
constant questioning, guarding against groupthink and excessive
courtesy, and allowing sufficient time to complete a project.
For many organizations where conclusions must be reached
on complex systems, in the absence of sound analytic fundamentals and strong, well-reasoned management, the promise of Big
Data has led to poor decision-making and an enormous waste
of resources as individuals use data to foster their careers at the
expense of improving the entire organization.
In many respects, organizations must begin making the changes
that many people are trying to do in their personal lives—learn
how to use information to improve their situation without falling
for the many pitfalls of too much information.
KURT J. WROBEL is chief financial officer and chief actuary at
Geisinger Health Plan.