on and analyzing “what” is happening. We are enthusiastically
building capabilities to predict “what is going to happen” with
the support of predictive analytics, but we often ignore the “why
is this happening” analysis.
As an insurer informed me recently, “When we figure out
what is happening, we tend to quickly move on to try to predict
what is going to happen if we change some variables.” But
without a comprehensive analysis of why, the “why” outcomes
are what they are. Predictive analysis easily becomes less of
a science and more of an art that is based on assumptions,
gambles, and crystal-ball gazing. And art can bring to us good
and bad: Some of it is successful, pushes our thinking forward,
and is appreciated by many. But some art is ahead of its time,
off-base, and does not seem to bring value to anybody except,
possibly, its creator.
Don’t Skip a Step
Now why would insurers skip the important “why is this happening” question? That’s easy. Answering this question requires
timely, accessible, clean, and complete data, but that’s not an
easy requirement to meet, especially for insurers with many
lines of business, several geographies, and old legacy systems.
Insurers increasingly seem to appreciate the need for a solid
data infrastructure, though. The previously mentioned SMA
study on 2016 initiatives and priorities found that 82 percent
of the 116 insurer respondents plan to invest in enterprise data
and analytics this year. (Only the customer experience area
scored higher.) And these insurers got it right. Data infrastructure tools and solutions, a/k/a data plumbing, have come a long
way in recent years. It seems to be the right moment to tackle
the ugly data problem.
Classical ETL providers have added analytics components to
their offerings. Data cleansing tools have become much easier
to use and incorporate into existing infrastructure. Visualization tools help identify where data cleansing is most critical,
which shortens the time needed for these projects and reduces
resources and expense. Core systems providers add analytical
capabilities to their offerings, and some BI providers offer solid
data warehousing and data mart infrastructure.
There are plenty of relatively new technologies in the market
that can effectively assist insurers to increase their analytical
capabilities around the “why is this happening” questions by
creating an environment that supports them, augmented with
easy-to-use analytical and visualization tools.
Increased Understanding of Risk
In addition, new as well as more conventional external data sets
can greatly support our understanding of “why” and provide a
bridge to predictions. Insurers have historically increased their
understanding of customers and risks by augmenting internal
data with location information, credit scores, insurance scores,
property information, driver and car information, financials,
demographics, and socio-graphics.
Other new data sources have recently become available, and
we are enthusiastically researching and investing in the analysis
of social-media data, drones, sensors, and other Internet of
Things data sources. We see plenty of examples of how these
data sources are used to learn more about fire risks at home,
personal health and fitness, materials tracking, and inventory
management, waste management and pipeline maintenance
for utilities or oil and gas companies. Augmenting internal
information about customers and risks with these new external
data sources might help us to better understand why things
are happening and set the stage for predicting what is going to
happen under different scenarios.
Ready, Willing, and Able
Again, technology is ready and available to help with this. Big
data solutions can incorporate many different third-party data
sources in analytical platforms. Huge progress in diverse areas
such as graph databases, visual computing and graphic cards,
or machine learning and data lakes enable us to analyze text,
sound, and pictures as never before.
We are starting to see the more innovative insurers deploying these technologies. For instance, a large insurer is piloting
machine analyses of hundreds of thousands of pictures in car
claims files. The purpose of this machine learning exercise is to
flag cases that are suspected of overinflated claims amounts by
including prior damage to the car to be further investigated by
Workers’ compensation carriers analyze social media postings to detect potential fraud. And, on a more positive note, carriers reward customers with free smart devices for their homes,
cars, and bodies in order to collect data and learn from those
data points to create better insurance products and services for
The ultimate goal would be to use machine learning to
quickly identify the small number of cases, files, and events
that require a human eye and brain. When we reach that point
in analytics, we can have machines deal with the majority of
repetitive tasks and solely focus on what we do best: applying
judgment and interpretation to facts.
In an old episode of the TV series Star Trek: The Next Gen-
eration, the character Data complained to Commander Deanna
Troi that his programming might be inadequate to the task. She
responded “We are all more than the sum of our parts, Data.
You’ll have to be more than the sum of your programming.”
If we move along the BI spectrum in an orderly fashion,
supported by the right technology, that future is not far off. It
will be exciting to see the field of analytics become much more
than the sum of its programming. ITA
Monique Hesseling, a partner at Strategy Meets Action, is
focused on developing effective roadmaps and helping
companies expand their business opportunities. She can be
reached via email at email@example.com.