Risky Business: How Data Analytics Can Help
Kathleen Long is using a combination of behavioral analytics, Bayesian engineering and big data to help companies better determine and mitigate business risk.
Topics
Competing With Data & Analytics
A socio-cybernetician and behavioral scientist, Kathleen Long battles operational risk. As CEO of Montage Analytics, a Mountain View, Calif., consultancy offering risk assessment software services and analytic reports, Long has combined her training with the experience of Montage Analytic’s CTO, Doug Campbell, in Bayesian network design. The company helps organizations better understand and mitigate everything from risky business practices and employee fraud to the big, unwieldy, nearly undetectable risks referred to as “black swans.”
This isn’t an easy undertaking. Part of the problem, according to Long, is that not everyone knows how to define operational risk (if you can’t define it, you can’t guard against it). At the same time, the risk landscape is changing so fast that what happened yesterday is no longer a marker for what might happen tomorrow.
“We are living in unprecedented times. You can’t say the past is a reliable predictor of the future,” says Long. “Too many events happen that are black swans that come out of the blue — Bernie Madoff, 9/11. Things have changed dramatically. And the fact that we are living in an increasingly globalized and networked world means that even small events happening somewhere on the other side of the world can quickly cascade around the globe and affect us in places we didn’t expect.”
In a conversation with Renee Boucher Ferguson, a contributing editor at MIT Sloan Management Review, Long discussed the changing risk landscape and how big data and behavioral science can help.
What are the different types of risk organizations need to think about, from operational risk to the so-called black swans?
A lot of people don’t understand what operational risk is, but it has to do with the people, processes and systems in place to produce the company’s product or service. That said, there are many different kinds of operational risk. For example: Internal fraud, external fraud, client, products and business practices, damage to physical assets, business disruption and system failures. Then there are execution, delivery and process management risks. Did I mention employment practices and workplace safety? Those are all the kinds of operational risk.
What is a black swan risk?
A black swan risk has been recently defined — it’s actually a metaphor that was based on a theory developed by Nassim Nicholas Taleb, a professor of risk engineering at Polytechnic Institute of New York University. It’s the disproportionate quality — an extreme outlier — of high-impact, hard-to-predict, rare events that generally people don’t expect. And also, the psychological biases that people may be blind to this kind of uncertainty and unaware of the huge role that rare events can have in historical scenarios.
What are some examples of current black swan events?
The whole Bernie Madoff thing was a black swan. It wasn’t a black swan to him, but it definitely caught the world by surprise. And the mortgage derivatives were really a sequence of operational failures that brought much of the economy to a standstill. That whole derivatives scandal exposed operational failures by mortgage brokers and mortgage bundlers, credit agencies, asset managers, regulatory agencies, investors — in other words, it created this trickle-down effect, or network-spreading effect, that began to cascade across a whole financial sector, ultimately affecting individual homeowners who certainly didn’t expect it.
Things have changed dramatically. And the fact that we are living in an increasingly globalized and networked world means that even small events happening somewhere on the other side of the world can quickly cascade around the globe and affect us in places we didn’t expect.
How has the nature of risk changed in recent years? It seems like it really has.
First of all, there’s a greater focus on operational risk than there has been in the past, because of various major events such as 9/11, Bernie Madoff, AIG and Lehman Brothers and other similarly catastrophic events that have happened.
Two things: Globalization and its cascading network effect — from events half a world away to somebody’s front door — have changed the nature of risk and brought it from, “It happens out there to somebody else” to “Oh, my God, it could happen to me.” So the identification of this being a riskier world has increased.
And because of that, an awareness of how to prevent risk, or mitigate risk, has come to the fore, and operational risk has become a more important topic. I would say that one of the ways companies are getting it right is to realize that operational risk is a really important problem, and in fact it represents about 30% of business loss. That’s a huge number. And so they’re getting it right by working toward improving their operational risk management strategies and tactics.
What capabilities do companies need to have, or to cultivate, in order to use behavioral analytics to mitigate risk?
First of all, companies need to understand more fully the impact of operational risk, and how you best identify and measure it. The field of operational risk management is really still in its infancy.
Part of what they need to do is cultivate a risk-conscious culture. They need to have change management ability if something requires a change, and probably most importantly, an awareness that most risk arises from human frailty, like, people make mistakes, they’re inexperienced, they’re incompetent, they’re overloaded, they’re greedy, they misjudge things. These are all human things that give rise to operational risk. Beyond that, we have developed a simple online “point and click” assessment questionnaire that delivers a score indicating the extent of risk, and a detailed triage report to help companies take targeted action to mitigate their risk.
Who is focused on risk within the organization, and are they the right folks?
Larger companies often have a risk officer, like a CRO, a chief risk officer. However, smaller companies don’t have that luxury, so often it’s done sort of ad hoc. Often it’s IT, because IT deals with the computer systems and also management reports and such; and also because most risk analytics is data-driven, it would tend to come from somebody in IT, either directly or indirectly.
So, if it’s not someone from IT, should it be a CEO who is looking at risk?
Oh, absolutely, the CEO ought to be looking at risk; the board of directors ought to be looking at risk. In fact, much of the risk that has occurred can be laid at the feet of a board of directors that somehow missed the boat.
Where does data analytics play a role in determining or detecting black swan events, or are these just outside of the realm of detection?
Black swans by their very definition are outside the realm of detection — by quantitative, traditional analytics. However, Montage Analytics has developed a tool that can predict areas of vulnerability in organizations to black swans and other kinds of risk.
Let’s talk about the tool you’ve developed, and what other types of risks you are able to detect.
The tool we developed is based on the need we saw to provide a missing piece to risk assessment and risk management. Generally, big data relies on historical statistics, which, because it’s historical, it’s like trying to drive your car looking through the rear view mirror and not at the road ahead. You don’t have enough information to keep from hitting something.
We found that what was missing was the effect of human behavior on operational risk, and we developed a process we call “data farming,” which is sort of the flipside of data mining. Using this method, we have found a way to identify and assess various levels of operational risks and provide a roadmap for implementing solutions to mitigate the risk.
How is it that you are able to pull in this human perspective?
I’m a behavioral scientist, and our chief technical officer is a Bayesian engineer. Years and years ago I taught a series of courses on risk assessment and risk management in cases of child abuse. So in my background, risk and people went together very easily; I saw that. And when I began doing corporate work in various organizations, I realized that what was missing was the human element, that risk was a lot about what happened in the past and not too much about what would happen in the future; whereas the work that I did while teaching looked also at prognostic factors, which is, “what do you think is going to happen in the future?”
That informed my understanding of how data-based analytics have limitations in terms of their utility in understanding risk that involves humans. And the actual ironic thing is that operational risk is all about people. It’s caused by people. It’s people who make mistakes, who get greedy, who are inexperienced. And it’s people who design the systems that sometimes fail, and the processes that sometimes don’t work. So at the core, it’s about people. What we found was missing in other technologies was the effect of human behavior — motivations and their effect on risk.
So without divulging any trade secrets, how is it that you are able to quantify human behavior and human emotion with analytics?
What we look for are behavioral antecedents to various risk scenarios that are identified as a result of an ethnographic study. So we use ethnography — an insider’s view of a business type, like banking or accounting or clinic management or employers. We take a homogeneous group and we get an insider’s perspective on what it’s like to be in that environment. And we look at the risks that can occur, based on these behavioral antecedents.
So for a simple example, one behavioral antecedent to fraud might be an employee — or whoever’s the perpetrator — and their need for money. For example, studies have found that in difficult economic times, internal fraud goes up. Also, how thoroughly did an employer vet their potential employees? How are they supervised? What incentives are there — unintentionally of course — to commit such an act, and are there any disincentives that would help prevent such an event? So, we look at the combination and the interaction among lots of variables, and we do this via an online questionnaire that we have developed, that gathers holistic information about the caliber of business processes that affect risk. In other words, we’re looking between the cracks in a system, to see where there’s vulnerability.
Let’s take a recent major catastrophe — the BP oil spill for example — would you have been able to predict that incident using behavioral analytics?
I believe so, and here’s why: We would be looking at that scenario from the bottom up. We don’t just look at the top down; we look at the bottom up, the people who actually are on the site, managing their part of the operation, which would be the people who were at the site of the oil spill. There have been reports that on-site employees were worried about such an event occurring, but they had no channel by which to communicate it safely.
Also, we would look at the working conditions, the level of experience, the sets of rewards and incentives that could affect something like this. When product reliability and risk is involved, Montage also works with strategic partners who apply both the physics of failure and the science of reliability to discover failure modes in advance of a disaster. Montage Analytics can combine this data with other data and information to formulate a holistic operational risk assessment. For example, in the case of BP Oil, they didn’t monitor the systems as closely as they should have, because it was less profitable for them to spend the energy or the resources to do that. So, they cut corners.
And really, oftentimes what these risks come down to is a disincentive to mitigate risk because of the cost. They’re making the decision that a catastrophe is probably unlikely: I don’t think it’s going to happen; we could use the money elsewhere; our numbers were down last quarter; I’ve got to impress my boss — whatever, whatever. And so they make a decision to forgo the risk assessment and perhaps the system maintenance in favor of something that looks better on the bottom line. And that’s when they pay the price for it.
To me, that type of information — determining risk based on, say, cutting corners — seems to be one of those dirty little corporate secrets that would be kept quiet. So how much does honesty play a role in how well you can develop risk assessments?
It plays a huge role. And the way we deploy our assessments is that there is a shield between our risk output and our risk engine and the company itself, so that anyone who answers questions would be anonymous.
The other thing is that one of our questions is, do you have a widely communicated procedure for alerting about risks or whistle-blowing? And if they don’t, then that’s also information. It might be a question — I’m not quoting a question here, but I’m trying to imagine how it might work in this state, but — if you saw a situation that could have future adverse effect, would you feel comfortable in bringing it up to senior management? And oftentimes, as you said, it’s the dirty little secret in companies. And that’s why we get in and do ethnographic research on what it’s really like to work in various industries: the external pressures, the internal pressures, competing priorities. All of those things together go into how we analyze risk.
What do you think the future is for behavioral analytics?
I think the next generation of risk management technology rests on a foundation of both behavioral science and knowledge engineering that provides the business risk puzzle’s missing piece. And that missing piece is the multidimensional behavioral analysis that incorporates comprehensive internal or behavioral elements, and external and statistical analytics, to come together to produce intelligent risk analytics.
Our intelligent risk analytics, as a metaphor you could say, thinks like a team of seasoned, highly skilled experts, from the way it asks questions to how the responses are analyzed. It’s absolutely an emerging trend.
Comments (2)
sanchezjb@attglobal.net
raguand