Improving Customer Service and Security With Data Analytics

The advantages of analytics to customer service have already been shown. Now the question becomes: How can analytics be used to improve security?

Reading Time: 4 min 

Topics

Competing With Data & Analytics

How does data inform business processes, offerings, and engagement with customers? This research looks at trends in the use of analytics, the evolution of analytics strategy, optimal team composition, and new opportunities for data-driven innovation.
More in this series

Organizations are collecting more and more data. And while rich data allows personalized service, detailed data about real people (rightly) often raises concerns. Just as this data is increasingly valuable to organizations, it can be valuable to criminals as well, leading to an ever-escalating series of data breaches. Data analytics exacerbates trade-offs between security and service; the analytical processes on data can, at a minimum, raise privacy concerns for individuals because much of marketing analytics tries to learn as much as possible about potential customers. These analytics processes are becoming increasingly powerful at de-anonymizing people from their trace data.

However, these de-anonymization techniques are an example of a way that analytics offers at least a partial solution to the problems it has exacerbated.

Consider, for example, placing a call to your bank for help after losing your debit card. The core problem is that, before providing customer service, the bank must authenticate that you are who you say you are. This authentication process must begin with the assumption that the caller is a malefactor impersonating the real customer — guilty until proven innocent. The bank will help the caller only after being convinced of the caller’s identity.

While this process is annoying when we’re customers seeking help, we actually want and need this level of security. It is in our best interests that the bank will verify that we are who we say we are before continuing to assist us. After all, we don’t want the bank to hand out our money (or our new debit card) willy-nilly to just anyone.

Historically, this telephone authentication process involves answering a set of questions. What is your account number? What is your personal identification number (PIN)? What is your Social Security number? Can you verify the last three transactions in the account? What is your prior address? The process continues, potentially escalating to security challenge questions based on shared secrets, until the bank is convinced of our identity.

This process is adversarial by design. Even the name “security challenge question” evokes a combative stance, a challenge. The initiator of the call is not trusted until passing through a gauntlet. For banks, it is unfortunate that so many initial interactions with a customer are adversarial in nature.

Topics

Competing With Data & Analytics

How does data inform business processes, offerings, and engagement with customers? This research looks at trends in the use of analytics, the evolution of analytics strategy, optimal team composition, and new opportunities for data-driven innovation.
More in this series

More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.

Comments (2)
Chandra Pandey
Interesting article, in assessment of how cross cutting concerns interplay in user experience it is important to understand that risk & security are not the same. Naturally when the stakes are high, the gravitational force of minimum viable window applies as a business concern & thereby locking up everything. The solution somewhere lies in having the building blocks of federated security services which is backed by integrated Risk Modelling to classify transactions at run time for add-on security as multi factor approach. Unfortunately, the industry & standards body has not invested enough in terms for time, efforts & funding the risk modelling as a science to develop the ecosystem for automated but intelligent transaction process.  The business of security is somewhere preoccupied with the thoughts of new security vaccines or casualty reports such that not enough is spent to engineer the value chain as industry ecosystem.  Transactions being distributed by nature, divided we fall is the likely outcome witnessed as the next breach headline. In that respect the value of risk modelling at granular level of all the interacting components is highly undervalued as investment & approach.

Technologies such as AI & behavioural techniques do provide the necessary aids but are not an end in itself. In digital world where the interaction points are ever increasing & traditional boundaries of security perimeter of orgs are ever evolving to remain relevant. GRC in modern era is not just about re enforcement of best practices but also of security rethink of tools, techniques & ecosystem for a balanced trade-off between security & user experience.

Disclaimer: The views and opinions expressed are personal in nature and do not reflect the official policy or position of any organization.
Munyaradzi Mushato
Excellent article and does bring to the fore the core issues relating to ethics and big data. Background AI intelligence to compliment  human interface is indeed a solution where security checks may interfere or are seen to interfere with customer service.

On another note, and from another perspective on the matter, it MAY be a worrying and serious security issue if an institution like a bank does not verify my identification before rendering a service. The customer may want to be assured that indeed a security check has been carried out before the service, in other instances. This might speak to visible AI process.

Given the above not-so-complimentary perspectives to AI and big data and security, I would suggest further discourse on how AI and Big data can be developed to  address the possibility of opposing security requirements/preferences in the client population.