The need for digital transformation in companies is obvious and urgent. But many businesses, especially those burdened by legacy systems, still struggle to transform their operations to cater to the increasingly empowered digital customer. By the time companies overhaul their IT and operational infrastructures, technological developments have already moved ahead.
Dinesh Venugopal, president of the digital unit of Mphasis, an IT services company headquartered in India, has a solution that doesn’t need a complete revamping of legacy infrastructure. He calls it the ‘front-to-back’ transformation, a game-changing approach that would let legacy companies quickly provide hyper-personalized products and services to their customers. Venugopal spoke to Knowledge at Wharton to explain why it doesn’t have to take a fortune and years of implementation to digitally transform.
An edited transcript of the conversation follows.
Knowledge at Wharton: How are companies trying to go digital today? What’s different about the way that they are approaching this issue now compared to how they might have done it in the past?
Dinesh Venugopal: Digital disruption is hitting all industries. Maybe less so in the financial service industry because of regulatory reasons, but it is affecting all industries and … different industries are coping differently. If you ask the CXOs what is keeping them awake at night, they will tell you that it’s not so much Amazon, Google or somebody like that coming and taking over their industry; they’re more concerned about the fact that customers are demanding from them the same kind of services that they’re getting from Amazon or Google. That means companies have to start delivering the same level of personalized or hyper-personalized services to these customers.
Now, if you ask a bank or a financial service institution why they’re not able to do that [now], they will tell you that they are sitting on top of a lot of legacy systems. For example, take an insurance company that may have four or five different policy administration systems built 10 or 15 years ago. What are these systems good at? They’re very good at scale. They’re very good at providing customers the right data. And they’re also good at what is known as mass personalization. That is, they can target segments of the market.
Digital disruption is hitting all industries.
What they are not is flexible. They are not systems that can be easily changed, and they are not systems that can be hyper-personalized — simply because that’s the way that they have been built. So what options do enterprises have in front of them? There are two options. One is to completely take those old core systems and modernize. Some of them are taking that approach. But the problem with that approach is that it’s not easy. It takes two, three, four years to completely modernize all of your systems. And by the time these modernization projects are done, the industry has moved on. Newer products have come along.
So what do we do? There is an approach that we call “end transformation.” It is all about starting with your end stakeholder in mind, looking at what are the specific use cases that make sense for that customer, and how can we add value to the customer and start working from there. You do that by building an intelligent middle layer, which then talks to your core systems and pulls out the data and services, and provide them using your engagement layer back to the customer.
This is a different approach simply because it helps customers get their transformation sooner. Not the entire transformation, but they get wide chunks of value sooner to the customers than ever before. That approach is what we call front-to-back transformation. It’s a key — and important — way in which you can start providing hyper-personalized services to customers.
Knowledge at Wharton: Personalization per se has been around from the earliest days of the internet. But from what you are saying, it sounds like there is a different degree of expectation on the part of customers for what hyper-personalization is all about. How can banks and other financial institutions deliver on that kind of granularity in terms of customer expectations?
Venugopal: I’ll give you two examples of how a typical, traditional organization would have looked at a service that they’re providing, and how a more completely digital company would look at the same service in providing a hyper-personalized service. Here’s a simple example of a credit card transaction. You may be in a mall and swipe your credit card. A traditional service would do this job really well, which is record your transaction, look inside and see if you have the right balance, and do all the checks, and make sure the payment goes through. It does this in a very short amount of time and it’s optimized for that. It is at scale and it is mass personalized because it is personalized for a merchant or a particular type of transaction.
First and foremost, start with the end customer in mind.
Now, if I were a digital company, I would look at not just the transaction itself but the context in which the transaction has happened. They know that you will swipe the transaction in a mall … and you might also know the same credit card company offers a discount or a coupon at a different store close by, and you’re immediately able to offer that person a 20% discount. … The chances of that person walking into the next store and making a purchase is very, very high. That is called contextualized service. It’s an example of how hyper-personalization can be used in the context of a simple credit card transaction.
I’ll give you another example of a travel insurance company. When you do travel insurance, your insurance company collects all kinds of data about you. It knows your flight information, what cities you [will be traveling] in, and then it shows you a policy. After that, in most cases, the policy and the insurance company goes silent until it’s time for you to submit a claim.
But imagine that you are in Disneyland and there is a measles outbreak. Probably the insurance company has information about whether you had vaccinations or not, and especially in the case of international travel, you would have that information. They can issue an immediate alert saying, ‘Look, there is something going on here that you need to watch out for to prevent you from falling sick and maybe even prevent a claim from happening.’ That’s loss prevention. It’s not easy to do this, because you have to now have your internal information, which is sitting in systems, marry with the external information — all the situations data, contextual data — and provide that insight back to the customer.
Knowledge at Wharton: It seems that the ability to marry different kinds of data is critical in order to make hyper-personalization possible. What are some of the challenges that companies face in trying to combine data in a different way? And specifically, what is the role of big data analytics in helping to make those kinds of connections?
Venugopal: There are three specific issues here. One is that your data in a traditional enterprise, whether it is a financial service organization or not, is scattered in multiple systems or brackets. The data is not in one system — it’s in many different systems. Second is making the contextual information real time. It’s important to provide the information of [a discounted offer nearby] to the customer who just swiped a credit card. In most cases, the information goes back into a data store and it gets reconciled overnight. By the time [it gets to the consumer,] the information is belated. It is too late if it’s not done in real time.
[Another] problem is what I call the data-dialect problem. That is, different systems speak in different data forms. A customer in one system is very different from a customer in another system, because even though you may be a bank customer, your mortgage information could be in a separate data store. So a customer in that information store is different from a customer in a retail kind of store. That’s where you have the issue of the data dialect — how are these different data going to communicate?
The best way to solve these problems is one that we call the ‘next new data’ solution. You don’t need to have all the information that you’ve stored in a system for many, many years to provide this contextual information. You just need the transaction information at the time it’s being done, plus the contextual information that is right there with you when you’re actually swiping the card. You can just use that data and make it efficient right there without having to go through multiple systems to obtain the data. We call this the usage of net data.
If I were a digital company, I would look at not just the transaction itself but the context in which the transaction has happened.
We believe it’s important to build what we call ‘knowledge models,’ instead of having data being stored in multiple systems and being pulled into one big data lake. If you have good knowledge models, you can actually solve the data dialect problem. These are two simple but important examples of how you can take advantage of what we call data in motion, and solve certain contextual data, without having to completely overhaul your entire data program.
Traditionally, what a company would have done is say, ‘I need to get some good use case out of my data, I want to get some hyper-personalized data. How do I do it? Let me do this massive data project.’ … What we [recommend] is to start with customer use cases. In this case, you are trying to provide the next best offer at the time that the merchant swipes [the consumer’s] card. Use that as a use case and say, ‘If I were to solve it, what kind of data would I need?’ That approach is what we call the front-to-back approach to hyper-personalization.
Knowledge at Wharton: In the past few years, there has been a tremendous proliferation of cloud and cognitive computing, and also AI and machine learning have been developing very rapidly. I wonder if the emergence of some of these technologies makes it easier for different types of data to be married together, and makes the hyper-personalization process easier. Do you think that’s the case?
Venugopal: Absolutely. We look at it as a spectrum. On the one hand, you have … your Excel sheet type macros. Then you add things like robotic processes and automation, which actually helps us speed up the process of either collecting data or providing a service. Then you have semi-autonomous computing, and then you have … full-fledged artificial intelligence. Each of these technologies and solutions absolutely could be used, depending on the situation, to solve very specific hyper-personalization issues. We have several examples today of how we have used everything from simple-code level automation to artificial intelligence to solve some of these problems.
Knowledge at Wharton: Can you give me some examples?
Venugopal: I’ll give you the example of how we solved the KYC problem in a bank — KYC is Know Your Customer. In a B2B bank, enterprise to enterprise, usually when you want to do KYC, you might end up [discovering that your] company is owned by another company, which is owned by another company, that is owned by yet another company. There are a lot of nested loops in companies. You have to get to the root of the company to find out who owns you before you make and allow a transaction.
If you look at the anti-money laundering rules and regulations around that, you need to be sure that the two entities that are entering the transactions are the right ones and they are authorized to enter the transactions…and are not part of any politically exposed list, for example.
You have to think big but implement small.
We have found that there are certain kinds of patterns, and the data is stored in various data sources. We designed a machine learning algorithm that would go and find these nested lists and get to the root very quickly without even any intervention. That’s an example of a simple use of machine learning to solve a very complex but important case of KYC.
Knowledge at Wharton: Any other examples, say from the insurance industry?
Venugopal: I’ll give you another example from insurance companies that we worked with recently. There’s … a company that serves the SMB [small, medium-sized business] property and casualty market. They have a broker that works with small businesses. If a small business wants to get [an insurance] quote, it goes to the broker, the broker goes to the underwriter, the underwriter [responds] to the broker, and the broker goes back to the customer and gives a quote. That process would take two days — and it’s something that would maybe take a few minutes in a purely digital company.
What we did is we went back and looked at what is causing this delay. And we identified it. Some of this was due to process issues, but most of it was because the required documents that the SMB or the small business was sending back to the underwriter was not being ingested correctly by the systems. And that’s where we used [our] AI-based document ingestion system that actually looked at these documents, figured out what was required, the relevant data, and pulled it and gave it to the underwriter, who could quickly make a decision.
Second, this underwriter had to look at multiple systems to get the information to determine … the quote. She was actually trying to calculate the risk there. So we designed an underwriter’s work station that used AI machine learning, and also an element of a digital assistant, which walked them through the process of doing this quote very quickly. We were able to bring [the time it takes to get a quote] down from days to minutes through the entire process. This is a good example of how we’ve used this in the insurance industry to reduce the time of interaction between SMB, a small business user, and an underwriter.
Knowledge at Wharton: What does it take to implement a hyper-personalized sales and marketing strategy? Is it very expensive and time consuming? Could you give us a sense of the scale and scope of what it takes to do something like this for a company?
Venugopal: First and foremost, you have to start with the end customer in mind. Now, if you are doing a marketing use case, start with the use case. What is it that you’re trying to do? Once you start there, then you go back and look at designing a system that can get you the outcome in the shortest amount of time without actually looking at a massive IT systems modernization project.
The way you do this is by looking at your current system, what it can do, build out the current capability, and pull out the capability that you need to build this new case into what we call the intelligent layer. An intelligent layer is where you start infusing latest and greatest technology — you mentioned cloud and cognitive. Choose the intelligent layer with cloud and cognitive and then build out this core back to your end user. And it need not be very expensive if you have a clear idea what you want to do and you really are very clear on what you want to achieve and in what time frame.
Now, what typically happens is that in an organization, you start designing all of the use cases that you need, and then you go back to your IT department or technology department and say, “Here are the 95 use cases I need to do. Build me a system.” They go off and process all these requirements for three or six months, come back and say they need four years to build it. This negotiation and this dance goes on for a year, and you don’t end up with anything in that year.
What we [recommend] is start with the use case and build out the reference architecture, which means that you’re not building this for a one-time use. You’re really, truly building it based on what your future looks like. Once you build out the reference architecture, your marginal cost for the second use case would be quite low. … We have found a tremendous amount of success with this approach, which we call front-to-back transformation.
Knowledge at Wharton: Having spoken to people in the finance functions of different companies over the years, I know that one of the issues that they often struggle with is the chief technology officer or the CIO very often will come with a fairly large budget request for investment in technology. But one of the things that they end up struggling with is how do we justify the ROI of that technology expense in business terms and in terms of the strategic business objectives of the company? So when it comes to hyper-personalization, what do you think should be the right framework for people to think about the ROI of investing in technology that could lead to hyper-personalization? Do you face these issues when you deal with companies? How do you deal with them?
Venugopal: Any chief information — often chief technology — officer should always, if they’re not already doing so, be looking at how to optimize the current set of services and create a budget for digital initiatives. That’s a whole different topic, which we call service transformation. How do you look at an existing set of services that it is offering? How can you optimize it and create some dollars for new digital projects?
By the time modernization projects are done, the industry has moved on.
Most of these projects, if you are able to tie them back to specific customer benefits, may be impactful but not large to begin with. You put together plans [to, say,] provide the next best offer [to the digital shopper in a mall, and] start with that. … It is not a huge three-year project. I can start getting results in six, nine or 12 months at most, end-to-end. And we have seen that in an even shorter amount of time — as little as three to six months — for results to come out. We have also seen organizations going down the path of picking a use case, building it out, going to the next one. And as each one gets rolled out, you learn a lot about what’s working, what’s not working, and start building on it.
The methodology that we recommend is, ‘don’t go big bang, go chunks of value.’ You have to think big but implement small. You start thinking about what your future state is going to look like to some extent, so you’re building on top of a reference architecture, which is solid and future proof. But what you’re building one at a time is an end-to-end use case.
Knowledge at Wharton: When you think big but act small, can you share what some of the results are that you have seen in terms of impact?
Venugopal: The results in as early as six months, for some cases, was a 30% savings in process improvement. And in one [earlier example] I told you about, the interaction time for a quote went down from two days to 60 minutes. These big impacts can happen in a short burst of time — typically six to nine months, we should be able to see definitive results of this sort.
Knowledge at Wharton: For financial institutions that want to start down the road to hyper-personalization of their services, where do you think they should start, and what first step should they take in their journey?
Venugopal: The first step is, in my opinion, to have a pure understanding of your customer and clearly identifying your business context. Try to understand what are the specific areas that you want to focus on as your first set of opportunities in hyper-personalization. Now, if you’re a credit card line of business (LoB) , it’s not about identifying three or four definitive use cases that you think your customers would see value. Once you identify the set of use cases, then you start thinking about which ones will have the highest impact. Then you start charting on one axis, highest impact, the other effort to execute, based on the current environment. After that, pick the ones that will have the highest impact and least amount of effort to build, at least in round one.
Start working then with your organization to start building, taking this front-to-back approach. Figure out what services are available in the core system. Start building out your intelligent layer. Figure out what technology innovations are required, and then you start building back your engagement layer and how you want to interact with a customer. This entire process in my opinion is not a very long one if you have the right people to understand the system. You could get to production in three to nine months, and we have seen as little as three months, and in some cases as long as nine months. But three to nine months, you should be able to get results while seeing your first set of hyper-personalized services come out.