Me, Myself, and AI Episode 408

The Beauty of AI: Estée Lauder’s Sowmya Gottipati

Play Episode
Listen on


Artificial Intelligence and Business Strategy

The Artificial Intelligence and Business Strategy initiative explores the growing use of artificial intelligence in the business landscape. The exploration looks specifically at how AI is affecting the development and execution of strategy in organizations.

In collaboration with

More in this series

It might seem like cosmetics and perfume are products shoppers need to try out in person before buying, but artificial intelligence is opening up new avenues for reaching and understanding consumers — as well as new ways to manage supply chains.

In this episode of the Me, Myself, and AI podcast, we learn how Estée Lauder’s Sowmya Gottipati leveraged her earlier technology leadership experience in telecommunications and broadcast media to deploy brand technology projects for a portfolio of cosmetics, fragrances, and skin and hair care product brands. She talks about AI’s role in product development, a virtual try-on tool for lipsticks and foundations, and a fragrance recommendation engine, as well as an application for supply and demand planning. Sowmya also explains why, despite AI’s power, she believes human-machine interaction will always be necessary.

Read more about our show and follow along with the series at

Subscribe to Me, Myself, and AI on Apple Podcasts, Spotify, or Google Podcasts.

Give your feedback in this two-question survey.


Sam Ransbotham: Skin care products are inherently physical, not virtual. How can companies use AI to make choosing skin care products possible online? Find out on today’s episode.

Sowmya Gottipati: I am Sowmya Gottipati from Estée Lauder, and you’re listening to Me, Myself, and AI.

Sam Ransbotham: Welcome to Me, Myself, and AI, a podcast on artificial intelligence in business. Each episode, we introduce you to someone innovating with AI. I’m Sam Ransbotham, professor of analytics at Boston College. I’m also the AI and business strategy guest editor at MIT Sloan Management Review.

Shervin Khodabandeh: And I’m Shervin Khodabandeh, senior partner with BCG, and I colead BCG’s AI practice in North America. Together, MIT SMR and BCG have been researching and publishing on AI for six years, interviewing hundreds of practitioners and surveying thousands of companies on what it takes to build and to deploy and scale AI capabilities and really transform the way organizations operate.

Sam Ransbotham: Shervin and I are excited today to be talking with Sowmya Gottipati, head of global supply chain technology for Estée Lauder. Sowmya, thanks for taking the time to talk with us. Welcome.

Sowmya Gottipati: Glad to be here. I’m really excited to be here talking about AI, one of my favorite topics.

Shervin Khodabandeh: Great to have you.

Sam Ransbotham: Let’s start with your current role at Estée Lauder. What do you do now?

Sowmya Gottipati: I’ve been with Estée for about two and a half years. We are a prestige and luxury beauty brand company, and we have around 30 brands under our umbrella, Estée Lauder being the flagship brand, but we also have Clinique, MAC, La Mer, and several others. We are a global company, with our products across various regions, from Asia to [North America] and Latin America, Europe, and everywhere.

I’m responsible for global supply chain technology. I am responsible for the technology that powers the entire supply chain globally. That includes inventory, supply, demand planning, manufacturing, distribution centers, fulfillment, transportation, end-to-end supply chain — all the technology capabilities that support that.

Sam Ransbotham: All right; I didn’t hear anything about artificial intelligence in that, though. So how does artificial intelligence connect with that supply chain?

Sowmya Gottipati: I took the supply chain role just about four months ago. Prior to that, I was responsible for brand technology for the Estée Lauder brand. That was really where I was directly involved in a lot of AI applications: how we use AI at Estée Lauder, starting with the consumer experience; how we are enhancing our consumer experience with the AI technology and providing real-life applications, which I can talk [about] a little bit, with the virtual try-on [tool] and so on and so forth. And then personalization is a very important area, and the application of AI is very significant.

And then AI is also being used to create new products, such as skin care and fragrance. This is where we can use data to inform ourselves on what kind of ingredients and what type of products that people like and so we can inform our product planning using AI and what I call agile enterprise. So there are a number of areas where AI is applied to run an efficient organization, such as supply chain and R&D. These are all various areas that we are using AI in.

Shervin Khodabandeh: Sounds like it’s quite prevalent across the whole value chain. I did read up on something — I think a talk you’d given — on how AI’s being used to help personalize fragrance. Is that right?

Sowmya Gottipati: That’s right.

Shervin Khodabandeh: Can you comment on that a bit?

Sowmya Gottipati: Absolutely. That’s definitely one of my most exciting projects that I worked on. It’s an industry breakthrough, so I feel very proud about it. It’s a fragrance recommendation engine, but it takes advantage of neuroscience and AI and olfactory science. We are bringing all three sciences together to make that happen. You know, the human brain has approximately 400 olfactory receptors, and we are working with a company that can actually replicate those receptors in a lab environment — so if you take a particular fragrance, we can actually tell which of the olfactory receptors in your brain are activated by that fragrance.

Shervin Khodabandeh: So are these like neuromorphic chips, or are these silicon-based software?

Sowmya Gottipati: It’s not a software. These are actually biosensor testing.

Shervin Khodabandeh: That’s pretty cool.

Sowmya Gottipati: Yeah, it’s really cool. So we would be able to tell [that] receptor 67 and 92 and 86 are triggered by this particular fragrance. And let’s say that fragrance is predominantly lavender-based. By the way, your brain can’t really tell the difference between lavender and woody, so I might be able to bring a woody fragrance to you and the same receptors might get triggered because they’re evoking the same emotion in your brain.

So, because the same receptors are being triggered, we can tell, “Oh, by the way, just because you love lavender, you might like this other fragrance that may be woody,” which smells totally different, but they have the same effect on your brain, or they trigger the same emotional reaction in your brain.

Shervin Khodabandeh: That’s very cool. And that’s live?

Sowmya Gottipati: That’s live, yes. We are piloting that in China right now, and we’re trying to expand it to other areas as well. The way we implemented it, it’s interesting because we started off with online, because selling fragrances online is very difficult, because you can’t … how do you smell [it], right? At least … we don’t have that technology yet; maybe 10 years from now. But this is why we came up with this technology: to see [if] maybe can we use facial recognition, and the facial recognition can identify the emotion that you’re feeling based on very subtle changes in your face when you smell it, and based on that, we can recognize how you are reacting to each of those fragrances. You get a score, and based on that, we could tell whether you liked or didn’t like it, or moderately, on a scale of 1 to 10, how you’re liking it, and we use that data.

Shervin Khodabandeh: So as a customer, you’re looking at my facial recognition?

Sowmya Gottipati: Correct.

Shervin Khodabandeh: And then deciding what’s the right fragrance for me.

Sowmya Gottipati: Correct.

Shervin Khodabandeh: One of the things that Sam and I have been probing into over the past several years is the collaboration between human and AI and how it is so much more accretive, that collaboration, to the pure tech or the pure human and how they complement each other.

And it seems to me that fragrance and makeup and these things are so personal. And I have to imagine that in the AI solutions you talked about, there must be — or should be — a fair amount of human intervention or collaboration. Can you comment on any of that? I get the recommender system and how it works and the receptors and all that, but is there a human side to this as well — that maybe the experts and maybe the customers are interacting with the recommendations of AI and adapting it?

Sowmya Gottipati: Absolutely. Historically, when you try lipstick, how many can you try? Maybe three, four, or five. You can’t do more than that because after a while, the skin starts drying up and it’s uncomfortable. But now, with the virtual try-on capability, you could try 30 shades of lipstick in 30 seconds. Same thing with foundation. We have 56 shades of foundation, which are so slightly different.

We take pride in providing high-touch service, and in each of our stores, we have beauty advisers. Their job is to work with the customer and recommend various foundations, lipsticks, etc. How do you try so many different foundations? You can’t, whereas AI can narrow it down for you. These virtual try-on applications can narrow it down to two or three, and from there, a beauty adviser can actually work with the customer. So a beauty adviser is there to help [a customer] choose what actually looks better and have that conversation, and also explain why [they would] recommend this for your skin based on the results that you want to achieve, whether it be [addressing] acne or dryness, etc. So we don’t see that going away, that human-machine interaction; it will always be there.

Shervin Khodabandeh: Is there a feedback loop whereby the machine gets smarter? For example, the beauty adviser says this, or the customer — now you’ve narrowed it down from 60 shades to three — but based on the final choice they make, I assume the algorithms are getting smarter from that interaction as well.

Sowmya Gottipati: Yes, absolutely. There are two ways it happens. One is, we have a consumer data platform that has information about what you previously bought, what you liked, what your situation is, etc., so it feeds into that so that next time when you come into the store, or when you interact with us, we can say, “Hey, by the way, last time you bought this, so I can reserve that for you or I could recommend something else.” And the second thing is, when we rolled out virtual try-on applications, we started off with a million faces for the data modeling. Now it has a hundred million faces. So that algorithm and the engine are constantly improved over a period of time.

Shervin Khodabandeh: These are faces of actual customers, right?

Sowmya Gottipati: Actual customers. That’s right.

Sam Ransbotham: How does that beauty adviser work with the platform to get that feedback back into the system? I guess you can see what they actually ordered or what they chose or what they preferred. How do they get that input back in?

Shervin Khodabandeh: I can tell, Sam, you’re intrigued by the beauty adviser concept.

Sowmya Gottipati: We have very strict privacy laws, so in the store, when people are buying in the store, a lot of times we actually do not gather their personal information, whereas when they’re buying stuff from us online or [via] social platforms, where there is a login and that kind of mechanism, then you have that information, so we know exactly what they bought, and that information gets passed on.

Shervin Khodabandeh: One of the things we’re seeing — maybe it’s a teaser of our new work to come out — but one of the things we’re seeing is that the ability to understand and explain why an algorithm or an AI solution makes a recommendation or pulls out a particular insight or an action, just the ability to sort of understand it rather than it’s a black box, helps organizations get a lot more adoption.

Sowmya Gottipati: I can speak to our supply chain world. In the last year, we rolled out an AI application to do our supply planning and demand planning. Before, it was spreadsheets and those kinds of things. The moment we started using the AI application, we saw … 30% increases in our forecasting accuracy.

Shervin Khodabandeh: Exactly. Some of my clients deliberately will settle for a less precise or less accurate recommendation so that they get the adoption going. Maybe they go for less precision to trade in a little bit of explainability or ability to override, and so that way, at least people will begin to trust it more. I don’t know whether you do something like that.

Sowmya Gottipati: I have not come across that, but that’s a very interesting point.

Sam Ransbotham: There’s an angle there, too, that trades short term and long term, Shervin. Let’s say short term, they take a compromise solution that isn’t quite as good. And then they can come back in three months and say, “Hey, you overrode this and it didn’t turn out as good as you thought, sunshine.” It’s a longer game. It’s not just each one-off decision — that short-term optimal.

Shervin Khodabandeh: I did just that with my son, actually. He was going to a school dance, and he was outside, and all he had on was a T-shirt, and I said, “Wear a jacket; wear something.” He says, “No, I’ll be fine.” And I’m like, “OK. You’re going to get sick.” And he got sick and … hopefully he learned, and he is like, “Dad, you were right.” [Laughs.] And I’m going to make him listen to this podcast so that he knows now, I’ve said this to the whole world.

Sam Ransbotham: Well, Shervin, if you can drag your kids into this, I have to tell the anecdote that I … this will shock you, Shervin, but I track the time that my kids’ bus arrives every day. So I’ve got seven years’ worth of data now of what time that bus arrives. And so my next step is, I’m predicting that. I’m trying to say, “OK, what do we think today? Is today going to be an early day? Is today going to be a late day?” And then we can leave the house at the right time, but maybe we miss it someday, so I’m not sure —

Shervin Khodabandeh: How’s that working? How’s that working out for you?

Sam Ransbotham: I’ll have to come back on a later episode and see how that plays out. But at least it’s real time, you know — trying to use the dog food of the things that we talk about on the show.

Speaking of things we talk about on the show — how’s that for a segue? — Sowmya, we have a segment where we ask our guests a series of rapid-fire questions. And so the idea is you just hear this question and you give the first response that comes to your mind. Shervin, are you doing these today, or am I doing them today?

Shervin Khodabandeh: No, I’m not, because I don’t have it in front of me.

Sam Ransbotham: All right. So, Sowmya, what’s been your proudest AI moment?

Sowmya Gottipati: I think I already spoke about this: The fragrance application we built last year in my current role, that is really my proudest moment. But before that, when I was in my previous job, when we cracked the code on computer vision — combining computer vision with natural language processing to break down video processing, because that was really the beginning days of AI when we were able to build something like that, so that was really cool.

This is one thing I think is the coolest thing about technology: Technology transcends industries. It almost doesn’t matter what industry it is. Technology is so pervasive. So I feel so happy that we are able to apply the same technology for totally different applications, and that’s the beauty of it.

Sam Ransbotham: If you’d come up with something that topped the fragrance example, I was going to be super impressed. That was already a pretty proud one. So, what worries you the most about AI?

Sowmya Gottipati: What worries me the most? I think it’s the data privacy and the bias. Those, definitely, and the tracking. [On the] one hand, when I use Google Maps, I like this functionality; I like what it does. But at the same time, I know Google knows exactly where I am at every second of the day. I don’t like that. So the privacy and the data-tracking piece, absolutely, [are] a problem.

Sam Ransbotham: What’s your favorite activity that involves no technology?

Sowmya Gottipati: Reading a book.

Sam Ransbotham: Do you have any recommendations for us? You can extend your answer with a book recommendation.

Sowmya Gottipati: Recently, it’s a very short book that I read: The Night Diary. It’s about a Pakistani girl during the partition between India and Pakistan, and this little girl who didn’t talk but wrote a diary every day. It was a really moving and interesting book. I really liked it.

Sam Ransbotham: So, what was the first career that you wanted when you were a child?

Sowmya Gottipati: Oh. [Laughs.] I wanted to be a pilot, actually.

Shervin Khodabandeh: You are one, right?

Sowmya Gottipati: Yes, I am. [Laughs.]

Sam Ransbotham: OK, so you get a check mark next to that one.

Shervin Khodabandeh: You got that one.

Sowmya Gottipati: Well, I’m more of a recreational pilot, but I actually wanted to be a professional pilot. But that’s OK; I’ll settle for recreational.

Sam Ransbotham: I don’t know. You’ve gone from AT&T to NBC to Estée Lauder, so there’s a next step. So, what’s your greatest wish for AI in the future? What are you hoping for?

Sowmya Gottipati: This is more of an answer from my personal side of things, which is, I just hope we use AI for environmental causes more — you know, better crops with better yields, and water conservation. And I hope there are a lot more advances on that side of things as opposed to shopping or personalized experiences.

Sam Ransbotham: That’s particularly interesting, coming from someone who’s so interested in both of those aspects, that you think that these other aspects might be even more promising.

Sowmya, great talking with you, and these are fascinating applications. I think that most people who listen to this are going to remember the smell example. I mean, I think there’s something very visceral about that that I think will connect with lots of people and spur some thinking. So thanks for taking the time to talk with us. We really enjoyed it. Thank you.

Sowmya Gottipati: Oh, thank you. This is so much fun.

Shervin Khodabandeh: Thank you so much. And we can have some beauty advisers to work on Sam while we’re talking.

Sam Ransbotham: This is a podcast! Nobody knows that I’ve got a face for radio.

Shervin Khodabandeh: Tell them to bring all the shades and foundations, and we’ll see what they could do.

Sowmya Gottipati: You should go to and try the 30 shades of lipstick in 30 seconds —

Shervin Khodabandeh: I’ll go with you.

Sowmya Gottipati: — and the foundation. See how it looks on you. [Laughs.]

Shervin Khodabandeh: I’ll go with you.

Sam Ransbotham: I just Googled something — “shades of gray” — and I get something …

Sowmya Gottipati: Not that. [Laughs.]

Sam Ransbotham: We’ve come to the end of Season 4 of Me, Myself, and AI. We’ll be back on Aug. 2 with new episodes. In the meantime, we hope you’ll listen to our back episodes and join our LinkedIn community, AI for Leaders, to keep the discussion going. Thanks for listening.

Allison Ryder: Thanks for listening to Me, Myself, and AI. We believe, like you, that the conversation about AI implementation doesn’t start and stop with this podcast. That’s why we’ve created a group on LinkedIn specifically for leaders like you. It’s called AI for Leaders, and if you join us, you can chat with show creators and hosts, ask your own questions, share your insights, and gain access to valuable resources about AI implementation from MIT SMR and BCG. You can access it by visiting We’ll put that link in the show notes, and we hope to see you there.

More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.

Subscribe to Me, Myself, and AI

Me, Myself, and AI