The charm of AI may have been kept alive by both the science community and Hollywood, but Ash Dhupar knows its real-world application can have a huge impact on the bottom line. As Chief Analytics Officer at Publishers Clearing House, he and his team utilize 180 algorithms to predict LTV with a 2-3% accuracy. By adding the laser focus of CLV as an objective, the application of machine learning and AI techniques help to point to the most powerful marketing channels by attribution, better optimizing marketing dollars.
Interested in learning more about AI and CLV? Ash Dhupar will be at the Customer Centricity Conference, May 17-18, Wharton School San Francisco.
Key Concepts: Customer Lifetime Value, Artificial Intelligence, Machine Learning, Deep Learning, Lifetime Value, Marketing,
Who Should Listen: CAOs, Digital Marketers, Business Analysts, C-suite professionals, Entrepreneurs, Ecommerce, Data Scientists, Analysts, CMO
Allison Hartsoe - 00:02 - This is the Customer Equity Accelerator, a weekly show for marketing executives who need to accelerate customer-centric thinking and digital maturity. I'm your host, Allison Hartsoe of Ambition Data. This show features innovative guests who share quick wins on how to improve your bottom line while creating happier, more valuable customers. Ready to accelerate? Let's go!
Welcome, everyone. Today's show is about the intersection between a customer lifetime value and AI, specifically machine learning and deep learning within the category of AI. And to help me discuss this topic today is Ash Dhupar. Ash is the chief analytics officer at Publisher's Clearinghouse and hands down, one of the most knowledgeable people I've run across on this topic. Ash, welcome to the show.
Ash Dhupar - 01:01 - Hello Allison, and thank you for inviting me to your podcast.
Allison Hartsoe - 01:05 - Now I see you have a background in market research and competitive research and I have to say I attended a conference that will remain nameless of, I think it's called customer intelligence, CI professionals, and they knew nothing about big data or any kind of data sources. How on earth did you get from market research into the complicated field of AI and machine learning, etc.?
Ash Dhupar - 01:32 - Sure. I think I've been on a journey for the last 20 years in the analytics field. My first startup was a market research firm, so even though we would say primary research, some may argue that it's not analytics, but it is analytics, and you are doing simulations even if you're doing primary research, but anyway, I've been to startups, brought in the analytics field of work for an NPD, which is one of the largest research organizations for non CPG related goods and then here at PCH.
Allison Hartsoe - 02:12 - Oh, that's a nIce path. I want to pick up what you saId about primary research because I want to make sure everybody understands what that is. Sometimes we get lost in big data and all the big data terms. Can you elaborate just for a minute about what primary research actually includes?
Ash Dhupar - 02:28 - Sure. What I think is in the old school world, before the big data, if you wanted to know anything about your customers, you. The primary research was about going and talking and talking to them regarding whether you would do surveys through telephones or well more recently through online qualitative and all that fun stuff. When you were in that field, statistics was part and parcel of our lives of whether you understand customers' preferences and so on.
Allison Hartsoe - 03:02 - Do you think that's gone away a little bit? Are we still use that at PCH?
Ash Dhupar - 03:06 - That would be still used at PCH, but I think the abundance of data that is available now and the sources from there, you can derive the information about your customers and how fast you can work with that data. That has grown tremendously by do you think about laws of statistics actually haven't changed in the last hundred years. I can say that, but what has really changed is really the availability of the data and the availability of the computing power that is available to us. That has changed.
Allison Hartsoe - 03:41 - For better or for worse, Now, I mean chief analytics officer is a big spread. Tell me a little bit more about the various aspects of what your team covers and also help me understand where you report into the company because that sometimes controls what you can and can't do in with an analytics team.
Ash Dhupar - 04:03 - Sure, so at PCH the analytics department has three different teams. One is our insights team, which includes the primary research and the insights team is primarily working hand in hand with the business owners on a day to day basis, so there are different PNL owners across the company and each of the p and l owner has been assigned a set of analysts who are helping them in a non biased way with as much customer inflammation as possible with the clear mindset of what information are we providing these business owners to make progress in their business, to focus on how do we increase either customer engagement or loyalty or ultimately all it was done as to the bottom line of the PNL owner. So that's the insights piece.
Ash Dhupar - 04:59 - Then I have the next team is my bi reporting team and just as you know, when I started at the company, we have 3000 reports that some use and some don't. I mean we, who have inherited legacy systems and understand that reporting has hundreds and hundreds and hundreds of reports that some people look at it and some have never looked at that report after it was created. So the key change that we brought in the reporting team was the word KPI, and truly we work with business owners to say what is it that you're looking for as a KPI in your business? You cannot have a 100 KPIs if you're going more than six or seven KPIs, you'd probably be lost in the forest. So let's work on what are the key areas that we should focus on so that you can a keep an on the business. You're doing good, bad or ugly, and it just giving you some directions of if you are making changes. Are those key KPI's changing or not?
Ash Dhupar - 06:07 - So that's the reporting team and in terms of relationship we have actually used to have reporting teams separately reporting into me, but we realized that if the reporting thing works primarily with the insights team and that is a perfect combination, so that's the insights piece and then the last team is the algorithm and sciences team, which is building algorithms and be at PCH, use a lot of algorithms to make our decisions to change our customer expedience to optimize what we're doing and we will probably talk about it later in the podcast about that.
Allison Hartsoe - 06:46 - Got it, got it. And then where do you report in the organization?
Ash Dhupar - 06:51 - So I've been lucky. I've been part of the C suite and my CEO, hats off to him; he had recognized that in the analytics is a key part of the organization. So is having a seat at the table definitely helps. It helps you work with your peers, even the senior peers so that you can actually influence and make things happen in a much faster way.
Allison Hartsoe - 07:18 - How you get stuff done. And I love the balance of your team because, from a marketing analytics perspective, you have the traditional areas of insights and reporting, but then you also have the areas of algorithms and sciences and let's talk a little bit about that because I suspect that that side of the business, the AI side is where there's a lot of new interests, new insights coming through. So for people who aren't that familiar with AI or perhaps don't have a really crisp definition of it, why should they care about AI, particularly within a marketing aspect? Is it just hype like big data was or does it actually have a real impact? I don't want to go into impact just yet, but help me understand a little bit about what it is about marketing.
Allison Hartsoe - 08:16 - Sure. You know, I would say the fantasy fiction charm and the promise of AI have kept the scientist community and pursuing AI for the last 70 years. And if you look at the history, there has been hot and cold times since its inception in 1950. So AI is not new as all of us would know. And, you know, we went through a cold period of what you would say is in the eighties and nineties where AI was more as a hype and that hype was actually sustained by, um, I would say thanks to Hollywood where a lot of movies that were there who kept the fantasy and fiction of AI moving along.
Allison Hartsoe - 09:01 - I remember that movie, right? The AI little boy
Ash Dhupar - 09:05 - There you go in there. Quite a few of those movies out there that kept us. And even till now, however, what's important is for us in marketing especially, I mean what is AI? And in simple terms I would say it's making computers, none the way humans can think and solve problems in technical terms. It has two fields, machine learning, and deep learning. What these fields of AI are actually affecting our everyday lives and will continue to affect in major ways in my mind, in the next years to come. And to give you a common example of machine learning. If you go and buy something on Amazon, you do see at the bottom saying, people who bought this also bought this. Right? Those are your typical recommendation engines, that recommendation and then actually being powered by machine learning algorithms. If you look at the work that is coming out on and from the self-driving cars.
Ash Dhupar - 10:07 - That is a field that is highly influenced by deep learning because there is a multitude of data that needs to be taken in and analyzed in a split second to make those decisions and that's where all the deep learning work is happening, and it's not self-driving cars. Some of the customer service areas are what I would also say is related to deep learning.
Allison Hartsoe - 10:31 - What'd you say? One is easier or harder than the other.
Ash Dhupar - 10:34 - I would say machine learning, for now, is easier for two reasons because machine learning primarily deals with structured data primarily in most of the cases, especially in the marketing areas. You're dealing with all the structure data, finding talent, people who have done machine learning because machine learning technically is not all that new. You know the algorithms behind that have been used in a while. So there is a talent who knows and understands the machine learning part of it, so it's easier to do that.
Allison Hartsoe - 11:09 - I just want to go back on what you're saying for a second to structured data is your common row and column data, like an excel spreadsheet is structured data and so historically examples of structured data might be a customer record in a CRM system?
Ash Dhupar - 11:29 - Yes. All of the data that sits in rows and columns essentially is what I would say is structured data
Allison Hartsoe - 11:35 - And I would also argue that this is probably smaller volumes of data. Would you agree?
Ash Dhupar - 11:41 - No. the. I mean earlier I would say if you were talking about this 20 years ago, then smaller volumes of data would kind of be the norm, but today the volume data doesn't matter because of the sheer computing power that is available to us.
Allison Hartsoe - 11:59 - Okay. Gotcha.
Ash Dhupar - 12:01 - The computing power truly has changed in, I would say 20 years ago or if you really want to go back to the times of NASA launched the first Apollo version. The computing power in that system is very similar or impact. Your iPhone today has more computing power than what NASA had at that time.
Allison Hartsoe - 12:21 - Isn't that amazing?
Ash Dhupar - 12:23 - It is amazing. The or data doesn't matter these days. Whether you have a terabyte or 50 terabytes because you have the computing power that can sift through that data pretty fast.
Allison Hartsoe - 12:36 - Okay, and would you also argue that because we have more computing power, AI has become hot again, whereas maybe in the 80s and 90s it was suffering from hype because the computing power wasn't there?
Ash Dhupar - 12:49 - Absolutely. Absolutely. I think the introduction of big data as it was as archaic as it it right now, but the introduction of the big data in back in I think 2010, 8, 9, 10 timeframe is where I think things started changing because you were able to put a bunch of servers together to take the data and run them through algorithms without any major problems.
Allison Hartsoe - 13:16 - So how does that relate to deep learning then?
Ash Dhupar - 13:19 - Okay, so deep learning I would say primarily works better when you have unstructured data. So let's talk about unstructured data. What you and I are doing right now is unstructured data, right? I'm talking to you. You're talking to me. It cannot be put in rows and columns. So if the machine has to interpret what I'm talking right now, imagine the algorithms behind it is is you have to make a computer learn what does intense really is, what does it mean, positive, negative and so on. The customer service systems are utilizing these deep learning algorithms because there's a lot of teaching of the deep learning algorithm has been happening for this algorithm, so that's where your deep learning algorithms are more powerful because it is taking a lot of unstructured data and then learning through that process of unstructured data and making them either predictive decisions or life decisions at that particular moment.
Allison Hartsoe - 14:29 - Got it.
Ash Dhupar - 14:32 - Yes, and you can come. I mean deep learning will also be able to combine a. Your structured and unstructured data together to get to whatever you're trying to. Whatever business problem that you're trying to solve.
Allison Hartsoe - 14:44 - Okay, so that makes sense. Thank you for that definition. That helps us really understand the difference between the two because I think like people, you don't always think about them in structure. You hear almost interchangeably. AI machine learning, deep learning, so it's very helpful to have that structured data. So what kind of AI techniques apply in marketing?
Ash Dhupar - 15:07 - I think that's a pretty broad question. I would say that you can apply it in different ways in different situations. Different techniques can be applied, but I think the best would be if I could give you examples of what we've been doing here at PCH. Would that make sense?
Allison Hartsoe - 15:24 - That sounds great.
Ash Dhupar - 15:25 - Okay. The use of AI ultimately is what business Problem you're trying to solve. We Don't do in marketing as practitioners, we are not doing academic research, but ultimately our jobs are to help the organizations give a better experience to our customers or contribute to the bottom line. So in each of those cases, you are basically trying to solve a business problem. In our case, we looked at attribution as one of the problems. Now you're aware of attribution or should I give you a little more details on that?
Allison Hartsoe - 16:01 - I think most people understand attribution, but briefly when we talk about attribution, we're talking about what channel drove the effect, the desired effect that you're looking for, whether it's conversion or whether it's a certain type of experience.
Ash Dhupar - 16:15 - Perfect. So I won't go into too many details about attribution's attribution was a problem for us. We spend about $40 million in our paid media acquisition and we spend about another $40 million on TV advertising now, so that's about $80 million that you're spending on a yearly basis and the problem of attribution is important to understand from two perspectives a within each of those spends, you know, within the digital paid media side, are we optimizing what we have, what we are spending and similarly in the TV side and in an ideal world, if you are the CEO sitting out there and saying, okay, I'm spending $80 million, where should I for the next year or next two years or three years, where should the money go? Should I start shifting more into digital media because you know, that is becoming more effective and TV is not. Or should I grow a book and so on. So those are the areas that you would want to know and get behind.
Allison Hartsoe - 17:22 - So Ash just to clarify for a minute, I understand what Publishers Clearinghouse is, but what were you advertising?
Ash Dhupar - 17:28 - So ultimately we are an interactive media company who has an. Our customers are engaged with us so that they have a chance to win with whatever we are doing as a company. We offer digital entertainment which is free to play games, and we offer shopping as entertainment because most of the purchases that are made on PCH are primarily impulse driven. NO one gets up in the morning and says, hey, I need to buy that kitchen item that I let me go and buy on pch.com. So these are all impulse driven items. Right?
Allison Hartsoe - 18:04 - Got it.
Ash Dhupar - 18:05 - That's what PCH does. Now, what you also know PCH is we are a big switch six company. This is all that is happening is under the umbrella of sweepstakes. So six times a year we promote our big events. An event is $7,000 a week for life. Or forever or a 10 mIllion dollar prize and so on. So that's what the sweepstakes are being promoted and when we are promoting those sweepstakes, we are either advertising on TV, hey that event is on, or we are going on the digital paid media sites so that we can get customers from the digital side as well.
Allison Hartsoe - 18:44 - Got it. Sounds good. Sign me up seven k for life.
Ash Dhupar - 18:50 - So with that in mind, we wanted to understand how do we start looking at the digital paid media side and say what do we want to do? How do we want to optimize that? 40 million in spending a typical attribution solution will take you through the approach of the last where it says multi check and so on. There are different techniques of how you. You can assign attributes. So what we did was we taught this through, and we said conversion is conversion to which is in our terms would be opting into an email program for PCH or do you know opting into the sweepstakes? Is that a conversion event? And we said, well you know what? It doesn't make sense if we just did it at the conversion level because you know you spend $40 million, someone comes in and convert on your site. If that is your measure of success, I think we're doing something wrong out here.
Ash Dhupar - 19:47 - Now, this is where we brought the concept of lifetime value in any business. Let this stop after conversion. What happens? Any business, you know there's an 80/20 rule. Eighty percent of your customers or only gonna make 20 percent of your revenue. So if you take that into account, if you start thinking a little more granular than just a conversion is not just a legend, but conversion into lifetime value is. That's how we started talking. The problem with attribution. So what we did, we built about lifetime value models for a customer who just landed on her side and who just registered, so on days that we call it a day zero. Can I predict the lifetime value of a customer who just came and registered on pch.com
Allison Hartsoe - 20:34 - I just want to the point that out for a second because most people don't think about that as a conversion event because it's not a purchase, but what you're saying there that's so fascinating and so interesting is you're able to take the engagement that they have illustrated and tie it to LTV. I think that's incredible.
Ash Dhupar - 20:51 - Yes, and that's exactly what we're trying to do here. So the business implications of this before I go into some of the modeling parts of it is let's say the paid media analysts just spend $500,000 on a campaign on Facebook and they want to know what was my conversion rate, but they don't last. They also want to know what's the lifetime value of a customer who's coming from Facebook. You may get a very high conversion rate from Facebook, but are you getting high lifetime value from that source or not? Facebook is just as an example. It could be any source because they're. The campaigns are run on many different sources. When you're spending that much money. This is critical information that they have in their hands, and they say, you know what, I can spend more today on this source because the lifetime value of the customer is very high when they are coming from this specific source, and I should spend less on that source because that specific one, because the LTVs are pretty low from people coming from that source.
Allison Hartsoe - 21:54 - Now is it a self-fulfilling the prophecy? In other words, if I mark it really heavily to a certain channel, am I going to get higher LTV?
Ash Dhupar - 22:03 - Is your question on if I'm advertising heaviness?
Allison Hartsoe - 22:06 - Yes, sorry. If I'm advertising really heavily on a specific channel and my naturally going to get higher LTV.
Ash Dhupar - 22:12 - No, you may get higher conversions. You may get a higher volume of people landing on your site because advertising was driven to do things in a typical scenario, a or three or four things, but you know prime be. What matters is, besides building your brand, when you're advertising, we'll draw a customer to your website. The second event, which is important is the person actually converts our registers with us in our case, and the third piece is how they, from there on, they engage with us for a long time. So those are the three different events. Now typical advertising, you know, you did a campaign, and you had a million impressions and guessed what? You drove 10,000 people. You got a pretty good number of people to come in on your side; you drew them on. But if a person comes to your site and does nothing, that money is basically down the drain, right? So you take that same concept all the way to the lifetime value of the customer.
Ash Dhupar - 23:11 - And what we have seen is if you advertise heavily in, in one area, do they actually correlate to a high lifetime value? No, it doesn't happen. So now how does this whole thing helps and where does, where does the machine learning comes into play? So we built models which take into account every single activity of the person that we know from the time they come on the site, not only from the time they come inside but what email address you have on the site. For example, if you have an edu email address from which you are registering on the side versus a Gmail address, it has influence and has the predictable, predictability of what your lifetime value could be.
Allison Hartsoe - 23:57 - Wow.
Ash Dhupar - 23:58 - Yes. So we take that into account. There are about a hundred and 80 separate models that run. These are what I call it, the micro models that run on a nightly basis which ultimately covers every single person who's on the site and they are, when I say nightly basis vr, because these are machine learning, deep learning algorithms every day their activity feeds then into the lifetime value. So you know, if we have false positives up front of saying, hey, this guy looks like a pretty high LTV person, but two days from now that activity is actually like someone with a pretty crappy LTV back kicks in and the south learning part of the algorithms is taking in and saying, hey, Ash looks pretty crappy. Whatever LTV you attribute it to him is changing.
Allison Hartsoe - 24:52 - Got it. Can the marketers respond that fast? Let's say that I've got people flowing into different LTV classes and I realized the LTV would actually be a statistical propensity. That's usually how it surfaces, but many people tend to classify it into high value, low value and other valuations. So for the marketer to adjust and that you've got this dynamic nature flowing around, which on top of the question I'm asking, there's even another question behind it about the sheer volume of what you're able to compute in a 24 hour period. Hold on for a second. Can I as a marketer make changes fast enough to respond to how fast people are moving around? How do you use that information?
Ash Dhupar - 25:36 - Oh, so that's a good question. I think the, uh, the paid media acquisition team is all over the data, so now what happens is these models are, you obviously you have to aggregate all of this data, right? Altogether. So for the marketer or for the paid media analysts, what they are saying is they have a budget to work on that $40 million budget, and they're saying, I have to spend this $500,000. What they do is they would say, let me spend $50,000 on this source, look at the lifetime value of this customer on day one, day two of this source in the next three days, next five days, seven days. And they can say that at any point in time I can cut the data and say, am I getting some craziness on the $50,000 that I spent on this? Is my lLTV still holding out or not? Oh wow. Day seven, day seven. You say, hey, you know what? The source is starting to what the spend that you did on this $50,000 is not really high LTV but pretty crappy LTV. They know that next time I'm spending on this, I'm not spending that 50,000, I'm going to cut it down a level where I can optimize the spend with that source.
Allison Hartsoe - 26:53 - So are they kind of looking at the trend, is it common to see an initial pop of maybe good quality and then see a dropoff or is it more like you could see any kind of trend line on any source.
Ash Dhupar - 27:05 - You can see any kind of trend line on any sources, but you have sown to places that are going to be sources that are pretty able. They are pretty good in bed are sources that you know. Right. I shouldn't be spending a lot on those, so the data is being utilized. That's the beauty of it is they have a reporting tool on top of this algorithm. Then the lifetime values that are being scored on a nightly basis, they can look at the data in any cut that they want to.
Allison Hartsoe - 27:35 - Nice. So you've really armed them to make good decisions, and you've also helped them not rely on the source itself that they're advertising at, which always bothers me when you have the source reporting, the quality of what you're getting. It's like the fox watching the hen house, right? Like the ad agency that makes the buys or is then telling you how well you're performing.
Ash Dhupar - 27:58 - That is true. And you know, what we also wanted to do this is one of the lessons I've learned being in analytics or in general in life is you know, you don't have to tell the other person what to do, but arm them with the right information and let them do what's the right thing for what they have been hired to do. Right? So a paid media analyst job is out there to maximize that spend that we have that he or she has allocation for. That's how he or she is measured on a daily basis. So I can actually take those algorithms and be more prescriptive about what they should be doing. But I think we leave that in the hands of the analysts that he make the right decision for this still an element of the unknown in these models as a human aspect of it. You cannot lose touch with that part of it, or you know, someone knows and says, hey, you know, I did that campaign but it has dropped low LTV but there was a reason for it, and the reason was x, y, z, so I'm still going to do that because I'm still going after that piece.
Allison Hartsoe - 29:04 - And is that where the primary research starts to come back in when you try to keep in sync with the human part or is there. How do you stay in touch with that? How do you see what's not in the data?
Ash Dhupar - 29:15 - No primary research comes in back into play is when the data that you see either is telling you some trends, and you don't understand why. An example of that would be, let's say a person who comes on the website which went and saw an I'd either on TV or through paid media acquisition. They come on the website, they start filling the form, and then they just say, oh, not for me. Let me just get out. I don't want to do anything with it. So your final report will say 10% of the people dropped up while pulling the form, but why did this have fallen off? You don't know. That's where you want to reach out to them and say, hey, what happened? Did we scare you when you were on the side when you were filling the form or did the experience or you know, so the funnel can continue going on and someone would engage with you once or twice and so you want to understand from these people, let's do what is it about your services or, or their experiences that they've been like and that's where you want to do the primary research.
Allison Hartsoe - 30:23 - That makes sense it. Let's come back to the LTV for a second. In some companies when they've rolled out LTV, the models are, you know, maybe not as tight as they could be in the beginning and they end up with a little bit of backtracking and adjustment time. Did you have That in the LTV models that you rolled out or how did you make sure that they were as tight as possible?
Ash Dhupar - 30:48 - That is a loaded question, but I'll answer to the best of my ability today. What happens is at a high level, you're looking at a lifetime value of a customer. When you aggregated at the highest level, we come out with a two to three percent range. Wow. The models that are at the highest level, who does that? We can vouch for that.
Allison Hartsoe - 31:12 - Now, just to clarify for listeners, when you say that it comes out within a two to three percent range, what you're saying is that when you project forward and you test backward, the match between those two lines is within two to three percent accuracy. Is that right?
Ash Dhupar - 31:29 - Yeah. The predicted versus actual is within a close range. Got it. Now the way you use the data is where you may see the variances being high. Now you want to understand basically the LTV for not just the LTV of a customer, or we're all, but when you are looking at the customer from a certain source because that's how the data is being used. Right? Because I don't care if you at the aggregate level, I do an awesome job, but when I am sitting at my desk, and I have $50,000 to spend on a certain source, is my data accurate at the source level or not because that will drive how much I can I have the power to spend or not.
Allison Hartsoe - 32:18 - That's tricky isn't it? Because I've added a slice to the data that depends on so many identification pieces is coming through. Has so many dependencies really to make that call.
Ash Dhupar - 32:33 - Yes. What we do is, so it's not only as a cut but a thousand cuts of the data. You have thousands of sources, so this is where it gets a little tricky, but what we have in an aggregate when we have done by source, the models that we have built are pretty intense, and on an average, they are about 12 to 13% in the range from predicted to the actual
Allison Hartsoe - 33:03 - better than which most people were at.
Ash Dhupar - 33:07 - Yes. And in some cases where you know, you have a lot of stable sources, so what happens is the algorithm that ultimately bound by what data you have, you just started on a source today and people coming from that source, I have no idea about how they are going to interact or because you know, there's no previous history because there's no previous history. Predictions are going to be not as good, plain and simple as that. So what this is now a user education and we tell our users are the analysts that hey, when you're starting a new source, don't go and lie on that LTV on day zero, day one, day two, we'd put about 30 days for it to build. Let the algorithms learn what's going on. And then you know, you start spending more or less after you get about 30 days of activities in there because I feel I'll be more confident in the movies that are coming out at that level.
Allison Hartsoe - 34:04 - Got it.
Ash Dhupar - 34:04 - On the other side of the spectrum, if you are looking at a base standard source with whom we have advertised like you know, every day because they generally provide good LTV. The prediction than that is as close as two percent, one percent because you can do a pretty good job in predicting those sources.
Allison Hartsoe - 34:26 - Nice. Now am I right in thinking about our definitions earlier that this modeling is a combination of machine learning and deep learning or is it all deep learning to run these models?
Ash Dhupar - 34:40 - It is all machine learning actually not even deep learning. We don't use as much deep learning here because you know, you're dealing with mostly structured data.
Allison Hartsoe - 34:48 - Okay. But isn't digital analytics behavior unstructured naturally or are you forcing it into a structure?
Ash Dhupar - 34:55 - No, it does structure. What do you mean it is not structured?
Allison Hartsoe - 34:59 - I'm thinking of the Adobe feed and how you know what a mess that is coming out when you got hit level data. Oh my god.
Ash Dhupar - 35:06 - Yeah, but it is all may not be clean, which is a process that you go through to clean the data, but it is still structured that you can still make sense of it.
Allison Hartsoe - 35:16 - Got it.
Ash Dhupar - 35:17 - When ten something good, bad or ugly on a blog or on Facebook, now that is natural language, and you have to do NLP which is natural language processing algorithms. NLP is very used, used a lot of deep learning algorithms as well. You can use some machine learning too, but you know, I think the deep learning where the NLP is coming into place.
Allison Hartsoe - 35:39 - Got it. Okay. So let's say, and this has been a fantastic example, thank you for going into so much detail here, but let's say that I'm convinced and I'm super excited about AI and particularly machine learning. I want to take action. How would I start?
Ash Dhupar - 35:57 - Well, I would say that the simplest way to start is it's not about using AI or AI within machine learning or deep learning. I would say start by asking the question what business problem solving, and that will determine what kind of machine learning or deep learning algorithms that you will be able to use, so that's the starting point. Now you may be able to, for example, if you are trying to build a product recommendation engine, if you're trying to predict customer behavior, for example, understanding any patterns or preferences so that you can tailor some of the experiences for them. Use machine learning if you are trying to do anything with unstructured data like you know if it's trying to build a customer service application, so service application. There's a lot of voice recognition and voice learning process, or if you're looking at social media habits, trying to predict, you know, purchase on your site through there I would say is use deep learning algorithms and on top of what talent do you already have in-house will determine that as well.
Allison Hartsoe - 37:09 - That's great, and I'm also going to, for our audience, I'm going to link to an article that we surfaced in the signal needs a letter from Rob May at talla.com where he. He talks about a framework for nontechnical business owners to figure out how to use AI in their business and it's framed by predict, automate, classify, and then you list the rows of your business, so we'll include that which it can help you think about where it makes sense to use these techniques, but thank you for those suggestions and also you're going to be speaking at our conference in a couple of weeks at the Customer Centricity Conference. Thank you for that.
Ash Dhupar - 37:46 - Thank you. I'm excited for that as well.
Allison Hartsoe - 37:48 - Now if people want to get in touch with you, what's the best way for them to reach you?
Ash Dhupar - 37:52 - The best is to reach me through Linkedin I would say because that's the same face. All of us can be reached there and then we can take it from there.
Allison Hartsoe - 38:00 - That sounds good. Now, let me do a quick summary here and at the end you can tell me if I've missed anything major, but we talked about why you should care about the relationship between CLV and AI and in particular we talked about the definitions as well, which was super helpful, but it comes back to when your focus for these calculations, it's not conversion, but customer lifetime value. Then these techniques in the example you used was machine learning can point you to powerful areas of impact, and those areas of impact are also subject to the ongoing nature of LTV. So you can see day one, day two, day three, day five, which I think is fantastic, which almost creates a self-correcting cycle. I guess the LTV is always updating and it's always been handled. So in paid media, which we all feel is an area that's probably ripe for a correction and in terms that there's a lot of spends and there's probably a lot of areas for optimization.
Allison Hartsoe - 39:03 - This application of machine learning and LTV is incredibly powerful as you've pointed out. And then finally we talked about how to bring this into your own business. That's a difficult question, but in many cases, everyone I talked to comes back to use cases. You have to understand the business use cases, but I love that you called out specific examples of when to use which tool. So ash, did I miss anything there? Anything else you want to call out?
Ash Dhupar - 39:32 - No, I think you summarized it perfectly. Thank you.
Allison Hartsoe - 39:36 - Wonderful. Well, as always, links to everything we discuss are at ambitiondata.com/podcast. Ash, thank you so much for joining us today. It's just always a pleasure chatting with you.
Ash Dhupar - 39:46 - Same here. Thank you and thank you for inviting me to your podcast, Allison.
Allison Hartsoe - 39:51 - Remember everyone, when you can use your data effectively, you really can build customer equity over time. It's not magic. It's just a very specific journey that you can follow to get results. Thank you for joining today. Show. This is Allison. Just a few things before you head out. Every Friday I put together a short bulleted list of three to five things I've seen that represent customer equity signal, not noise, and believe me, there's a lot of noise out there. I actually call this email the signal things I include could be smart tools. I've run across articles, I've shared cool statistics or people and companies I think are doing amazing work, building customer equity. If you'd like to receive this nugget of goodness each week, you can sign up at ambitiondata.com, and you'll get the very next one. I hope you enjoy The Signal. See you next week on the Customer Equity Accelerator.