Customer Equity Accelerator Podcast

Ep. 22 | Where Academics Meet Reality with Joe Megibow

What does academia have to do with marketing and CLV? In this episode, Joe Megibow shares that academia is five to 10 years ahead of practical application. If you want to know where we are heading, start here. Joe talks about his history in the early days of data-driven insights where he worked with companies including American Eagle and Expedia to dig into customer data and develop better attribution models. He talks about the challenge in getting an accurate picture of the customer when the number of touch points is exploding. And he shares his model for what companies need to get started with CLV: an actionable hypothesis, technically achievable goals, and a purpose that is strategic to the company. Reach out to Joe on Twitter at @megibow and on LinkedIn


See all Episodes  (and even subscribe to future ones!)

 

Key concepts: attribution models, marketing science, academia, CLV, data-driven insights, customer insights, customer centricity, data mapping

Who should listen: marketing science, CAO, analytics, e-commerce, CMO, marketing analytics, customer experience managers, customer retention

 

Learn more about Personas

 

Show Transcript

Allison Hartsoe - 00:02 - This is the Customer Equity Accelerator, a weekly show for marketing executives who need to accelerate customer-centric thinking and digital maturity. I'm your host, Allison Hartsoe of Ambition Data. This show features innovative guests who share quick wins on how to improve your bottom line while creating happier, more valuable customers. Ready to accelerate? Let's go!

Welcome, everyone. Today's show is about where academics specifically customer lifetime value academics meet reality. To help me discuss this topic is Joe Megibow. Joe is an advisor to Advent International and also sits on for advisory boards plus one public board. He has tremendous experience extracting value from digital data. In fact, I remember the very first time I saw Joe up onstage at eMetrics and he basically outlined what was cutting edge at the time was the use of analytics data with what you might see on the network, like a network operations center, which was just a groundbreaking unification of data streams. So Joe, thank you so much. Welcome to the show.

Joe Megibow - 01:19 - Thank you, and thanks for the kind words.

Allison Hartsoe - 01:21 - Now I've also heard that you were one of the originals at TeaLeaf. Is that true? That you were there back in the day and did you actually name all of TeaLeaf's products?

Joe Megibow - 01:32 - There's some infamy around that as they got very cutesy. Yeah. I actually was looking at TeaLeaf from an investment standpoint from my prior company before it incorporated and ended up realizing that it wasn't hitting the profile of our investment strategy, but it fitted the profile of my career strategy. So I joined as the thing launched and, and was there from the very beginning.

Allison Hartsoe - 01:57 - Wow. Nice. Nice. So was it the TeaLeaf aspect that brought you to this topic or what was there more in your background that kind of drew you into the use of data and CLV?

Joe Megibow - 02:09 - That's a fair question. I mean I was an engineer originally an electrical engineer and then software developer or I mean I had a, a grounding in the technology and I graduated post-internet but pre-web. So a lot of the web stuff happen as my career was kicking off and I ended up going back to business school and took some classes that really just rocked my world on data-driven marketing. And you think about it, see the web, you were starting to have the loyalty programs with the grocery and airlines and credit card companies were getting very, very sophisticated on segmentation and targeting, and it's been behavioral data and what you could do with that. So there were starting to be a renaissance of getting out of the old school, sort of mad men finger in the wind side, the marketing and realize when you could get very sophisticated. And this was before the web even happened. So I just, I fell in love with that concept that you could actually take data and use it for good in marketing circles.

Joe Megibow - 03:10 - And then the web started to happen, and it was like, holy cow, now we've got not only transactional data but behavioral data. So yeah, when I ran into TeaLeaf, I mean I was already, I had just been in a, in a management consulting role doing some of the early days, the late 90's here, so early days on web development and building out platforms and some of the early commerce plays. So yeah, when I saw what TeaLeaf does in a lab, which was really at the time just the debugging tool for solving it spun out of sap, some of their own problems, the potential I saw to have these incredible data-driven insights on what and why consumers were able to get through processes on websites and where friction was and how to optimize around. That just blew me away. I'd never seen anything quite like it.

Allison Hartsoe - 03:59 - You know, it's funny. We use those terms so much today. It's hard to remember that that concept of friction and optimization was actually in play in the late 1990's.

Joe Megibow - 04:08 - I mean we could get sort of philosophical here, but the reality of those concepts were around long before we named them consumers who are struggling all over the place. And in the early days, it was bad. I mean it was really hard to get through most transaction than websites. Some part of the challenge is most of the back-ends were built for these monolithic closest items where you own the entirety of the engagement.

Allison Hartsoe - 04:32 - Well, not totally. We're all on dial-ups.

Joe Megibow - 04:35 - Well sure. If you know you had bandwidth issues, but when suddenly in the web, what's assembled into the page, I mean we take for granted now, but with assembled in the page, it's fleeting. It's, it's a momentary collection of content and data that's coming from a multitude of systems and sources that all somehow come together in real time. Just for you, you know, and this is where all the idea around personalization started to really blossom. I don't think we ever really got there for 20 years. We've been talking about one to one marketing and personalization and to be a little perhaps generalizing. The best we've come up with is people who bought this also bought that. We've solved some pretty sophisticated math problems, but I think the idea of this true personalization is still a long way out.

Allison Hartsoe - 05:24 - So Joe, after TeaLeaf and then you had a little bit of management consulting, then what happened?

Joe Megibow - 05:31 - Yeah, summarize quickly, but yeah, after TeaLeaf I ended up going to one of my customers which was hotels.com, running the data side, the analytics and optimization and also as part of that took on all of the digital marketing there and we took a very quantum view, a very consumer-centric driven view of what the opportunities were and really just rebuild everything. There was probably a two of the most prolific years in my career; a hotels.com was owned by Expedia Inc. I ended up going up to expedia.com doing some very similar work. Ultimately a heading up us does the math, so it was the general manager of expedia.com, our business. And then I did a crazy flip into retail and ended up going into American eagle outfitters as their chief digital officer. The CEO at the time realize the travel was probably about ten years ahead of retail in its disruption and evolution and was trying to instill some of that same digital-minded customer centricity into the retail world.

Joe Megibow - 06:31 - So, uh, ended up figuring out how do we actually bring an older school, brick, and mortar retailer in the digital age and got really good educated at that briefly after that I was president of a little retailer called joyous. And then that got to me where we started, which is working with some of these private equity firms in advisory boards.

Allison Hartsoe - 06:52 - Very nice. Well, I remember back when you had hotels.com because that was actually when I saw you at metrics and I remember calling back to your example a number of times and saying, wow, you know, this is how somebody does it. This is how they start with winds in one aspect, and then they move up in the organization, and then they get their team behind them, and you know, all these great things keep happening. So I, you were, you know, you were the shining star example for a long period of time. Still are in this space.

Joe Megibow - 07:22 - Thank you. It's good to get in early, I guess. Awesome. Wash, rinse and repeat.

Allison Hartsoe - 07:28 - For sure. So let's talk a little bit about the academic side. You were inspired in b school with your marketing analytics classes. Clearly you were exposed to the academic information. Did you want just to come out and apply them all immediately and if so, what happened in that connection between what you found in academia versus what you found in reality?

Joe Megibow - 07:50 - Yeah, and I can't see it hit me a few times in my career because interestingly, not only as I tried to kind of have my idealistic academic view collide with the challenges that a real world, but also when I left TeaLeaf and went to the other side of the table and in a stopped selling software which had some academic similarities and that you're selling the idealistic potential of what can be done with the software. Not so much the reality of your what can be done and the most software companies are constantly frustrated with why most customers only use a fraction of the capabilities. So flipping to the other side, even they're coming out of tea leaf and thinking, wow, I'm going to change the world with all this stuff. I know that we have at our fingertips. It turned out it wasn't quite so simple, real world and so, I've actually enjoyed that real-world side more because you know, figuring it, maybe it's the engineer in me, but figuring out, hey, you've got all this stuff that's possible.

Joe Megibow - 08:46 - You know, but how do you figure out in the real world how to take all that possibility and actually create something that works, create that magic box that does something is he turns out to be the real battle. It's not knowing what to do, it's can you actually execute and get it done and how do you get that done and can you actually then get value out of the other side. That's where I've had most of my fun, and my car heard that it's not as clean and simple and the academic side would say,

Allison Hartsoe - 09:15 - Okay, so is there a rule of thumb about the distance between academia and practical application?

Joe Megibow - 09:21 - There are hype curves for sure, and then on top of the academia you get what gets picked up in the fresh. What's hot in the press often doesn't get realized in, you know, call it a majority or a significant number of businesses and probably five years after the pr buzz dies down and they're already talking about the next thing. In fact, that can be a challenge inside of companies because you're trying to invest in initiatives that feel like, oh, you know, we've been talking about that for years. Let's actually invest in something new and fresh. We're like, yeah, but we never actually delivered her, solve the stuff we were talking about five years ago. So yeah, I mean, I, I think it depends on the concept, but it's five to 10 years tends to be sort of when things start building up some momentum on the academic side before you really start saying some meaningful application in business with probably a safe rule of thumb.

Allison Hartsoe - 10:09 - Okay. So there's, I'm sure there are dozens and dozens of different things that come out of academia about what you should do and then dozens and dozens of ways you can try to apply that. How do you think or how do you frame the problem that you want to solve in picking what you want from academia and deciding whether it makes sense for practical application?

Joe Megibow - 10:30 - No, maybe just to back up a little and color some of the challenges there are on mapping some of these academic things and because I think that then sets the stage for how you start to think about prioritization and as you think about. I mean there are lots of different areas to try to solve problems and let's start in this call was we're thinking more of the customer lifetime value and, and around the customer. I'd say there's kind of two big pieces of information that I think is often assumed but turn out are really hard to get to. One is, which is I know a common theme on your show is, is attribution. I know what I spent, and I know what I made, but can I actually match that dollar of spending to that dollar of revenue and as, as clean a way as possible? And can I do that at an incremental level? If I spelled spending one extra dollar, do I know that that dollar is creative because it gets so averaged out? It's hard to understand is that it's dilutive but not, but it's still a creative and how do you balance those things together.

Joe Megibow - 11:32 - And it turns out it's really, really hard to do that because tracking the customer across channels and understanding the behaviors is very, very, very difficult and actually on that one would love to get back to it. When I was at Expedia years ago, we did a phenomenal academic exercise to try to solve that, which I'd be happy to talk to, but it's a really tough problem to solve is can I really identify who my customer is because in order for me to do anything around customer lifetime value, it assumes I have a customer over a period of time and I can track them and whether it's cohorts are just generalized lights, I value, how do I really look at them, and it turned out that's also a really, really hard thing to do.

Allison Hartsoe - 12:15 - Do you think it's easy now though or is it still so hard?

Joe Megibow - 12:16 - I actually believe it or not think it's getting harder, not easier, and it's because the number is calling on the channel, but the number of endpoints that a consumer can engage with the brand is exploding. People think about your multitude of devices that you have your phones and your tablets and your laptops, and you may have one or two of each, and you may have your work once, and you may have your home. I see people who have a home laptop, their work laptop or work phone, their home phone or a tablet or two, and then you start putting in set top box and voice and all these other mechanisms. It's growing and the ability to track across them. Maybe there are very few brands that you enabled logged and authenticated state every single time you engage with the brand. That's a luxury. It's very untypical. Yeah, and then on top of that, the data collection across these are very different.

Joe Megibow - 13:14 - So as an example, years ago when I was at American Eagle, when we were starting to try to build a single customer repository, one of the things we recognize as we have lots and lots of customer data, marts lots, I think it was almost a dozen, and it was because we have our loyalty program, we had our other loyalty program which was our co-brand credit card or private label credit card which were similar but not necessarily aligned. We had our account information. If you signed up online, we had our email file which may or may not be connected to the account. We had affinity programs. We had our mobile app, which was a different thing and a different set of tracking. There were the POs; there were credit card receipts. So I mean all these things came together, and the primary key on these wasn't necessarily the same either. And the loyalty program, it was phone number on credit card. It was really. It was addressed. Obviously online tended to email. But yeah, I looked at it, and when we started this process, something like a third of my loyalty members didn't even have an email address on file.

Allison Hartsoe - 14:20 - So you said there's a key, you don't have a key or the key that you might have isn't even necessarily one to one.

Joe Megibow - 14:26 - That's right. And more recently I was working with an athletic company, and we had highly loyal customer and very high engagement with the loyalty program, but still 30, 40% of transactions to the register. Didn't have a loyalty account associated with it. And in some early security decisions they decided not to say anything about these transactions, not even a half version of the credit card, which means there was every single transaction that wasn't tied to the loyalty program was completely independent and autonomous and there was no way to track anything over time about customer behavior in the store for a third to 40 percent of your customers. And that represented something like 80 percent of all transactions with store versus web. So I mean you have these huge blind spots that turned out to be really critical, especially because you tend to index on the best data around your best customers and while certainly protecting them and leaning into them is important. Understanding how you get everyone else to become your best customers is there's rich territory and if you don't have the data, what do you do?

Allison Hartsoe - 15:33 - Basically have a giant hole. It almost sounds like you're making a case for a Napoleonic slash and burns policy to just start over from scratch because it would be faster and more convenient than trying to work with the legacy systems.

Joe Megibow - 15:48 - There are a lot of approaches you can take as a marketer say bar, we're going to do it, so it's easier for me and perhaps that isn't a point, but it doesn't necessarily mean you're going to end up in a better place. It takes a concerted, long-term view on the loyalty program. What it may mean is you're going to say, hey, over time we're going to bleed out the bad actors. The people who. It's not them, it's your hay man to the program at a time where you say, okay, starting nine months from now we require an email on file and you start a process to engage with those customers, and you reward them for adding an email so it's in their best interests and some of them you can't reach and you never get, which means we're probably not particularly active loyal customers anyway. And over time you convert as many as you can. You start by stopping the bleeding, which means all net new customers, you're signing up to whatever the program has the new standards and then by the time you get to the date, you know you may have three or four or five percent of noise.

Joe Megibow - 16:46 - That's again, probably not particularly tragic because you've been unable to connect to reach with them and then you just say you drop them and you move forward. But I mean that may be a year-long process. So if you're saying, hey, right now I want to do a meaningful analysis to go back and say, sure, I'll get that for you in 13 or 14 months. That doesn't fly so well, but that's part of these. It really gets down to what do I have now that I can act on and what am I going to do later and how do I assess that journey because it's not to say you can't do things today. I think there are all sorts of things you can do where you can be less wrong, which means you're direction-ally correct. It may not be the right answer, but it's still going to be a better answer than doing nothing or a better answer than what you were doing before.

Allison Hartsoe - 17:33 - Let's go back to what you were talking about a few minutes ago that Expedia and digging in a little bit to that particular example. Is there a way that you worked with the data and Expedia so that the academic elements weren't lost? They were actually pulled together around the customer

Joe Megibow - 17:50 - Yes, and again, what I'm going to describe a little more commonplace now, and there are some very interesting businesses that have even come and gone around these topics, but I bring it up because we did this at a time that it was very cutting edge and was driven entirely from trying to apply some academic concepts. So we were Expedia's fewer channels, so this, you know, mobile was very small than most of our business was online. We didn't have a physical store, so we had a pretty clean set of customer data in general. Just the benefit of being a pure online play, but there were multiple channels that we spent into and being a huge travel marketplace. We spent a ton of money in hundreds of millions of dollars marketing across nearly every channel that we could actually do efficiently. So part of the question was at the time we were doing what everyone was doing with his last touch attribution, and everyone knew last touch attribution, who was wrong.

Joe Megibow - 18:47 - We knew it was better than not having an attribution model, but what we were trying to assess is as we were trying to squeak more and more profitability out of business and where are we leaving money on the table and competition was growing. So where can we lean in? What could we do to get out of the last touch? Because like for example, things like email, always indexed, very high is a highly efficient channel is relatively low cost, but you know, if it's in the consideration set for the last site, what is it scaling from? What really drove the sale? If they just search for Las Vegas and they came in on paid search with google, and then we re-targeted them with an email and then they came back and bought. If it weren't for that initial click, then we wouldn't have got the sale, but it was getting zero percent of the sale. So there weren't really any models to do this, and it was. I mean there were some weighted average and some fairly simplistic models and in fact another person. You and I know who out of respect for this person who runs a very, very successful analytics blog actually told me it was lunacy to even. Yeah, try to attack this.

Joe Megibow - 19:50 - I just, I felt that was wrong, so we did and one of my data scientists with doing some research on the academic side and sound a proven model that had been used primarily in healthcare. It was a survival model or technically a proportional hazards model and a specific one. We used as what's called the Cox regression and what it was was let's take a look over a long period of time. We have a disparate set of events and again, st medical here in a drug trial in a how do I understand the efficacy of the drug over the years and understand that the end, what were the factors that actually drove? Was the patient cured with a not cured that they live? Did they die and so forth. And as she started to examine the model, she realized, oh, this works perfectly for how we've been trying to think about solving this whole marketing touches fusion problem. And this was probably in, I don't know, around two. No, that's not right. I'm sorry, around 2010.

Joe Megibow - 20:50 - So you know, maybe eight, nine years ago. So this was pretty early on. Some of the attribution work and we were using one of the big data science packages, and they had already again on there a pharma side of the business had already built an application of this model. But we took that model and used it for marketing and ran it through the statistics bladder and it produced results that, you know, obviously you train the model and you test it against another data set and so on. And we were able to get to a place that we have high confidence that this was working and you know, it kind of proved out exactly what we thought that some of the channels that get a lot of credit for being overstated and some of the channels that turned out to be really critical. We're getting under-invested, and we were able to marry academically and accurately describe where we needed to shift our marketing spend. And it was rigorous enough that we could get by in, from like the head of finance and the people we needed to say, hey, despite the fact that the models we've been using are saying this money we're spending over here is losing money for us, we have better data now that shows it's actually not.

Joe Megibow - 21:58 - And we were able to get going now at this time, we didn't have any way that we could figure out how to productionalize it. It was more of a moment in time, sort of evaluation of historically what had worked and what happened. And from that, we were able to build some more static factors into our more traditional models. Kind of a weighted last touch approach that, uh, that allowed us to do that. And again, it's gotten much more sophisticated. But to me, that was just a great example of where we had a real problem, and we were able to go to some, you know, a more academic rigor to actually answer something that we could then act on. And that's not something that happens all the time. And it was something threatened on an enormity of need because we were spending so much money and we needed to find a way to get better at this, but it was just a great example and one that I was really proud of the team for.

Allison Hartsoe - 22:46 - Well, and I think that's fascinating that you found a model in a different industry altogether, that you didn't go out looking at more marketing academia research. You actually went and looked for what model or what other, what other situation or what other industry has a similar problem and matches the same type of problem we're trying to solve here. I often think that those are areas of undervalued resources.

Joe Megibow - 23:14 - Yeah, no, I agree. And it's, especially in business, I think in academia you see a little more of this where you know, you have a hypothesis that what works over in this discipline might work elsewhere, but certainly not in business to, uh, you know, you really have to step out of your area of expertise and knowledge to say, hey, I'm in a business selling automobiles and I'm going to now go look at a business that is selling cosmetics and see if there's anything I can learn there around selling automobiles. But that's, that's a big leap. I think taking that academic mindset allowed us to connect them to us that way.

Allison Hartsoe - 23:49 - Now I have to ask, of course, in this model, did you include customer lifetime value as you tried to understand the value of the different conversion points?

Joe Megibow - 24:01 - Oh, you're getting greedy. In this particular analysis, we did not. We did other, like I did some really interesting CLV modeling on mobile and trying to assess what was the value of a download in the early days of the app store was an example of where we directly use CLV. But this model was really just, could I understand for any individual user, what was the impact of multiple touches with multiple spam levels against the final transaction or not?

Allison Hartsoe - 24:30 - Got it, got it. Well, it still makes sense. It's still a great example. Do you have any other examples you care to share?

Joe Megibow - 24:37 - On the CLV side?

Allison Hartsoe - 24:39 - Well, CLV is ideal, but it doesn't have to be a CLV. It'S really about where academics is maybe a little short sighted or where we can find a good intersection between academia and reality.

Joe Megibow - 24:52 - Yeah, it's intense. When I. When I think of more the academic approach, it really is often around my data sciences team. So yeah, I think there are many examples of where we've been able to apply sort of the best thinking and data science. An example, as retail started to decline and we were looking at American eagle store closures, my data science team was able to build a pretty sophisticated model using CLV and you know, am and understanding historical behavior and predicting CLV if you close a store, could we model out based on geolocation of nearby stores as well as their engagement with online channels, you know, what the actual loss of business would day and how much business we've recouped through digital channels and how much business would we recouped from the nearby stores and what were those radius is and what were the profiles that drove online. And it was a model that we directly acted on.

Joe Megibow - 25:50 - I mean, it became a model that helped us very deliberately prioritize which stores we closed and also build our forecasts are operating plan around the expectations on, you know, not just wiping though sales off the books, but understanding more practically what we really see happen when we close these stores. So that's just another example of a different way it's been used.

Allison Hartsoe - 26:11 - You know, I, I love what you said about prioritization because I'm oftentimes felt that CLV, in general, gives us a little more signal to all the noise, all the choices that we have, whether it's attribution or whether it's store closures or whether it's how to execute a digital transformation. It's something much bigger. Do you think that that's true, that CLV is, is really one of the leading metrics that people can rely on in order to prioritize what they need to do?

Joe Megibow - 26:41 - Absolutely. I mean, I, I think it kind of gets back to the broader question of why do you see all the, you know and think you've got to have Like I can kind of walk through the framework I think about on when approaching any of these big rigorous analyses. What must be true before you even kick them off? We'd be okay if I walked through that

Allison Hartsoe - 27:03 - Yeah. I think that's a really nice application. If you wanted to take the next steps, what would you do?

Joe Megibow - 27:09 - Yeah, so what I, what I think about, and again, prioritization may be one, but I, I, I say you start with do you have an actionable hypothesis? And there's a lot of analyses that are done, I think just because you can like, wow, if we could pull this data together, put it into a box, shake it up, we're going to know more about our customers, and there's sort of this presumption that the data will tell us what to do, that clusters will appear and white space and gaps. We'll suddenly have the light shine on them and action will result. And my experience is sometimes you get lucky, and that happens, but more often than not it doesn't. Which means either the data is clear, but no one is set up to look at it or it's just never quite as clear as you'd like it to be.

Joe Megibow - 28:02 - So I tend to start with unless you have an actionable hypothesis, something that I believe that data is going to show me x, y, z. And based on that I am going to go do something. I'm going to pull a lever; I'm going to make a change. It's going to affect my prioritization as you were just asking about. But there's something I can act on. If I can't define that hypothesis, let's not even bother escaped one of three. Gate two to me is okay, let's assume we've got an actual hypothesis. We do the analysis, and we get the results we want from the data side. Is it something that could be acted on and that could mean it would require technical requirement. You technical capabilities that we don't have. Okay. To do that would mean we need to change something on how the associate engages with the customer at the store and we'd signaled them to do that, but our, our point of sale, it doesn't have any way to do that, and we're not replacing the POs in a system that can last for a year from now. So.

Joe Megibow - 28:59 - So even if we had the data, we couldn't act on it because of technical limitations, or it would require me sources, you know, we would now need a content team or a production team or development team to go act on it and we don't have the resources to do that. So date to mean is even if you could prove it and actionable, can I through resource allocation or technical capability actually do It and it's easy to overlook that and just assume those things would be true or that they would happen but not. So you need to kind of prevent that. And then the third one, which is the really tricky one is is it strategic is does this align with mine. Because it turns out even if you can prove it in, even if you have the resources to execute on it, you still as a company may not choose to do so because you have a finite number of resources and other things may be a higher priority.

Joe Megibow - 29:54 - If I do a deep analysis, I mean this could, for instance, be quite offensive to this conversation, but if I do a big analysis on showing how I could improve my retention rates through CLV and, and I really can build this out. But right now because of where I am as a company, I'm putting all of my energy into growing top line through acquisition. I'm not going to divert resources to a big retention initiative if that's not aligned with what my business goals are at this moment in time, maybe six months from now, maybe a year from now, but right now we need to be really laser focused on something else. So again, I think you have to understand your business and your capabilities before you get into this. So is it useful prioritization? Sure. But that still only steps one of three to set this up for success.

Allison Hartsoe - 30:43 - Yeah, that makes perfect sense, Joe. Thank you. So this has been a really informative conversation regarding the framework and the actions and the examples that it's been really nice. Thank you. If people want to get in touch with you, is it okay for them to reach out to you and how would they reach you?

Joe Megibow - 31:01 - Yeah, absolutely. I'm on nearly every system. My, uh, my id is just my last name, Megibow and you can find me on LinkedIn. That's my, uh, my shortcut. Twitter, you name it. So just find me @megibow and feel free to reach out.

Allison Hartsoe - 31:16 - Excellent. I'm going to do a quick summary, but please feel free to step in if, uh, if there's anything you want to add or if I haven't captured something right, but what I was particularly interested in the very beginning as we kicked off after, after we got past all the background and the TeaLeaf fun and we talked about academia tending to be maybe five to 10 years ahead of practical application, and we talked a lot about the reasons about why that is, but particularly the mapping issue. Having lots of data makes it inherently hard to get to the pieces that have to connect to each other. So in the attribution model, it's, can I match it to the person? Can I tell if each new dollar is creative? And then it could also be in identifying the customer themselves, which we talked about is actually getting harder, not easier.

Allison Hartsoe - 32:10 - And frankly that, that surprised me a little bit. I thought it was really getting easier and identifications, but I completely by your point about the endpoints exploding because it's no longer even mobile desktop and the number of devices, but it's your car, your refrigerator, all these different IOT devices that factor into.

Joe Megibow - 32:32 - Yeah. Unfortunately data science is getting better and technology's getting better, so there's sort of a race going on between how smart we are and how much data we have. But yeah, if it's not obvious which side to win.

Allison Hartsoe - 32:46 - Yeah. And then we talked about the kind of impact that you can get, and you gave us two examples. One about attribution at Expedia and even though you didn't use CLV that's okay. It was early on and what was so interesting in that particular model was the ability to pull the model from pharma into marketing and which I think just speaks highly, not just to your data scientists that did that, but also to the need to look beyond where we might think the solutions are and to think about other situations in business, other environments where the same protocols happen, and we can simply pick it up and move that bottle over. And then you also talked about the American eagle and this was I think so fascinating because it's a connection between, I think a bit about it as a connection between online and offline, but maybe it was really just about the offline store closures, but I was wondering if you actually factored in not just the geolocation and where they would go offline, but would there be an effect online as well?

Joe Megibow - 33:52 - Oh, absolutely. No, that was part of the model, and you know as an interesting side point on that is physical stores drive online sales really well. I mean, if I'm in a region that doesn't have physical stores, the minute I open up physical stores, my online sales will grow dramatically, and it's sort of counter to a lot of the populist discussion on online replacing offline, but the reality is, and by the way I should say this was specific to apparel. I can't say this would be true of every business, but I do see this happening with Warby Parker opening stores all over, opening stores all over it and it works, and the scores are individually profitable. It's a great way to discover the brand and then buy online or buy there or either, and part of what we were looking at is once the store there and you built that base of customers who want to engage, and then you take it away. How much business do we actually then increase online because they're willing to shift that business online? So that was absolutely part of the model

Allison Hartsoe - 34:57 - Because they're loyal. That's fascinating. I love it. So and then finally we talked about the framework and the three gates. Is there an actionable hypothesis? I love this because so many times, and I have literally run into this at least twice in the last two weeks were folks that we've talked to have made the assumption that because they've landed everything in a data lake that suddenly bells will ring and angels will sing and all the answers will come pouring out of the lake, and it is just not that way. The actionable hypothesis is incredibly important so you don't get stuck in that general profiling exercise and then the second gate of can we act on it? Is it technically I think is also important. I've seen many analyses that run through and then there's no way to execute on it. What a waste of time and how disappointing for the folks that spend their effort on that. This is totally preventable upfront and it also I think helps you get buy-in for people who would be taking action, so an incredibly important step.

Allison Hartsoe - 35:59 - And then the last part about is it strategic, is it aligned with the company goals? Now I agree with this, but at the same time, I also feel like can analysis show that there are places where the company has maybe a little bit of a blind spot, and maybe it should be a priority even if it can't be a priority immediately?

Joe Megibow - 36:20 - No, no, for sure, and I think though it still has to be worked simultaneously. Bottoms up, top down. I think in part of my point is how do you not have just a demoralizing failure where really smart people did great work, and it results in that thing which is how you start losing really smart people. If you go to you do something that is bad, profoundly interesting and is a strategic opportunity and to just go surprise, hey, we just spent the last three months or four months doing this analysis. We worked it out, and here you go. Nothing moves that quickly. In business your strategic planning once a year, and so I. I think you have to have at least enough leadership representation that you can sort of pre-vet and say, hey FYI, we're working on something that I think is really novel. I don't know where it's gonna go, but if it does, this could be a huge opportunity for the business, and you start to pre-quite that and if it's getting shut down, I mean if it's just, even if you could come back to me with something twice as good as what you think is possible, Joe, here's the reasons why we're not going to do it.

Joe Megibow - 37:26 - I'd rather know that even in my bones if I know, I'm right, it's just not gonna happen there because I don't have buy-in from either who's the ultimate leader or those who hold the purse strings or whatever. Um, Versus, you know what? That's interesting. That's not exactly where we're going, but if you can truly demonstrate that, let's talk, which isn't a guarantee, but at least means you've got your day at the table.

Allison Hartsoe - 37:51 - Incredibly important. I love that approach. So very, very grateful that you've shared that framework with us. So as always, everyone links to everything. We discussed our ambitiondata.com/podcast. So any particular call outs and the transcription of this podcast will also appear there if you want to copy and paste some of the frameworks. Joe, thank you for joining us today. It's been an absolute pleasure to have you.

Joe Megibow - 38:15 - Thank you so much. It's been fun.

Allison Hartsoe - 38:17 - And remember everyone, when you use your data effectively, you can build customer equity. It's not magic. It's just a very specific journey that you can follow to get results. Thank you for joining today's show. This is Allison. Just a few things before you head out. Every Friday I put together a short bulleted list of three to five things I've seen that represent customer equity signal, not noise, and believe me, there's a lot of noise out there. I actually call this email the signal things I include could be smart tools. I've run across articles, I've shared cool statistics or people and companies I think are doing amazing work, building customer equity. If you'd like to receive this nugget of goodness each week, you can sign up at ambitiondata.com, and you'll get the very next one. I hope you enjoy The Signal. See you next week on the Customer Equity Accelerator.

 

Episode 23: Customer Centricity Conference Recap with Allison Hartsoe Ep. 21 | Arming the Front Lines with Bob McKinney, Head of Marketing at Batteries Plus Bulbs
Powered by SlickText.com

Listen to the Customer Equity Accelerator Podcast

c-suite-radio-headliner-badge-black-1

Available on itunes  

 Spotify_Logo_RGB_Green

 Listen_On_iHeartRadio_135x40_buttontemplate-01

Stitcher

Listen on Google Play Music

We'd love to give you a snapshot of your data using up to 10,000 rows of information.

Schedule A Time