Customer Equity Accelerator Podcast

Ep. 73 | The Evolution of Marketing Analytics

 

"As the web analytics industry grew up, we discovered people who had been doing marketing analysis for generations." Jim Sterne

 
This week Jim Sterne, founder of the Marketing Analytics Summit and author of Artificial Intelligence for Marketing: Practical Applications,joins Allison Hartsoe in the Accelerator. Jim discusses how todays marketing analytics evolved from web analytics and the implications that has for the introduction of Machine Learning. While the challenges remain the same: data is terrible, there are too many ad hoc questions, and it can be tough to get the message through, marketing analysis is evolving quickly.  Learn how analysis today is dramatically improved by machine learning, but ultimately not replaced.   

Please help us spread the word about building your business’ customer equity through effective customer analytics. Rate and review the podcast on Apple Podcast, Stitcher, Google Play, Alexa’s TuneIn, iHeartRadio or Spotify. And do tell us what you think by writing Allison at info@ambitiondata.com or ambitiondata.com. Thanks for listening! Tell a friend! See the full transcriptView all episodes.

 

If you want clearer insights from your data, check out some of our services:

Predicted CLV Calculation

Customer-Centric Reports

Campaign Tracking

Behavioral Segmentation

 
Ep. 74 | Communicate Insights with Kathleen Maley Ep. 72 | The CAO Conference in a Nutshell

 

Show Transcript

Allison Hartsoe: 00:01 This is the Customer Equity Accelerator. If you are a marketing executive who wants to deliver bottom-line impact by identifying and connecting with revenue generating customers, then this is the show for you. I'm your host, Allison Hartsoe, CEO of Ambition Data. Each week I bring you the leaders behind the customer-centric revolution who share their expert advice. Are you ready to accelerate? Then let's go! Welcome everyone. Today's show is about the evolution of marketing analytics and to help me discuss this topic is Jim Sterne. Jim is the founder of the marketing analytics summit, which was the e-metric summit ages ago for you industry old timers, and he is the co-founder and director emeritus of the digital analytics association. Now if you haven't run across Jim and the conference circuit, you've certainly seen his books. He's got about a dozen books out there in online marketing and analytics, including the latest one, which is artificial intelligence for marketing practical applications. Jim, welcome to the show.

Jim Sterne: 01:11 Thanks very much. This is a real treat for me cause I really enjoy your podcast and I am delighted to be part of it.

Allison Hartsoe: 01:17 Oh, thank you so much. That's nice to hear. So where did you start and how did you get into this space originally and start developing such a passion for it?

Jim Sterne: 01:27 Well, I always had strong opinions about how the Internet sucked and expressed my opinions to everybody about why their website was bad and could be improved. And then one day I realized we can measure this stuff. It's no longer my opinion. We can actually find out how bad your website is through analytics. And that was back in 1997 or eight with when Webtrends was around. And I started presenting all over the world at Internet world conferences. And there was one presenter that I always went out of my way to see a guy named Matt Cutler, who was with a company called net genesis. And he had founded that right out of MIT initially to do log file analysis. And then they figured out, oh we can do Java scripting and really get some interesting data. But he was just a wonderful speaker. So I was, went to his presentations, he always went to mind.

Jim Sterne: 02:19 And finally one day we sat down and had dinner and said, look, we should figure out how to work together. And he said, well, come and give a presentation at our user group meeting and meet some of our clients and hey, let's write a white paper. So we interviewed 25 companies, and they all said, we're overwhelmed with data, we have no idea what to do with it, and we're just swamped. So we put out a white paper, which was then called e-metrics, and it was five pages of survey results and 60 pages of what you could be doing if only you would. Then a year later, I interviewed another 25 companies that did actually have answers. And then that became my book web metrics, which came out in 2002, which was the year that since Internet world had dried up and blown away, I decided I would start my own conference. Oh, 2002 was when the e-metrics summit now marketing analytics summit was founded. And then in 2004, the audience created the then-web analytics association, which got renamed to the current digital analytics association. And that's my story and I'm sticking to it.

Allison Hartsoe: 03:24 That's an incredible story. I didn't realize that you started with the paper about people being over the interview and then discovering that all these companies were overwhelmed with data. I almost would have said that you could pick that up and run that exact same scenario today people are still overwhelmed with data.

Jim Sterne: 03:42 Yeah, well that white papers on my website at targeting.com so enjoy a blast from the past.

Allison Hartsoe: 03:50 We certainly, well, excellent. Well, so people who have looked at marketing analytics over time may understand a bit about the evolution where we've started with, what should I measure? There's lots of data, but is it, if you're just coming into the space today, is it likely that you might be thinking that data has evolved to the point where AI, she just be telling us what to do, and I might not need marketing analytics or digital analytics as much as I used to. Have we gotten to that point?

Jim Sterne: 04:24 Oh absolutely. Especially because we're in a place where robots cleaned my house and do all of my grocery shopping and figure out what I should buy my wife for her birthday. Yeah, it's all taken care of. I'm just going to go sit on the beach. I think we're a long way from there. I believe it's important to throw in a little bit of history that we had mixed modeling. Let's says I'm spending this percentage of my advertising dollars in these channels and here's the result. What happens if I change the amount of money I'm spending on these different channels? So that's the absolute top down. Then there's market research. Let's ask people their opinions and do psychographics and all of that stuff ad nauseum, two very different realms of data collection. Then there is the customer loyalty, customer lifetime value side that says, let's actually measure what our customers are actually doing business wise with us.

Jim Sterne: 05:15 And then along came the internet, and suddenly I had this behavioral data of what they clicked on and how often they came back and how often do they hover and how long do they read and all of those wonderful bits. So the disconnect in the very start of the industry was that those people who were handed the job of web analytics, we're not schooled in modeling, they weren't schooled in market research. They just said, here's a bunch of data. Go see if it means anything. So there was a whole lot of invention that didn't have to happen. If the web that master of old who was responsible for the log files had actually had a degree in statistics. This would have gone a very different way, but the web analytics industry invented itself by going to a conference and keep joining an online discussion group from the Yahoo Web analytics forum that Eric Peterson put together is where we learned how to do our jobs.

Jim Sterne: 06:12 And then as we grew up, we bumped up against business intelligence, and we discovered that there were people who'd been doing marketing analysis for generations. And okay, now it's time to be a little more sophisticated. The question about will artificial intelligence take our job? Is Someday I'm going to get tired of saying no, that's not true, but I'm not yet. I'm willing to explain yet again that the human is necessary for three crucial things. Figuring out what problems solve, figuring out what data to give to the machine to solve the problem, and then deciding whether not the answer that they get out the other end makes any sense and that's stuff that the machine is not going to be able to do for a couple of generations. I'm not worried about my job, but the thing to keep in mind is that it's just another tool. So if you have a hammer and a screwdriver and a wrench and you add a ratchet to it, well wow, that's going to solve all my problems. No, no, no, it's not. It does a different thing. It solves a different kind of problem. Your job as a marketing professional is to understand what different kinds of problems can be solved with this magical machine learning stuff.

Allison Hartsoe: 07:18 Well that's an interesting angle, and I've been reading the book prediction machines, and they talk a lot about judgment, which it seems like all three of these pieces, what problems should I solve, what data should I get? What answer makes sense is all related to judgment, human judgment and I think that's because the idea of something that we might simply express like show me a valuable customer or show me something that's good or bad is not inherently a binary element. It requires all these pieces of judgment, notably human judgment to come in to say whether that is indeed what we want in the outcome,

Jim Sterne: 07:56 which is where supervised learning comes in. It's here is a good customer. I've decided to label this a good customer. Let me label a hundred good customers. Now look at my other hundred thousand customers and tell me which ones look like the one I just gave you and instead of me saying there between these ages, with this income in that zip code and they for the recency frequency, monetary value is X. Instead, I say I declare these as good, find more people like that and here is a wide variety of data to consider whether or not the other people look like this hundred labeled group and that is supervised learning. I've told the machine what I want told it to find more of them like that.

Allison Hartsoe: 08:39 Which I think is an interesting avenue, but I have also seen some applications where it almost seems like certain AI tools can get to causality, not just correlation. And I'm not saying that machine learning is necessarily correlation because I wouldn't say I'm such an expert in supervised learning that I could tell that it was definitely doing correlation versus causation, but it seems like there are some more advanced models behind what we might typically start with machine learning.

Jim Sterne: 09:10 The causation problem is a tough one to solve. It is definitely a correlation to machine full stop. Our data scientist trying to figure out how to code causation into, it's what, knowledge set into its self created rules? They're exploring that, but understand that marketing is one of the messiest subjects that we have to play with. It is not chemistry. It's not physics. It is dealing with the myriad human conditions at any different point in time that are going to be different for a wide variety of reasons that are constantly changing. Hey, that's an easy problem to solve. So if you want to adapt, if you want to nail down causality in order to come up with attribution that says next time you should spend more on outdoor billboards on the highway, that will improve conversions. We're ways away from that. Now we can hook up a lot of the digital side.

Jim Sterne: 10:05 We can get, what did you click on? What emails did you open? What have you posted on Twitter? Who Do you like on Facebook? And we can bring all that data together and put it in a big pile and say, find me a pattern. Find. Tell me something I don't know. That is unsupervised learning. Take a look at this data and take these people and put them in categories and see if we can segment into some way that makes sense to a human. So either I'm going to train it. Yes, this is a dog. No, that's not a cat. This is a good customer. This is a bad customer. Or we're going to just throw a bunch of data at it and say, yes, see a pattern in there. Tell me what you see that's interesting. Well, this correlates with that. Yeah. I know that sales go up when the weather's bad. That's true, but it's not useful. Tell me something else. Well, when somebody has looked at these pictures, and then they've opened to that email, they are more inclined to make a purchase if you give them a discount code, and it is out of your hundred thousand prospects that you're going after, there's 13,000 that we have a confidence of 80% that they will use that coupon. Oh, that's useful. Thank you.

Allison Hartsoe: 11:12 So did I just hear you say that in unsupervised learning you can basically peel back the layers of maybe it's correlation that tell you, okay, the first one is something obvious. I get more sales when I have more salespeople. I'm going to throw that out. And then it will automatically roll down to the next layer that says, okay, if I take that out of the picture, now I've got another mix of things. What is causing the outcome now?

Jim Sterne: 11:38 No, that's what analysts do. The machine will look at the data and say, here's a bunch of correlations, and some of them will be completely useless. Some of them will be obvious, some of them will be ridiculous. And that is where that third thing, first of all, what problem, what data, the third thing is, does the answer make sense? So is there a correlation between drowning by swimming and ice cream sales? Well, yes, there is, but ice cream is not causing the drowning. It's the temperature going up when it's hotter and more ice cream is eaten, and more people go swimming. So is there a correlation? Yes. Is it one that I would have tried to tease out as a human? No, I wouldn't have thought to cross-correlate those two things. But the machine found it and offered it up on a silver platter. And said, how about this one? Is that useful? And that takes judgment to say, Oh yes, I can adjust my spend based on that insight.

Allison Hartsoe: 12:31 That makes sense. Well, let's talk about some examples where I always think it's interesting where the highest paid person's opinion comes in, and they say, hey, we should be doing machine learning, we should be doing AI. And we subscribed to a maturity curve that has very specific, well, specific things at very specific times and it's largely driven by connecting to the customer and a very specific view of customer lifetime value and how you ladder up to that. Is there a similar curve for perhaps the application of AI and machine learning when in other words, when would you be ready for use cases and when should you not even be thinking about that?

Jim Sterne: 13:11 Well, the first rule is I have found myself quoting Stefan Hemal a lot, and in a recent tweet he said, if you're not doing analytics well you are not ready for artificial intelligence. So if you don't get statistics, if you don't understand modeling, then this is a tool that is too sophisticated for you. Now you don't have to be a statistician just like you don't have to be a programmer in order to advertise on Facebook, but if you understand how Facebook works, you will be better at your job. So you are not going to take machine learning and just open the box and use it right away if you don't really know analytics, you don't have enough data, and you don't have a specific problem you're trying to solve. So machine learning, it solves a particular problem. It is very narrow. General artificial intelligence is a wonderful thing for science fiction writers, and it is not something marketers are going to be using for a long, long time. Instead, like you do with a statistician, you have a problem you're trying to solve. You have a data set, and you build a, you use machine learning to solve a specific functional problem. I want more email opens. I want more click-throughs on my ads. I want higher conversions. When people hit one of these two landing pages and it's that specific, it's a horizontal tool like a spreadsheet. Hey, I can just use a spreadsheet and that'll solve all my problems. Well, no, it'll solve one problem at a time.

Allison Hartsoe: 14:42 Do you have examples of companies that are perhaps doing the right or going after the right use cases versus maybe a company that didn't go after the right use case?

Jim Sterne: 14:53 So yes and no. The companies that shouldn't have done it don't tend to talk about that.

Jim Sterne: 15:00 But there are the stories out there, and the fun place that are coming from is from startups because they're out there to tell their client's stories. So for instance, there's a startup called Albert, and they love telling the story of the Harley Davidson dealer in Manhattan that was having a tough time, lots of data, lots of potential customers, lots of people who are interested in their product, lots of data from social media. And they took hundreds of photographs and dozens of headlines and had the machine generate display ads and emails and then started testing them against micro-segments. People who look like this tend to respond better to pictures of that. Now you can do ab testing your whole life and test a hundred things, and you'll get some lift, but the machine can test a hundred things in five seconds. So that's a lot more powerful. And the results were 500% increase in visits to the website, 20% increase in sales month over month for three months. You know, just amazing results up to a point. And then that particular model, using that specific Dataset had maxed out. It had learned what it could learn, and unless you change that inputs or the desired output is just going to give you the same answer, it has solved the problem, but it won't, it can't continue to improve. You know, there's no such thing as a hockey stick with no end.

Allison Hartsoe: 16:26 So it's almost like you're saying that the typical use case for machine learning is more of an 80% solution. It's figuring out ways to do heavy lifting or find areas that you may be perhaps you hadn't optimized before, hadn't optimized well.

Jim Sterne: 16:41 And it's not that you haven't done it, it's that it was not humanly possible to spend the time necessary. If you had an infinite number of monkeys with an infinite number of excel spreadsheets, they could solve it. And that's what machine learning is, but you, it's not even with a roomful of interns, it's not worth crunching the numbers you, it will not be cost effective. But with machine learning, you can crunch all of that data to tease out well the better message in front of the better person at a better time. Now machine learning does have it tops out. When you start using it, it gives you terrible results because it's learning. It's like a toddler trying to walk. It takes a step and falls over and then it takes two steps and falls over and then it runs out the front door and is in traffic and wait, come back. And it does great. I mean, so it's this dip at first, and then it figures it out and it's this huge climb and then it plateaus. It has done the heavy lifting, and now, yeah, there is no final answer. There's no final solution to it. There is just, we got you an amazing amount of lift and we've increased the value, the return on investment of your advertising, but there's no infinite climb. So you now have to find a different problem to solve.

Allison Hartsoe: 17:55 Or find new data to bring in. So I guess it's a bit like a U shape curve. Exactly. Huge. Not doing anything right at the beginning, then the huge climb and then it plateaus. But it almost reminds me of a product as curve when you say plateau because I always think, well if you change the variables and you add more to it, then you can probably get it to lift again. But isn't there a fallacy in that too? I mean if I just keep adding more data, maybe does something better, but I can also get a lot of spurious correlations. Right?

Jim Sterne: 18:26 Well, sure. So now we're at the point of how much agency have you given the machine? If is it going to just kick out recommendations and then humans will decide or do you give it control over your email system so that it can constantly reconsider what message it's sending to which people? And then it is going to reach its maximum optimization. Now it needs a different problem to solve or different data, not just any data but data that is predictive and useful. Valuable.

Allison Hartsoe: 18:55 So I'm going to guess that you probably have an example of the human factor where people have given maybe a machine learning system control to execute something perhaps without the best consequences on the other side,

Jim Sterne: 19:08 Again, the people who run into those mistakes don't tend to talk about them in public. So imagine that you've asked the machine to improve email opens, and you give it agency and you give it lots of data and it starts to try things and it sends out emails and these got open and those didn't, and then it changes the frequency of email and discovers that if you send out a hundred emails to somebody within an hour they will open one. So success and a human looks at that and goes, no, no, that does not pass the smell test, stop that.

Allison Hartsoe: 19:42 They're opening down. Subscribe.

Jim Sterne: 19:46 Yes. Or to see who the heck I can file a lawsuit against. So the human is necessary to ride, herd and determine, did I describe the problem clearly enough so that the results are gonna fit within the realm of common sense. And so it becomes a combination of, I'm going to use machine learning and I'm going to start creating some rules that say we really don't want to send anybody more than two emails a week. Just make that a rule and let's constrain the machine to stay within that because common sense says otherwise we're not upset people.

Allison Hartsoe: 20:18 Yeah, I think that makes sense. But I'm also sensing that when we look at companies that are becoming more and more mature with data over time, it seems like the number one thing that holds them back is a human factor. It's people not accepting the results of the data. It's people not able to create content fast enough. It's people not willing to allow my turf versus your turf kind of thing, like an organizational problem and so oftentimes when we think about machine learning, the first applications tend to be in optimizing a dataset. Can you also optimize human factors?

Jim Sterne: 20:58 Well, this is where this is a tool that can be used for a wide variety of purposes. So Microsoft has their system that is internal communications. It's a little bit like slack and a little bit of calendaring and a little bit of all these good things put together. And Oh, by the way, since it's Microsoft, if you're using Microsoft 360 and it's reading all of your emails and it's scheduling all of your meetings, so it is a system that can say, Oh, if you're going to have a meeting and invite these five people and talk about this topic, here are two other people who are the go-to people on email that everybody is always asking them questions about this topic. You really should invite them to the meeting as well. Is that solving political issues? Oh no, but is it an interesting aid? Yes, it is. But remember when you say machine learning, it's like saying word processing or a typewriter. Can you use this for writing copy? Yes. Could you write a novel on that? Yes, you could. Will it help me write a novel? No, it's a typewriter.

Allison Hartsoe: 22:03 That's a great analogy. That makes a lot of sense. Now I imagine that at the conference that you run, people have a lot of discussion about machine learning. Is that one of the hot topics and maybe there are other hot topics along those lines at the marketing analytics summit,

Jim Sterne: 22:20 because of our history and because of the constituents, the attendees who show up. It's a real interesting combination of what's over the horizon. So yes, we all need to know about machine learning. We all need to know about how to communicate with our customers online through email, with search optimization through a chatbot. And that's, and machine learning is a new thing. But you know what the other half of the conference is, I'm trying to do this actual work here today and I don't have all those fancy tools. I'm not the thought leadership is an interesting, but that's not where I live. I live with. I'm having a horrible time with Google tag manager. Can anybody help me? So it's this amazing combination of tactical solution and strategic thinking. That is why this conference is in its 18th year.

Allison Hartsoe: 23:08 Yeah. And I remember many of those tactical solutions being quite helpful to me personally when I first started out in this space and was trying to understand more or less what was possible, what was I not thinking about? So I can see where many people benefit from that. And so along the lines of, you know, this evolution of marketing analytics, is there a certain order of operations that you see or perhaps that you recommend people subscribe to? If I have my foundational tracking in place, what should I do second, third, how should I start moving forward into marketing, modern marketing analytics? And then how does the conference play in? At what point should I be thinking about the conference?

Jim Sterne: 23:53 Well, in terms of maturity model, there's a whole lot of things that have to happen in parallel organizational change management, the politics. If you don't have a senior sponsor and if you don't have a curious, capable, low-level technical capability, none of this can happen. You have to have enthusiasm at the top and the bottom so that those of us in the middle can make it all work and make decisions about how to spend the budget. But data cleansing is when you say you have your foundation of data collection, that's huge, and oh, by the way, it's ongoing. That's not a tick the box and move on. That's a forever problem. So you will always be doing data collection, data cleansing and tearing your hair out because it just breaks all the time. Then it's time to do some basic counting so you could get a baseline.

Jim Sterne: 24:42 Where am I now? Then you get into, okay, where do we believe that we can have the biggest impact? If we can optimize some stage of the customer relationship. Is it more important to get more attention and let people know about us, or is it more important to engage them interactively in a consultative basis in order to convince them that they should buy it? Or is it more important that we should focus on the shopping cart, or is it more important that we should focus on turning people into advocates on social media? Pick one because you can't do everything, which one is going to give you the biggest step forward. And that's by your investment. It's political. Who is the most willing to experiment with you? And then eventually you can move up to automation. And when you move in to automation and you're building out your marketing tech stack, once that's in place, oh we can bring machine learning in underneath that and provided as the cognitive part of the automation.

Jim Sterne: 25:36 I mean way back when it was a pretty straightforward thing to create an email tree. And by that I mean we're going to send out emails to our customers, and we're gonna ask them a question. And if the answer A we'll follow up with this email and if they answer B, we'll follow up with that email, and we'll offer them to download a white paper. And if they do download the white paper, then in three and a half days we'll send that follow up email. And you can set all that up in advance to be automated in the dark. But somebody had to decide that these were the right steps to take. You can bring in machine learning to help reveal which of those steps is better than the other.

Allison Hartsoe: 26:13 I think that makes sense, but I also think one of the elements that circles back to what you were saying in the very beginning about human judgment was when you define the biggest impact, I think there's a hidden element there that says if I'm successful, other people may come to me and good problem to have. They start filling up my queue of requests and analysis, things that they want me to do and at that point, I think once you start to get a little bit of traction, it becomes very strategic as to which projects you take because it's more and more expensive. You need more people, you need more tools, you need more budget, and if you don't show the bottom line dollar impact with your analysis organization, it could be hard to get that budget. So if you want to keep that upward spiral after you get the initial projects, I think you have to be very selective about what you attack and then move on to perhaps the other elements that might be perhaps something you can support on the side and a shorter term project as opposed to like the main focus of your team.

Allison Hartsoe: 27:23 Would you agree with that?

Jim Sterne: 27:24 I would if we're talking about an analytics department of one, you are the one who has to manage all of yourself as a resource. If you have a team, then this is the problem that management is being paid to solve. Here are the resources, here are the needs, prioritize, and that instantly it starts with, well, let's map out the ROI. If we do project A, we'll get a bigger return. If we do project B, the vice president of whatever they are will give us more budget. So guess what? Let's do B instead, and that's the human condition I'm afraid.

Allison Hartsoe: 27:57 It is. An I like what you were saying about the email tree example before. I constantly find that there are many things that were documented either through direct mail marketing or through old systems that somebody figured out that we're just now circling back to and saying, Oh yeah, we should be doing this at scale.

Jim Sterne: 28:19 Yeah, exactly, and that's what machine learning is all about. It Is scaling the art of counting. It is scaling the art of statistical analysis. I start with just writing a program with specific rules, and I move on to a spreadsheet that lets me do some iteration, some really rudimentary modeling, and then I hire a statistician to really do some predictive modeling to get heavy into Basie and analysis and they create a model and test it and then tweak it and test it and tweak it and finally have something that's pretty good and oh look, I got 2% lift. That was worth the money. Well with machine learning I am putting a lot of statisticians into the box and turning the crank really fast.

Allison Hartsoe: 28:59 That makes sense. What the conference coming up, what's the biggest challenge you expect to hear people talking about?

Jim Sterne: 29:05 Whooo, so it's time once again for me to revisit an article I wrote about 10 years ago, maybe 12 and what's your biggest problem? And five years later I brought it out, and half of the problems were still there. So I expect people to continue to talk about how their data is terrible. So I will quote Ronny Kohavi from Microsoft who says that 80% of data scientists spend their time cleaning data. They spend 5% of their time doing deep analysis and insight cleaning. And they spend 15% of their time complaining about the 80% of the time they do cleaning. So that will never go away. So that's part one. They're going to complain that there's too money, ad hoc questions of random, what is the price of tea in China? And what if we correlated that against last Thursday? And like, what are you trying to solve? And I'll help you, just ask random questions.

Jim Sterne: 29:57 You destroy my throughput. They're going to complain that people just don't listen to them. And this is a solvable problem because as an, if I'm an analyst, I love this stuff. I am a problem solver. I'm a detective. I really love digging around and finding the Aha moment, and that's exciting, and I want to share that excitement. Look at how hard it was for me to clean the data, and I tried this, I tried this, and then I came up with this golden nugget, and by that time everybody in the audience is asleep because all they care about is the golden nugget. Tell me what the nugget is. Well, you should send out your emails on Thursday morning between nine 30 and 10 great, thanks. Bye.

Allison Hartsoe: 30:37 That's so true. And I literally have an episode coming up with an executive at KeyBank, and she talks about how she trains her team to do exactly that, to flip and to think about here's the nugget, tell it in one page and then everything else.

Jim Sterne: 30:53 Yeah. I want a three slide PowerPoint with 27 slides in the addendum that are for you, not for the audience.

Allison Hartsoe: 31:02 Or for that stray person in the audience that really wants to crack open the cover and understand how you got the answer. Everybody else just doesn't care.

Jim Sterne: 31:11 If you find that person who says, I want to have a followup meeting so I can understand exactly how you did that. Buy them lunch, buy them dinner, be their best friend because you just have a new convert who is fascinated by the data and instead of being the, Oh no, here come the million questions. This is an opportunity to bring somebody onto your team.

Allison Hartsoe: 31:31 A fantastic opportunity and I find that because this is a new space, many people who are coming into it have aptitudes in other areas and they're kind of gravitating in and lots of times we see people with technical backgrounds just more or less relabeling themselves. But I find that some of the most interesting hires is actually come from economics, from psychology, from very different fields. And then they can think about the problem that they're solving in a very creative way. And the data is secondary to it.

Jim Sterne: 32:02 The difference between machine learning in humans is that humans can take all of our experience and apply it to all of our problems and machines can't do that. They can take this data and apply it to that problem. But humans, you can go to the movies and see wonder woman, and suddenly you'll think, oh, I just figured out how to solve that problem I was dealing with at work. They're totally unrelated. But there was a spark that happened in your head. Now if you take that head, and you put it in a room with nine other heads who all have different experiences, different views of the world, different ways of expressing themselves, you have a supercomputer like we cannot even imagine, but if you only hire people who look like you and only people who think and talk like you, you might as well be alone.

Allison Hartsoe: 32:49 Well and on that fantastic note, I imagine that there is probably a great way for people to get in touch with you if they want to find out more about the conference or if they want to talk to you more about targeting. What's the best way for people to connect with you, Jim?

Jim Sterne: 33:05 So Twitter @jimsterne, I have had the same email address for 30 years, jstern@targeting.com and yeah, you can find me on Amazon if you wanted to.

Allison Hartsoe: 33:18 Is there a particular offer code that you'd like to extend to the listeners if they want to come to the conference?

Jim Sterne: 33:24 I would love to invite everybody to register right away for the marketing analytics summit, which is in the middle of June in Las Vegas and a 15% discount code is sterne, STERNE15. Oh, that's all caps by the way.

Allison Hartsoe: 33:41 Is it, okay.! Is it case sensitive? Really!

Jim Sterne: 33:44 It is. It's a promo code. You bet it's case sensitive.

Allison Hartsoe: 33:47 Nice and yeah, there can't be any better place to do a conference then Las Vegas, that place is just a total kick in the first place. I know that the older conferences we used to kind of gravitate up and down the west coast, but Vegas is unlike any other places you just never find a shortage of things to do there.

Jim Sterne: 34:04 And it's easier to get to than almost anywhere else. And we'll be at Caesar's Palace, which is pretty nice.

Allison Hartsoe: 34:10 Are you still doing the dinner with friends?

Jim Sterne: 34:12 Dinner with strangers. Yeah, so at the end of the first day, people sign up random and are randomly assigned to groups of 10, and then we make reservations at restaurants around the casino and around town and you end up meeting nine other people that you had never met before and you go Dutch treat and please come back in time for morning session.

Allison Hartsoe: 34:37 Now, I can tell you firsthand, and I'm sure you've had other stories, but I have made some amazing friends through that dinner with strangers function, and that is unique. I've never seen that at any other conference.

Jim Sterne: 34:49 It's fun. It works.

Allison Hartsoe: 34:50 Good. Well Jim, thank you for everything for your insights about machine learning and the way that marketing analytics as evolving and also for just continuing to put the conferences on. I know firsthand that's not an easy thing to do, and to stay on it and keep after it year over year as the industry changes as people change is the hell of an achievement. So thank you for doing that.

Jim Sterne: 35:17 Well, I will tell you a secret. I do this conference thing because it's the best way for me to gather the smartest people I can find in order to get up on stage and teach me stuff.

Allison Hartsoe: 35:29 Oh, that's why I do the podcast.

Jim Sterne: 35:31 There we go. See, like mine.

Allison Hartsoe: 35:35 Exactly. Well, as always links to everything we discussed including Jim's conference are at theambitiondata.com/slash podcast. Jim, thank you for joining us today.

Jim Sterne: 35:46 Allison, an absolute pleasure. Thank you for inviting me, and it's just always a treat to hang out with you.

Allison Hartsoe: 35:51 Thank you. Remember everyone, when you use your data effectively, you really can build customer equity. It is not magic. It is a very specific journey that you can follow to get results. Thank you for joining today's show. This is your host, Alison Hartsoe, and I have two gifts for you. First, I've written a guide for the customer centric Cmo, which contains some of the best ideas from this podcast, and you can receive it right now. Simply text, ambitiondata, one word to, three, one, nine, nine, six, (31996) and after you get that white paper, you'll have the option for the second gift, which is to receive The Signal. Once a month. I put together a list of three to five things I've seen that represent customer equity signal not noise, and believe me, there's a lot of noise out there. Things I include could be smart tools. I've run across, articles I've shared cool statistics, or people and companies I think are making amazing progress as they build customer equity. I hope you enjoy the CMO guide and The Signal. See you next week on the Customer Equity Accelerator.

 

Key Concepts:  Customer Lifetime Value, Marketing, Digital Data, Customer Centricity, Long-Term Customer Value, Marketing Leaders, Analytics, Creativity, Product Development, Audience Research

 

Who Should Listen:  CAOs, CCOs, CSOs, CDOs, Digital Marketers, Business Analysts, C-suite professionals, Entrepreneurs, eCommerce, Data Scientists, Analysts, CMOs, Customer Insights Leaders, CX Analysts, Data Services Leaders, Data Insights Leaders, SVPs or VPs of Marketing or Digital Marketing, SVPs or VPs of Customer Success, Customer Advocates, Product Managers, Product Developers

Podcast Updates

Sign up to be notified when each week's episode is released.

(PRIVACY POLICY)

 

 

Recommended Episodes

 

Ep. 31: Inside the Customer Equity Accelerator Podcast

Episode031

 

Ep. 51: 2018 Show Directory with Allison Hartsoe

Episode051

 

Ep. 52: Holiday Break Message from Allison Hartsoe

Episode052

 

Ep. 72: CAO Event Summary

Episode072

Listen to the Customer Equity Accelerator Podcast

 

Spotify_Logo_RGB_Green

c-suite-radio-headliner-badge-black-1

Available on itunes  

 Listen_On_iHeartRadio_135x40_buttontemplate-01

Stitcher

Listen on Google Play Music