Allison Hartsoe: 00:01 This is the Customer Equity Accelerator. If you are a marketing executive who wants to deliver bottom-line impact by identifying and connecting with revenue generating customers, then this is the show for you. I'm your host, Allison Hartsoe, CEO of Ambition Data. Each week I bring you the leaders behind the customer-centric revolution who share their expert advice. Are you ready to accelerate? Then let's go! Welcome everybody. Today's show is about the recent chief analytics officer conference hosted by Corinium on May 15th and 16th 2019 in San Diego. I'm Allison Hartsoe, CEO of ambition and data, and we are a service that uses customer lifetime value to tell you how well your marketing is working across tools. I love that our customers treat us as a trusted advisor. So pin me if we can be your advisor too. So why should you care about this chief analytics officer conference?
Allison Hartsoe: 01:08 There are plenty of events out there, right? Well, I've been to this particular event twice before, once in San Francisco, which I swear was pure gold, but that was about five years ago and once in Miami, which was a little bit less so, and that was about two years ago. The companies at the show are mostly maturing or mature in their use of data, which I personally find very refreshing, and that puts them in the middle of the learn or sometimes higher into the lead stages of our customer-centric maturity curve. And you can always hear more about that maturity curve in episode one. Now I typically go to this event to hear what the leaders are thinking, and that's what I'm going to be sharing with you in this podcast now since conferences are so much more about the people that you need as well as the event itself.
Allison Hartsoe: 01:58 I think it's fair to say that location is a good indicator of who will come. And this event attracted a lot of west coast gaming companies, medical companies, and a bit of retail and several banks. So there were several companies that had a variety of wide-ranging data teams, meaning that some had maybe 500 people strong, which I would consider our large data team and some had a few dozen, but most of the people here had multiple functions under their control such as data engineering, data science, and now Alesis Ux and market research. And interestingly, just as a side note, not a single one of these leaders mentioned customer data platforms or CDP's, which we talked about a couple of weeks ago. I think that's very interesting. And I also think it's because most of them have already pivoted into customer databases. Maybe they did this a little while ago before CDP's became so popular and they seem to be using the cloud systems or other systems out of their own stack that they figured out how to grow.
Allison Hartsoe: 03:09 So I think it's just an interesting marker of maturity. There were two overriding themes that you might hear across the highlights that I'm about to share. See if you can pick them out. I will identify them succinctly at the very end. So now the event kicked off with Alejandro Cantarero who is the VP of data at Los Angeles Times, which is a newspaper publication and a lot of the things Alejandro was saying, what are the things that we might find with people in the middle of the maturity curve? So he said that they were asking do we share a common data language? They were really thinking about internal customers across the company, and his presentation largely dealt with data as a means of company alignment. This is a key stage that every company has to go through because what a particular data field means in one part of the company is almost never what it means in the other part of the company, and sometimes the way people name these data fields is not intuitive.
Allison Hartsoe: 04:12 So this alignment is a natural part of the process when you go to bring all the data together. He spent a lot of time with his organization looking at how data was being consumed around the company. And I don't think people oftentimes spend enough focus in this area and he was kind of forced to do that because they were temporarily blocked by doing, they weren't able to do external analysis and so they had to do internal analysis as their initial focus and they started by attending meetings where data was presented to non data employees so you didn't have a lot of backgrounds and they wanted to understand how are you using it. They also spent some time reviewing what the top level corporate goals were. They looked at what was appearing on dashboards and TVs around the office.
Allison Hartsoe: 05:02 They looked at what was accessible in the analytics portals and what was being sent out via emails and perhaps not surprisingly to some of you, they found many inconsistencies and a big mismatch to the corporate goals. Many data consumers did not even understand what the metrics met. Now that's important because even know they had consolidated the data and they had pushed it out to different groups, they still didn't have alignment. It wasn't enough. So he found that there were lots of opportunities to increase clarity. He looked at the classic three levers, what are executives looking at, what our analytics teams looking at and what our employees are looking at people who are going to take action on the data and he found that obviously, the execs look at the revenue drivers. The analysts tend to look at performance metrics, content address traffic or ad revenue or subscriptions and the employees.
Allison Hartsoe: 05:58 Surprisingly, we're only looking at traffic and social media numbers, which were things that they understood intuitively but they weren't connecting perhaps more sophisticated metrics, and they certainly weren't connecting with those sophisticated metrics and what the executives were looking at. So they had to go back and say, does everybody know what the key metrics are? They had to put the definitions in where people can find them, and that meant on the TVs, extra text on the sides, hover states on the dashboard. This all gets back to focus and execution, and they had to use examples to stick the definition and help them know what to do. I thought what was very interesting was that they actually pulled their design team and the design team came up with color templates. In a color Palette to give governance to the data that was presented because otherwise, he said they would have 16 analysts with 16 different ways of presenting data and that means that when you're using red, it might always mean something is negative or if you're using blue, does it always represent subscriptions?
Allison Hartsoe: 07:02 There is a certain color intuition that should be part of your overall systems. We oftentimes talk about that with regard to one particular dashboard, but what he's doing, in this case, is he's pulling his designers to use it systematically as a way of sticking governance and sticking the understanding of the data. And last Kelsey said that they had to coach the executives to come up with more precise goals. I think that's just worth calling out that executives don't naturally come up with what should be measured. It's really a little bit of a dance back and forth, and we did see other examples of that in the show where people would come back to executives with here are all the things you should be measuring, but then the execs would push back and kind of have to go back and forth until you get to what is the actual metrics that they care about in the metrics that you can actually hold to that ideally have causation. Now he didn't talk about causation of their metrics, but he just said that there were particular things that the execs were aligning to and that once they got all the internal people aligned to those metrics, they had a much easier time getting people to use the data and use the systems, and that will be echoed in Zach Anderson's presentation that we'll cover in just a minute.
Allison Hartsoe: 08:14 Then we had a presentation from the conference host, Jose Antonio Murillo Garza and he has been on our podcast before. He is the chief analytics officer at Banorte, which is a bank in Mexico, and he recapped a little bit of a fantastic case study. Again was the subject of the podcast, but you can also find it in Harvard Business Review. It was a subject where you talked about focusing the organization to move away from volume and more into a customer centric approach, and part of the process of how they did this was they signed up for a certain amount of return on investment. The organization asked them to sign up for a 10 x return if it's going to take you a certain amount, we want 10 x return on that at the amount. But when they actually got in, the first year was 46 times the cost back, and in year two they got 197 times the cost back.
Allison Hartsoe: 09:09 It's a great story. I highly recommend looking it up, but as Jose was going through his slides, one of the things that really caught my attention as he was flipping past was this measurement of R O. E. We oftentimes talk about ROI, which is return on investment, but what he was measuring, and this is important, he's measuring return on equity, which is how much are they increasing the value of the customer base, the future lifetime value as opposed to Roi, which might be rear view mirror, how much did I get in in dollars for x amount that was spent. So I think that's a very interesting twist when you're thinking about measurement, especially the measurement of your projects. He said they had a three-pronged strategy that looked at automation, AI and experimentation and as part of his digging into what those were, he talked a little bit about measuring cost cutting, and this is something I've heard before.
Allison Hartsoe: 10:07 In most cases your big analytics process projects should really start with what can be audited by everyone, and that means something that's cost-cutting focused and easy go to here is oftentimes where marketing is spending maybe paid search optimization I've heard that several times from different leaders. But in general, cost-cutting is an easier place to start than revenue generation. If you're starting with revenue generation, then it's harder to measure the impact, but when you go to measure the impact, the recommendation is to measure it in terms of customer lifetime value, particularly from Jose who that is what they do. Now he looked at are they extending or deepening the relationship. That's really what they're after is not just an immediate close but how well are people coming back and re-engaging with the company and in order to understand that they needed very well designed test and control groups to see if they were really having an impact.
Allison Hartsoe: 11:09 In addition, you said it takes time to build consensus across the organization. Anytime you're dealing with the customer, you really spreading across the organization and you need that buy-in so that somebody doesn't immediately say, oh well it doesn't have anything to do. This is what really happened. You almost do to get everybody on the bus, get them thinking, get them on your side before you run the testing control what might measure the impact. He also said that some of your estimates may need to be refined as you go. So it makes sense to start with conservative estimates and then update as you start to see the progression of where things are going. I completely agree with that. Jose talked about prioritization, and I thought a very interesting insight of his was that large teams tend to get budget restrictions. So once you get critical mass and you start getting your projects running through, you have to be careful about which projects you're going to pick.
Allison Hartsoe: 12:08 And that means you need to pick projects that are going to be more profitable because they need to go back and support your budget ask. So the consideration of Roi impact starts to be very important. Because once you have an impact, everybody comes in and they start wanting to work with you. But having someone who can help sort out what are the high impact projects becomes a really key factor as you want to continue that critical mass. His example was, particularly around credit cards. And in the course of giving his example, he mentioned some interesting things about cognitive biases that customers have, and he cited hyperbolic discounting, which is not a term that we use very often or one that we talk about very often. So just to give you a quick definition, this refers to the tendency for people to increasingly choose smaller but sooner rewards over larger later rewards when the delay occurs when they can get something immediately versus later in time.
Allison Hartsoe: 13:08 So hyperbolic discounting is something that they actively tested for and found that there was indeed a cognitive bias, particularly around credit cards and their tests showed that they were, the customers were more willing to take a product with a short term benefit, then the longterm benefit. And so what they had to figure out was what were the right features that were valued by customers, and that is what they architected their test results designed to do. So an interesting example of the interlock testing, cognitive bias, and hyperbolic discounting all coming together. When a company tries to say, is this going to move the needle or not? So a very nice presentation from Jose Murillo at Banorte. Another presentation was given by Zach Anderson who is the SVP and chief analytics officer at electronic arts, and Zack emphasizes using the right tool for the job, and like a lot of leaders, they moved so far ahead of the market, but they tend to build their own tools in house.
Allison Hartsoe: 14:11 We've seen this before with Amazon. Historically with eBay and Yahoo when Paige was on the show, we've talked a lot about, or we've seen this example a lot when companies are moving so quickly that they aren't held back by what might be produced for everybody in the Martech stack. And what they did that was very interesting is they combined the telemetry of the game and all of the data collection with what would essentially be like a dashboard so that the designers and the people who were actively trying to program the Games could actually see in the game and have the data superimposed on top of the game, and this allowed them to replay certain session, watch certain paths, almost like different UX session. They could see where frustration was happening. They could see when players were dying. They could see when Pete players will way off the map for some reason and then maybe frustrated and with that information they could give them or arm the developers, really the product managers with information that help them know what to fix in the system.
Allison Hartsoe: 15:23 But even more than that, because EA uses a lot of customer analytics, they could cut it in a more refined way by not just all users but maybe by we'll longterm users and where they were getting frustrated versus people who were in the middle layers versus people who are maybe just getting started and one-timers. They can really cut the data in a lot of different ways. Now Zach is very good about talking about their process of how it took them some time to get their metrics aligned across the organization, and he shows four core player metrics that the organization still uses today. One is unique players two is session days, which we oftentimes think about as engagement. How often are you in the game? How often are you coming back? Three is average spend obviously leads very closely to CLV. And the last one is NPS, net promoter score, which is a great way for getting the voice of the customer into the system.
Allison Hartsoe: 16:25 So he talked a little bit about how it was a process to get those metrics sold in house and to get access to data. It wasn't something that they came up with initially. Initially, they came up with about 20 metrics, and the execs were like, nope, can't use that. And so he had found through another mentor, that someone told him that nobody could remember more than four things. So he just hit all of those items and boiled it down to four goals and said, okay, let's just focus on these four things. And I think that's somewhat true. I know for billboards, the rule of thumb I remember from journalism school was, oh, just seven words on a billboard. That's all somebody can capture at a glance. And I do think that's true. So the more than two, less than seven, somewhere in there you've got magic for a, certainly a great place to be, but no one can drive their company by looking at a top level group of metrics that are all equally weighted as important but very voluminous in number.
Allison Hartsoe: 17:25 There's a tendency to look for the tip of the spear and so spending the time to figure out what are those metrics that are the tip of the spear that really drive the business. If this happens, everything else must be true is a worthy exercise. It's a difficult exercise and it's one of the things that we see as a key piece of the maturity curve process. So they're prometrics obviously captures CLV as well as more monetization and customer voice, and those are all really fantastic elements to drive your business around. But then he also talked about how once they set the metrics down, it wasn't like everybody just signed up and started using it. They actually had to fit the rhythm of the business. So they spent a lot of time figuring out, similar to the LA Times story from Alejandro, they spent some time looking at how people were using the data, and they noticed that business owners regularly talk to finance, but they did not speak to customers in the games.
Allison Hartsoe: 18:22 So they set a weekly meeting to start the habit. It's only a 30-minute meetings. So I thought that was very interesting too. It's a short meeting and talk about metrics. Talk about player behavior, and they never come to the meeting without something interesting to say. So they always have an agenda. They always have something to communicate, but they're also making it a two-way meeting. So they're building a little bit of, so what's your session day target? Why are your monthly active users falling? They're prompting the business owner with questions that they can see from the data in order to get that back and forth. And the side benefit was that it thought a lot of the leaders used to looking at the data. And when they did that, they found flaws in the data that only they can see, and they started to care about whether it was good data.
Allison Hartsoe: 19:10 So this back and forth process was instrumental in increasing the quality of the data as well as getting ownership from people who were using the data. Ultimately that became a really nice reward for both parties. And as a result, the core player metrics were set in the team goals, and it became something that the whole organization eventually pivot into and started to drive off. And that was very important, particularly at electronic arts where you've got a lot of companies, a lot of different studios that may be kind of operating as their own city-states. So it's difficult enough when you have to reach across an organization, but when you have product lines that are almost like their own little entities, they really had to work hard to get these groups together. So Kudos EA for making that happen. Now Zach did mention that he noticed adherent to that weekly stakeholder meeting was a great indicator of the maturity of a particular studio and how willing and receptive they were to using the data.
Allison Hartsoe: 20:14 I think that's a very good indicator that anybody can use internally. We all need those stakeholders who are going to sign up to use more data to take a little bit of risk and to trust in the data. And that can be a great way to figure out who's the right person. Cause I've, don't be afraid to cross boundaries. Customer analytics inherently crosses departments. Customers do not care about your org structure, but internally as you're crossing boundaries, you inevitably will upset people, and you have to let the analysts roam freely in order to find those opportunities. You can't keep them just pigeonholed. And he was lucky in that he had CEO backing. So when he would go to ask a particular group or a data, he actually had a letter, a little letter from the CEO that said, please give Zach the data and if you have a question with this, call me.
Allison Hartsoe: 21:05 And so that letter from the CEO helped him go around each part of the organization and crack open that Dataset. So I'm actually did call the CEO, and in some cases, you had to get around some very strict restrictions, but it always allowed him to get to the data, which allowed them to get a bigger and better view of the customer. Because you're just one person and you're not exactly representing the customer. You're representing data that represents the customer. So the process of reaching out across the organization is difficult. You do need air cover. Getting it from the CEO is fantastic and probably one of the best stories I've heard about somebody who was able to go across and do that. It's the same with Banorte as well. They oftentimes talk about how they have to politically work across the boundaries and the nuances more time and effort it takes to get across those different parts of the organization and make everybody feel comfortable.
Allison Hartsoe: 22:00 Do not underestimate that time. So next up, we had a session that was ironically talking about silo breaking, and this wasn't one where we had a particular one person giving a presentation. It was an overall discussion session, and the key question was, okay, what does it take to break down those silos? Everybody knows it's an issue. Should you be hiring certain types of analytic practitioners who are able to do this? And there was a consensus that oftentimes when we're hiring analytics practitioners, we really don't think enough about the soft skills, the resilience and the relationship building it takes to get across. And so again, finding reasons to break down those barriers as a bit of an art. So never show up to a meeting without something to teach people, never let them think it's a waste of their time and always bring in something about, hey, did you know this was happening over here?
Allison Hartsoe: 22:55 You must become a little bit of the representative across the organization because you might have information that affects one part of the organization with another part of the organization. And that's a very valuable conversation. It's not your job to resolve those issues, but it can be your job to highlight the considerations or concerns. The word that was given to this was called Cross threaders, and oh, that was a very interesting description because essentially what you are doing is you're inserting analysts into the business as subject matter experts, but they're cross threading the knowledge for us the business. And in order to get very good customer impact, you've got to have that cross-functional relationship building more cross-functional information and a culture that's it's okay to challenge the data. So he talked about that as a little bit more like matchmaking than ownership. And when you start to see people who can share data uncomfortably, that's a good sign that people are starting to feel okay, they're starting to take some risks.
Allison Hartsoe: 23:57 They're not worried about being called on the carpet because they know it's something that can be fixed and they're going to move on, and everything's gonna get better. So there are tons of reasons that people can come up with to shield the data, but the size of your impact is directly limited to the size of data access. And that comes from Zach at EA. So when you're hiring, think about the people that you can hire who can sell analytics or relationship build and perhaps be change agents in addition to the fact that they're great analysts. I have seen some organizations that just hire that as a specific person, but I think it's great if you can get it into an analyst skillset as well. Not just somebody who's head's down in the data, but you can use both types. It's not like everybody has to be a relationship builder.
Allison Hartsoe: 24:44 And later in the day there was a short session by Ian McColly, I'm not sure that I'm saying his name correctly, but he's the founder and chief innovation officer at infinite eight institute. And her session was about emotional AI, and that was somewhat fascinating and horrifying at the same time, and I'll talk more about those at the end, but basically, their mission is to design solutions for thoughtful integration of AI into society. And he made the point, a good point, that people are creatures of emotion, not necessarily logic. They built, as a result, a deep neural net name Providence, and he had a number of applications that they could find to use this deep neural net. What providence does, by the way, it's specifically not in the cloud, it runs on a supercomputer, but it is specifically designed to understand emotional analytics. And at one point he said, why do I need a data scientist if I can tell what people are feeling right now and right there?
Allison Hartsoe: 25:52 That's scary for a variety of reasons. But as he went through what must have been at least 15 examples of emotional analytics, it started to be a little bit like Tom Cruise movie, minority report where they have the precogs and whether it's security for a construction site where they're understanding that a person's intentions are good or bad, or whether it's consumer retail, where you're looking at what will fly off the shelf, understanding how people really feel about an item before they express that emotion verbally. Can view little bit precog related, where is privacy? Do I have a right to think certain thoughts that might flash across my face but keep them private before those are outed and try to sell me something, or you try to assume that I had that intense. You know in a sense there's always a check and balance. There is a superego on top of the ego that helps us control some of the thoughts that we might have.
Allison Hartsoe: 26:52 The idea of AI picking up on those micro-moments and surfacing things that we should be buying or understanding what we really feel about a product is definitely pushing the edge of what I think is allowable for AI. I'll come back to that at the end, but across the presentation, people didn't talk deeply about AI with the exception of Ian's presentation. Maybe one other one. It was happening at the same time, but they did always come back the idea that they were planning to use AI to make the jobs that they were doing better, faster, smarter, so there's quite a lot of AI present. Even if people aren't talking about it as specifically AI as almost a tool that they're using to do the existing things better. And along those lines, there was Mekno Sinha from who is a VP of testing and measurement at target in a well-structured presentation.
Allison Hartsoe: 27:48 She talked about her challenges or running an active test and learn organization. And this is an organization that's running test after test after test very quickly. And she hit on a couple of things that we had heard before. No more data can lead to more confusion. So governance of the key metrics is very critical at large organizations. Robust test design she believes requires statistical expertise so that you're not coming to improper conclusions and that manual steps have to be continuously automated and can hear AI knocking on the door here. But end to end automation is important to scale AB testing. So they had an interesting way of framing testing scenarios. One they called pivot, which is it didn't work. You can timebox it and move on to the next idea. A second key scenario was I kinda got mixed results. Maybe you should iterate on it, fix and try again.
Allison Hartsoe: 28:45 And the third one was, this is a slam dunk success. We should move at speed to implement and scale. This is an action group. So the result here is push it and of these three different characteristics, whether it's a pivot, an iteration or an action. She said it's important to consider the ratio that you want. They use 40% pivot 30% innerate and 30% action lodge, which is about what they're looking for, what they think is the kind of, they're more star for outcomes in order to make things self-sustainable. She also mentioned a very key concept that I find people don't think about enough until they make this mistake multiple times. And Lauren Hadley from ambition data referred to this in his testing podcast just a few weeks ago. So she talked about the seven question framework, and at the beginning of the questions are the table setting questions, what would you need to be true?
Allison Hartsoe: 29:42 What is your assumption? You can't just say, oh, I want to test this. So there's a lot of discussion in the first question as well as in the second question, which is what are you trying to learn from this task? She said it might even be a two or three-hour session, the cross-functional teams to get those first two down, get all the first voices heard and to really understand if it does x, we will do y or under what considerations. If it does X but not Y will do z. So framing that up and making sure that you come into a test with a very clean hypothesis is the key consideration? So the output of one and two is the third question. What is your test hypothesis? The fourth one is what is your primary decision metric and are there any levers that will impact that measure?
Allison Hartsoe: 30:33 And the fifth is what is your expected lift and the decision metric goal, and six, what are the executional constraints, again, echoed by Warren, and seven, what actions or decisions will you take based on the outcome? So spending the time to round the test and the right hypothesis is incredibly important. Again, no one knows what they want to measure and it takes a little bit of back and forth between the analyst and the product managers or whoever your stakeholder is on the other side to make sure that you're going to set up the test for success. As a result, they did create at target and internal, they call it 10-page manifesto to help keep everybody aligned to these key concepts. I thought that was very well done. Now finally toward the end of the day, we had Richard Bach's who is the VP of data science and analytics at Apex Park Group, and they are very dialed into customer lifetime value.
Allison Hartsoe: 31:30 They're looking at predicting behavior going in the weather looking at different areas of the park that's being used and what I thought was so interesting about this presentation wasn't just the segmentation, but it was the way that you could take the online world of understanding different product categories and the usage or the consumption across different product categories and apply that to the offline world in different physical areas of a park. So you might have a games area and a food area or a waterpark and rollercoaster area, and what they had found was that probably just like any commerce, when people are active across multiple centers, they have higher lifetime by they become better customers, so very interesting connection there between the online and offline world. They also spend more time looking at labor optimization, and this is not the first time I've heard this, I've heard this around casinos as well where they're looking for ways that they can optimize either the volume of staff they have or maybe the success metrics for their staff, and understand what is the right amount of people to have at a particular time.
Allison Hartsoe: 32:50 One thing they found that I thought was very telling and very interesting was that the forecast for amusement parks was more important and the actual weather because the way people look at data look at the weather will determine whether they're going to come or not, they plan in advance whether to go to the park and that makes sense. They also have the very strong multi-dimensional approach to lifetime value. So they've got LTD at the core. But then they're also looking at behaviors within the basket, the product, the brand all around the edges. And I thought that was very interesting because when they analyze that a customer has lapsed, they look deeper to say, is that a person who has a similar frequency to someone else, but also what was in their basket? Are they consuming fewer categories there than before? So is the basket size going down but they're still coming? Is the amount that they're spending decreasing or should we be targeting them with different types of promotions in order to rectify that situation? So I thought it was very interesting how well they were using lots of multidimensionality across the ltd spectrum. To understand specifically what triggered a ball at what time based on the depth of someone's behavior.
Allison Hartsoe: 34:11 So hopefully that gave you a taste of a conference and got some good nuggets. I'm going to wrap up with the two themes that I think we're overriding across the conference. Number one is we're still wrestling with a lot of human factors. So bringing the data together does not solve for this, and it sometimes catches data lovers like myself, off guard, human nature continues to hold back. Logical data-driven innovations are natural biases. Our fear of change, our fear of losing our jobs, our turf wars within the organization create a lot of friction that makes it difficult to be a truly data-driven organization. It creates a strong need in response to relationship build and not just trust the data, but trust the team and trust the people behind it. And this gets me thinking about the second overriding theme that was a little bit more subtle, which has to do with AI. AI continues to push ahead sometimes in very scary ways. The emotional AI development was truly shocking in the myriad of applications that it could be applied to. And there's a lot of conversation around using AI for limited uses, like hyper-personalization, but also to dig into things like causality.
Allison Hartsoe: 35:41 Now when many systems like ours, like democracy exists on faith, and we believe we're being represented, for example, or we believe a doctor has our best interests in mind and not their success ratio at heart, then something like AI has the potential to uproot a lot of these beliefs and lead us into a world that we don't fully understand yet. But I think there's an opportunity to find a balance to have enough human nature to slow us down, to consider the second and third order consequences behind each innovation. That means to deeply ask is this a good thing? And then what? And then what? But not so much that every little data innovation becomes a threat to someone's job. So if your job sounds like a scene from the movie office space, then maybe some innovation is acceptable.
Allison Hartsoe: 36:39 After all, we're not faxing anything anymore. Consider for a moment that our role is to do what AI never can, to think deeply and humanly to guide the choices that collectively make our world a better place. And of course, that is also the heart of customer centricity. So there you have it — my summary of the 2019 spring CAO Conference. If you'd like to connect on this topic or others, you can always reach me at email@example.com, @aheartsoe on Twitter or connect with me on LinkedIn. As always, everything we discussed is that ambition, data.com/podcast. And remember when you use your data effectively, you really can build customer equity. It's not magic. It's just a very specific journey that you can follow to get results. Thanks everyone.
Allison Hartsoe: 37:37 Thank you for joining today's show. This is your host, Alison Hartsoe, and I have two gifts for you. First, I've written a guide for the customer centric Cmo, which contains some of the best ideas from this podcast, and you can receive it right now. Simply text, ambitiondata, one word to, three, one, nine, nine, six, (31996) and after you get that white paper, you'll have the option for the second gift, which is to receive The Signal. Once a month. I put together a list of three to five things I've seen that represent customer equity signal not noise, and believe me, there's a lot of noise out there. Things I include could be smart tools. I've run across, articles I've shared cool statistics, or people and companies I think are making amazing progress as they build customer equity. I hope you enjoy the CMO guide and The Signal. See you next week on the Customer Equity Accelerator.
Key Concepts: Customer Lifetime Value, Marketing, Digital Data, Customer Centricity, Long-Term Customer Value, Marketing Leaders, Analytics, Creativity, Product Development, Audience Research
Who Should Listen: CAOs, CCOs, CSOs, CDOs, Digital Marketers, Business Analysts, C-suite professionals, Entrepreneurs, eCommerce, Data Scientists, Analysts, CMOs, Customer Insights Leaders, CX Analysts, Data Services Leaders, Data Insights Leaders, SVPs or VPs of Marketing or Digital Marketing, SVPs or VPs of Customer Success, Customer Advocates, Product Managers, Product Developers