Allison Hartsoe: 00:01 This is the Customer Equity Accelerator. If you are a marketing executive who wants to deliver bottom-line impact by identifying and connecting with revenue generating customers, then this is the show for you. I'm your host, Allison Hartsoe, CEO of Ambition Data. Each week I bring you the leaders behind the customer-centric revolution who share their expert advice. Are you ready to accelerate? Then let's go! Welcome everyone. Today's show is about how to create happy customers through privacy compliance. And to help me discuss this topic is Jodi Daniels. Jodi is the CEO of Red Clover Advisors, a firm that specializes in simplifying privacy requirements to gain customer trust. Jodi, welcome to the show.
Jodi Daniels: 00:52 Hi, it's so great to be here.
Allison Hartsoe: 00:54 So this is obviously a new topic. I doubt you went to school to study this, but how did you get to this privacy area of the industry in the first place? We'll tell us a little bit about your background.
Jodi Daniels: 01:06 Sure. So I had a, well first half of a career, quite honestly in accounting and finance. So I went to undergrad for business and accounting. I started off at Deloitte in the audit practice, found my way to finance and strategy and home Depot. And then the second half of my career started at Cox enterprises, a big media conglomerate. I did a strategy work for them. And then, it got into digital marketing where I created a targeted ad network for utter trader.com, so I stopped you for cars. If you were looking for a Toyota, I might help you buy that Toyota or encourage you to come back and look at it. Or maybe a Honda. And from there, it's really where the online advertising industry honestly banded together to try and prevent government legislation. Came out with a program, it was called ad choices, and I was responsible for our compliance, and at that point, it was autotrader.com, Kelly blue book, and a variety of other brands. And that was my entry into privacy. And from there I really liked it and thought this was interesting, and I kinda jumped from the marketing side, full pledge into privacy. So I built the privacy program for Cox automotive. Veni went to Bank of America with their digital privacy expert, and then I opened my own firm.
Allison Hartsoe: 02:19 Nice. That's a great progress though. Can you tell us a little bit more about the ad choices program because I'm not sure everybody understands or really knew what that was?
Jodi Daniels: 02:29 It's very true. A lot of people don't until I tell them, so if anyone's ever noticed in an ad and the ad that you've seen online in the upper right-hand corner, there's a blue triangle often in that ad, that's a signal that is a targeted ad. Meaning like I was looking at blankets over the weekend for my couch, and now I'm being targeted everywhere with my probe like it, and if I looked in the upper right-hand corner, I'd see ad choices. I could click that, and it would tell me my ad was targeted, and if I wanted to, I could opt out in what happens. There's different experiences, companies doing a little bit different, but in theory, what's supposed to happen is I can click, and then I can see the different companies. That might be part of that whole process of making that targeted that happen.
Jodi Daniels: 03:15 And I could opt out of all of that. Um, the challenge quite honestly is most people have no idea what company a is compared to company 23 that might be listed, and you could have one company to 50 companies. Most people then just say, well, I'm just gonna opt-out of this whole thing because I don't know the difference — that ad choices. And you will also see in the footer the words ad choices on the website and that to signify he, this site, we use data to target and into collect. So now you see it often in the footer of a website and then the privacy notice and kind of in the ad that's supposed to be the triangle to help educate the average person about the ads.
Allison Hartsoe: 03:51 And what happened on the company side. So let's say that you opt-out of ad choices. Does Google handle that automatically for you, or do you also have to purge your database?
Jodi Daniels: 04:01 That user doesn't have to do anything. It's all on the companies and all the advertising companies that there are oodles of ad tech companies that make these targeted ads happen. And when you click the opt-out, it actually dropped by another cookie on your site that says don't track me. And then all the other ad tech companies kind of listen to that signal. There's the private industry database, and the magic happens on that side, but you, the user is not supposed to have to do anything else.
Allison Hartsoe: 04:26 That's hilarious that it's another cookie.
Jodi Daniels: 04:28 It's how like when you opt-out of the phone, yeah. LL because it's all based on the, if we has to have another cookie that says don't read that cookie. So what happens is if you then go and purge your cookies.
Allison Hartsoe: 04:40 You purge the do not track.
Jodi Daniels: 04:41 You're back in the targeting pool.
Allison Hartsoe: 04:44 Ah,
Jodi Daniels: 04:44 And you have to do it again. Yeah. It's not a perfect solution.
Jodi Daniels: 05:31 I think it's going to feel to some companies in these initial days an operational headache when the companies then realize the longterm view, they'll realize this will be a win for both them and the customer. It's been a little bit of the wild wild West, and with the amount of data that we have and how it's collected and consumed and shared and right, there's oodles of data that we have like quintillion bytes of data that's been produced and companies can honestly do whatever they want with it. But that's been a bit of a wild, wild West and consumers are catching on. Well, I'm not sure I want you to do that with my data. Well, maybe I wouldn't have given it to you. Had I known you were going to share it, sell it, package it the way you did. Maybe I'm okay with you following me and serving me a targeted ad for a blanket, but maybe not for the health information that I just clicked because it actually wasn't about me, and you've sorted about me. I was just looking it up because I read an article and that was interesting, and the computer can't figure out all the time that it was me. And especially in the early days it was just computer one, two, three look up this item, we're going to serve a targeted ad, and then I was profiled with that, but I might've been missed profiled, and the user had no control and no knowledge of what is happening.
Jodi Daniels: 06:44 And that's just in the advertising world. And as we move into the internet of things on smart TVs and cars and more sophisticated uses of the internet and everything, it becomes more complex where there really needs to be a little bit more reigning in. And when anything is new, it's different. It's changed. It takes some adapting to get to the new norm, and I think that's where we're going to ultimately get to is the new norm.
Allison Hartsoe: 07:08 Yeah, like a cultural change. I see these articles on the news all the time that somebody was seen on a nest camera, and there was actually a pretty interesting story about a 92-year-old man who is suspected for killing his wife's daughter who was in her sixties and everybody's like, Oh, that's impossible. That couldn't possibly happen. But the nest camera basically picks him up walking into the house at the time of the murder, and you sit there and say, well, is that the new normal? Is that where we've gotten to where all this IOT information can be used without your consent?
Jodi Daniels: 07:44 Yeah, it's a sticky situation because the math situation from a law enforcement, do you want it to be available to you? But on a personal side, maybe not. And where do you draw the line? Where's gray? And as we move more into artificial intelligence, and again with more IOT, it gets more complex when we start having this intersection also with data ethics, what's okay to use, what's not okay to use of it. It has an underpinning of privacy really as the core, and you start asking the question, well, who's data is it? Is it my data? Is it the company's data? Is it the person who's home though? Right, right. Who owns that data in? What choices do I have, and what should you tell me?
Allison Hartsoe: 08:19 I love that you mentioned ethics because I think, and correct me if I'm wrong here, but it seems like the California law is maybe the first foray into some government regulation of ethics, which is a little bit of an oxymoron, right? You don't always want the government to regulate ethics, but I think it's important when there's a lot of chaos in an industry, somebody's got to put something down to say, here's what we're going to comply with. Is that the first place we're seeing this? Is that fair?
Jodi Daniels: 08:46 I would actually say that you have with HIPAA and the and there's some financial privacy laws. It's called Gramm-Leach-Bliley act GLBA, so your bank and your credit card has to comply with those as well. There's a lot of compliance in there about how companies can and can't use data, but there's a little bit of the ethics in that regard. From a California law perspective, the new CCPA, it's very focused on telling me what you're doing with data. So a company has to now tell me what is happening with this data. I'm collecting XYZ, I'm using it for these purposes, and I'm sharing it with these kinds of providers, but we've always had privacy notices. The fees are going to be much more detailed.
Allison Hartsoe: 09:22 So what you said is that the HIPAA and the GLBA compliance, those were about how companies could use it but not about for the consumer about what they had rights to with the data. They were just required to tell that they are, they have the data, but the CA TCPA is taking it to another level. Is that what you're saying?
Jodi Daniels: 09:39 It is. And for anyone listening, there are a few rights that you have from the financial piece like you could opt out of your bank sharing it with third parties. So there's a few but not.
Allison Hartsoe: 09:49 Not easy to do.
Jodi Daniels: 09:50 So I don't want anyone listening thinking like, no, that's not true. There's a few, but not nearly to the extent that CCPA is. So I would certainly feel comfortable saying CCPA is bringing honestly GDPR light individual right to the United States, and that is for the first time. So at a state level it is the most comprehensive privacy law at the state level and really elevating almost all companies to have to do it across the board because operationally it's very hard to say, well, I'm just going to treat my California people this way and all 49 other States. You don't have a lot.
Allison Hartsoe: 10:20 Yeah,
Jodi Daniels: 10:20 No, it's not really going to be very customer friendly that way. If I, if I already called up and said charity, I'm sorry, I can't really honor that because you're from Georgia and there's a law there.
Allison Hartsoe: 10:30 It's not like a bottle return.
Jodi Daniels: 10:33 It's not, I'm not going to get my 5 cents like I did growing up into that again. So with that being said, the California piece, because of those individual rights, I have to tell you what I'm doing, and I have to give you some choices about it. But as a company, I can kind of still do it as long as I can unwind what it is I've done. Does that make sense?
Allison Hartsoe: 10:52 Yeah. Oh, that's interesting. So I have to tell you what I'm doing, and I have to give you choices, which I have to comply with if I give you those choices. But essentially, the burden is still on the consumer to understand what those choices are, what the impact is, and whether they have additional rights behind that.
Jodi Daniels: 11:10 Yup. The company can still proceed as it wants to, which is very different from GDPR, which I'm sure everyone who's listening here has heard at least something about. There you have to really think first in advance. You have to have a lawful basis to be able to process the data. Under the California law, you don't have to have that same lawful basis. Now there are a few exceptions if you're processing data for a minor, and it qualifies for a sale of data. That's like very little minute buckets, but for the majority of companies, you just have to tell me what you're doing. You have to secure the data, and you have to give me my rights.
Allison Hartsoe: 11:43 So what are the choices that companies are offering or what are they required to offer, I guess?
Jodi Daniels: 11:48 Sure. One of the big ones is the ability to access, tell me what you have on me. I want to know the kinds of data that you have by kind of category, and the types of service provider or the type of other companies you've might've shared it with. So give me an idea that you're collecting contact information, and you've shared it with service providers, or you've collected my personal information, and you've shared it with asset providers to target me.
Allison Hartsoe: 12:12 You only have to say the type of company, you don't have to say the specific companies,
Jodi Daniels: 12:15 you don't have to list the specific company. And I think that makes sense. If you think about some of the large companies, again, it would be kind of meaningless to the consumer if I had cause some companies work with thousands of companies and think about all of the service providers that they might be working with or all the different types of other companies. That would be I think a little unwieldy to list every single type of company. Like the actual name of every company. It wouldn't be as meaningful for the end-user if you were to look, you don't know company A versus company B, so you have to list the categories and the kinds of companies that are there. So I could go to a company and say, tell me what type of data you have on me. And now as the company, I have to go in my organization and be able to figure out, Oh gosh, this Jodie person, she wants to know all the data we have on her. What on earth do we have? I have to find it in my marketing. I have to find it in my transactions and my CRM. Maybe I have something in social. What kind of data do I have? And then from there, I might say, I want you to delete it.
Allison Hartsoe: 13:08 The part about finding the data you have on me, I mean that's easy if you have a fairly unique name like mine, I mean there's only one other Allison Hartsoe in the entire United States, but what if I have a name like Michael Smith or something that's very common? How on earth do they know that they're revealing the data that just applies to that one person versus somebody else with a similar name that just moved and happened to be in your area?
Jodi Daniels: 13:32 So that is a very good and challenging question for companies. They have to come up with some type of a validation process to make sure that the Michael Smith that requested it is the same Michael Smith that getting that data and not getting somebody else's Michael Smith data. There's a variety of different ways companies might consider doing this. It could be if you had an account, you log into the account, maybe I send you a festival message or through the account or where you can request that access. It might be another email, like a double verification email. There's also some verification services at companies that what they do, they do verification, and some of those companies are starting to get into the privacy space to be able to help support your kinds of needs. If it was in the digital space, I might be able to provide the cookie. Michael Smith might have the cookie string, and I can hand the cookie string over, and the company might be able to match to the cookie strings. So those are a couple of different ways, but validation is absolutely needed and important.
Allison Hartsoe: 14:27 Wow. I can see how complex that gets. Okay. You were saying the third part was about deletion.
Jodi Daniels: 14:32 Yeah. So one of the big ones is, I might ask you to delete my data now. CCPA, compared to GDPR, is a little bit different. There are a variety of reasons I might need to keep that data, and I can't just honor it completely. Yeah, I need to keep my transaction information for financial purposes, for tax purposes. I might need it for a business purpose. I might need it for a legal purpose. However, let's take marketing data. One might argue I don't have to keep marketing data. That one is an area where that's just a nice to have. So if Jody came along and said, I want you to delete my data, and you have me in this CRM system, you've put me through Facebook matching, you've sent my information to some agencies, you'd have to find all of that and delete it.
Allison Hartsoe: 15:15 What about areas where it's a little bit of a crossover? So like let's say, I'm allowed to keep my transaction information, but then I use that transaction information to calculate customer lifetime value and create various customer profiles for which I then do marketing. Is that something you can keep?
Jodi Daniels: 15:34 Yes, so that's a good question. You'd have to really analyze how that works. If I asked you to delete it, I might come to the conclusion I could delete, or I could keep the transaction information for financial purposes, but I might not be able to continue to use it if this person requested me to delete the information. Because if I've requested to delete, that means I don't want you doing all those calculations. I don't want to be a part of your system. But if the company says, well, I have to be able to prove that I made a sale because when the, if the tax, if IRS ever came and needed to audit me or when my financial statement auditors ever come and I have to be able to prove my sale, I might need to keep that information, but one would argue I don't have to keep that information for me to be able to do all the analytics. However, you could anonymize the data. So if you anonymize or aggregated the data and you took out the personal features, then that would be a situation where you'd still have access to the data, but you would not have the personal identifiers with them.
Allison Hartsoe: 16:28 That's tricky because I think we've all had situations where you anonymize, or you aggregate. I remember a particular example at a company I worked at, and they had a survey, and everybody filled out the survey, but when you sliced it in such a way, you could see that this person was in this geography and had this role and therefore the answers must pertain to that particular person. So there was a wave, and through anonymizing the data that you could still pinpoint someone. Yeah, by the lack of information on others.
Jodi Daniels: 16:55 It's not foolproof, and you have to certainly think about the anonymization measures and is it, do you have enough to truly anonymize? All of those are absolute factors.
Allison Hartsoe: 17:04 Okay, so let's come back to what companies are doing, and we started talking at the beginning of the show about how they could take action to create happier customers or at least customers that felt like they were being cared for through the privacy regulations. I mentioned there's a lot of resistance from companies, but are there examples where companies are taking steps in the right direction?
Jodi Daniels: 17:26 Yup. Well, a lot of companies are a little late to the game. I think a lot of companies thought that this California law for sure was going to be delayed, but it's not. The governor actually just late last week signed into law the rest of the clarification's and amendments. So it is here to stay and especially what's happening on the B to B side, if I may say so, is that a lot of companies have a variety of contract amendments and to do business with each other, they have to be able to sign that amendment. If I'm a service provider, if you were a company, I'm often going to receive a contract that says I have to comply with these different privacy regulations. It might mean there's harm done to honor individual rights. Here's how I'm going to protect your data. I'm not going to use it in any other manner.
Jodi Daniels: 18:07 I'm kind of simplifying, right? But those are the types of things. And so that new company has to be able to sign off on that. So you have a bit of a cascading kind of inner web of companies all pushing each other to comply. And then you also have some companies that are very forward-thinking, B to C as well who will leave. This is the right thing. They're making their privacy notices more dynamic. You're seeing more summaries at the top. More hyperlinks, well I'm gathering data is explains why I need this interesting example. I was purchasing some new kind of project management software, like a Trello or the sauna I use Monday, and when I was looking at the features, tell me all the wonderful features. One of them was privacy, and it explains here was what we're doing from a privacy point of view, and you know, I pay a little bit more attention to that, just what I see all day long. So I'm very cognizant of where I notice what those privacy features are, and I think you'll find more and more of that where that will be the new norm. Really explaining not in four-point fun and buried at the bottom, no privacy footer, what it is that's happening with your name.
Allison Hartsoe: 19:10 So it's almost a competitive advantage for those companies.
Jodi Daniels: 19:14 It is absolutely a competitive advantage. And I have the number of marketers who are really realizing we've spent so many years on so long pull lading, this data and oodles of companies and they're certainly all there, but it's almost like we're going kind of back if you will, to the human interaction one-to-one. And instead of just looking at the numbers of emails or the numbers of data points, it's one to one personalized marketing where you're wanting to talk to an actual human and that you're realizing that the, you're trying to talk to a person and not just a number. And so marketers, I think those that realize this personal component also realizing that privacy is a big part to that and they're looking at it from a trust angle. I mean all the conferences I go to, you're hearing marketers especially talk a lot about trust. And to me, that's the whole essence of these privacy laws. And it's because consumers feel like we haven't had their trust. And there's been some studies by out to the internet, which is a leading provider on research and they're finding that Americans are not feeling so warm and fuzzy with all the targeted marketing and all the ways that data is being used. They want more regulation, they want more knowledge about what's happening, and you're the companies that realize that are certainly going to have in it competitive advantage and win.
Allison Hartsoe: 20:28 Do you think it's, are we though walking into a bit of a rat's nest where the marketers say, okay, I'm going to default to telling you more about how we use your data in the hopes that you won't back out, and you'll realize that we're being very careful with your data and people will feel more comfortable, and therefore they'll allow the marketing to progress. But at the point where they decide that, okay, that's too far and they get out of the system, and then they realize that, Oh, maybe that wasn't what I wanted and then want to get back into the system. Are we building this complexity of I'm out of the system, I'm in the system, I'm back and forth. I'm all around, and it's just really, really hard to comply.
Jodi Daniels: 21:07 I will have to say that time will tell to see what will happen. I think we're a little too early to know operationally if or even if that's what people are going to do. What I would say is I think that the companies who can show the right value upfront and are just consistent with providing the consumer, whether you're B to B or B to C exactly what it is that they're looking for. Then I think that they will knock out is the companies that inundate people all the time with messages that aren't relevant to them or the consumer or the customer feels like you've done something that I don't really like. You've lost my trust. I mean, let's think about some of the smart TV's that share J Benton and us or the location tracking on all kinds of apps. The head of business with location tracking, right? You've lost my trust in that regard. Instead, if it's a scenario where I'm expecting you to use the data to keep serving the relevant messages and you keep serving me references that aren't relevant, well, then you kind of already lost me, and it wasn't really my privacy. It was because they didn't serve me relevant messages.
Allison Hartsoe: 22:03 So they just need to stop broadcasting. And when they personalize, they have to realize that even if they are personalizing, there's a certain gentleness to it that needs to be taken into account.
Jodi Daniels: 22:13 Right? Again, I think it goes back to that whole personalized human interaction, right? And if it's all about, you hear so much around the story and making sure you're putting the customer first, and that's true. Marketing shouldn't be about a secret. It shouldn't be I have to get all this so that I can do it in the background and try and find you. It should be I found you. Now let me show you why this was important and relevant to you. And if I've convinced you that my product and service is something that you need, the way I deliver it should be consistent with that. Ty, this new should be part of the fabric of your company. It should just be part of your culture. It's not about being sneaky and trying to cover it up or being built sneaky, but now I've just closed this. It should just be this is the right way we're going to do things. There's one company, a big huge company and they had once told me their mission was to deliver stellar customer service, and to deliver stellar customer service they felt like privacy had to be a part of that because when they're interacting with a customer or when they're sending messages to the customer, whether it be marketing or when you called for customer support, it needs to be like the customer is first and never that they felt like it was sneaky or being tricked.
Allison Hartsoe: 23:17 Does that mean that these companies are applying, we used to call it the sunshine laws in journalism where you would apply this test that said, if you shined a light on this and everyone knew, would you be embarrassed? It sounds like the privacy regulations are becoming, as they become part of the culture or the fabric of the company as you said, the organizations might be applying a little bit of a sunshine law to it.
Jodi Daniels: 23:40 I like that. I hadn't heard of the sunshine law before, but I would say that if you were to design slash sold on your company, would you be okay and proud with what it is that you're doing? Not to share your proprietary secrets, but are you doing anything that a customer might find a little shady, and if you are, that's not the right practice, and customers are going to start getting through that.
Allison Hartsoe: 23:59 Does it get more complicated, Jody? Like, let's say that I've got a group of customers that's offended. Maybe it's a generational difference, like one group of customers doesn't care, and another group of customers is like kind of care, and then a third group of customers really cares. Do you have to default to what that last group of customers really cares about even if all the rest are kind of with it? How does privacy evolve culturally?
Jodi Daniels: 24:23 I love that question. And obviously, you want to go to what the law is going to require, so that's going to be your first basis. And then from there, I think you have to evaluate what size group you currently have as a customer base, who your product and service is really marketed toward. Is the not care the growing group? But it's that whole, I think the pros and cons and cost-benefit analysis, right? Where's the revenue? Where's the size of the base? And who's going to be more vocal, what's the future? And then, like any other strategy, you would layer this piece on top. And if I like your journalist comment around the sunshine law. If a journalist was to come and cover the story, you would want them to find all the amazing features, attributes of your company. You wouldn't want them to come and find you're doing this thing that's not quite right.
Jodi Daniels: 25:07 So I think to your question of if you have all these different generations and different needs, you have to like not too different I think from should I have what your products and features would be? Right? You're probably designing your products or services for those generations and how different they may or may not be. This privacy piece needs to be layered on top, just like any other risks should be. But if a journalist was to come and they signed a light, would it still be okay? I love that philosophy that you have, and I think you can layer it on top of it with the privacy strategy and kind of risk analysis that you'd have to do.
Jodi Daniels: 26:00 Well, the very first thing is all companies really have to know what data they have, how they're using it, and who they're sharing it with. That's the first part because agar policy should match what your company is actually doing, and you can't decide how you might need to change it if you don't know what it is you're doing in the first place. So that would honestly be the very first place is you have got to do what we call kind of as privacy professionals, a data inventory or a data mapping, and a generally done at a process activity. Think email marketing. How do I get emails? Where do they come from? I put them in what system, and I send them to who. How do I share them with anyone? Who are my systems, who are my vendors? What are those people doing with my data?
Jodi Daniels: 26:38 Understanding all of that as an example would be the very first part. From there, kind of depending is you're B2B or B to C, you would probably want to look at what is your philosophy. So kind of go into that strategy piece. Do you want to be front and center? Privacy is important to us. We'll put that out there as a feature. Probably want to work with the marketing and communications team, or maybe even if it's an online product or you're exploring it online, thinking about how is that communicated in the product and feature set on your website. So really thinking about it from that point of view. Otherwise, maybe you need to think about the product team and the product development team and how you're going to get, if there are any types of changes into the product team, maybe you want people to opt-in versus default.
Jodi Daniels: 27:18 So you've got to work with the product team to figure out how do we make that change or if it's a website, right? The website team to figure out how you make that change. This would be kind of a big two to three starting areas, and then the other is going to be those individual right. How are you going to honor that as a company? And to me, that's really a big marketing communication, branding opportunity. If Jody wants to come and make a change, what's that experience gonna be like? It's really a consumer experience. Who do I talk to? Do I commit a form? How often are you going to monitor that? How are you going to communicate back with me, that's a unique user experience. Maybe you're going to change my mind, and I'm going to love it so much that it's okay. You can keep my information, or maybe I want to just know what you have on me. You give it to me. I feel so warm and fuzzy. I'm copacetic. I don't want to delete my data. I'm a new happy customer. Let's just flip my user experience going to be with the company. No different than when you call maybe issue to accompany. How's your customer support experience? I think you should be the same.
Allison Hartsoe: 28:14 You know, that's perfect because it is customer service, customer support right there. And I had never thought about it as a full-blown experience, and yet I've been on the other side of it where I literally had this problem with a credit card, and I just wanted to know more information about the transaction and instead they took it to the extreme and canceled the card and gave me a new one. And I was like, no, I just wanted to understand. I didn't want to go through this huge change. And so I think that's a perfect opportunity to really win customers to a higher level of loyalty. Privacy is a loyalty strategy, perhaps.
Jodi Daniels: 28:48 Yes. Another easy example that honestly, companies can do today without thinking about any of the other privacy laws. It's thinking about when you opt-out of email, there are some emails that I am forced to. It's either all or nothing, and actually, I would like the email, but just not five a day. Maybe like once every two weeks, once a month would be great, but I'm forced to do one or the other. And there are some companies that recognize consumers want choices. It's not an all or nothing, so when you choose update your preferences or unsubscribe, you're given a list, and the company even really has it been of branding and tone of voice into the landing page where I can opt-out and if not the upped out. They're using their own phone and giving me choices that make sense and kind of mirror the brand, and I don't have perfect staff, but my unscientific poll here is going to presume that they save more people instead of just fully unsubscribe them because they've given me a good experience.
Allison Hartsoe: 29:43 I think something we all underestimate and should not be underestimated. Jody, this is obviously your area of expertise. If people want to reach you to ask more questions about privacy, or have you simplify it for them, how can they get in touch with you?
Jodi Daniels: 29:58 Absolutely. So you can go to my website, red Clover, divisors.com. You can also just send an email info at red clovers, red Clover advisors.com, and I'm also on social media. On Facebook. You can find red Clover advisors on LinkedIn, you can find red Clover advisors on four my name's Jody Daniel would be delighted to help any of you.
Allison Hartsoe: 30:17 Wonderful. Well, as you can tell, Jodi is incredibly well-spoken. I highly recommend reaching out to her if you have any questions about privacy and compliance. Jodi is such a pleasure to talk with you. As always, links to everything we discussed email@example.com slash podcast thank you so much for joining us and sharing this wisdom.
Jodi Daniels: 30:35 It is my pleasure. Thank you for the opportunity. I'm so glad that we're able to talk about privacy and honestly, it's kind of next step.
Allison Hartsoe: 30:42 Yeah, it's really revealing. It's very interesting. I loved it. So remember, everyone, when you use your data effectively, you can build customer equity. This is not magic. It's just a very specific journey that you can follow to get results. Thank you for joining today's show. This is your host, Allison Hartsoe, and I have two gifts for you. First, I've written a guide for the customer centric Cmo, which contains some of the best ideas from this podcast, and you can receive it right now. Simply text, ambitiondata, one word to, three, one, nine, nine, six, (31996) and after you get that white paper, you'll have the option for the second gift, which is to receive The Signal. Once a month. I put together a list of three to five things I've seen that represent customer equity signal not noise, and believe me, there's a lot of noise out there. Things I include could be smart tools. I've run across, articles I've shared cool statistics, or people and companies I think are making amazing progress as they build customer equity. I hope you enjoy the CMO guide and The Signal. See you next week on the Customer Equity Accelerator.
Key Concepts: Customer Lifetime Value, Marketing, Digital Data, Customer Centricity, Long-Term Customer Value, Marketing Leaders, Analytics, Creativity, Product Development, Audience Research
Who Should Listen: CAOs, CCOs, CSOs, CDOs, Digital Marketers, Business Analysts, C-suite professionals, Entrepreneurs, eCommerce, Data Scientists, Analysts, CMOs, Customer Insights Leaders, CX Analysts, Data Services Leaders, Data Insights Leaders, SVPs or VPs of Marketing or Digital Marketing, SVPs or VPs of Customer Success, Customer Advocates, Product Managers, Product Developers