If you want to build customer equity, follow the steps to progress through the digital maturity curve. In this episode, host Allison Hartsoe goes in-depth on the key traits of first two stages, plus the Pit of Reporting Despair (the real budget-buster in many organizations). She identifies critical blockers that prevent you from moving on, the exit criteria for each stage, and actions that individuals must take.
Who should listen: C-suite, Marketing, Analytics, Digital Transformation, Customer Transformation, Data Insights, Customer Experience, eCommerce, Digital Marketing, and Customer Satisfaction professionals.
Key Concepts: Organizational alignment with digital data, customer centricity, digital maturity, data metrics, customer signals, digital data v. customer data, data reporting, customer experience v. customer centricity
This is the Customer Equity Accelerator a weekly show for marketing executives who need to accelerate customer-centric thinking and digital maturity. I'm your host Allison Hartsoe of Ambition Data. This show features innovative guests who share quick wins on how to improve your bottom line while creating happier more valuable customers. Ready to accelerate? Let's go!
In today's episode of the Customer Equity Accelerator. We're going to lightly dig into the first section of the maturity curve, the customer-centric maturity curve.
Stage 1 you might remember is Foundation. Stage 2 is early insights and finally for you Princess Bride fans, the pit of reporting despair. Now in each stage, I'm going to talk a little bit about the fundamental framework of people, process, and technology. But I put a little twist on it and I add some different definitions so let me cut to those right now and then we'll dig into the first stage. So, in addition to people, process, and technology I always think that, overlooks a key element of leadership and leadership is important when it comes to getting organizations to drive by the data. So, our first element is leadership as measured by organizational alignment around the customer portfolio. And second, when we talk about people I oftentimes think, that you know people and skills, yes that's important but what really matters are the actions that people can take, the leverage they have. So, we're going to talk about people's actions as measured by their behavior which is the usage of tools and the output for customer-centric decisions. Then in the process, it's pretty much what you'd expect it's as measured by the ability to execute optimizations around the customer. And technology is measured by the agility and enablement of customer-centric business goals.
So, with that clarification in mind let's talk about the first stage.
Now for stakeholders’ Stage 1 is foundation. The key question here is do you trust your data? Who's asking that key question? Well, oftentimes it's an individual or small team which might be in different business units and they're usually around the manager level. The key piece in this stage among our leadership people, actions and process, and technology groups is technology. You must be able to hear the customer through the capture of rich robust data. And there is likely no or very few monitors in process looking for them; the quality of that data. So, there's nothing saying whether your site is up and connecting that to the data that comes in and there's nothing that says whether the tracking is in place or the tracking is correct. So, technology is the A Number One piece of the foundation's state.
In leadership, they're not sure what's possible. So, there are new data-driven goals in place but there is usually someone who has a spark of vision. And it's that spark that kicks off the foundation stage and starts to move the organization higher because someone might not be able to articulate what's wrong, but they know that a better future solution exists. Around people's actions what you have in this stage are a lot of one-off reports. Questions like how many visitors did we get to the Website? How many visitors came through paid search? How many page use did they get? Really basic questions that are not leveraging the power of the data. And further the data may not be reliable, so people are probably just checking a box with that information and saying, "Yeah, I looked at that."
Around process's, if they exist at all it's mostly: How to set up a channel? Like, how to set up paid search? Or how to set up a campaign? Or how to start a social media campaign? It's all very basic how to stuff and nothing is really geared towards the measurement or the optimization of the information. So, our key activities in this stage are around preparing websites, mobile, and other channels for future data analysis through tagging and it's that technical tagging of campaigns and properties that allow us to start to integrate social media and paid advertising. All that starts to come together and that's all a technology task. Now the critical blocker to overcome in this section, which you might have guessed is the time that your I.T. department or your technical team must dedicate to this process. It's not a process that has high ROI. It's not easy to prioritize. So, you really need some leadership in place to say yes this is what we want and here's where it's going to take us long term. The exit criteria for this stage is when you can collect fundamental customer signals and use it to start to frame your analysis.
Now let's move on to Stage 2. Stage 2 is sometimes, I call it “pockets of insight.” Sometimes I call it early insights but the key question at this stage is did you make at least one decision based on your digital data this month. Notice I'm saying digital data not customer data because at this stage people are too immature for deep customer information. The stakeholder who is asking this question is generally somebody related to the brand or the product lead. It might even be a campaign marketer but we're still pretty deep in the organization in terms of a senior manager or director asking this question. What changes at this stage is that we're no longer so deep on technology. Instead, we're heavy on people's actions; because with the tech in place the adoption of the tools that supported the tech begin. So that might be your google analytics system or that might be your paid search system. Once we have those tools then we think about blending descriptive data reports together by descriptive I mean things that said what happened. We're not getting into predictive analytics at this point we're just trying to understand where we are. What happened. So, we are looking for more results an impact that's oftentimes heavily desired, but it's held up by the misalignment of goals and metrics from a higher up in the organization. So, what happens are small sometimes sporadic optimization wins occur based on a particular campaign activity or based on a particular channel. So, we're going to optimize our paid search based on clickthrough is would be a common application. The use cases can be rolled out at this point which is a technique that we use to frame anonymous data as people and use cases are a really great way to pivot the data that's coming out at this point into a more customer-centric perspective.
The second piece behind people's actions is the process. This stage is heavy on both of those pieces of that process comes right behind it. So, process means we're just starting to get some baselines and targets. We're just starting to understand what good looks like and that's driven by the campaign governance and the tagging governance. So, when I set up my technology I now have standards that say this item should appear on this variable and it should always come through this way. A process is about bulletproofing what is appearing and making sure that it's consistent. It's like the pots and pans in your kitchen. You know you can cook without pots and pans you can cook without the proper utensils. But when you have the right pieces in place it is so much easier to get great insights. That's what process is.
After process, we have technology which is where the tools and vendors start to increase, and they support the activity measurement, so you know maybe you decide to roll in a new tool that helps you analyze or helps you release social media better. Or maybe you start using an alert system or an SMS system. That's all part of technology. And we start to increase more data coming through that inevitably connects through to the core reporting system which is oftentimes attached to the website. So, when you think about Google Analytics don't think about Website tracking think about anything that can be digital anything like a kiosk or I've even seen people attach tracking to a coffeemaker through one of Google's data loading systems and people swiping their ID keys. So, there's a lot you can do to turn data into a digital format. But everything that happens at this stage when we talk about core reporting systems or digital analytics we're really talking about the universe of everything that is digitized and connecting that into one central system which is often resident around your Web site. So that is technology.
Leadership is our last section and here are the goals may be present, but they're just not aligned to metrics. We might even have as much as a central data strategy committee that can be formed but they're usually not thinking about goals. They're oftentimes thinking about how to manage budget and spending priorities. So, our key activities in this stage revolve around framing the data by customer behavior even when they're anonymous, breaking a few data silos down within the department and that can be some integrations that were perhaps left over from stage 1 or new tool integrations and some cross-business data sharing which is usually manual. But you know maybe you get some call center data or maybe you get some long-term conversion data or return data or something else that might come from another part of the organization that allows you to get a bigger better picture of your initial analytics data. The critical blocker to overcome here is really budget because remember in the first stage we had no ROI and in the second stage, we're just scratching the surface with some early insights. How do you get the budget you need to get the enhanced tool and more analysts and everything else? That's a tricky scenario and again it requires leadership and vision and especially when there are too many reports and not enough insight around the alignment. So, having a budget is the critical blocker to overcome, the danger is the next stage we'll talk about in just a second which is the pit of reporting despair and that's a real budget burner. So, the exit criteria here is are people asking for specific data to make decisions if you want more budget get more people to demand that they need the data to make business decisions. And that's an excellent pull.
Now finally the pit of reporting despair. This is I have to say for consulting companies, this is like a wet dream you just get paid money, money, money to produce stuff that nobody uses, and it looks like something is happening. This is a bad use of time and I've riffed on reporting before because I just I'm not a big fan of it. I think that you must be very careful about how you use it. You must be very careful that there's going to be action taken from it. And I know a lot of people are you know talking about actionable reports and all that stuff but it's a complex thing to do because you've got a balance a lot of levers to make that happen.
So anyways back to our pit of reporting despair this typically happens when you've got so many reports coming out from all these different channels and all these different datasets that just don't fit together. I've got my social media report. I've got my. And it maybe I even had different channels. I've got a Facebook report. I've got a Twitter report. I've got a paid search report. I've got my agency reports and they talk about what they did for me and I've got my website reports and I've got a campaign report and none of it fits together. This is a common problem. And what happens is you must get over this very quickly.
One of the ways to get over it is to frame information around the customer. So, the key activity here is to reinforce the standards get the data to be you know not eroded but still useful. You’re training and socializing people to have an eye for data to read it correctly and then to frame the information around the customer. I'll talk about that more in just a minute.
The leadership has really not moved forward at this point and this is a big problem and this is actually what feeds the pit of reporting despair is you don't have a person who's willing to step up and you know potentially offend their colleagues by saying "you know what this we don't need any more", because people get bought into this stuff that they're creating. And so, there's a lack of accountability or sometimes you have a change in leadership and the person who was you know sparking that initial vision is no longer there.
In the people actions part, people are not understanding what to do with the data and there's no urgency to do something with it right now. Again, it gets back to that accountability. So even though it's there and data is being produced not enough action is happening. That's what drove that whole actionable report angle that was so popular a couple of years ago. But moreover, in the pit of reporting despair, you have some problems lurking under the surface almost inevitably. Processes not sticking because people don't find the data important or because it's so splintered they can't make good decisions from it. They're not doing the things that cause the processes to yield good data. So, tagging may be eroding, and you know when you trusted your data before now suddenly you don't trust it again or you're not sure you should trust it.
Campaign tagging another area where people just didn't set it to a campaign tag or can't we just use the refer. No, you can't, actually, you have to pin it together. So, the process is really a mess at this point and it's never seen, it's lurking under the covers. And it's one of those things that people just go "oh yeah, did we need to do that. I didn't understand why?" Technology is also weak here because of the standards process causes the data quality to erode. And I have seen this more than once, a company spends a lot of money like into the 6 figures and they get their data sets beautifully coded. They get everything set up. They get it all together and then something changes stakeholder leaves, you know priorities change. They didn't realize they had to maintain it. And two years later they're at the exact same point and they end up spending that money all over again. Don't let that happen to you.
One of the ways you can get around that is by overcoming this critical blocker. Which is understanding the connection to and the need to care for the customers? Everyone needs to feel like their little bed you know their social media campaign their bit of marketing is about not just creating noise in the market, but it is really about inviting customers to interact with your brand and have a good experience.
This is why I'm a big fan of customer experience information even though I don't think it's as precise and I don't think it's as good as customer-centric. You shouldn't confuse customer centricity with customer experience because customer experience solves for everyone and it's not a bad thing. It's a good thing to reduce friction and have a good customer experience but it is more precise to use customer centricity and understand the value of those customers and which ones will mean more to you over the long term. So, the exit criteria here gets around what I was alluding to earlier around customer-centric use cases and the data has to come together. It has to merge into these customer-centric measurement frameworks.
The use case is a fabulous way to start putting that framework together. It uses anonymous data and it doesn't require you to I.D. everybody under the sun because not everybody's going to log in and not everybody's an e-commerce site, but by starting to speak in the language of people who are confused, people who are deal seekers, people who are you know regularly buying this particular product, and people who are new to us. When you start speaking that language and you get out of the arcane analytics metrics you start to get a broader audience and that is critical for getting out of the pit of reporting despair. So, this, in turn, drives a clearer line of sight to missing pockets of customer data and that in turn drives the next stage of department alignment. So, we will talk about that in our next podcast.
But again, what I always want you to remember is that when you use your data effectively you really can build customer equity you can make a bottom line difference for marketing. It's not magic it's just a very specific journey that you can follow to get results. So as usual everything we discuss today is available at ambitiondata.com/podcast. And I look forward to seeing you on the next podcast when we talk more about the following three stages.
Thank you for joining today's show. This is Allison. Just a few things before you head out. Every Friday I put together a short-bulleted list of three to five things I've seen that represent customer equity Signal Not Noise. And believe me, there's a lot of noise out there. I call this e-mail The Signal. Things I include could be smart tools I've run across, articles I've shared, cool statistics or people and companies I think are doing amazing work building customer equity. If you'd like to receive this nugget of goodness each week you can sign up at ambitiondata.com and you'll get the very next one. I hope you enjoy The Signal. See you next week on the Customer Equity Accelerator.