Customer Experience and its sister customer journey are creating a more complete view of marketing. By moving beyond the click, the device and the session, marketers change to optimizing experiences for people. That’s a good thing, even when these optimizations are chunky segments (e.g. mobile visitors).
The pressure for better customer experiences is creating an interesting c-level power struggle between the emerging role of Chief Analytics Officer, Chief Data Officer and the new Chief Customer Officer who seeks to champion the customer’s needs beyond the data. So far, it doesn’t matter what the title is. What does matter to leaders, according to BAIN’s research, is the ability to rack up wins. Getting to Customer Experience wins starts with unified data and progresses to continuous testing.
Understanding the customer experience requires a more complete customer report view, and that means data sets are more complicated. The process of pulling, transforming, joining and presenting data can be a daunting challenge. The challenge is larger if the right seeds for analysis were not planted (or maintained) at the early Digital Marketing stage.
You’ll need access to a wide range of data sources from cloud, legacy systems, CRMs, marketing and analytics tools. With access, the next step is to set up data pipelines to land and transform it into a usable format. We use Alteryx for this processing step, then connect the output to any number of visualization tools such as Tableau or Google Data Studio.
The most important role on any high-powered data team is not the rock star analyst, or the python coder or even the business intelligence manager. It is the data communicator. This person sits at the intersection of all data streams and business impact. They know what the business needs and they know what is possible within the data. Some more advanced companies turn this into a data product manager. Either way, a good data communicator can guide executives to asking the right questions.
For example, in a conversation on eCommerce conversion, an executive may focus on the cart and completion rate but a talented data communicator will come prepared with a more complete story about the quality of acquisition based on long term customer purchase trends and then match that against conversion baselines to go deeper and communicate a more powerful story through data.
Our advanced data visualizations do a great job of setting this person up for success.
You might think that given the high volume of work that goes into deriving insights from combined data streams, that it would generate the most powerful and persuasive stories. Sometimes that is the case, but more often it is the direct voice of customer that is the match which ignites the dry powder of data.
Voice of the customer when executed well can be one of the most powerful customer experience research tools. First, online surveys should be connected to digital analysis tools (Adobe Analytics, Google Analytics) in order compare answers to actual behavior. Second, customer responses should be sorted into different value groups. What high value customers say is often very different from what low value customers say. If customer value is ignored when developing customer experiences, long-term revenue goals are ignored. In a time when leaders must show wins to increase marketing budgets, splitting the customer base by value is an excellent strategy.
Personas are commonly built top-down based on external demographic market data, then crossed with your own customer data to show saturation of each group within your customer database. There are two ways to improve this approach and make it more usable and dynamic.
First, align survey questions with persona assumptions. Are the demographics right? Do responses match assumed customer motivations? Second, develop a bottom-up use case behavioral analysis and supplement personas. Behavior is more predictive than any demographic data point. Once personas are connected to specific customers and their behavior, you will want to create a stronger, measurable model based on customer value. This connects ethereal assumptions with real, measurable performance, and further aligns with rich testing strategies.
Whether you are optimizing existing content or adding something fresh and new, the proliferation of testing tools make a/b testing seem simple. And indeed, there are many testing best practices. So why do so many companies fail to realize the benefits of a testing program?
Testing tools are only the tip of the iceberg. Behind every successful testing program there is a mountain of strategy, process, governance and analytics. Testing programs should start with a good understanding of where and what types of testing make sense. Will a win create a material gain for the organization’s bottom line? Will a loss provide important learnings?
Even with free tools, there are resource and opportunity costs with every test. Here are several important questions to ask before launching tests:
Is there a good process to prioritize testing opportunities?
Is the organization capable of putting an idea into production if it does prove a winner?
Once a test is approved, is the team prepared to identify good test design from bad?
Can the test be executed in the tool, and if can, is the experience clean and error free?
For evaluating test performance, most tools will provide you with a win/lose reading on a single metric with statistical data. While this seems simple and scientific, in the real world, changes being tested will be influenced by a number of confounding factors. A simple win or lose can mask even greater opportunities, hide a nugget of genius in a loss, or miss important learnings. Best practice is to make sure your tests are fully instrumented with your analytics tool and think deeply to get into the customer’s state of mind about any testing conclusion.