Can you hear them hollering . . . in your data?
Sometimes technology out paces and morphs concepts that came about in a different time in a more traditional setting. Voice of customer [VoC] is such a notion given the opportunities of social media and on line interactions that seem larger than survey approaches or comparative marketing research studies that have traditionally defined this concept. Not to say that surveys and studies are not important, but to truly understand the voice of a customer to the point that a business is listening and conversing with customers and providing options specific to segments of customers seems to go beyond surveys and benchmarks. This is particularly true on-line where lack of human interaction most likely prevents reaching individuals with an offer that is specific to their needs that would otherwise be closed routinely by an experienced sales person or customer service agent.
The equivalent of a sales close on-line is called a goal conversion, where conversion here means a goal event for the web site – a registration, a qualified lead, or a purchase. When comparing visitors that “convert” at the end of a funnel process to those that enter at the start of the funnel as new visitors, the rate of conversion can be rather small – 2% to 3% on average. The question on everyone’s mind is who are the other 97% and what can we do to get them through our on line sales funnel?
Assuming that your business has completed a couple cycles of the 5 difficult steps for tracking visitors, you feel you understand what visitors are doing on your site and believe you have quality visitor traffic data. You have gotten your 300% and 500% gain in conversions and ROI by implementing CPA directed marketing but are still at 5% conversion rate with revenue at 10% what the CEO would like. What is your next astonishing act? How do you take the online business to the next level?
First you need to ask – “If I increase the conversion rate by a factor of 10, what will happen to ROI?” There are several factors to consider before setting out for this lofty objective. How well have you modeled your business funnel? If your conversion funnels have already been “optimized” then increasing the size of the funnel will no doubt require “de-optimizing” resulting in decreased ROI. Perhaps there is still some low lying fruit still that can increase revenue without decreasing ROI. Perhaps listening to the 95% that are not converting is a good place to start.
Can it be as simple as just asking?
Avinash Kaushik has strongly advocated for using task completion rate as the primary metric for site optimization. Recognizing that visitors come to your site for different expectations of what they want to accomplish. These may not be reasons for which the site was designed but for whatever reason – are your visitors finding what they were looking for? Did they complete the task they implicitly set for themselves when coming to the site? From this we have Avanish’s “The Three Greatest Survey Questions Ever”
- What were you hoping to accomplish coming to the site?
- Did you complete what you expected to accomplish?
- If not, how did we fail? or If so, how did we help?
With answers to these 3 questions one has gotten intent of the visitor, task completion and an earful if the customer was not satisfied. iPerceptions has implemented Avinash’s questionnaire in a free offering called 4Q that asks four questions. The three above plus a customer satisfaction question: How would you rate your experience on this site? The survey is the most brief yet most universally potent in terms of actionable data than one could ever devise.
The offering is 100% free with no strings attached by a company that has an established reputation in online surveys. 4Q is very easy to set up. The hardest work is defining all the reasons that you believe visitors come to your site – get information, compare brands, find jobs, no reason but found it interesting just the same, etc. If your mind goes blank, there is a long list to select from plus ability to make suggestions that may show up on the list latter. I would have preferred some grouping of the tasks by kind of site – I doubt if I will ever be giving auto quotes.
You can select up to 6 tasks including other with an ability to refine these as you collect data. From this list visitors will select a reason that closely fits what they where expecting and the rest of the survey configuration is fait accompli. Once configured and the script installed on the web site next to the web analytics tag, Q4 raises at the start of a visit a permission window for selected visitors to participate in a short survey at the completion their engagement on site. With the visitor’s permission, a separate window appears allowing them continue on the site and complete the survey at their convenience once they have completed their business on site. It’s very cool. I can’t wait until I get enough traffic to try it out.
Now you are all set and collecting metrics on satisfaction and task completion rate as well as finding out why the visitors come on your site. As for the comments of visitors that did or did not complete their task, these are the voice of your customers that allows you to begin to understand where you should be concentrating effort to satisfy their needs and move them through your funnel. As you watch satisfaction and completion rates grow, you can practically count the dollars coming in. It’s like printing money!!! If only it were that simple.
User experience gleaned from voice of customer interviews is an important component of improving site performance, but still the experience has to result eventually in completion of your business tasks – conversion. So why they come to your site (task completion) has to be combined with what they do on the site (conversion) that has to be molded by how you do business (ROI). As a first step one needs to combine the “what” and the “why” – quantitative data of WA with the qualitative data from VoC.
VoC Analytics
iPerceptions provides a more powerful tool in webValidator that supports more exhaustive surveys; allows mining and visualizing the VoC comments for themes and persistent issues; and segments and cross tabulates visitors by responses to questions in the survey. In this way VoC analytics looks very much like quantitative Web Analytics except that we are dealing with qualitative aspects of the customer experience.
To provide a quantitative metric that can trend on qualitative visitor impressions, webValidator also includes iPerceptions Satisfaction Index (iPSI) calibrated with years of research data from over 700 projects involving Fortune 2000 companies. This single metric allows comparison of your site performance to other sites in your vertical. It gives a “360 degree” view of over all satisfaction that is sensitive to all the actions that effect satisfaction such as addressing qualitative issues in VoC.
Hopefully, if you are lucky, the visitor has told you exactly what they need or did not find on your site and you can immediately apply this knowledge to improve the site and overall satisfaction. In practice there are other factors to consider. The most important is that the response may have to be specific to a certain segment of your customers and require identifying those customers as they come onto your site. The voice of customer will present different marketing and user experience personas that will have to eventually map to visitor on-line profiles available at the time of a visit.
One of the advantages having a metric that is calibrated to your vertical you can see if others are in the same boat and some idea how big the boat is. One can compare completion of task and satisfaction metrics cross industry and verticals. This is not only important for identifying relative performance but also to see potential upside return in increasing these metrics. If you plan to have performance much greater than your competition, then you will need a business plan that is both cutting-edge innovative and very different from anything the competition presents.
Though this data is invaluable to developing personas, it does not necessarily translate into direct actions that will improve web site performance for key business performance indicators. The KPIs for VoC are only a subset of what the business must address and eventually must be related to how these effect over all business performance. Don’t get me wrong, user experience is extremely important. Bad experiences have the most corrosive effect on conversion and brand. However focusing on VoC KPIs can result in very happy and satisfied visitors that don’t complete your business tasks. Your business has to work out how to monetize happiness and satisfaction (the topic of a future blog).
Could it be that you have the data already?
Now we are a point often encountered in analytics where we have two different sources of data and want to understand their relative value. How are they equivalent and different? What does one contribute that is missing in the other? How can they be combined? To answer these questions let us apply the methodology discussed here and determine how each conforms to the five axioms of web analytics.
Before dealing with completion of task surveys such as iPreceptions, let us look at VoC sources in general. Though surveys have been the primary tool for marketing in VoC, the Internet provides increasing opportunities for obtaining VoC data from diverse sources. The customer is empowered as never before to express their opinion and attract an audience. Customer’s can rate and critique your products at a number of places independent of your web site; blogs such as this one can give positive or negative mentions of your products; these can be picked up and go viral via Twitter and Facebook; if a customer is really peeved or stoked they can set up a flame or fan wall on Facebook or produce a video to YouTube. Assuming that your marketing is pulling data from all these sources [for example using Lithium Social CRM, RightNow CX CRM, or Seesmic Desktop]. How does this data compare with web analytic data?
The VoC data from external social sources is technically not owned and definitely not controlled by the business (Axiom 1). The context of the sources as well as the user comments has to be considered. The user may or may not be a customer and if identified as a customer cannot be reliably linked to customer or visitor identifiers (Axiom 2). Time (Axiom 3) and site boundary (Axiom 4) are not applicable, and the content of the data will most likely apply to company brand, product and function within the company – such as customer service and not customer behavior (Axiom 5) except as one that is vocal. So in the end, the VoC data has a peripheral link to a web site or online property at best through individual comments on specific web site experience.
Analysis and diagnoses of themes and memes in the VoC data will require off line action to respond that may eventually effect brand and product marketing, business and service improvements, and at times changes to the online user experience. It is in this sense that Voice of Customer marketing is independent of the online web analytic efforts and in this sense that VoC analytics provided by iPerceptions is typically applied independent of web analytics. But what could be achieved if the two sources could be brought together?
Comparing data from site task completion VoC analytics to Web Analytics:
Axiom 1: The owner of the data is uniquely identified
Both uniquely identify the owner of the data and these can be the same or shared owners between the two sources conforming to axiom 1. What is important is that the data is controlled by the business and therefore the collection coordinated to complement each other as well as conform to data policies.
Axiom 2: The client source of the data is uniquely identified.
Both identify users uniquely via anonymous identifiers except that VoC identifies a single person for a single survey event whereas web analytics tracks user agents used by the visitor over multiple visits. But both can share the same user identifier that conforms to axiom 2.
Axiom 3: The time of any event is known precisely and universally.
Both share the same clock but VoC analytics only records one event for a subset of unique visitors in web analytics data. Therefore VoC violates the third axiom where events are recorded with sufficient resolution to distinguish order of events. The main short fall is that the visitor is not recorded again when returning for a different task, so we have no information of subsequent behavior as a result of completing or not the surveyed task. VoC data is applicable to diagnosing various types of visits and issues related to those visits. By itself, VoC data does not necessarily apply to overall movement of a visitor through the sale / business process. If the survey can be linked to a visit in the web analytic stream then one can attribute task completion to subsequent actions including subsequent visits and conversions.
Axiom 4: The boundary of the web site or property is unambiguously defined.
The surveys are initiated on site or property boundaries at the beginning of sessions. Therefore both can capture the same visitor introduction state at that boundary and attribute channel and marketing information to the survey. In this sense VoC analytic data conforms to the fourth axiom. However this is not available in 4Q and may not be available from more advanced offerings. What is necessary to implement is to bring the introduction package – landing URL, referrer-URL, user-agent, and cookies into the survey record for processing later.
Axiom 5: Data that is the same will be interpreted the same.
The two data sets can be leveraged to provide a more complete view of visitor intent and behavior. Web Analytic data implicitly tracks behavior through variations in the event data stream whereas VoC analytic data explicitly characterizes behavior through the specific questions answered during a survey. By linking the two, the “what” of web analytics is joined with the “why” of VoC analytics.
By combining the data sources we can have validated models of intent and completion derived from behavior data while presenting well understood persona for marketing and user experience design and content derived from VoC. Regardless, the personas derived from marketing research must map to visitor properties identified in the web analytic data stream to link on line visitor behavior with marketing customers.
There is no reason that task completion cannot be congruent with conversion. After all a purchase is a completion of a task even if it may not have been the original intention of the visitor. On the other hand, if the visitor intended to purchase but did not, that would be an entirely different situation. How were they diverted from their intent? How many cart abandonments are actually failed purchase tasks as opposed to what-if exercises by the visitor? Would it be useful to separate these two visitor tasks in the web design and user experience?
As with any task that a visitor can perform – learn about the product, request more information, read the privacy policy, review conditions and terms, place items is a shopping cart – there must be transitions in visitor experience that can be observed in the analytic event stream and related to goal conversions. The survey task completion and satisfaction data can greatly inform the construction of the funnel leading to conversion by identifying the important tasks visitors perform on site; where these fit in the overall scheme of multiple opportunities to interact with the visitor within the funnel; and observed behavior patterns that may predict completion or abandonment of the funnel.
So theoretically everything that describes visitor success or failure should be captured in the web analytic data stream. All that is needed is a way to bring it out. Besides surveys, are there other ways of listening to visitors, extracting voice of customer, and reacting to that voice to provide the necessary options to which the 95% will respond?
Customer Experience Management – VoC++
There are a number of offerings that combine web analytics with VoC data to provide complete integrated solutions under the designation – Customer Experience Management [CEM] as an emerging specialty of Customer Relations Management [CRM]. Companies such as TeaLeaf and Responsetek provide recommendation engines that use quantitative web analytics and qualitative VoC data to not only identify problems but also diagnose the cause and present actionable business intelligence. Recently TeaLeaf as gained press [here] for its ability to connect all the dots in real-time and bring together all the data including content that represents a visitor’s total experience. This can be further mined for patterns that represent common experiences the lead to either success or failure.
To accomplish this requires the ultimate in data minding where content and actions on both the server and client sides can be brought together, requiring integration of analytics into the web development and web presentation layers. These offerings are not so much the next generation of analytics but confirmations that analytics requires a great deal of consciences effort in both minding and mining data.
Unfortunately the methods behind this “magic” are hidden within mind numbing marketecture and opaque product modules that the customer has to put together to achieve results [Tealeaf, Responsetek]. One can be assured that implicit behind the success of these offerings is the fact that web analytic data remarkably reflects visitor behavior and can express implicitly what VoC provides explicitly. Call it non-verbal communication when combined with verbal (survey) communications becomes the complete voice of the customer.
Wisdom of the Crowds
One company I have looked at in detail for a number of years illustrates the power of this approach – Baynote. What I found remarkable about this start up in the beginning is the confirmation that the analytic data was sufficient to extract usable visitor segments and highlight both problems and appropriate response through Wisdom of the Crowds (WoC) techniques. To be sure there are a number of different methods and offerings that can perform similar feats including: gradient descent (Optimize), Bayesian classifiers (Autonomy Interwoven), vector clustering (Touch Clarity), neural networks (Certona) as well as statistical analysis (SAS). However it is metaphorical that the crowd speaking in the quantitative data is dual to the voices speaking in the qualitative data. It is as though the customer as visitor has been speaking if not hollering all along.
Since my first encounter with Baynote, they have expanded their offering to include social media and VoC data to provide a Collective Intelligence Platform (CIP). As an integrated intelligence warehouse that is able to detect patterns in anonymous traffic and then predict subsequent behavior, the platform becomes an excellent offer recommendation engine that can customize the content based upon how previous visitors have interacted with the site. In fact when combined with Lithium Social CRM, the combination can provide true integrated customer experience insights [here].
The core of the CIP is the affinity engine that uses WoC to find and group users with similar intent. This grouping then provides a behavior context that can now be used to shape content presentation. For example, in most site search installations (referred to as internal search or second search) behave exactly like external search engines, the keywords that are entered are used to retrieve “relevant” content. Now what happens when the search is augmented by the behavior context? The results are even more relevant to the intent of the user and products placed before the user are even more appropriate and more likely to lead to a sale or a request for more information.
This all works even though the data reflects an open system where arbitrary events on the web and by individuals and these events are never controlled or eliminated; and the relationship of the various parameters collected have indirect and most likely non-linear relationship to actual human behavior. It works because there is sufficient correlation between the data and human behavior that patterns can be readily extracted with remarkably sensitivity to actions such as variation in content and presentation.
None of these offerings are for the small or even medium trafficked sites – it requires a great deal of traffic before Sheldon’s Law of Quantitative History takes hold. For large sites with significant traffic, the various site optimization platforms with scalable data warehousing and data mining support are important to consider over much simpler out-of-the box offerings specialized for a specific audience with tailored reports and tools. Eventually the business will out grow the introductory offerings and need significant data warehouse capacity and processing capability to bring all the analytic and business intelligence data together. Fortunately these investments do not need to be made immediately and not until it has been demonstrated that data actually models your business and returns actionable insights.
Test and then Target
Now I personally like these open system non-linear affinity engines because from experience with AI, Neural Networks, Expert Systems, and Adaptive Learning, plus using these tools and approaches in SEM / SEO and Site Optimization, I am confident that they can work. However these techniques do not work unless you really mind your data before you mine your data. There is a more incremental approach that is based upon sound workflow development and discipline than simply allowing the data miners for go wild and crazy in the backroom.
This involves incremental testing and eventually sufficient control of the treatments to present the best presentation to each customer segment (behavior targeting) or individual (personalization). Survey tools such as iPerceptions, SurveyMonkey and others are valuable marketing research tools for understanding your customer’s intentions and developing real on-line personas. These can provide audience personas for marketing content and user experience for tailoring content authoring and marketing presentation to their particular needs and understanding. Instead of fretting about which content or treatment is appropriate, establish A/B tests or Multivariate Tests to determine what is best for each persona.
There are number of test platform offerings such as Omniture’s Test and Target (formerly Offermatica), GA Optimizer, and SiteScope that provide an introductory platform that can meet at least initial testing needs. These are integrated into or concurrent with web analytic packages and depend ultimately on Web Analytic data to define visitor profiles and attribute test results. The testing emerges as an extension from your continual web analytic monitoring and reporting.
Eventually, once the value of testing has been confirmed, you will want to consider an experimentation platform that combines content management and web presentation with a back-end data warehouse advanced data mining and analysis such that the transition from testing to behavior targeting is seamless and a part of standard practice.
This requires more discussion but if interested a good starting place are the Microsoft Experimentation Platform ACM papers that are the current gold standard for defining what an experiment based web site is about. However none of this could be accomplished if the quantitative data that makes up web analytic data streams did not also include some aspect of customer’s voice – that Wisdom of Crowds is dual to Voice of Customer. Clearly both should be included together in understanding user experience but are most powerful when combined together.
Adapt and then Adapt more Quickly
To summarize the points above, what has been presented is a maturity model where one incorporates Voice of the Customer in steps to meet incremental performance objectives. Only after value has been demonstrated does one expend the cost and resources to move to the next level.
- If you have web analytics and want to understand and apply voice of the customer then iPerceptions 4Q is an excellent place to start and for many this may be all you need. If the business is responsive to this voice then you should see satisfaction and task completion measurably improve. Hopefully this translates directly into improved business performance. If not then …
- If you find that you want to employ more extensive survey and voice of customer approaches using iPerceptions, SurveyMonkey, and in corporate social CRM such as Lithium then insist that these integrate with your quantitative web analytic data. As stand alone offerings they may help at first but a great deal of what customers are telling you will be left in the quantitative data. You will begin to develop an integrated data warehouse to support your particular internal reporting needs among sales, marketing and business finance.
- To go to the next level use offerings such as Tealeaf, Baynote, Interwoven TeamSite / Optimost, or Omniture’s Recommendations, which has embedded a number of world class technologies from Touch Clarity, Visual Sciences, ClickZ, and Offermatica, to connect all the dots between web analytics, content, and customer comments. These will draw out from the web analytic data stream actionable often real-time intelligence to target content and improve business funnels based upon VoC from surveys and social media as well as the implicit Holler of Customer (HoC) found in the event-content streams. At this point you should have a full fledged data warehouse capable of supporting sophisticated analysis and data mining.
- An alternative incremental approach is to first adopt a test strategy where content personas (derived from VoC) are mapped to visitor profiles available in the web analytic data stream to maximize both user experience and business performance metrics. This will setup of the discipline and coordinated work flows for marketing, product, and business to test new content, products and policies in a cost effective and timely manner.
- Eventually move to a full experimentation platform that incorporates both content management and presentation to allow targeting to vary treatments to different visitor segments and eventually individuals for a personalized experience that meets both customer satisfaction and business performance objectives. At this point you will have data warehouse that not only supports continuous experimentation but also data mining using some of the offering in 3.
In what ever you do your objective will always be to align visitor expectations (the tasks they perform) with business objectives (the transitions they make) so that customer satisfaction translates directly to business KPIs.
Nice article. One of the interesting things about Baynote (no capital N) is that we actually can start producing good recommendations for medium trafficked sites pretty quickly, given the fact that we base our recommendations on the behavior of all site visitors. For an example of how this works in action, see Sun & Ski case study at Internet Retailer. http://www.internetretailer.com/ECTR/article.asp?id=34447
Thanks Kathleen for bring up how good results can come from even moderate traffic. That was one of the features that impressed me when I first saw Baynote (with no capital N).
Pingback: Voice of Customer Analysis « Kraft T Thoughts
Just want to say what a great blog you got here!
I’ve been around for quite a lot of time, but finally decided to show my appreciation of your work!
Thumbs up, and keep it going!
Cheers
Christian, Satellite Direct Tv