On the Need to Reorganize Approach
Maturity Models for Web Analytics have been a continual topic of discussion among web analytics evangelist. Stephane Hamel, one of my WABITs on Twitter, has put considerable thought in developing the Online Analytics Maturity Model, while others such as Jim Stearn, founder of WAA, have recognized that the actual path to maturity is strung with speed bumps. I have taken my crack at one in “Tracking Multi-channel Behavior in 5 Difficult Steps”, where the steps are iterated again and again to achieve more refined, more inclusive, more difficult objectives set by an enterprise. However, when in the trenches putting the processes, organization and communications in place to get to the next landing on the stair case to maturity, it can be difficult to recognize when one has “landed” and not readily accepted that this means starting all over again to reach the next level. That is why, on an emotional level, the steps are difficult.
Starting again means rethinking approaches and reorganizing processes. Companies are always reorganizing to meet new opportunities and challenges, so it should not come as any surprise that the organizations within the company that provide the data necessary to advanced the business must also change. For web analytics, that is actually the first landing – getting the company to recognize that the data and insights provided by web analytics has value and can be incorporated into business decisions and processes.
Web analytics venders such as Adobe Analytics (Omniture), Google Analytics, WebTrends have made it “easy” to introduce web analytics into a business primarily focused on marketing and conversion and minimizing the need to involve IT in its deployment. But to go further beyond validating the need for this data requires more involvement of the business and deeper engagement with IT. A shift in paradigm on the path to maturity so to speak.
Maturity in web analytics is the same as maturity in business, its about establishing processes and communications within an organization. Each advancement requires introducing new processes and extending communications to new areas within the organization. What got you to the first landing – the rules, restrictions, organization, lines of communication, processes – will not necessarily get you to the next. Most of the time, it may be simply tweaking or refining what you have done, but at other times it may require a dreaded “paradigm shift”.
Enough philosophy, let us get to practical cases.
For the past 18 months I have been working to establish within a large enterprise what Omniture (Adobe Analytics) refers to as the Analytics Center of Excellence (COE) model for managing web analytics within an enterprise. The intent is to focus web analytic expertise and aptitude into one entity that then is applied as a service across multiple web development projects and different business concerns.
Key to implementing the COE model are the roles of Solution Designer and Implementation Lead:
- Solution Designer – responsible for establishing the tag plans for collecting data and designing reports that cover the entire range of business needs.
- Implementation Lead – web developer knowledgable of the mechanics of implementing tag plans and responsible for minding data from applications.
Other important roles are Web Analysts that work directly with business owners to determine analytic data requirements and support the business in providing reports and insights to the business, and Project Manager that prioritizes the many requests and projects and manages these through the development process. All this is established under the recognized Owner and Advocate for web analytics within the organization.
Details on how this is organized to meet business needs is discussed here. Prior to my arrival, the business had already achieved maturity in realizing that the data collected by web analytics was important in supporting the business in their operations and recognizing that a ‘paradigm shift’ was necessary go to the next level. Up to that point, they followed a distributed model of web analytics where the business owners, business analysts, and IT developers had to include web analytics as just another thing they had to do in developing applications.
So when I arrived first as a consultant and then in the role of Solution Designer, I was already on a landing ready to proceed on the first step to the next landing. The organizing rules of engagement were simple and clear:
- Do not ask the business what they don’t know (eVars, sprops, events, correlations, relations, etc) but ask what they do know (business processes, objectives, and key performance indicators).
- Don’t ask IT to implement special requirements for web analytics but ask IT to be systematic and consistent in what they do implement for the business.
The motivations for the first rule are self evident. This changed the discussion from what individual data elements should be collected to how does the business gain insights in application performance, user experience and business insights. Omniture speak in eVars and sprops, which presents a confusing cacophony of concepts with illogical restrictions and limits or at times language that is unrelated to web analytics, was quickly expunged from conversations among business managers and analysts.
Here the solution designer acts as a knowledge engineer interviewing domain experts to construct a conceptual framework to collect data and design actionable reports that provide knowledge back to the domain experts. However this requires communicating and gaining acceptance by the business of the conceptual business/user experience models that are the organizing principle for the reporting.
The last rule assumes an Aspect Oriented Programming paradigm that has been fundamental to web analytics collection from the very beginning. This is a level beyond adding static “tags” to each page to implementing an instrumentation script that pulls data (including static tags) from the HTML Data Object Model (DOM) and performs additional processing and grooming before sending the data to Omniture. This approach attempts to leverage regular patterns in the web design to extract visitor actions and usage that is not normally available in static tags and make ‘corrections’ when patterns are not so regular.
For example, to track click placement, if there is a regular layout design, the collection script can associate clicks within a page to their placement within the layout. Then key performance indicators cannot only be attributed to a referring internal channel or page but also to a location on the page. Another example is form analysis which is relatively straight forward capability in HTML but much less so within Rich Interactive Applications (RIA) using Dojo or Ruby.
The motivation for the second rule is that any additional burdens on development, in particular tasks that developer does not understand or even has the motivation to understand, will quickly become broken, incomplete and forgotten in the ongoing life cycle of the application. The downside, it that different developers seldom do the same thing, so the focus and challenge is getting regularity into the engineering process where conformance to specific patterns and processes is enforced. The quality of the data increases by many factors when the application must conform to these patterns to pass unit and acceptance testing.
Starting as a proof of concept (POC) for specific strategic projects, these rules have propelled the process such that the Web Analytics COE is now standard operating practice across all projects and applications. Most of the processes are in place and communications with the business, project management, and IT formalized with appropriate documentation established. We have even progressed to where we can cut lose the data miners that are building predictive models based upon the data that has been so studiously minded.
But just as we are reaching the next landing in the progress to maturity and everything should be just humming along, the rules of engagement must now change!
The Next Paradigm
As the Web Analytics COE matured so has the enterprise as whole in both recognition of the role of web analytics and its integration with enterprise business data / intelligence and the need to communicate the structures of the business throughout the organization. By structures I mean the formalizing of the business processes through user experience and web design down to the uniformity of the platform that supports these processes. We are asking questions as a business such as:
- Is each business process different or simply variations of a more fundamental process and should not the fundamental process be recognized and characterized in defining the variations?
- How are these processes communicated in user experience and web design principles for integration into one consistent user experience?
- Why do we have or need a dozen different date picker widgets and can we not better align the components of web design with business processes and web design principles?
Formalizing this means developing and documenting Business, User Domain and System Architectures that align and provide seamless transition and traceability. As a start, it means aligning individual businesses and projects to global initiatives and objectives.
This is great news to anyone who practices web analytics or any data analytics! As solution designers, we thrive on regularity in business patterns and web design. However, in business, in particular one that is expanding and evolving, this is very rare.
As a consultant, I usually have to do this myself. My initial effort is to develop conceptual graphs of the business to provide a framework for the analytics. An example of one such conceptual graph (mind map) can be found here. As that article explains, I use these graphs to overlay the business semantics (how people within the business describe their business) with analytic concepts (how analytics differentiates the elements of the business).
These are developed primarily to organize my own knowledge of the business into a conceptual framework that I can use to define web analytic solutions. Since I am taking established and well understood business processes and reframing them into my own model of analytic processes, I seldom need to communicate this model back to the customer except to communicate a specific concept behind a solution design.
For example, I distinguish user intent from the tasks the business requires from the user to fulfill that intent. Often intent has to be inferred from the actions the user performs. So if I am developing a report for measuring user intent and success in fulfilling that intent, I need to communicate my criteria and how intent is differentiated but related to business tasks. In this setting, it is not difficult to communicate the concept in terms that the business understands but in the background are my models of business processes and user experience.
Now the question is why not make these models explicit and baked into the web design and platform implementation? Why not make user intent a conscious aspect of the application? These models form a common language for communicating business concerns throughout the development process. So now rule one must change to: ask the business what they do know, but now framed in a common conceptual framework (Business Architecture) that can be communicated to both analytics and business/application development.
This leads directly to the change in the second rule. We must now abandon the Aspect Oriented Programming approach and replace with the notion of observable application development approach. Observation is baked into the system architecture and application design. The common conceptual framework of the Business, carried through the user experience and web design to the product/platform implementation, is now an integral aspect of the framework implementation which the web analytics can observe.
Continuing with the example above, this would mean that distinguishing user intent from business tasks is an aspect of the application design and available to the analytics as a part of the application’s execution context. Tasks are clearly delineated as well as how user was brought to those tasks. Conformance is enforced through requirements, design, implementation and test phases, which means that there is a common understanding of language and agreement on the process for communicating requirements through each phase.
The rub is that this is an ideal and is seldom realized in actual practice, yet the need for quality data persists regardless of whether this ideal is met. So what is an interim approach that can be put in place as the business and IT development approach the ideal in the next cycle of maturity? What is the second rule of engagement?
The Need to Change Thinking
There are several motivations for changing the rules of engagement with IT.
The move from HTML to RIA and Mobile Apps.
The application execution context is the context data that represents the operational and business state of the application as the user progresses through the various services and tasks implemented by the application. It captures when specific visitor and business milestones have been met and characteristics of those milestones.
With HTML this is in most cases the document object model (DOM) but even then, certain data concerning what content or offers are being communicated to the user and where the user is in the business process must be included as part of the execution context of the page or web flow. Traditionally this is addressed by having web developers add static tags or context variables before the page is served to the client browser. Often the instructions are as simple as “include this tag on this page and this tag on that page”, assigning the parameter values that should be sent to web analytics.
With rich interactive applications (RIA) where much of the server processing is brought into the client whether as a mobile application or Web2.0 browser application, the execution context, constructed from numerous service calls, can vary greatly. To address this, the analytics must now provide a formal API to development that defines the context variables necessary to support a web analytic capability and the instructions on what data from the AEC must be mapped to these variables. This is similar to the tags that where set in the original approach but now can be communicated with context variable names that identify to the developer more explicitly what is being requested and when they should be set within an asynchronously evolving operational environment.
IT Development wants to apply the same code standards and quality control to web analytics as the application code.
In addressing this need, IT recognizes that web analytics remains a shared service common to all applications and that development will need to take more responsibility for providing the context data necessary to support the analytics. The analytics instrumentation can no longer attempt to bridge cases where patterns break during execution. These cases have to be addressed in the application in a manner that conforms with web design quality controls and the needs of analytics.
This means engaging the analytics COE early in the design to ensure that data required is brought to the client for collection, which often means that the architects and designers need to understand why the data is necessary for analytics. Fortunately the discussions are not what eVars must be set but what context variables must be exposed and how they relate to the business. It also means that ‘tricks’ typically performed in the instrumentation script must have an alternative solution in the application. Where form analysis would scan the DOM for forms and attach codes to collect user actions within the form, the form template and widget components must now provide an observability dimension that passes this data to the analytics.
This has resulted in an Analytics Framework with a formalized API and well defined mechanisms for communicating context to the framework. The data collection is broken down into coherent testable units called Web Analytic Capabilities (similar to configurable plug-ins from Omniture) that have their own tag plans (sets of Omniture variables), internal data model (state engine) and API in the form of context variables to which the application must conform. Once deployed, like a plug-in, the capability can be applied across all projects and applications.
The implication is that the developer must now set context variables at the appropriate places and times. Whereas in an AOP approach events could be fired on specific page names or user actions, these now have to be declared explicitly in the application.
The developer will need to understand some of the conceptual framework for setting variables. Hopefully this can be communicated more as a requirement of building observable applications rather than following recipes specific to analytics. Time will tell if this simply hair splitting the distinction. Hopefully (again) this dialog with IT will result in baking the concepts such as intent and business actions into the design and application framework.
Increasing use of agile development with independent timelines.
In some respects agile development, where separate teams work to incrementally build applications in cycles called sprints, is counter to the COE model. To embed an analytics implementation lead into each project quickly disperses talent with the potential of changing a centralized model into a distributed one. However, it does present a number of opportunities.
First, analytics is an integral aspect of every sprint requiring demonstration of the appropriate analytics at the end of each cycle. Every story implemented within a sprint must demonstrate an analytic dimension. This re-enforces the concept of observable applications as analytic implementation leads and application developers work through the concepts necessary for build-in observation as critical aspect of the application. From this discussion has evolved the Analytics Framework, with the implementation leads attempting to systematize and leverage their effort over multiple projects.
Second, it provides an opportunity to bring analytics into the discussions with product managers, and developers that make up the team. This includes understanding and at times discovering how users will interact with the application, how their experience will be observed and where that experience can breakdown. In these cases, concepts such as what is user intent and what are business tasks and how completion of tasks and user accomplishment can be observed becomes part of the stories that are developed during the planning of a sprint.
With a more formalized API with the Analytics Framework and standardization of web analytic capabilities, most of the participation of the implementation leads can be more communicating requirements and assisting developers in meeting these requirements and less actual code development specific to each project. The implementation leads can concentrate more on generalizing existing capabilities and when necessary developing new capabilities that may be introduced within a project timeline. The agile sprints are excellent opportunities for refining and improving web analytic capabilities and verifying (a.k.a. minding) data.
- Do not ask the business what they don’t know (eVars, sprops, events, correlations, relations, etc) but ask what they do know (business processes, rules, and key performance indicators), framed in a common conceptual framework (Business Architecture) that can be communicated to both analytics and business/application development.
- Don’t ask IT to implement special requirements for web analytics but ask IT to be systematic and consistent in what they implement for the business, providing an observable framework that be can directly mapped to well defined Web Analytic Capabilities.
Two final points in summary.
What is being attempted here is not unlike what was implemented for tracking campaigns. Once the mechanism – using tracking parameters in query strings for campaign landing page URLs – was defined, the entire industry organized to apply the mechanism from advertising networks such Google and Yahoo!, through all web analytic tools to campaign management systems such that campaigns can be tracked and measured uniformly everywhere. The concept is the same, only now applied to business requirements.
I find the paradigm shift discussed here similar to when after years in grade school being taught that one should never subtract big numbers from little ones, FINALLY being introduced to the concept of negative numbers. I remember being both shocked and angry. Shocked that there was such a thing as negative numbers. Angry that teachers kept this from me for so long. These shifts in thinking are often welcomed indicators of maturity, yet come with there own shock and awe.