Toward the end of 2008, I attended the Gartner CRM and Analytics conference in Washington, D.C. The groups and sessions provided valuable information - everything from statistics on how poorly most people perceive their average customer experience (glad I'm not alone there), to prognostication on the importance of utilizing or, more importantly, not overutilizing new channels for direct marketing success. As a life-long BI guy, I'm always looking for the analytics angle to a story, and I didn't have to look hard here. In fact, I was struck by a common thread in the sessions and presentations I attended, reminding me that as much as things change, they really do stay the same. The case in point here was Web analytics. Even in sessions that were not specifically geared to analytics, the topic of Web-based information added a crucial element to the current business environment and multichannel customer experience. What do I mean about things not changing? Consider a few keys to success for projects to capture Web analytics:

  • C-level support for the initiative,
  • Well-defined requirements,
  • Well-defined and understood ROI and
  • An accurate testing approach.

I thought I was listening to a discussion from the early '90s about why data warehousing projects succeed or fail - but the fact remains that all of the points are true.
Based on the stories I heard at the conference and the current initiatives people spoke about, it is fair to say that Web analytics is still a distant prospect for a large number of firms. To make this point, consider these definitions. A practical definition of analytics from Wikipedia is: how an entity (i.e., business) arrives at an optimal or realistic decision based on existing data. Also from Wikipedia, a report is described as a document characterized by information or other content reflective of inquiry or investigation, which is tailored to the context of a given situation and audience.

I think Web reporting best describes the current status for most folks right now. Firms know how to count page views, where visitors came from, where they went, how long they stayed and where they were when they left. Yet they struggle to understand what to do with the information. Successful projects have a clearly defined goal and way to measure the value of that goal. While it is true that analysis may provide valuable insight that was not intended, a more concrete strategy provides a higher probability of success. When sites are being designed or redesigned, what data should be captured to understand whether the design or enhancement is inducing the desired behavior? This should be stated as part of the project definition and measured in the ROI calculation.

While one of the keys to a successful Web analytics project is knowing the right questions and having expectations about what the right answers are, another critical aspect is application flexibility and the ability to adapt. A common problem I've seen is the lack of ability to move quickly from a technical perspective in order to ask a different question. Forward thinking is critical regarding setup with the major tools on the Web site. Having to retool or retag a site to capture information for a different question is time-consuming. With proper setup, the appropriate information is captured and a question can (potentially) be asked and answered quickly. If firms miss the boat on the setup so the tool is not capturing everything it needs from the site, time will be lost.

A firm moves from reporting to analysis by asking questions and reviewing results to prove or disprove a theory and taking action based on that analysis. Taking action was one of the other themes that swirled around the conference. The key to actionable information is understanding what that action should be and defining how to measure the consequence of that act. Here's a very simple fictional example: 80 percent of the time, users leave my site when I ask for a form to be filled out to receive a free success story or product review. Should we remove the form with the goal that more people will download the success story and will proactively contact us? Or, should we leave the form there because although we are sending out fewer success stories, at least we now have a contact to follow up with? The answer depends, and a testing methodology must be applied. Does having the contact information lead to traceable sales? Or, did we see an increase in inbound contacts when we removed the form and allowed anyone access to the white paper?

This is not to say that some companies are not already doing these things and more. At the Gartner CRM Analytics conference, I heard a story about a CMO putting the kibosh on a large Web site redesign deployment (the development and testing were complete - this was at the deployment phase) because no one could explain how success was defined or would be measured.

Future columns will bring stories and overviews of the major players and issues being encountered in the Web analytics space. I'll talk about other topics, including:

  • Predictive Web analytics - how to predict the future.
  • Paid search/ad words - what is working?
  • Web 2.0 - understanding feedback.
  • ISP data - mining value.
  • Search engine optimization.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access