Web Analytics Process The Data Vortex update from August 2004

Web analytic products and services can help you process a deluge of raw site visitor data, revealing truths about customer behavior.

August 6, 2004

9 Min Read
Network Computing logo

The concept behind Web analytics is straightforward: Track site visitors' behavior with the goal of separating them from their cash or personal data. To this end, information on a user's every click, shopping cart addition or abandonment, text entry and purchase is gathered, compared and correlated every which way to Sunday. Some analytics vendors even claim that every market/service/content/lead click can be predicted, and that you can use this information to make your sites more appealing and efficient.

Of course, all this collecting equals a torrent of data. And, as a rule, we don't put much faith in vendor predictions, but in this case we've made an exception. Many of the 10 analytics services we tested in "Inside Information" did a great job of collecting and dissecting data, and providing complex browser overlay reports, well-organized spreadsheets and sophisticated campaign correlation. But the job of converting all that data into useful information and acting on it falls to us mortals. That means Web analytics tools, in this case provided by ASPs, must serve your everyday, run-of-the-mill business unit end users. Some users will be engaged and analytically savvy; others will have neither the time nor the inclination to dig

through the mountains of reports. Still others will be dumb as rocks and call you for help every 10 minutes. The better analytics vendors are aware of and cater to these differences, striving to provide good help, training and navigation, and understandable reports.

All this assistance does not make these services a complete outsource, however. IT support and expertise are necessary. Access control, data collection, marketing-campaign tracking and funneling the purchase process into reports all require your intervention. In addition, given the varying range of user expertise with analytics, you'll likely have to help your end users with the interface and explain the data enough to make it actionable.

Working KnowledgeMost enterprise Web sites fall into one of four categories, and this delineation determines the metrics that matter. Once you ascertain your type, you can focus on the analytic features important to you.

  • E-commerce sites are about making money. Key metrics are sales or conversions of shopping carts on a per-visit average against the number of returns.

  • Lead-generation sites want to know who visits and what information they want to download. Unique visitors and top URLs are important.

  • Customer-service sites want to off-load user requests from customer-service humans to the site, reducing inbound calls and call length while still satisfying the customer. This is measured by the number of visitors with successful searches and how that calls to the service desk are impacted. Are you getting fewer calls from lost customers? Can you reduce the number of service-desk operators?

  • Content sites, like Network Computing's, have dual needs. We care about visitors finding the information they need, and the business side wouldn't mind selling a few ads. This calls for top URL reports by segment--in our case, site-, magazine- and technology-focused--and measuring across sites or groups of sites, which calls for flexible segmentation.How does all this tracking happen? Every one of the services we tested places JavaScript "tags" on each Web page tracked. When a tracked page is loaded by a visitor's browser, this JavaScript service downloads, runs and communicates with the ASP's tracking servers. Tags generally are placed in the header or footer of every page on large sites by content-management software, as it was in our tests.

    The services place cookies on visitors' machines to identify them. Cookies provide session, return-visit and similar metrics. How effective are cookies? Very. Vendors say--and we concur based on our tests--that only about 2 percent of the browsing public's workstations have cookies disabled.

    This tagging not only identifies visitors, it also helps site segmentation. When an enterprise Web presence comprises various collaborating sites--a common setup--tracking browsing behavior to and from those sites is important. For example, our testing took as a model of complex segmentation the 150 Web sites run by CMP Media LLC, which publishes Network Computing. We measure and manage our site, as do all our sister magazines, but corporate also needs the big picture of overall Web usage. And like any other organization, CMP has divisional perspectives, which might relate to different products, services or, in our case, magazine groups that are monitored and managed. Each, in analytic terms, is considered a segment. Generally, each segment needs a unique tag.

    This means that a single site, such as www.nwc.com, must be tracked three ways--alone, as part of a division and as part of the corporate interest--it needs three unique tags. Why not use a single tag for all of CMP and then parse out data for specific sites? The short answer is, each service provider manages its data process, and thus administration and reports, along the boundaries set by the tags. So in this single-tag example, a report of total visitors would reflect the total visitors for CMP. Of course, it seems reasonable to expect that a database query on a unique index, such as www.nwc.com URLs, would sort out data specific to Network Computing. However, this data first must be saved in a raw format for our publication, and then it must be processed just for us. This effort equals more costs to the service providers.

    Multiple tags may seem excessive from a client browser point of view, but our tests showed this isn't the case. Using the globally situated Gomez GPN and Last Mile Web performance services, which let us monitor availability and, more important, specific download times of each tag between the Internet and user desktops (including broadband and dial-up last miles), we found tag overhead to be negligible, usually subsecond. (read more on the Gomez service.)More important are the static segmentation and resulting costs that multiple tags imply. Although our site is predictably part of various CMP Web segments, it's also a member of unpredictable segments. It's natural for visitors to CMP's sites to search along technology or business lines, for example. To track this across multiple Web sites, you need still more segments--and additional unique tags. This increases the amount of data gathered by the Web analytics services, and consequently, the cost of the service.

    Tags are only one of the ways to segment sites. Almost all the analytics services we tested could also segment by URL and directory structure, among other methods. But when user behavior is determined and the sites to which it applies need to be identified after the fact, as in our testing on live CMP sites, a more dynamic segmentation is needed. This requires a database and query mechanism, which wasn't the norm. Only Omniture's SiteCatalyst ships with one.

    We tested services, but there are also many products available. In choosing between the two, consider the amount of data: The more you have, the more it costs to use a service. And data-intensive shops likely have some in-house expertise and the infrastructure required to run a product.

    Services do have one financial benefit, especially if you're struggling to get resources dedicated to making the most of your Web presence. That is, services cost hard dollars, while the amount of time allocated by in-house resources is often fuzzy. A service defines and dedicates resources, and builds a cost basis for possible future in-house resource cost justification. If you have a monthly bill of "X" and can bring the work in house for "X-1," you've got a case.

    Be aware, though, that service providers succeed based on cost sharing, so you may be hard-pressed to run the service cheaper. On the other hand, service providers succeed because they tend to take a one-size-fits-all approach; if you want granular control, and data analysis represents a competitive advantage, do it yourself.Who's Asking?

    Data security and privacy concerns generally aren't an issue for these services. Exceptions are e-mail and gated sites, where visitor identity can be correlated to specific behaviors. This tends to be an advanced function, promising to allow for cross- and reference-selling. However, even in this case, data is used only to determine general user behavior. For example, if a customer bought a red scoop-neck sweater last time she was at your site and a yellow scoop-neck sweater is on sale, she's a good candidate to get a yellow-sweater pop-up. But that's not because of who she is; rather, it's because of what she previously bought. Obviously, it's a fine distinction that could be argued. The bottom line: Those worried about who has insight into their habits and motivations shouldn't browse the Internet or should visit only gated sites with privacy polices they agree with.

    This sort of analytics slice-and-dice tends to be for large organizations. It makes sense that big companies are more likely to have the chops to digest analytics and make the data meaningful. Our tests found some affordable services, and the collected data could help a smart small business get a leg up on larger competitors without having to build an in-house infrastructure. Of course, being able to understand and act on analytics data is up to you.

    Bruce Boardman, executive editor of Network Computing, tests and writes about network management and systems. He has 12 years' experience managing networks and distributed computing for a financial service provider. Write to him at [email protected].

    If you build it, will they come? In this case, "it" is your Web site, and it doesn't matter if you're selling bobblehead dolls, recruiting players for a role-playing game, explaining how to set the clocks on your company's VCRs or complementing a family of print publications. If you don't give the people what they want, within a few clicks, they will go elsewhere. That's where Web site analytics come in. Available as services or as products you can run in-house, these systems track the actions of site visitors to predict behavior and help you improve the efficiency of your designs.Depending on the size of your Web infrastructure, the amount of information these products return ranges from a lot to "Help, I'm drowning." Turning this deluge into actionable data will fall to you and your end users. But, as we explain in "Analyze This," the rewards can make it all worthwhile.

    Our parent company, CMP Media, was in the market for a new Web analytics service, so we set up a joint review: We did our usual battery of Real-World Labs tests on services from IBM, Intellitracker, Manticore Technology, NetIQ, NetRatings, Omniture, ThinkMetrics, Watchfire, WebAbacus and WebSideStory. In addition, CMP Web analytics manager Emily Sunderman and her staff filled us in on CMP's requirements, assisted with our tests, met with vendors and offered us some real live guinea pigs, er, end users. We tested all 10 services on CMP's live Web sites, gathered feedback from users, and applied the requirements that are being used to purchase a companywide solution.

    In "Inside Information", we give you Sunderman's and our analysis of the entries. Overall, we were pleased. Our winner, Omniture's SiteCatalyst, beat out WebSideStory's HBX On-Demand Web Analytics by a nose; we think both should be on anyone's shortlist, along with NetIQ's WebTrends and IBM SurfAid. Those looking to host their analytics in-house will want to pay special attention to WebTrends, which is available as both a product and a service.

SUBSCRIBE TO OUR NEWSLETTER
Stay informed! Sign up to get expert advice and insight delivered direct to your inbox

You May Also Like


More Insights