Sunday, August 14, 2005
How Successful is that Web Page? A Benchmark
Well, I've been away for too long - so much work to do this summer. While I was away from this blog, I was tasked with trying to determine an objective measure of how "effective" a particular web page is in relation to others. The result is a new benchmark: the Weighted Performance Score. First, some background:
I recently built a "work in progress" web site for a client who needed a basic presence while the full version of the site was in development. Even though this was a "no frills" couple of pages, it occurred to me that a small site presented a good opportunity to stage a test to see which messaging would be most effective to launch on the full site.
Plan the Test
For the test, an A/B/C methodology was devised for the "work in progress" version of the web site. The limited number of pages made for a manageable number of branching links to track. Upon entering the main URL of the site, visitors were randomly connected to one of three home pages. All aspects of the home page were the same on the A, B, and C version with the exception of some bulleted copy. Subsequent pages linking from the home page were exactly the same across all three versions. Group A contained bullets heavy in "marketing speak," which I considered my "control" group. Group B offered a bulleted profile of sales team experience. Group C consisted of several bullets of data on the results achieved for clients.
Measure the Test
After collecting data from site visitors, several metrics were compiled. These included how many visitors followed links from the home to other pages and how long they viewed each page. I called these the Followed Link Rate and Average Read Time.
Analyze the Test
To analyze and understand the data, I created the Weighted Performance Score (WPS). It was designed to objectively measure and compare the performance of a web page against itself or other pages on a site. The Weighted Performance Score looks at the percent of visitors following a link to a new page together with the average amount of time they spend viewing the new page to arrive at a performance score. The most scientific approach is to only compare slightly different versions of the same page to score a "winner," You could also use the Weighted Performance Score to compare different pages within a site, but this is somewhat like adding oranges to your apples for comparison - you won't be measuring exactly the same thing.
In the test, the results borne out by the WPS matched my hypothesis that the site version containing hard data would do the best job of getting visitors to dig deeper into the site and spend more time viewing pages. In this case, I had a good gut sense going in what the result would be, but it was helpful to have the validation of data - like most of our visitors, I like solid facts too!
I recently built a "work in progress" web site for a client who needed a basic presence while the full version of the site was in development. Even though this was a "no frills" couple of pages, it occurred to me that a small site presented a good opportunity to stage a test to see which messaging would be most effective to launch on the full site.
Plan the Test
For the test, an A/B/C methodology was devised for the "work in progress" version of the web site. The limited number of pages made for a manageable number of branching links to track. Upon entering the main URL of the site, visitors were randomly connected to one of three home pages. All aspects of the home page were the same on the A, B, and C version with the exception of some bulleted copy. Subsequent pages linking from the home page were exactly the same across all three versions. Group A contained bullets heavy in "marketing speak," which I considered my "control" group. Group B offered a bulleted profile of sales team experience. Group C consisted of several bullets of data on the results achieved for clients.
Measure the Test
After collecting data from site visitors, several metrics were compiled. These included how many visitors followed links from the home to other pages and how long they viewed each page. I called these the Followed Link Rate and Average Read Time.
Analyze the Test
To analyze and understand the data, I created the Weighted Performance Score (WPS). It was designed to objectively measure and compare the performance of a web page against itself or other pages on a site. The Weighted Performance Score looks at the percent of visitors following a link to a new page together with the average amount of time they spend viewing the new page to arrive at a performance score. The most scientific approach is to only compare slightly different versions of the same page to score a "winner," You could also use the Weighted Performance Score to compare different pages within a site, but this is somewhat like adding oranges to your apples for comparison - you won't be measuring exactly the same thing.
In the test, the results borne out by the WPS matched my hypothesis that the site version containing hard data would do the best job of getting visitors to dig deeper into the site and spend more time viewing pages. In this case, I had a good gut sense going in what the result would be, but it was helpful to have the validation of data - like most of our visitors, I like solid facts too!
Joseph Mann Sunday, August 14, 2005