Online Advertising Effectiveness:
Search Engine / Keyword Value
First published 5/23/01
The following is a compilation of pieces from the Drilling Down
Newsletter on tracking visitor value using web log analysis. If
you want to get neat stuff like this delivered to your e-mail inbox, sign
up for the newsletter here ->>!
You might need a little background on web logs first, so if you
haven't already, first read this "kick-off" article to the
discussion below:
Monitoring Visitor Conversion Using
WebTrends
New!
Download the Visitor Quality / Engagement
Calculator for WebTrends (Excel spreadsheet)
Series: Online Advertising Effectiveness
Online Advertising Effectiveness?
Tell Me About It #1
=====================
Well, I got some positive feedback on the last Webtrends
article so I figured I would toss in another. I don't want
to sound like a shill for WebTrends, but I don't know how you manage a
web business without detailed log analysis. WebTrends is
not nearly as good as the system I used for the CBS/SportsLine
"points for page views" loyalty program, but then again, not
many of you probably need something with that much horsepower.
Or do you? Let me know...
Take my current little pet peeve - I'm getting ripped off on
advertising, it would seem. Or am I? Oh, not on the
response rate side, I get great response rates with Google AdWords and
GoTo. I'm talking about the quality of the visitors
generated. It seems that visitors coming from my ads might be of
a lower quality than free visitors coming directly through the
search engines.
Check out this little chart, all based on visitor
sessions (times in minutes):
Metric
|Ad Visitors |Search Visitors
________________________________
Avg. Visit Length 7.43
9.12
% 1 Page
Visits
38%
51%
% >10 Page Visits
9%
5%
% > 19 minute Visits
11%
7%
% Downloading Book Sample
2.2% 10.2%
% Bookmarking Site
2.2%
11.0%
% Newsletter Subscribes
3.7%
5.1%
Hmmm, he said. Not much to make a decision on here, but the
differences are striking enough to warrant further investigation I'd
say. The page viewing activity seems to indicate the ad-driven
visitors are of higher quality (lower one page visits, higher percentage of high activity
users) but the
"engagement behavior" of the search-driven visitors
(downloading, bookmarking, subscribing) is far more valuable, as these
visitors are most likely to turn into book buyers. What's really
going on here? Why should I pay for ads if the "free"
search visitors are of higher quality. Huh? Huh? We'll "drill down" another level next
month and see.
Online Advertising Effectiveness?
Tell Me About It #2
=====================
OK, is Jim getting ripped off on his online advertising or not?
The only advertising I buy is highly targeted to search terms,
primarily through GoTo and the Google AdWords program. This
means I get two kinds of traffic from the same search engine - paid
and unpaid - for the same search terms! Last month, we looked at
a chart comparing the value of these visitors by
source.
Because I'm a Drilling Down kind of guy, I took these
numbers down to the next level. I wanted to see if there was
variation by the search phrase used, not just an average of all search
phrases.
So I took my top 3 search terms (relationship
marketing, customer retention, customer loyalty) and did a similar break out. The following is a chart of visitor behavior for the
3 search terms above, broken out by whether they clicked on an ad
displayed in response to the search term or clicked on the search
engine listing itself.
By the way, in many cases both are
displayed at the same time (if I rank high enough for the search term
in the engines involved):
Top 3 Search Terms Comparison -
Paid versus "Free" (times in minutes):
Metric Ad
Visitors | Search Visitors
_______________________________
Avg. Visit Length 8.75
3.53
% 1 Page Visits 22%
20%
% Downloading Book Sample
6.0% 2.2%
% Bookmarking Site
9.8% 3.8%
% Newsletter Subscribes
3.8%
.6%
Well, I'll be darned. Now the visitors from ads are of better
quality - higher rates of downloading, bookmarking, and newsletter
subscription. The variation is really not best understood by the
method of arrival (ad or free search), but by the search term itself!
Or some other yet undiscovered combination of variables. If
there can be this much change just by looking at search term, then I
must have some paid ads keyed to search terms that generate very poor
quality visitors.
I know what you're thinking - he's going to Drill Down
some more, take it down another level in next month's newsletter...
And you would be right!
Online Advertising
Effectiveness? Tell Me About It #3
=====================
OK, is Jim getting ripped off on his online advertising or not?
The only advertising I buy is highly targeted to search terms,
primarily through GoTo and the Google AdWords program. This
means I get two kinds of traffic from the same search engine - paid
and unpaid - for the same search terms!
Last month, we looked at a chart comparing the
value of these visitors for my top 3 search terms (relationship
marketing, customer retention, customer loyalty) and did a break out
of visitor value by source - paid ad or "free" search.
By the way, in many cases both paid and free links are displayed at
the same time (if I rank high enough for the search term involved).
Visitors from paid ads are clearly of better quality - higher rates of
downloading, bookmarking, and newsletter subscription. Paid ad
visitors also stay twice as long on the web site. This is a monster change from the previous analysis, which showed when
looking at all search terms (not just the top 3), paid versus
unpaid, the free visitors appeared to be of higher value based on
their behavior.
The implication of the above shift: there is variability in the
quality of visitor generated according to the search phrase, and
this may account for some or all the difference between the quality of
a pay versus free visitor. Intuitively, this makes sense to me,
because I only pay for relevant search terms, and "free
visitors" may be arriving as a result of a non-relevant search.
This is tremendously important to know, especially in light of the
general industry commentary that paid search listings result in poorer
search quality for users.
Hmm...
So, let's take a closer look at search term quality by busting up
the aggregate "paid" search results above by search term,
and see what we get. The following table compares each search
term individually with the total site statistics, where RM =
Relationship Marketing, CR = Customer Retention, CL = Customer
Loyalty, and TS = Total Site statistics (Avg. Visit Length in
minutes):
Metric_________RM___CR___CL___TS
Avg. Visit Length 8.49 8.44
6.87 8.21
% 1 Page Visits 24%
22% 20% 43%
% Downloading 8.2% 6.1%
3.7% 3.1%
% Bookmarking 9.6% 7.6% 12.2%
5.9%
% Subscribing 4.5%
4.5% 2.4% 3.2%
Clearly, the paid ads on average generate a higher quality visitor,
and there is substantial variability even among the top 3 search terms
in visitor quality. The term Customer Loyalty generates visitors
with a shorter visit length and lower newsletter subscribe rate than
the overall site! But at the same time, they bookmark at much
higher rates. A bit puzzling, and whenever a behavioral marketer
sees data sets with potentially conflicting indicators such as seen in
the term Customer Loyalty, we know there is probably something else
going on we need to find out about.
Online Advertising Effectiveness?
Tell Me About It... (Part 4)
=====================
Last month we took a look at a quality of visitors chart
generated by my paid search listing ads on Google and GoTo.
Well, we're getting there. We've previously proved visitors
clicking on a paid listing are of higher quality than "free
search" visitors for the same search term, and now we see there
is also significant variability in quality of visitor by the term
itself, according to the chart above. Look
at Customer Loyalty (CL). Much shorter visits, and lower
download and newsletter subscribe percentages, but much higher
bookmarking percentages. What could this mean, why the
difference?
The stats above are a combination of all visitors for the same
terms from both Google AdWords and GoTo, so it seems logical the next
"Drill Down" would be to look at each source individually,
and that is just what I have done. For clarity, instead of
creating two charts and having you bust your eyeballs trying to
compare them, I have created a ratio between the Google and
GoTo numbers.
Google / GoTo Ratio
================
Metric____________RM_____CR______CL
___________________________________
Avg. Visit Length 65%
125% 308%
% 1 Page Visits 110%
115% 91%
% Downloading
48% 112% 570%
% Bookmarking 72%
44% 160%
% Subscribing
85% 84%
140%
If you were to read down the Relationship Marketing (RM) column, this
chart says:
"For the paid search term Relationship Marketing, the Average
Visit Length for visitors from Google is 65% that of GoTo, the percent
one page visits is 110% that of GoTo, the percent Downloading is 48%
that of GoTo," and so on. A number over 100% means Google
is higher than GoTo, under 100% means Google is lower than GoTo.
One thing is perfectly clear from this chart - Google dramatically
under-performs GoTo for the paid search term Relationship Marketing,
and outperforms GoTo on the paid search term Customer Loyalty, across
the board, in every category (note a lower number on % 1 Page Visits
is better).
Things are less clear-cut for the term Customer Retention, although
I'd have to give it to GoTo because Bookmarking and Subscribing to the
newsletter are highly correlated to future purchase of a book.
Where does this leave us? Overall, it appears you can not
attribute "quality" as defined here to either a search term
or a search engine; there is a combined contribution which
creates dramatic visitor quality differences. This is a perfect
example of the mistake people make when using "averages" or
looking at the "average customer" - rarely does the average
customer represent the true underlying behavior of the actual
customers.
Tactically, it means I should budget paid search expenses by term
by engine, and in the case above, shift most if not all the budget for
Relationship Marketing to GoTo, and most if not all the budget for
Customer Loyalty to Google. Customer Retention might need a
little more work to resolve, but instead of running the budget 50 / 50
as initially set up, it would make sense to maybe run 70% on GoTo, and
30% on Google, from what I see here. Hey, it doesn't always come
out black and white, you know?
As far as why this occurs, it's fun to speculate, but a
marketing behaviorist cares more that it does happen - it's a
fact, Jack - and takes action based on this fact. There's plenty
of time to wonder about it later, after the spending has been
reallocated and the highest ROI possible is being realized.
A "gun to the head" guess? It's the content at the
other end of the click making the difference. The content on the
Customer Loyalty page appeals more to a Google user, and the content
on the Relationship Marketing page appeals more to a GoTo user.
Why? I haven't got a clue. Check them out for yourself:
Customer Loyalty
(favored by the Google user):
Relationship Marketing
(favored by the GoTo user):
Let me know what you think. If the responses seem to be
trending one way or the other, I'll present the arguments in the next
newsletter. Meanwhile, the idea of content making the difference
(a 3rd variable in addition to term and engine?) is kind of
interesting - maybe there's a way to test the idea.
I'll let you know...
Practice What You Preach: Online Advertising
Effectiveness? Tell Me About It... (Part 5)
=====================
Last month we took a look at the quality of visitors generated by my
paid search listing ads on Google and GoTo.
One thing was perfectly clear from this chart -
Google dramatically under-performs GoTo for the paid search term
Relationship Marketing, and outperforms GoTo on the paid search term
Customer Loyalty, across the board, in every category (note a lower
number on % 1 Page Visits is better).
Things are less clear-cut for the term Customer Retention, although
I'd have to give it to GoTo because Bookmarking and Subscribing to the
newsletter are highly correlated to future purchase of a book.
This analysis brings up an interesting question, though. What is
the effect of the content searchers land on when clicking on a search
item? Could the variances above be at least partially explained
by a good or poor match of the content with the expectations of the
searcher? How large could this effect be, a double or a triple
in response?
That's what I tried to find out, by sending all these searchers to
the same page - my home page, which covered all three subjects in a
generic sense, and had prominent links to the same pages searchers
were sent to previously - Custom Landing pages written to match the
search term used. Note: The current Home Page is different
than the one used when this test was run. The Home Page used in
the test was similar to this
page with links to the Custom landing pages displayed prominently
at the top of the page.
The chart below shows the conversion metrics of visitors for my
three primary search terms - Relationship Marketing, Customer
Retention, and Customer Loyalty - when they are all sent to the Home
Page (far left column) and when they are sent to a Custom Page
designed to reflect the search term they were using (far right
column). Also provided for comparison are the same metrics
generated by All Search visitors and All Google search visitors (Avg. Visit Length
in minutes):
Search-Driven Visitor Conversion Metrics
================
Metric
Home___All_____All____Custom
________ Page___Search__Google_Landing
Avg. Visit Length
3.35 3.15
2.61 2.60
% 1 Page Visits
40% 44%
52% 53%
% Downloading
3.19% 3.42% 3.63% 6.01%
% Bookmarking
3.72% 5.36% 7.44% 9.84%
% Subscribing
3.19% 3.57% 3.82% 3.83%
If you were to read down the Home Page column, this chart says:
"When visitors searched the terms Relationship Marketing,
Customer Retention, and Customer Loyalty on Google and GoTo and
clicked through to the Home Page, they stayed an average of 3.35
minutes, 39.9% viewed just this page then left, 3.19% downloaded a
book sample, 3.72% bookmarked the site, and 3.19% subscribed to the
Drilling Down newsletter (which you are reading now).
But check out what happens when they land on a page designed for
the topic they were searching. Shorter visit (bad), higher
abandonment (bad), higher download, bookmark, and subscribe (very
good, since these stats directly correlate to future purchase of my
book).
What does this mean? Can we reconcile the "bad" and
the "good" in terms of the behavioral marketing
approach?
Well, sure. Two possibilities come to mind:
1. When I dump highly targeted visitors on the generic home page,
they stay longer and view more pages *looking for what they came to
find*, but a higher percentage then leave without engaging in the
desired behavior. When I take the exact same traffic and dump it
to Custom Landing Pages, they stay for a shorter length of time and
view fewer pages, but they download, bookmark, and subscribe at a much
higher rate, because they found exactly what they were looking for.
2. It's also likely the targeting of the Custom Landing page
itself is causing shorter visits / higher abandonment. In other
words, a visitor types in "Customer Loyalty," a pretty
generic concept, and lands on a page with a specific view on the
search term. It's more likely this specific content differs from
what was desired by the visitor *relative* to the Home Page, which by
nature is meant to have a generic appeal. The generic approach
gets the longer visit and deeper site penetration relative to the
specific approach, but also ends up driving away the specific
visitors I am looking for (those who might want to buy a book on
measuring and tracking loyalty metrics) at a higher rate.
This kind of effect is seen quite frequently in direct marketing
efforts; the more targeted you get on the front end, the lower the
"initial response" but the higher the "final
conversion" to the desired outcome you are looking for. In
this case, the listings (paid or free) are acting like the "outer
envelope" of a direct mail letter, and the landing pages are the
letter. Same kind of idea. The
results may seem intuitive to you (give them what they want and they
respond at a higher rate) but you don't know for sure until you
measure the effect. To maximize the ultimate conversion of the
whole site, you have to find the "perfect balance"
between the initial response and final conversion to the behavior.
Did you notice how the stats get better and better as you read from
the left to the right of the chart? Scroll
up and look at it again. Weird, huh? Almost mystical
in consistency. I get better performance from natural search
traffic than I get from driving highly targeted (and paid for) traffic
to the generic Home Page. And "natural" Google traffic
is even better than "All Search" engine traffic. What
does this mean?
That's right, you guessed it. I'm going to have to go down
another layer and find out what the heck is going on. Next month
we'll have the last Drill Down on this topic, I promise.
Practice What You Preach: Online Advertising
Effectiveness? Tell Me About It... (Part 6)
=====================
Last month we took a look at the influence of a custom landing page on
the quality of visitors generated by the GoTo and Google Adwords
programs. The custom landing page resulted in higher abandonment
(bad), and higher download, bookmark, and subscribe rates (very good,
since these stats directly correlate to future purchase of my book)
when compared with the home page. This was expected, since the
very targeted nature of the custom landing page tends to screen out
everybody but the most focused visitors, and for the same reason,
drives higher "action behavior" (bookmark, subscribe,
download).
This makes me wonder - do the different engines really deliver
traffic all that different in quality? Google is a bit of a
strange bird, because it is currently a media favorite and never got
into the "portal" business. What about all the other
search engines?
Here's what the "action behavior" (behavior leading to
book purchase) stats look like on the rest of them, in order of the
percent of traffic they deliver to my site:
Note: "Yahoo" excludes Google default pages
Engine______MSN__A. Vista__Yahoo__Excite
% of Search 35% 20%
19% 7%
% Download 3.2% 4.8%
2.1% 1.4%
% Bookmark 6.5% 8.1%
4.2% 8.6%
% Subscribing 2.3% 2.8%
3.6% 2.8%
Note: "Lycos" excludes Hotbot
Engine_________Netscape__Lycos__Hotbot
% of Search
5.7% 4.6%
3.5%
% Downloading 3.5%
3.2% 1.4%
% Bookmarking 1.8%
5.4% 8.6%
% Subscribing 6.1%
6.5% 2.9%
Engine__________N. Light___FAST___AOL
% of
Search
3% 1.2%
1.1%
% Downloading
1.6% 8.3% 0.0%
% Bookmarking
12.7% 12.5% 4.4%
% Subscribing
0.0% 4.1% 8.7%
Hmmm. Sure are different, aren't they? There's frequently
a difference of double or triple in the same metric across the
engines. But traffic also matters. FAST delivers great
overall stats but hardly any traffic, so I should probably look into
what is going on there.
And I will. Fortunately, you will be spared the results, as
this is the promised end of the series on analyzing web logs.
What did we learn? Keywords, landing pages, paid search
links, and the search engine itself all have a tremendous impact on
the quality of your visitor traffic. Not just "an
impact," but a huge impact. All traffic is not created
equal, and if you are not doing this kind of analysis for your site,
you are undoubtedly wasting resources chasing what you think is
working, as opposed to what you know is working. My
advice - let the behavior of the visitor tell the tale.
Make sure to download and try the free visitor metrics
calculator, it works with just about any traffic analyzer and
contains 22 more metrics like the ones above. Not all of them
will apply to your web site, but you will probably find many of them
do apply to your site. If you really want to get serious about
this area, check out the book on creating
/ using visitor metrics.
Visitor metrics are all about getting customers. Once
you've mastered visitor metrics, some of you might be interested
in making more money from and keeping customers; that is what my other book, Drilling Down,
is all about - the metrics you need to create and track High ROI customer
marketing programs.
|