Tải bản đầy đủ - 0 (trang)
Chapter 11. Honing the Craft: SEO Research and Study

Chapter 11. Honing the Craft: SEO Research and Study

Tải bản đầy đủ - 0trang

Websites

A large number of online sites cover the search marketing space. Here is a short list of some of

the most well-known ones:

• Search Engine Land (http://www.searchengineland.com), owned and operated by Third Door

Media

• Search Engine Watch (http://www.searchenginewatch.com), owned and operated by Incisive

Media

• SEOmoz (http://www.seomoz.org), owned and operated by SEOmoz

Each of these sites publishes columns on a daily basis, with Search Engine Land and Search

Engine Watch publishing multiple posts every weekday. The columns are typically written by

industry experts who have been chosen for their ability to communicate information of value

to their reader bases. SEOmoz also provides a wide range of tools and resources for SEO

practitioners.



Magazines

Magazines and journals that cover search and web marketing issues provide an additional

information stream. Despite the fact that they provide less frequent coverage (as they are

published monthly/bimonthly), the following are worth investigating:

• Search Engine Marketing Research Journal (http://www.semj.org/)

• Search Marketing Standard (http://www.searchmarketingstandard.com/)

• Website Magazine (http://www.websitemagazine.com/)



Commentary from search engine employees

Search engine representatives sometimes actively participate in forums, or publish blog posts

and/or videos designed for webmasters. The main blogs for each of the three major search

engines at the time of this writing are:

• Google Webmaster Central Blog (http://googlewebmastercentral.blogspot.com/)

• Bing’s WebLog (http://blogs.msdn.com/livesearch/)

• Yahoo! Search Blog (http://ysearchblog.com/)

The search engines use these blogs to communicate official policy, announce new things, and

provide webmasters with useful tips. You can reach Google personnel via the Google

Webmaster Help group in Google Groups (http://groups.google.com/group/Google_Webmaster

_Help/topics). Members of the Google webspam team are active in this group, answering

questions and even starting their own new threads from time to time.

You can also interact with search engine representatives in various forums, such as

WebmasterWorld (http://www.webmasterworld.com) and Search Engine Roundtable (http://



474



CHAPTER ELEVEN



forums.seroundtable.com/). Sometimes they use a nickname such as “googleguy” or “msndude,”

so watch for those. You can also watch for search engine personnel who leave comments in

popular SEO blogs. We will discuss the value of forums in more detail in “The SEO Industry

on the Web” on page 500.



Interpreting commentary

Search engine reps are “managed” by their corporate communications departments. Some

aren’t allowed to go on the record. Some need approval before doing so, and/or need their

comments edited before publication. A rare few have free reign (e.g., Matt Cutts). Often they

can’t be very specific or they can’t answer questions at all. The algorithms the search engines

use are highly proprietary and they need to keep them secret.

This means there are certain types of questions they won’t answer, such as “What do I have

to do to move from position 3 to position 1 on a particular search?” Or “How come this spammy

site ranks so much higher than mine?”

In addition, they have their own motives and goals. They will want to reduce the amount of

spam in their search index and on the Web overall (which is a good thing), but this may lead

them to take positions on certain topics based on those goals.

As an example, Google does not talk about its capability for detecting paid links, but it suggests

that its ability to detect them is greater than the general webmaster community believes. Taking

this position is, in itself, a spam-fighting tactic, since it may scare people away from buying

links who otherwise might have chosen to do so (as we indicated in Chapter 7, we do not

recommend purchasing links, but this example is meant to illustrate how a policy might affect

communications).

In spite of these limitations, you can gather a lot of useful data from interacting with search

engine representatives.



SEO Testing

SEO is both an art and a science. As with any scientific discipline, it requires rigorous testing

of hypotheses. The results need to be reproducible, and you have to take an experimental

approach so as not to modify too many variables at once. Otherwise, you will not be able to

tell which changes were responsible for the results.

And although you can glean a tremendous amount of knowledge of SEO best practices, latest

trends, and tactics from SEO blogs, forums, and e-books, it is hard to separate the wheat from

the chaff and to know with any degree of certainty that an SEO-related claim will hold true.

That’s where the testing of your SEO comes in: proving what works and what doesn’t.

Unlike multivariate testing for optimizing conversion rates, where many experiments can be

run in parallel, SEO testing requires a serial approach. Everything must filter through the

search engines before the impact can be gauged. This is made more difficult by the fact that



HONING THE CRAFT: SEO RESEARCH AND STUDY



475



there’s a lag between making changes and having the revised pages get spidered, as well as

another lag while the spidered content makes it into the index and onto the search engine

results pages (SERPs). On top of that, the results delivered depend on the user’s search history,

the Google data center accessed, and other variables that you cannot hold constant.



Sample experimental approach

Let’s imagine you have a product page with a particular ranking in Google for a specific search

term, and you want to improve the ranking and resultant traffic. Rather than applying a

number of different SEO tactics at once, start varying things one at a time:

1. Tweak just the title tag and see what happens.

2. Continue making further revisions to the title tag in multiple iterations until your search

engine results show that the tag truly is optimal.

3. Move on to the H1 tag, tweaking that and nothing else.

4. Watch what happens. Optimize it in multiple iterations.

5. Move on to the intro copy, then the breadcrumb navigation, and so on.

You can test many different elements in this scenario, such as:

• Title tag

• Headline (H1) tag

• Placement of the body copy in the HTML

• Presence of keywords in the body copy

• Keyword prominence

• Keyword repetitions

• Anchor text of internal links to that page

• Anchor text of inbound links to that page from sites over which you have influence

Testing should be iterative and ongoing, not just a “one-off” in which you give it your best shot

and you’re done. If you’re testing title tags, continue trying different things to see what works

best. Shorten it; lengthen it; move words around; substitute words with synonyms. If all else

fails, you can always put it back to the way it was.

When doing iterative testing, it’s good to do what you can to speed up the spidering and

indexation so that you don’t have to wait as long between iterations to see the impact.

You can do this by flowing more link juice to the pages you want to test. That means linking

to them from higher in the site tree (e.g., from the home page). But be sure to give it some

time before forming your baseline, because you will want the impact of changing the internal

links to show in the search engines before initiating your test (to prevent the two changes from

interacting).



476



CHAPTER ELEVEN



Or you can use the Google Sitemaps protocol to set a priority for each page from 0 to 1.0. Dial

up the priority to 1.0 to increase the frequency with which your test pages will be spidered.



NOTE

Don’t make the mistake of setting all your pages to 1.0; if you do, none of your pages

will be differentiated from each other in priority, and thus none will get

preferential treatment from Googlebot.



Since geolocation and personalization mean that not everyone is seeing the same search results,

you shouldn’t rely on rankings as your only bellwether regarding what worked or didn’t work.



Other useful SEO metrics

As we discussed in Chapter 9, many other meaningful SEO metrics exist, including:

• Traffic to the page

• Spider activity

• Search terms driving traffic per page

• Number and percentage of pages yielding search traffic

• Searchers delivered per search term

• Ratio of brand to nonbrand search terms

• Unique pages spidered

• Unique pages indexed

• Ratio of pages spidered to pages indexed

• Conversion rate

• And many others

But just having better metrics isn’t enough. An effective testing regimen also requires a

platform that is conducive to performing rapid-fire iterative tests, in which each test can be

associated with reporting based on these new metrics. Such a platform comes in very handy

with experiments that are difficult to conduct under normal circumstances.

Testing a category name revision applied sitewide is harder than, say, testing a title tag revision

applied to a single page. Specifically, consider a scenario where you’re asked to make a business

case for changing the category name “kitchen electrics” to a more search-engine-optimal

“kitchen small appliances” or “small kitchen appliances”. Conducting the test to quantify the

value would require applying the change to every occurrence of “kitchen electrics” across the

website. A tall order indeed, unless you can conduct the test as a simple search-and-replace

operation, which you can do by applying it through a proxy server platform.



HONING THE CRAFT: SEO RESEARCH AND STUDY



477



By acting as a middleman between the web server and the spider, a proxy server can facilitate

useful tests that normally would be invasive on the e-commerce platform and time-intensive

for the IT team to implement.



NOTE

During the proxying process, not only can words be replaced, but also HTML, site

navigation, Flash, JavaScript, frames, even HTTP headers—almost anything. You

also can do some worthwhile side-by-side comparison tests; a champion/

challenger sort of model that compares the proxy site to the native website.



Start with a hypothesis

A sound experiment always starts with a hypothesis. For example, if a page isn’t performing

well in the engines and it’s an important product category, you might hypothesize that this

product category isn’t performing well because it’s not well linked from within your site. Or

you may conclude that this page isn’t ranking well because it is targeting unpopular keywords,

or because it doesn’t have enough copy.

Once you have your hypothesis, you can set up a test to gauge its truth. Try these steps:

1. In the case of the first hypothesis, link to that page from the home page and measure the

impact.

2. Wait at least a few weeks for the impact of the test to be reflected in the rankings.

3. If the rankings don’t improve, formulate another hypothesis and conduct another test.

Granted, this can be a slow process if you have to wait a month for the impact of each test to

be revealed, but in SEO, patience is a virtue. Reacting too soon to changes you see in the SERPs

can lead you to false conclusions. You need to give the search engines time to fully process

what you have done so that you can improve the chances that you are drawing the right

conclusions based on your tests. You also need to remember that the search engines may be

making changes in their algorithms at the same time.



Analysis of Top-Ranking Sites and Pages

There are many reasons for wanting to analyze top-ranking sites, particularly those that are

top-ranking in your market space. They may be your competitors’ sites—which is reason

enough to explore what they are doing—but even if they are not, it can be very helpful to

understand the types of things they are doing to see how those things helped them get their

top rankings. With this information in hand you can be better informed as you decide how to

put together the strategy for your site.



478



CHAPTER ELEVEN



Let’s start by reviewing a number of metrics of interest and how to get them:

1. Start with a simple business analysis to see how a particular company’s business overlaps

with yours and with other top-ranked sites in your market space. It is good to know who

is competing directly and who is competing only indirectly.

2. Find out the starting year for the website. This can be helpful in evaluating the site’s

momentum. Determining the domain age is easy; you can do it by checking the domain’s

Whois records. Obtaining the age of the site is trickier. However, you can use the Wayback

Machine (http://www.archive.org) to get an idea of when a site was launched (or at least

when it had enough exposure that Archive.org started tracking it).

3. Determine the number of Google results for a search for the domain name (including the

extension) for the past six months, excluding the domain itself. To get this information,

search for theirdomain.com -site:theirdomain.com in Google. Then append as_qdr=m6 to the

end of the results page URL and reload the page.

4. Determine the number of Google results for a search for the domain name (including the

extension) for the past three months, excluding the domain itself. This time modify the

results page URL by adding &as_qdr=m3 to the end of it.

5. Perform a query on Google Blog Search (http://blogsearch.google.com/) for the domain name,

excluding the domain itself, on the default settings (no particular timeline).

6. Find out from Google Blog Search how many posts have appeared about the site in the

past month. To do this, search for the domain in Google Blog Search, but then append

&as_qdr=m1 to the end of the results page URL and reload the page.

7. Obtain the PageRank of the domain’s home page as reported by the Google Toolbar.

8. Use Yahoo! Site Explorer to determine how many backlinks the site has. Better still is if

you have an industrial-strength tool such as SEOmoz’s Linkscape (http://www.seomoz.org/

linkscape) or Majestic-SEO (http://www.majesticseo.com). These tools provide a much richer

set of link data based on their own crawl of the Web, including additional critical details

such as the anchor text of the links.

9. Pull a traffic chart from Compete (http://www.compete.com) that compares the site’s traffic

to that of its direct competitors to see how it is doing trafficwise. Repeat the traffic analysis

with Quantcast (http://www.quantcast.com) and Alexa (http://www.alexa.com). Be careful to

not put too much stock in the specific numbers, as the measurement techniques of these

services are crude, but the relative traffic numbers should be information of great interest

(e.g., competitor A is doing better than competitor B and how they both compare to you).

10. If you are able to access a paid service such as Hitwise (http://www.hitwise.com) or comScore

(http://www.comscore.com), you can pull a rich set of additional data, breaking out the site’s

traffic by source (e.g., organic versus paid versus direct traffic versus other referrers). You

can also pull information on their highest-volume search terms for both paid and organic

search.



HONING THE CRAFT: SEO RESEARCH AND STUDY



479



11. Determine the number of indexed pages in each of the three major search engines, using

site:theirdomain.com.

12. If relevant, obtain Technorati’s authority number for the site, which derives from the

number of individual, unique blogs that have linked to a site in the past 90 days.

13. If relevant, get Google’s feed subscriber numbers for the site, which you can find by

searching for domains inside Google Reader.

14. If relevant, determine Bloglines’ subscription numbers for the site, which derives from

searches performed inside Bloglines.

15. Search on the company brand name at Google, restricted to the past six months (by

appending &as_qdr=m6 as outlined earlier).

16. Repeat the preceding step, but for only the past three months (using &as_qdr=m3).

17. Perform a Google Blog Search for the brand name using the default settings (no time

frame).

18. Repeat the preceding step, but limit it to blog posts from the past month (using

&as_qdr=m1).

Of course, this is a pretty extensive analysis to perform, but it’s certainly worthwhile for the

few sites which are the most important ones in your space. You might want to pick a subset

for other related sites.



NOTE

As valuable as website metrics are, brand names can sometimes provide even more

insight. After all, not everyone is going to use the domain name when talking about

a particular brand, nor will they all link. Thus, looking at brand mentions over the

past few months can provide valuable analysis.



Analysis of Algorithmic Differentiation Across Engines and Search Types

Each search engine makes use of its own proprietary algorithms to crawl and index the Web.

Although many of the basic elements are the same (such as links being used as votes), there

are significant differences among the different engines. Here are some examples of elements

that can vary in on-page SEO analysis:

• Weight of title tags

• Weight of heading tags

• Emphasis on title tags

• Weight placed on synonyms

• Value of internal link anchor text

• How internal links are weighted as votes for a page



480



CHAPTER ELEVEN



• Duplicate content filtering methods

• And many, many more

Similarly, there are many different ways a search engine can tune its algorithm for evaluating

links:

• Percentage of a page’s link juice that it can vote for other pages

• Weight of anchor text

• Weight of text near the anchor text

• Weight of overall linking page relevance

• Weight of overall relevance of the site with the linking page

• Factoring in placement of the link on the page

• Precise treatment of NoFollow

• Other reasons for discounting a link (obviously paid, manually tagged as paid, etc.)

• And many, many more

A detailed understanding of the specifics of a search engine’s ranking system is not possible.

However, with determination you can uncover various aspects of how the search engines

differ. One tactic for researching search engine differences is to conduct some comparative

searches across the engines. For example, if you search on blog in Google and Yahoo! you get

the data shown in Table 11-1.

TABLE 11-1. Comparison of top five results for “blog” in Google and Yahoo!

Google



Yahoo!



http://www.blogger.com



http://www.blogger.com



http://googleblog.blogspot.com



http://360.yahoo.com



http://en.wikipedia.org/wiki/Blog



http://blogsearch.google.com



http://sethgodin.typepad.com



http://en.wikipedia.org/wiki/Blog



http://kanyeuniversecity.com/blog



http://wordpress.com



There are some pretty significant differences. For example, notice how Yahoo! has Google Blog

Search as its #3 result. Google has most likely filtered this out of the results because it is a search

engine home page, and Yahoo! has not. Yahoo! also has 360.Yahoo.com in the #2 position,

whereas Google does not list it as a result at all.

You can also try to conduct some detailed analysis to guess why Seth Godin shows up in Google

but not in Yahoo!, or why Yahoo! is showing the WordPress site. You may find that Google is

weighting anchor text more heavily, whereas Yahoo! places greater value on contextual

analysis.



HONING THE CRAFT: SEO RESEARCH AND STUDY



481



It is also interesting to analyze the similarities. Both search engines have Blogger.com in the

#1 position and the Wikipedia page in a high position. This speaks to things in common in the

algorithms.



Using Experience and Instinct

There are some commonly perceived differences among the search engines. For example,

Yahoo! is perceived to be more focused on contextual analysis than Google. Google is believed

to place greater weight on link analysis than Yahoo! or Bing (note that link analysis is very

important to all three engines).

There are also institutional biases to consider. Yahoo!, for example, has some of the world’s

most popular websites (e.g., Yahoo.com), and this gives Yahoo! access to data that the other

search engines don’t have from those sites. Given its huge market share, Google has the richest

array of actual search data. Similarly, Microsoft has substantial assets in terms of its software

and operating businesses, MSN (http://www.msn.com), and products such as Hotmail. The nature

of the data that they have available to them can be influencing factors in how the search

engines make their decisions.

Over time and with experience, you can develop a sixth sense for the SERPs so that when you

look at a set of search results you will have a good grasp of the key factors in play without

having to analyze dozens of them.



Competitive Analysis

Everything we discussed previously regarding analyzing top sites applies to analyzing

competitors as well, and there are additional analysis methods that can help you gain a

thorough understanding of how your competitors in search are implementing their SEO

strategies.



Content Analysis

When examining a competing website, ask yourself the following questions:

• What content do they currently have on their site(s)? Answering this question will tell

you a lot of things, including how they view their customers and how they use content

to get links. For example, they may have a series of how-to articles related to their

products, or a blog, or some nifty free tools that users may like. If they do have a blog,

develop a sense as to what they write about. Also, see whether the content they are

developing is noncommercial or simply a thinly disguised ad.

• How rapidly is that content changing? Publishers with rapidly changing websites are

actively investing in their websites, whereas those who are not adding new articles or

updating content may not be actively investing in their sites.



482



CHAPTER ELEVEN



• What type of content is it? Articles? Videos? Images? Music? News feeds? The type of

content they focus on can give you insight into their marketing strategy.

• Are they collecting user-generated content? Sites that gather a meaningful amount of

user-generated content tend to have an engaged user audience.

• Are they trying to generate sign-ups or conversions in a direct way with their content? Or

is it editorial in tone and structure? This can give you more visibility into the way their

strategy is put together.



Internal Link Structure and Site Architecture

Your competitors’ site organization and internal linking structure can indicate their priorities.

Content linked to from the home page is typically important. For example, the great majority

of websites have a hierarchy in which the major subsections of the site are linked to from the

home page, and perhaps also from global navigation that appears on all or most of the pages

on the site.

But what else is linked to from the home page? If the competitor is SEO-savvy this could be a

clue to something that they are focusing on. Alternatively, they may have discovered that

traffic to a given piece of content converts well into business. Either way, this is interesting to

know.



External Link Attraction Analysis

You can extract a tremendous amount of information via link analysis. Yahoo! Site Explorer

provides a start, but going past a simple list of links to gathering link data such as anchor text

and link authority requires a more advanced tool, such as Link Diagnosis (http://www

.linkdiagnosis.com), Linkscape (http://www.seomoz.org/linkscape), or Majestic-SEO (http://www

.majesticseo.com).

In addition, you can do many other things by conducting an external link analysis of your

competitors:

Dig deeper

What pages on the site are attracting the most links? These are probably the most

important pages on the site, particularly if those pages also rank well for competitive search

terms.

Determine where the content is getting its links

This can help you develop your own content and link-building strategies.

Analyze anchor text

Do they have an unusually large number of people linking to them using highly optimized

anchor text? If so, this could be a clue that they are buying links.



HONING THE CRAFT: SEO RESEARCH AND STUDY



483



If your analysis shows that they are not buying links, take a deeper look to see how they

are getting optimized anchor text. This may be an indicator of a good strategy for you to

use with your own site.

Determine whether they build manual links

Manual links come in many forms, so look to see whether they appear in lots of directories,

or whether they are exchanging a lot of links.

Determine whether they are using a direct or indirect approach

Try to see whether they are using indirect approaches such as PR campaigns or social media

campaigns to build links. Some sites do well building their businesses just through basic

PR. If there are no signs that they are doing a significant amount of PR or social-mediabased link building, it could mean they are aggressively reaching out to potential linkers

by contacting them directly.

If they are using social media campaigns, try to figure out what content is working for

them and what content is not.

Determine whether they are engaging in incentive-based link building

Are they offering incentives in return for links, such as award programs or membership

badges? Do those programs appear to be successful?

Determine their overall link-building focus

Break down the data some more and see whether you can determine where they are

focusing their link-building efforts. Are they promoting viral videos or implementing a

news feed and reaching influencers through Google News and Yahoo! News?



What Is Their SEO Strategy?

Wrap up your competitive analysis by figuring out what their SEO strategy is. First, are they

SEO-savvy? Here are some quick signals you can get that flag a competitor who is not SEOsavvy:

• Poorly optimized anchor text (if you see lots of links that say “Read more” or “Click here”

it is a dead giveaway that they are not SEO-savvy).

• Content buried many levels deep. As you know from Chapter 6, flat site architectures are

a must in SEO.

• No signs of an active link-building campaign. Some sites can still rise to the top just because

of a very effective (traditional) PR campaign.

• Critical content made inaccessible to spiders, such as content behind forms or content that

can be reached only through JavaScript.

• And so forth.

Basically, if you see people making obvious errors and not following best or even good practices

in basic ways, they are not very SEO-savvy. These competitors may still be dangerous if they



484



CHAPTER ELEVEN



Tài liệu bạn tìm kiếm đã sẵn sàng tải về

Chapter 11. Honing the Craft: SEO Research and Study

Tải bản đầy đủ ngay(0 tr)

×