Head

Form

Lower Head

EBLOG

E-Marketing Performance Blog

Google's Webmaster Guidelines: Design and Content Guidelines

Over the next several issues I’m going to go over Google’s Webmaster Guidelines. Many of the guidelines posted here are no-brainers, however some are often over looked are considered no-so-important. Certainly the importance of any single guideline presented by Google can be up for debate, however each must contain some measure of importance if Google has gone out of its way to publish it on their site.

In bold are the Google guidelines followed by a brief discussion of each.

Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.

We all know by now that every page should be reachable by clicking through from the site. In the past it was common to create orphan pages to act as doorways into the site. These orphan pages, not being accessible by clicking from another page in the site itself, are simply considered of little importance to the search engines. If the visitor can’t find it, then that page certainly must not be of significant enough value. After all, if its an important page to share with your visitors, you would have provided a link to that page.
hat catches my eye here on this guideline is that Google is recommending that your site be build with a clear hierarchal structure. Many web masters, and even SEOs, feel that you should have as many pages linked from the home pages as possible (without overwhelming the visitor with links). Google, on the other hand, wants to get a sense of what your main categories are, what’s most important, what is less important. We do the same thing with textual content. Sure all the content on the page is important, but some sentences and words get bolded for emphasis. Google is looking for the same kind of emphasis here. If we bold all the content on our pages, essentially all the content is seen as the same and no emphasis is added. The same goes for sites without a clear hierarchal structure, Google has no way of knowing which pages are most important to the visitor. If no structure is in place, because we feel that all the pages are most important, we are eliminating a potential relevance boost that some pages might otherwise receive.

Finally, notice that Google recommends text links. There are several kinds of links that simply don’t play well with search engine spiders, namely JavaScript menu systems and links in flash elements. You will want to make sure that any links in this kind of a menu system are duplicated with crawlable text links. What isn’t clear here is if Google is recommending text links over image links. Certainly there is a value by having a link with readable text, whereas images cannot be read (the image alt attribute can), but is there a superiority to having the text links in lieu of an image link with alt text? I don’t know, but it certainly would not hurt to duplicate any image links as well, if and when possible.

Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.

Site maps can certainly be useful to visitors. If they are unsure where to click to find the information they want, they can find it by getting to the site map. Site maps also make it easier for search engines to find all the pages on your site without having to dig to hard. This makes each of your pages more accessible to indexing. For large sites, site maps can be quite cumbersome. Notice Google emphasizes linking to the “important parts of your site”. Site maps don’t have to include EVERY page of your site, though it that can be done it would certainly be recommended.

Create a useful, information-rich site and write pages that clearly and accurately describe your content.

This one is a no brainer. Sites with little or no content will ultimately perform poorly. Search engines need to be able to read textual content in order to know what your site is about and how to assign is relevance for search rankings. Keywords here are creating “useful” and “information-rich” sites. Google wants to offer up sites that offer the best user experience. If your site is rich with information and visitors find it of value (i.e. spend time on your site) then your site is more likely to be seen of higher importance.

Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.

This is essentially keyword targeting. Many people come to us wondering why they don’t rank well for certain phrases, but when I go and look, those phrases are not used on the site. That’s not going to get you very far. Another trap many sites find themselves in is using phrases that the site owner knows and understands, but the average person does not. Its important to consider how the searcher might search for what you offer without knowing what its really called.

Try to use text instead of images to display important names, content, or links. The Google crawler doesn’t recognize text contained in images.

Graphics look good and allow you to use any font and style you want, but they don’t provide any readable content for the search engines. I’ve seen sites where the entire text was laid out in a graphic because that is how it looked best. Unfortunately, unless a site owner is willing to part with perfect pretty, the search engines won’t find the site much use. Its pretty common to use images for headings or titles. These headings are usually keyword rich and important, but by placing them in an image you are keeping that “important” element from being considered by the search engines.

Make sure that your TITLE and ALT tags are descriptive and accurate.

In other words, don’t stuff your title and alt attributes with junk. To me descriptive means using natural language and properly worded descriptions. A title or alt tags that are simply jammed with keywords will not far well.

Check for broken links and correct HTML.

When sites are designed we often check for broken links, but after that changes are often made, moving or renaming pages and files, etc. Sometimes we neglect to verify that all links work properly. Even if pages are accessible via another route, you are sending a message not only to the search engine but to the visitor as well about your professionalism.

Checking for correct HTML probably the most overlooked recommendation, overlooked by both web designers and SEOs alike. You don’t need perfect HTML for a page to look right in a browser. In fact you can have very poor html and still have a great looking site. Keep in mind, however, that browsers are much more forgiving to bad Code as search engine spiders. While it may look good to the viewer bad code may actually be preventing the search engines from indexing your entire site, or even entire pages, properly.

I would strongly recommend going a step beyond simply fixing any issues that might prevent spidering and getting your page to validate according to W3C standards. Many SEOs will disagree with me on the necessity of a full validation. They will point out that poorly coded sites can still perform excellent on the search engines. I won’t disagree with that as I’ve had my share of experiences of getting good rankings for sites that simply cannot validate 100%. On the other hand I believe that Google’s algorithm looks beyond simple relevance of on- and off-the-page content, but also at the credibility of the site itself. Its well known that some sites reach a certain threshold and are considered sites of “authority”. I think it would be less difficult to achieve that threshold if you are presenting a site that is constructed in a manner that complies with other established authorities, in this case that W3C, which sets the standards by which HTML is validated.

If you decide to use dynamic pages (i.e., the URL contains a ‘?’ character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them small.

Search engines are getting smarter about indexing dynamic content, but there are still pitfalls to certain dynamic systems. Its best if your dynamic pages can be displayed in static URLs and I would still recommend having that implemented if/when possible. But for some its not always possible, or plausible, and they dynamic URLs have to stay. In those cases, its best if there are not to many ‘?’ characters in the URL. Most engines will reach a certain threshold (I’ve heard 5 is the max, but cannot confirm this) and then stop. Keep these URLs as short as possible and if your system won’t give you what you need, there are a number of search engine friendly shopping cart systems available.

Keep the links on a given page to a reasonable number (fewer than 100).

Reasonable is a subjective term, but Google apparently defines that as 100. My personal recommendation is to keep the links to a much more reasonable number that I will define as 25. This isn’t always possible but in most cases it is. Its If you are creating a directory, 100 is the maximum, but if the purpose of that directory is to provide value to the pages being linked to then your best bet is to keep those links down to 25 or less.

Comments are closed.