Global Marketing Plus

Tips and Tricks for Small Business Success

Archive for July, 2008

Strategy #6. Keyword Density

July 19, 2008 By: Ron Coleman Category: SEO, Website Design

Search engines love relevant text. They want to match the keywords that an end user types into a search engine with keywords that are located on your website.

Keyword density analysis is one of the most important ratios of how often these keywords appear on an individual webpage.

What is keyword density? It’s a percentage, calculated this way: Number of times keyword appears on a page / Total word count on page = Keyword Density Keyword density is usually displayed as a percentage.

So, if you have a page that has 100 words on it, and you have a keyword appear 5 times on the page, your page would have a keyword density of 5%. (5 / 100 = 5%)

In a real life example, the search term “website design” has an overall keyword density on this page of 4.13%:


Click the graphic above to see a live sample of a keyword page.

(10 instances of the keywords / 242 total words on the site = 4.13%)

However, not all keywords on a page are treated the same. Keywords in the title tags, page name and section headings are often given higher weight than keywords that appear in the regular content area of the page.

Here’s how the keywords break down in the different areas of the site:

Description: Keywords: Total: Percentage:
Title Tag 1 6 16.6%
Page Name 1 1 100%
Linked Text 4 83 4.8

So, how much keyword density is too much? It depends on which study you read, but it’s generally best to keep your keyword density between 3-6%. Anything more, and you’ll be penalized for trying to spam the search engines.

As a general rule of thumb, if the copy of the site makes sense to a human reading it, you should be fine. But if you repeat the same keyword five times in a row (Website Design, Website Design, Website Design, etc), then you can be banned from search engines or penalized.

Let me know if you’d like us to do a keyword density analysis on your site…

Strategy #5. Why Sitemaps are Baby Food For Search Engines

July 18, 2008 By: Ron Coleman Category: SEO, Website Design

When websites were brand new, a sitemap was used to help people find their way around a disjointed site. As websites became easier to navigate, sitemaps fell out of favor.

But now they’re back… because they are the equivalent of baby food for search engines. Just a few years ago, the philosophy about sitemaps went something like this:

If your customers need to use a sitemap to find their way around your website, you haven’t done your job organizing your content and creating a   navigational system that is easy to understand.

But sitemaps are now back in favor. Why? It’s less about human visitors and more about search engines.

What is a sitemap? A sitemap is page that lists all of the other pages on your site, usually in a bulleted list.

Here’s an example of a sitemap:


As I’ve discussed before, search engines are easily confused. Many pages of a website are often ’hidden’ behind tricky menus or drop-down lists. Or, the links to reach a specific page are too deep (i.e. more than a couple of pages down from the home page).

A sitemap, linked from the home page of the site, will list every page of your site in one convenient place.

When a search engine visits your site map, it’s very easy for them to then get a list of every page on your site, and then crawl, digest and include all of your content in their system.

We generally recommend having the link to your sitemap on the bottom footer navigation of your site.


But you need to make sure that as your site changes, your sitemap is updated.  Otherwise, Google and others may not index the latest pages placed on your site.

And even better than an HTML sitemap is an XML sitemap. An XML sitemap is a sitemap that is specifically formatted for search engines like Google. It’s a machine-readable version that allows you to specify all of the pages of the site.

Click the graphic above to see a live sample of an XML sitemap.


Adding an XML sitemap ensures that a site will get indexed much more quickly and more rapidly than not using this method at all.

For my new sites, the XML sitemap allows the sites to be indexed in 3-4 days vs. the usual 3-4 months.

Let me know if you need help with a sitemap or XML sitemap for your site. We’re here to help.

Strategy #4. What does a search engine look for?

July 13, 2008 By: Ron Coleman Category: Marketing, SEO, Website Design

We’ve discussed local search, and how to make sure you don’t confuse search engines with graphics and flash animation. We’ve also talked about the all-important title tags.

This week, we’ll take a higher-level view to discuss what search engines look for when ranking your site.

At the end of the day, a search engine is in business to help you find the most relevant results possible when you conduct a search. Search engines make their money by selling relevant advertising to supplement the natural, organic search results.

Because a top ranking in Google or another search engine can translate into a great deal of business, it’s important to know how search engines determine who gets placed at the top of the list.

The two biggest ways search engines rank you are based on:

  1. Relevant Content: Search engines are really good at reading text. The more relevant copy you have on your site, the better chance you have getting your page indexed. Search engines love pages that have more than 500 words of text on them.

    Why? A page with a lot of content is usually more beneficial to the end user. (Though for every rule like this one, there are many exceptions.)

    Adding articles, press releases, detailed information about your products and services all can help quickly increase the amount of relevant content that you have on your site.

  2. Inbound Links: The more sites that link to you, the more important your site becomes to search engines. If sites that link to you are very relevant and/or important, those inbound links worth more. And domains that end with .gov, .edu often perform better than .com for inbound links.

    It’s kind of like a high school popularity contest. If the most popular kids all point to you and say that your website is better than anyone elses, in the eyes of the community, your ranking is elevated.

There are many other things as well that affect search engine ranking. I can’t go into great detail for the entire list, but even small changes can translate into higher rankings.

  1. Title Tags: See last week’s email.

  2. Page Names: Keywords in page names in crease the relevance of the search and are displayed in a Google search result.

  3. Image Names: Putting relevant keywords into image names helps your ranking.

  4. Alt Text for Images: If you hover over an image, this is the text that appears; also used by the blind to understand what an image represents.

  5. Keyword Density: How often specific keywords appear on a page as a percentage of all of the words on a page.

  6. Section Headings: In the HTML code, section headings like H1 or H2 are treated as more important content than the information on the rest of the page.

  7. Words contained in links: A link like: “Global Marketing Plus offers Web Marketing and Search Engine Optimization Services” can help boost rankings.

  8. Clean HTML code: Search engines are easily confused if your websites’ code is a mess.

  9. How often pages are updated: Search engines like new conent, but also have a bias toward pages that have been up on the web for a long time.

  10. Site Map: If you have a site map (and an XML site map as well), it’s easier for search engines to crawl through all of the pages of your site.

  11. Keywords in your domain name.

  12. The age of your domain name: Older domain names are perceived as more relevant than something registered last week.

  13. Keywords in subdomains

  14. Keywords in file directory structures

In the coming weeks, I’ll delve into many of these points in more detail. Let me know if you’d like me to discuss specific areas, or if you’d like to discuss your site with our team.