Build a Sitemap to Put your Site on the Map

May 28th, 2009

Dear Reader,

Our Support Angels receive many SEO questions along with Web CEO-related queries. We want to share our knowledge publicly. Feel free to send the topics you are interested in to editorial@webceo.com

Web CEO Editorial Team

SEO Cartoon

Rate this cartoon!

1 Star2 Stars3 Stars4 Stars5 Stars

(12 votes, average 4.83 out of 5)

 
Loading ... Loading ...

Build a Sitemap to Put your Site on the Map

What is a Sitemap?

It is extremely important for search engines to have their index up to date. That is why Google, MSN, Yahoo, and Ask jointly support the sitemaps protocol. The simplest form of a sitemap is an XML file that lists URLs for a site, along with additional metadata about each URL (when it was last updated, how often it changes, and how important it is relative to other URLs across the site) so that search engines can crawl the site more intelligently.

Sitemaps are particularly beneficial on Web sites where some areas of the Web site are not available through the browsable interface, or Webmasters use rich Ajax or Flash content search engines do not normally process.

Sitemaps supplement and do not replace the existing crawl-based mechanisms search engines use to discover URLs. And using the Sitemaps protocol does not guarantee the indexing of all Web pages. However, a sitemap provides crawlers with hints that help them do a better job of crawling the site.

Sitemaps Protocol

The Sitemaps Protocol format consists of XML tags or may be just a plain text list of URLs. The file itself must be UTF-8 encoded.

Here is a sample sitemap that contains just one URL and uses all optional tags:

  • <?xml version=”1.0″ encoding=”UTF-8″?>
    <urlset xmlns=
    “http://www.sitemaps.org/schemas/sitemap/0.9″>
    <url>
    <loc>http://www.example.com/</loc>
    <lastmod>2005-01-01</lastmod>
    <changefreq>monthly</changefreq>
    <priority>0.8</priority>
    </url>
    <url>

    <url>
    </urlset>

Within each <url> record:

  • loc is a required field, representing the URL of the Web page.
  • lastmod is an optional field in the W3C Datetime format (e.g., YYYY-MM-DD and HH-MM-SS if necessary), representing the last time the URL was known to change.
  • changefreq is an optional field representing how frequently the page is likely to change. Valid values include always, hourly, daily, weekly, monthly, never.
  • priority is an optional field representing the relative importance of this URL in the Web site.

The full specification of the protocol is available at Sitemaps.org.

Sitemaps, SEO, and Usability

The main benefits of a Sitemap from the SEO point of view are:

  • With a sitemap search engines can crawl pages that aren’t otherwise discoverable;
  • You can give search engines information about pages’ priority with optional tag in the sitemap. This can help them to order the crawling of the Web site based on priority information;
  • You can use two more optional tags: “lastmod” tells search engines when a page last changed, and “changefreq” indicates how often the page is likely to change.

In addition to providing more intelligent navigation for search engines, sitemaps are useful as a secondary navigation aide. Research studies confirm that a sitemap is an important benefit for visitors because it gives them an overview of the site’s areas at a single glance. Besides being dedicated to the visualization of the information architecture, sitemaps act as a guide to a Web site.

You can generate an XML sitemap using Web CEO Editor tool and then submit it to the search engines with the help of Web CEO Submission tool.

Bookmark and Share

SEO Companies’ Visibility Rate

Are SEO companies as good as they claim to be on their sites? Will they return the efficiency they promise? Are their skills qualified? The only way to find it out is to check how they optimize and promote their own sites.

Here we share Top 10 SEO Companies according to their search visibility rate for April 2009.

1. submitexpress.com 2.networksolutions.com
3. bruceclay.com
4. iprospect.com
5. majon.com
6. seoconsultants.com
7. seoinc.com
8. webmetro.com
9. seo.com
10. mediumblue.com

Web CEO analysts use objective evidence to rate SEO firms according to their search engine visibility. SEO companies’ visibility rate is calculated using a special formula that considers the positions of SEO companies’ sites in search engines results pages for the keywords their potential clients use, popularity of these keywords and number of competitors. Learn more about the formula.

Facts about Sitemap Usage:

  • Approximately 35 million Web sites publish Sitemaps. The top ten TLDs for these Web sites are .com, .net, .cn, .org, .jp, .de, .cz, .ru, .uk, .nl. And a long tail of TLDs comprises the last 5% of Web sites.
  • Most popular sitemap formats include XML (77%), URL list (3.5%), Atom (1.6%), and RSS (0.11%).
  • 17.5% of all sitemaps published are broken down by submission format.
  • 58% of URLs include a last modification date, 7% include a change frequency field, and 61% include a priority field.

Sitemaps: Above and Beyond the Crawl of Duty

Technical Recommendations for Natural Linking

May 14th, 2009

SEO Cartoon

Rate this cartoon!

1 Star2 Stars3 Stars4 Stars5 Stars

(10 votes, average 5.00 out of 5)

 
Loading ... Loading ...

Safe Links In Spider Food Chains: Technical Recommendations for Natural Linking

When Yahoo and MSN are not making themselves clear on this issue, Google is crusading against webmasters paying for and getting revenue from links. According to Google’s reasoning, “Our goal is to provide users the best search experience by presenting equitable and accurate results.” That said, if the link is paid, it might have been received from an irrelevant website, and so it was created with the only purpose of passing Page Rank for the target site’s higher position in search results. Consequently, Google users will be dissatisfied with the irrelevant sites’ artificially inflated positions in search results (this also violates Google’s webmaster guidelines).

What Google Recommends

Google allows ad publishers to flag their paid links with the “rel=nofollow” tag or use other techniques that prevent paid links from passing Google’s PageRank, such as protecting the page with the ad links from indexing in robots.txt file.

What criteria you should follow to select link partners? Pay close attention to the following factors:

  • Sites linking to yours should be relevant; this way they will attract visitors and deliver traffic, and not just pass Page Rank. All of neighboring links on the page linking to your site should be relevant to your topic (the fewer links on the page, the better).
  • If you exchange more than one link, ask your link partner to vary link anchor text. If the same text for all links is used, this will only show Google your target keyword, and reflects a lack of consideration for visitors.
  • Avoid site wide links (i.e. links that go from every page of another site to yours). It looks unnatural, because it’s unlikely that someone would link to you in this way unless you paid for it.
  • Search engines detect sites containing information on how to buy/sell links. It is a flag, so don’t exchange links with such sites.
  • Search engines easily detect such words as “Advertisement”, “Sponsors”, and “Our Partners”, so carefully read the page where the link to your site is placed. Search engines also may figure out the location of your link (for instance, if it’s in a footer or side column) and consider these links paid.
  • At last, your site can be reported to Google as a one that is paying for links. It’s hard to protect yourself from others in this case, so just try to be friendly :-)

Google uses most of these flags not to penalize sites automatically, but as indicators that sites are suspicious and require further investigation.

What can Google do?

“Buying or selling links that pass PageRank is in violation of Google’s webmaster guidelines and can negatively impact a site’s ranking in search results.” (c) Google Webmaster Central.

Google’s goal is to prevent a paid link from passing PageRank. The first thing Google can do is to discount a paid link, i.e. lower or even remove its PageRank from a site that sells links. The last resort is to remove a violator from the index; this is done only if Google is completely sure that the website is buying or selling links.

What if your site has been penalized?

According to Google’s webmaster guidelines: “The site owner can address the violations of the webmaster guidelines and submit a reconsideration request in Google’s Webmaster Central console. Before doing a reconsideration request, please make sure that all sold links either do not pass PageRank or are removed.”

Take care of your links!

Bookmark and Share

Web CEO Metrics

Here we are sharing the generalized numbers from our HitLens Web Analytics service. It covers 300,000+ websites from all over the world.

search engines market share for April 2008-2009

Global Search Engines (%)

This chart gives the idea of the market share of each of the three major search engines.

Google and Yahoo are keeping their positions, while MSN has gained one point as compared to the year 2008.

visitors-apr-09

Visitor Referrers (%)

You can see how visitors are being referred to websites. The shares of search engines and bookmarking have noticeably grown as compared to 2008. E-marketers spend less on paid advertising this year as compared to the previous one.

What Experts Say:

What matters is that there is an html link and if it’s paid, it would have either been opened as a JavaScript pop up window (so it would not be crawled by robots because they don’t pop windows) or use rel=”nofollow” to at least indicate to robots (Google especially) that that link is not to be crawled (not to pass PR).

Matt Cutts,
Google

Page 20 of 23

« First...10<1819202122>...Last »