SEO: Developing a search engine-friendly website
October 6, 2006 • Glenn Murray
Search engines don’t see websites the way you and I do. They require your site to be designed a particular way. If you don’t observe a few rules of thumb, you can severely hamper your search engine presence.
Following is a list of the main things you need to be aware of. I’m no web developer, so don’t take the below as gospel. Just discuss these things with your developer first. If you engage an experienced SEO web designer, they’ll already know all the issues (far more than are listed here).
- Design your site in HTML – i.e. HTML based copy and headings and text based links at the base of each page as per existing site
- Use static URLs
- Use standard rollovers and/or CSS formatting for navigation menu
- Create a robots.txt file. This file is used to inform the search engine spider which pages on a site should not be indexed.
- Alternatively, you can do a similar thing by placing tags in the header section of your HTML for search engine robots/spiders to read. These tags are as follows:
- tells the spiders to crawl and index your entire site
- tells the spiders not to index anything.
- says don’t index this particular page, but follow its links to other pages (e.g. for use on secure or private pages).
- says to index the page but not follow its links.
- Create a 404 error handling page, and place a sitemap on the 404 page.
- Create a text based site map containing links to every page in your site (see http://www.divinewrite.com/site.htm for an example).
- Create a Google Sitemap (Read an overview of Google Sitemaps and download a great tool for generating Google Sitemaps)
- Don’t embed your copy within a graphic (the search engines won’t be able to read it)
- Don’t use frames (this is a contentious one – some people use frames quite effectively)
- Don’t use “&id=” as a parameter if you want maximal Googlebot crawlage (many sites use “&id=” with session IDs that Googlebot usually avoids urls with that parameter)
Hope this helps.