How to build a Solid Site Structure for SEO

Computers & TechnologySearch Engine Optimization

  • Author James Whitrow
  • Published December 12, 2010
  • Word count 950

The SEO of a website is more than just the scattering of keywords. It also means more than just achieving a page 1 ranking in the search engines. Websites should be built primarily for your visitors, with search engine friendliness a close second. It’s all very well to focus entirely on Search Engine Promotion and get a number 1 ranking, but what good is all that work when the people who are visiting your site take one look at it and would rather take their business elsewhere?

When planning your website design and development, make sure that the designer you are using is aware of SEO and how to create a search engine friendly website. If they’re not, bring in your SEO consultant initially to advise and work with the designer to get the best results. Waiting until your website is complete before beginning the SEO process can be a costly, frustrating and complicated mistake. Having a search engine friendly design will lay a solid foundation to build your SEO campaign.

We have found that ‘Word Press’ sites perform the best for search engine visibility, especially when you utilise some of the SEO plugins like: ‘All In One SEO’, ‘XML-Sitemap’, and the social plugins.

If you already have a website up and running and don’t think it is ranking as it should, a website analysis or audit, will highlight any structural and on-page issues that may be affecting your site.

So, what are the structural areas of a website?

There are three initial areas to look at. Coding, Sitemaps and Robots File.

If your website contains excessive coding, search engine spiders can have a hard time wading through it all to find what they’re looking for. Make sure your coding is clean, concise and well laid out. Use Cascading Style Sheets (CSS) as much as possible. Avoid using tables. A great way to check how easy it is for Search Engine crawlers to crawl your pages is to use a the browser "Firefox" with the Add-on called "Web Developer". Once you have installed this add-on and browsed to your website, simply press Ctrl+Shift+S on your keyboard and it will remove all styles from the browser and show you how the search engines view your page. If all of the text and images align themselves on the left hand side of the page in basic looking fonts (ie. no colours or special formatting), then this is a good indication that your coding is clean. However, if the layout of the page remains similar and elements of the page don’t move to the left, then this indicates that you aren’t utilizing CSS as much as possible. You can simply press Ctrl+Shift+S to return your webpage to normal view.

A robots file tells search engine spiders what they should look at and what they shouldn’t. This is usually used to your advantage, but if it’s applied incorrectly, can be very damaging to your rankings. Google provides guidelines and an online generator through their Webmaster Tools (all you need is a Google account and access to your web server). We tend to use the robots file to direct the crawlers to the XML sitemap, which has an exact list of all the pages we want the search engines to crawl.

Having a comprehensive, well designed xml sitemap can be very helpful in evenly distributing your page rank throughout all the pages on your site. They are also used by search engine spiders to navigate efficiently around your website. Is your Sitemap correct and up to date? It should be updated every time content is added/modified on your website.

Double checking and correcting any issues with these three structural elements will ensure that the search engine’s spiders have every advantage when crawling your website.

That’s just the beginning. Some other points to consider to create a sound website structure are;

Try to avoid Javascript and Flash where possible.

Often flash and java give a website that extra special, eye catching element, but unfortunately, search engine spiders can’t see it. This in turn means that your potential clients probably won’t see it either…

Use interlinking between your pages

Search engine spiders follow links to get from page to page, therefore internal linking really helps them get around your website. When links to other pages of your site are placed within the page content, this also shows the spiders that all of your pages contain information that relates to the other areas of your website. Your human visitors will also appreciate the quick links to relative pages and easy navigation.

Keywords within URLs

If possible, it is highly recommended to incorporate your main keywords within your URL. This is excellent for SEO and is worth looking at.

Optimize Meta Data and Images.

Make sure all your meta tags are complete and keyword optimized. Separate them with comma’s. If you can, keyword optimize image file names, alt tags and title tags. Every little bit helps.

Last, but not least, make sure your site validates…

The w3c (World Wide Web Consortium) develops technical specifications and guidelines to which the majority of websites and search engines adhere to. Validating your site will bring to light any problems within the structure of your site that go against the guidelines. Complete adherence to the guidelines is not essential, but it will improve your visibility within the search engines. Just Google ‘w3c validator’ to find it.

Once you have built a solid website structure, you can then focus your attention to the on page website factors. Remember... never underestimate the factors that you cannot see.

James Whitrow is the developer and director of Get Seen Online, a South Australian SEO provider. For more information on the elements of successful SEO, or to contact James visit

Article source: https://articlebiz.com
This article has been viewed 760 times.

Rate article

Article comments

There are no posted comments.

Related articles