Search Engine Optimization - An Historical Perspective

Computers & TechnologySearch Engine Optimization

  • Author Donovan Baldwin
  • Published August 8, 2007
  • Word count 1,539

It seems as if every website or ezine article which mentions Internet marketing includes the term, "search engine optimization", or SEO. It's the current buzzword which everybody seems to use as if they were all experts on the subject. It appears everywhere from public forums to commercial advertising alike. Internet marketing "professionals" offer their "SEO services" for fees ranging anywhere from a few bucks to hundreds or even thousands of dollars, and Internet "gurus" abound willing to take your money and help you out. Everybody else seems poised to offer free advice on how to effectively incorporate SEO into YOUR website.

However, hardly anyone ever comes out and says WHAT search engine optimization really is! So, as we explore the history of SEO, let's try to get an idea of what it is and what it does.

At its simplest, search engine optimization is just the art and/or science (often more art than science) of making web pages attractive, or MORE attractive, to the Internet search engines. Without a doubt, an Internet business would be remiss if it did not consider serach engine optimization an integral part of any search engine marketing program or plan.

So, how did a need for "optimizing" a website so as to attract the attention of search engines come about?

The relatively arcane art of Search Engine Optimization (SEO) began to shine in the dark ages of the internet around the mid 1990's. Maybe it was the Renaissance, but "dark ages" is easier to spell. However, search engine optimization was pretty basic back in those days. As a matter of fact, many of the available "search engines" back then really weren't much more than web crawling (sorry Spider-Man) directories eventually extracting a bit more data from the site than was submitted originally by the website owner.

Even in those dark days, a good quality search engine was able to perform some discriminatory evaluation and assign a weight, or search engine rank, based on the relevance of the site's informational content, and other data, such as keywords, description, and textual and graphic content, to certain topics and queries. Unfortunately, although the web crawler, or spider, of the search engine was able to extract a certain amount of data, a large portion of a site's ability to achieve high search engine rank depended on material submitted by the webmaster.

Webmasters aren't stupid, you know, and they soon realized that by using various techniques they could increase their site's search engine rank. One such technique was manipulating content by increasing the usage of keywords, often to huge multiples which might be hidden in the background of the site, for example. In this way they could increase their website's search engine rank. A higher rank meant more visitors, which usually meant more money. A fact the webmasters easily understood.

Enter the search engine algorithm.

"Algorithm" is possibly one of the least understood words commonly found on the Internet. All it means is the system, or instructions, which, in this case, the search engine follows in its quest to rank websites. To be absolutely silly, a search engine owner could decide that his or her algorithm will include instructions to assign the lowest rank to websites with the word "blue" in them. The point is that the magical, mystical ALGORITHM is simply the set of instructions that has been provided to the software that the search engine uses to assign search engine ranking.

Certainly, search engine algorithms existed previously, but as with criminals and cops, as webmasters got better at sneaking around the existing algorithms, the search engines improved their algorithms to prevent their doing so.

A major change came about as search engines began to rely less and less on the information provided by the webmasters and created software which could investigate the site independently and form conclusions on what it found there. Instead of the webmaster filling in a form providing a title, description, and a bunch of keywords which was checked by a "Mortimer Snerd" indexer which said, "Yup, Mr. Bergen. Them keywords you asked about is there, you bet, and there's a whole bunch of 'em!", the search engine software began to look more deeply for itself and make logical, or at least quasi-logical, determinations about what it found.

BREAK FOR THOSE UNDER 50: Okay, 200 per cent of Internet users are people nowhere near my age, so here's the skinny on Mortimer Snerd. Back in the 1930's and into the 60's, I believe, there was a popular ventriloquist named Edgar Bergen, father of Candice Bergen. He mainly worked with two dummies, Charley McCarthy and Mortimer Snerd. Charley McCarthy, although a smart aleck, was usually dressed in tie and tails and seemed to be up on the comings and goings of society. Bergen's other major dummy was Mortimer Snerd, a hick straight off the turnip truck who believed whatever he was told...and believed it literally.

Back to the subject of SEO.

Okay, rather than just accepting the webmaster's word that keywords "weight loss", "diet", and "exercise" were applicable to the subject matter of the site and then checking to see if those words were there, the software began looking at a long list of factors. It would check the domain name, and the words used in the title. It would check to see how often keywords appeared, how close they were together, and the sequence in which they appeared. It would check such things as what the "ALT" attribute attached to images contained, and what the META tags had to say. Most important of all, it would check the textual content of the site to get a major feel for the way all these things came together and how they matched the claims of the webmaster and the expectations of the search engine's clients.

Now you see why so many people say, "Content is king!"

However, for a major search engine such as Google, website content alone was not enough to insure that its customers were seeing the most valuable search results and that websites were getting the most accurate page rank. As a result, search engine giant, Google, came up with a "Page Rank" system which also takes a look at the quantity of inbound links to a site"Page Rank" which also looks at the number of incoming links to the site. In other words, how many other sites around the Internet considered this site relevant to the interests of their clients and hence of value to the interests of the search engines' clients.

As search engines became bigger and more powerful, and as webmasters became more inventive at circumventing their algorithms, the major search engines such as Google made their particular algorithms tightly controlled secrets. This has made it extremely difficult for amatuer webmasters and search engine optimization services alike to predict exactly which technique or tactic was going to be the most successful for achieving a high web page rank on a given search engine.

However, some deductions have been made based on the pages and sites that DO seem to achieve high page ranks with Google and other search engines.

Techniques such as picking a relevant domain name, including important keywords and phrases in the title, having keywords show up in such places as the image ALT tag, and stressing keywords through the use of headline text and by placement at the beginning and end of the page are all of importance. Having lots of inbound links from relevant sites is important as is internal linking (the development and value of the sitemap is another important topic).

Over and above all the smooth moves, however, it appears that as search engine algorithms expand their capabilities, based of course on the instructions they have been provided, they begin to approach the viewpoint of the human website viewer. As a human would ask, "Does this site make sense and provide relevant data in an understandable manner?", so too are search engines becoming more interested in the structure and content of the website.

The search engine web crawlers are also becoming more proficient in tracking down your site if someone else has seen fit to include a link to your website from theirs. This is another reason why links from other pages can be important for getting your website indexed in the first place as well helping get it a good page rank.

As in the good old days of the Internet in the previous century (I needed to say that), the most common means of offering your website to a search engine for its consideration is by a simple task of filling in a form. You will notice in the modern era, however, that the search engines are asking you for less and less information about the site. They prefer to go and get it themselves. On the other hand, filling in the form does not guarantee immediate, or even soon, indexing of your site...if it happens at all.

From the viewpoint of the search engine or the human visitor, while various techniques of search engine optimization are important, the quality of the content provided to your visitor is probably going to be the best search engine optimization method of all.

Donovan Baldwin is a 62 year old bodybuilder, internet marketer, and freelance writer living near Austin, Texas, . He is a University of West Florida alumnus (1973), a member of Mensa, and is retired from the U. S. Army after 21 years of service.. He has recently joined with Global Domains International (GDI), and offers information on his involvement with this international company at http://donsdomains.ws/

Article source: https://articlebiz.com
This article has been viewed 1,321 times.

Rate article

Article comments

There are no posted comments.

Related articles