Signs It’s Time To Redesign Your Website
October 8, 2007A Beginner’s Guide to Pay Per Click Marketing
January 27, 2008A History Of Search Engine Optimization
May Help Us Understand Today’s SEO Process
Anyone involved in any sort of Internet marketing sees the term, “search engine optimization”, or SEO, everywhere these days. Everybody throws it around as if they were experts on the subject. It crops up again and again in forums and advertising alike. People provide their “SEO services” for fees ranging from a few bucks to hundreds or even thousands of dollars. Everybody seems poised to provide free advice on how to effectively incorporate SEO into YOUR website.
However, hardly anyone ever comes out and says WHAT search engine optimization really is! So, as we explore the history of SEO, let’s try to get an idea of what it is and what it does.
t its simplest, search engine optimization is just the art and/or science (often more art than science) of making web pages attractive, or MORE attractive, to the Internet search engines. Obviously, most Internet businesses will consider search engine optimization to be one of THE major factors of any search engine marketing plan or program.
So, how did a need for “optimizing” a website so as to attract the attention of search engines come about?
Well, back in the dark ages of the Internet, say the mid 90’s, was when the arcane art of SEO began to blossom. Maybe it was the Renaissance, but “dark ages” is easier to spell. However, search engine optimization was pretty basic back in those days. As a matter of fact, many of the available “search engines” back then really weren’t much more than web crawling (sorry Spider-Man) directories eventually extracting a bit more data from the site than was submitted originally by the website owner.
Even in those dark days, a good quality search engine was able to perform some discriminatory evaluation and assign a weíght, or search engine rank, based on the relevance of the site’s informational content, and other data, such as keywords, description, and textual and graphic content, to certain topics and queries. Unfortunately, although the web crawler, or spider, of the search engine was able to extract a certain amount of data, a large portion of a site’s ability to achieve high search engine rank depended on material submitted by the webmaster.
Webmasters aren’t stupid, you know, and they soon realized that by using various techniques they could improve their site’s search engine rank. One such technique was manipulating content by increasing the usage of keywords, often to huge multiples which might be hidden in the background of the site, for example. In this way they could raise their website’s search engine rank. A higher rank meant more visitors, which usually meant more money. A fact the webmasters easily understood.
Enter the search engine algorithm. “Algorithm” is possibly one of the least understood words commonly found on the Internet. All it means is the system, or instructions, which, in this case, the search engine follows in its quest to rank websites. To be absolutely silly, a search engine owner could decide that his or her algorithm will include instructions to assign the lowest rank to websites with the word “blue” in them. The point is that the magical, mystical ALGORITHM is simply the set of instructions that has been provided to the software that the search engine uses to assign search engine ranking.
Now, it isn’t as if search engine algorithms didn’t exist before, but, as with cops and robbers, as the webmasters got better at subverting the existing algorithms, the search engines tweaked their algorithms to counter their tricks and ploys.
One major change was that search engines began to place less faith on the presentations and protestations of the webmasters and developed software capable of investigating the site itself and forming conclusions on what it actually found there. Instead of the webmaster filling in a form providing a title, description, and a bunch of keywords which was checked by a “Mortimer Snerd” indexer which said, “Yup, Mr. Bergen. Them keywords is there, all right, and there’s a bunch of ’em!”, the search engine software began to look more deeply for itself and make logical, or at least quasi-logical, determinations about what it found.
BREAK FOR THOSE UNDER 50: Okay, 200 per cent of Internet users are people nowhere near my age, so here’s the skinny on Mortimer Snerd. Back in the 1930’s and into the 60’s, I believe, there was a popular ventriloquist named Edgar Bergen, father of Candice Bergen. He mainly worked with two dummies, Charley McCarthy and Mortimer Snerd. Charley McCarthy, although a smart aleck, was usually dressed in tie and tails and seemed to be up on the comings and goings of society. Bergen’s other major dummy was Mortimer Snerd, a hick straight off the turnip truck who believed whatever he was told…and believed it literally.
Back to search engine optimization.
Okay, rather than just accepting the webmaster’s word that keywords “weight loss”, “diet”, and “exercise” were applicable to the subject matter of the site and then checking to see if those words were there, the software began looking at a long líst of factors. It would look at the domain name, and the words used in the title. It would see how often keywords appeared, how close they were together, and the sequence in which they appeared. It would analyze such things as what the “ALT” attribute attached to images contained, and what the META tags had to say. Most important of all, it would look at the textual content of the site to get a major feel for the way all these things came together and how they matched the claims of the webmaster and the expectations of the search engine’s clients.
Now you see why so many people say, “Content is king!”
However, for a major search engine such as Google, website content alone was not enough to insure that its customers were seeing the most valuable search results and that websites were getting the most accurate page rank. Therefore, Google developed a system known as “Page Rank” which also looks at the number of incoming links to the site. In other words, how many other sites around the Internet considered this site relevant to the interests of their clients and hence of value to the interests of the search engines’ clients.
As search engines became bigger and more powerful, and as webmasters became more inventive at circumventing their algorithms, the major search engines such as Google made their particular algorithms tightly controlled secrets. This has made it extremely difficult for amateur webmasters and search engine optimization services alike to predict exactly which technique or tactic was going to be the most successful for achieving a high web page rank on a given search engine.
However, some deductions have been made based on the pages and sites that DO seem to achieve high page ranks with Google and other search engines.
Techniques such as picking a relevant domain name, including important keywords and phrases in the title, having keywords show up in such places as the image ALT tag, and stressing keywords through the use of headline text and by placement at the beginning and end of the page are all of importance. Having lots of inbound links from relevant sites is important as is internal linking (the development and value of the sitemap is another important topic).
Over and above all the smooth moves, however, it appears that as search engine algorithms expand their capabilities, based of course on the instructions they have been provided, they begin to approach the viewpoint of the human website viewer. As a human would ask, “Does this site make sense and provide relevant data in an understandable manner?”, so too are search engines becoming more interested in the structure and content of the website.
Search engine crawlers are becoming more efficient as well in simply finding your site if someone somewhere has considered it important enough to provide a link from their website to yours. This is another reason why links from other pages can be important for getting your website indexed in the first place as well helping get it a good page rank.
As in the good old days of the Internet in the previous century (I needed to say that), the most common means of offering your website to a search engine for its consideration is by a simple task of filling in a form. You will notice in the modern era, however, that the search engines are asking you for less and less information about the site. They prefer to go and get it themselves. On the other hand, filling in the form does not guarantëe immediate, or even soon, indexing of your site…if it happens at all.
From the viewpoint of the search engine or the human visitor, while various techniques of search engine optimization are important, the quality of the content provided to your visitor is probably going to be the best search engine optimization method of all.
About The Author
Donovan Baldwin is a 62 year old bodybuilder, internet marketer, and freelance writer living near Fort Hood, Texas. He is a University of West Florida alumnus, a member of Mensa, and is retired from the U.S. Army after 21 years of service.. .