FREE HOME BASE BUSINESS OPPORTUNITY
The best work at home internet job
Search Engine Ranking Article

Top 10 Search Engine Positioning Mistakes
By Sumantra Roy


Choosing the Correct Keywords for a Site
By Sumantra Roy


Page Cloaking - To Cloak or Not to Cloak.
By Sumantra Roy


Use Doorway Pages At Your Own Risk.
Subia Creative, September 2002


Top 10 Search Engine Positioning Mistakes
By Sumantra Roy

When it comes to search engine optimization, there are certain common mistakes that I see people making over and over again. Here's a list of the 10 most common mistakes that I see people making. By avoiding these mistakes, you can avoid a lot of anguish and frustration in the long run.

1) Optimizing your site for the wrong keywords

The first step in any search engine optimization campaign is to choose the keywords for which you should optimize your site. If you initially choose the wrong keywords, all the time and effort that you devote in trying to get your site a high ranking will go down the drain. If you choose keywords which no one search for, or if you choose keywords which won't bring in targeted traffic to your site, what good will the top rankings do?

In order to learn how you can choose the correct keywords for which you should optimize your site, see my article on this topic at http://www.1stSearchRanking.com/t.cgi?1953&keywords.htm


2) Putting too many keywords in the Meta Keywords tag

I often see sites which have hundreds of keywords listed in the Meta Keywords tag, in the hope that by listing the keywords in the Meta Keywords tag, they will be able to get a high ranking for those keywords. Nothing could be further from the truth. Contrary to popular opinion, the Meta Keywords tag has almost completely lost its importance as far as search engine positioning is concerned. Hence, just by listing keywords in the Meta Keywords tag, you will never be able to get a high ranking. To get a high ranking for those keywords, you need to put the keywords in the actual body content of your site.


3) Repeating the same keyword too many times

Another common mistake that people make is to endlessly repeat their target keywords in the body of their pages and in their Meta Keywords tags. Because so many people have used this tactic in the past (and continue to use it), the search engines keep a sharp lookout for this, and may penalize a site which repeats keywords in this fashion. Sure, you do need to repeat the keywords a number of times. But, the way you place the keywords in your pages needs to make grammatical sense. Simply repeating the keywords endlessly no longer works. Furthermore, a particular keyword should ideally not be present more than thrice in your Meta Keywords tag.


4) Creating lots of similar doorway pages

Another myth prevalent among people is that since the algorithm of each search engine is different, they need to create different pages for different search engines. While this is great in theory, it is counter-productive in practice. If you use this tactic, you will soon end up with hundreds of pages, which can quickly become an administrative nightmare. Also, just imagine the amount of time you will need to spend constantly updating the pages in response to the changes that the search engines make to their algorithms. Furthermore, although the pages are meant for different engines, they will actually end up being pretty similar to each other. The search engines are often able to detect when a site has created such similar pages, and may penalize or even ban this site from their index. Hence, instead of creating different pages for different search engines, create one page which is optimized for one keyword for all the search engines. In order to learn how to create such pages, see my article on this topic at http://www.1stSearchRanking.com/t.cgi?1953&keyword-rich-pages.htm


5) Using Hidden Text

Hidden text is text with the same color as the background color of your page. For example, if the background color of your page is white and you have added some white text to that page, that is considered as hidden text. Many webmasters, in order to get high rankings in the search engines, try to make their pages as keyword rich as possible. However, there is a limit to the number of keywords you can repeat in a page without making it sound odd to your human visitors. Thus, in order to ensure that the human visitors to a page don't perceive the text to be odd, but that the page is still keyword rich, many webmasters add text (containing the keywords) with the same color as the background color. This ensures that while the search engines can see the keywords, the human visitors cannot. The search engines have long since caught up with this technique, and ignore or penalize the pages which contain such text. They may also penalize the entire site if even one of the pages in that site contain such hidden text.

However, the problem with this is that the search engines may often end up penalizing sites which did not intend to use hidden text. For instance, suppose you have a page with a white background and a table in that page with a black background. Further suppose that you have added some white text in that table. This text will, in fact, be visible to your human visitors, i.e. this shouldn't be called hidden text. However, the search engines can interpret this to be hidden text because they may often ignore the fact that the background of the table is black. Hence, in order to ensure that your site is not penalized because of this, you should go through all the pages in your site and see whether you have inadvertently made any such mistake.


6) Creating Pages Containing Only Graphics

The search engines only understand text - they don't understand graphics. Hence, if your site contains lots of graphics but little text, it is unlikely to get a high ranking in the search engines. For improving your rankings, you need to replace the graphics by keyword rich text for the search engine spiders to feed on.


7) Not using the NOFRAMES tag in case your site uses frames

Many search engines don't understand frames. For sites which have used frames, these search engines only consider what is present in the NOFRAMES tag. Yet, many webmasters make the mistake of adding something like this to the NOFRAMES tag: "This site uses frames, but your browser doesn't support them". For the search engines which don't understand frames, this is all the text that they ever get to see in this site, which means that the chances of this site getting a good ranking in these search engines are non-existent. Hence, if your site uses frames, you need to add a lot of keyword rich text to the NOFRAMES tag. For more information on the different issues that arise when you use frames in your site, see my article on this topic at http://www.1stSearchRanking.com/t.cgi?1953&frames.htm


8) Using Page Cloaking

Page cloaking is a technique used to deliver different web pages under different circumstances. People generally use page cloaking for two reasons: i) in order to hide the source code of their search engine optimized pages from their competitors and ii) in order to prevent human visitors from having to see a page which looks good to the search engines but does not necessarily look good to humans. The problem with this is that when a site uses cloaking, it prevents the search engines from being able to spider the same page that their users are going to see. And if the search engines can't do this, they can no longer be confident of providing relevant results to their users. Thus, if a search engine discovers that a site has used cloaking, it will probably ban the site forever from their index. Hence, my advice is that you should not even think about using cloaking in your site. For more information on what page cloaking is, how it is implemented, and why you should not use cloaking, see my article on this topic at http://www.1stSearchRanking.com/t.cgi?1953&page-cloaking.htm


9) Using Automatic Submission Tools

In order to save time, many people use an automatic submission software or service to submit their sites to the major search engines. It is true that submitting your site manually to the search engines takes a lot of time and that an automatic submission tool can help you save a lot of time. However, the search engines don't like automatic submission tools and may ignore your pages if you use them. In my opinion, the major search engines are simply too important for you not to spend the time to submit your site manually to them. In order to speed up the process of submitting your site, you can use our free submission tool which allows you to submit your site manually to all the search engines, without having to go to the "ADD URL" pages of the individual engines. It is available at http://www.1stSearchRanking.com/t.cgi?1953&submission.htm


10) Submitting too many pages per day

People often make the mistake of submitting too many pages per day to the search engines. This often results in the search engines simply ignoring many of the pages which have been submitted from that site. Ideally, you should submit no more than 1 page per day to the search engines. While many search engines accept more than 1 page per day from a particular domain, there are some which only accept 1 page per day. Hence, by limiting yourself to a maximum of one page per day, you ensure that you stay within the limits of all the search engines.


11) Devoting too much time to search engine positioning

Yes - I lied. There's another common mistake that people make when it comes to search engine optimization - they spend too much time over it. Sure, search engine placement is the most cost effective way of driving traffic to your site and you do need to spend some time every day learning how the search engines work and in optimizing your site for the search engines. However, you must remember that search engine optimization is a means to an end for you - it's not the end in itself. The end is to increase the sales of your products and services. Hence, apart from trying to improve your site's position in the search engines, you also need to spend time on all the other factors which determine the success or the failure of your web site - the quality of the products and services that you are selling, the quality of your customer service, and so on. You may have excellent rankings in the search engines, but if the quality of your products and services are poor, or if your customer service leaves a lot to be desired, those high rankings aren't going to do much good.

Article by Sumantra Roy. Sumantra is one of the most respected search engine positioning specialists on the Internet. To have Sumantra's company place your site at the top of the search engines, go to http://www.1stSearchRanking.com/t.cgi?1953 For more advice on how you can take your web site to the top of the search engines, subscribe to his FREE newsletter by going to http://www.1stSearchRanking.com/t.cgi?1953&newsletter.htm

Choosing the Correct Keywords for a Site
By Sumantra Roy

In this article, we focus on the correct way of finding out the keywords for which you should optimize your site for the search engines. This article will give you the formula for the Keyword Effectiveness Index (KEI) - a mathematical formula which I have developed to help you determine which keywords you should be optimizing your site for.

Step 1: Open your text editor or word processor and write down all the words and phrases that you might have searched for if you were looking for a company which offers products and services similar to yours. For example, suppose your company organizes packaged tours to Australia. Here's a list of phrases that I might have searched for if I were planning to make a trip to Australia:

tourism in Australia
travel to Australia
travelling in Australia
travel agencies in Australia
travelling agencies in Australia
Australian travel agencies

Of course, the keywords that came to your mind may have been different. But that's not important - the important thing is to get an initial list of keywords.

You may be wondering why I have not used single word keywords. Here's why:

Firstly, single word keywords tend to be hyper-competitive. A search for "tourism" or "travelling" in any search engine will probably generate hundreds of thousands of pages. While it is possible that you may get your page in the top 10 for such a single word keyword, it is quite unlikely.

Secondly, because of the sheer number of pages that single word searches can throw up, most search engine users have realized that they can get more relevant pages if they search for phrases rather than individual words. Statistical research has shown that most people are now searching for 2 or 3 word phrases rather than for single words.

Thirdly, single word keywords won't get you targeted traffic. When people search for "tourism", they are not necessarily looking for tourist destinations in Australia - they may be interested in any other country of the world. Even if you got your site into the top 10 for tourism, you gain nothing from such visitors. However, when someone searches for "tourism in Australia", he/she is your potential customer, and hence, it makes sense for you to try and get a top ranking for your site for that keyword. Hence, whenever you are trying to generate keywords, try to be location specific. Try to think of keywords which apply to the geographic area that your product or service is designed to serve.

Step 2: Open any spreadsheet program that is installed in your hard drive. I assume you are using Microsoft Excel. If you are using some other spreadsheet program, just change the spreadsheet related procedures outlined here to fit your program.

Create 4 columns - one for the keyword, one for the popularity of the keyword, one for the number of sites that appear in AltaVista for that keyword and the last for something I call the Keyword Effectiveness Index (don't worry - I'll explain what KEI means later on). In order to ensure that you can follow what I am saying, I recommend that you add the following column headers to the first four columns of the first row of your spreadsheet:

Keyword
Popularity
No. of Competitors
KEI

In case you don't want to take the trouble of creating your own spreadsheet, download the keywords.zip file from http://www.1stSearchRanking.com/t.cgi?1195&download.htm The file contains a sample spreadsheet in Excel 97 format.

Step 3: A great way to obtain a list of keywords related to the ones you have developed in the first step is to use WordTracker's keyword generation service by going to http://www.1stSearchRanking.com/t.cgi?1195&wordtracker/

Click on the "Trial" option at the top of the site. In the page that appears, type in your name and email address and click on the "Start the trial >>" button. In the next page, click on "Click here to start the trial". In the next page, type in the first keyword that you developed in Step 1, i.e. "tourism in Australia", in the text box. Click on the "Proceed >>" button.

Step 4: In the next page, WordTracker will display a list of keywords related to the keyword that you had typed in. (Just scroll down the left pane to see the keywords). Now, click on the first keyword in the left pane which is applicable for your site. In the right pane, WordTracker will show a list of keywords which contain the keyword you had clicked on in the left pane. Then in the table that you have created in your spreadsheet, copy each of the keywords in the right pane and paste them in the first column of the table. Also, copy the number of times those keywords have been used (i.e. the figure present in the Count column in WordTracker) and paste them in the second column. In order to ensure that you can follow me, make sure that you type the first keyword in the second row of your spreadsheet. Of course, you should only bother adding a keyword to your spreadsheet if it is applicable for your site.

Once you have added all the keywords in the right pane which are applicable for your site, click on the next keyword in the left pane which is applicable for your site. Once again, WordTracker will display a list of keywords in the right pane which contain the keyword you had clicked on in the left pane. Again, copy the keywords in the right pane which are applicable for your site and paste them in the first column of your spreadsheet. Also, copy the figures present in the Count column and paste them in the second column beside the corresponding keywords. Repeat this process for each of the keywords in the left pane.

Step 5: Once you have finished with all the keywords in the left pane, press your browser's Back button a number of times until WordTracker again displays the text box which asks you to type in a keyword. Type in the second keyword in your original list (i.e. "travel to Australia"), click on the "Proceed >>" button and repeat Step 4. Do this for each of the keywords that you developed in Step 1.

Step 6: Go to AltaVista . Search for the first keyword that is present in your spreadsheet using exact match search (i.e. you should wrap the keyword in quotes, i.e. you should type a quotation mark before typing the keyword and a quotation mark after typing it). AltaVista will return the number of sites which are relevant to that keyword. Add this number to the third column of the spreadsheet in the same row in which the keyword is present. Repeat this process for each of the keywords present in your spreadsheet. Once you have done that, your first column will contain the keywords, your second column will show the popularity of the keywords and your third column will contain the number of sites you are competing against to get a high ranking for those keywords.

Now it's time to calculate the KEI!

Step 7: The Keyword Effectiveness Index is the square of the popularity of a keyword multiplied by 1000 and divided by the number of sites which appear in AltaVista for that keyword. It is designed to measure which keywords are worth optimizing your site for. Higher the KEI, better the keyword. How the formula for the KEI is arrived at is beyond the scope of this article. If you want to know, send a blank email to mailto:kei@1stSearchRanking.com

If you had used the spreadsheet file that I created for you (see Step 2), you won't need to enter the formula for calculating the KEI yourself. The KEI would be automatically calculated for you the moment you enter the values in columns 2 and 3. You can go straight to Step 8.

In case you didn't download the file, here's how you can calculate the KEI.

I am assuming that you have created the spreadsheet columns in the way I recommended in Step 3 and that you are using Microsoft Excel. If you using some other spreadsheet program, you will need to adjust the formula to the requirements of your spreadsheet program. Click on cell D2. Type in the following exactly as it is shown:

=IF(C2<>0,B2^2/C2*1000,0)

Then click on the Copy button to copy the formula, select all the cells in column 4 which have keywords associated with them and press the Paste button to paste the formula. The KEI for each keyword will be displayed.

Step 8: Use your spreadsheet program's Sort feature to sort the rows in descending order of the KEI. In Excel 97, you would click on the Data menu, click on the Sort menu item, choose KEI from the drop-down combo box named "Sort by", click on the "Descending" option next to it, and then click on OK.

And guess what - that's it! You now know the keywords which you should optimize your site for. You can now start optimizing your site one by one for each keyword, starting with the keyword with the highest KEI. Exactly how many of the keywords you choose to optimize your site for largely depends on the amount of time that you can spare from your normal business activities. But whatever the number of keywords that you target, it obviously makes sense to go for the most effective keywords first.

Tying up the loose ends:

The number of related keywords that WordTracker displays in the trial version is limited. In order to get all the keywords which are related to the keywords you had developed in Step 1, you would need to subscribe to WordTracker's paid service.

Article by Sumantra Roy. Sumantra is one of the most respected search engine positioning specialists on the Internet. To have Sumantra's company place your site at the top of the search engines, go to http://www.1stSearchRanking.com/t.cgi?1195 For more advice on how you can take your web site to the top of the search engines, subscribe to his FREE newsletter by going to http://www.1stSearchRanking.com/t.cgi?1195&newsletter.htm

Page Cloaking - To Cloak or Not to Cloak.
By Sumantra Roy

Page cloaking can broadly be defined as a technique used to deliver different web pages under different circumstances. There are two primary reasons that people use page cloaking:

i) It allows them to create a separate optimized page for each search engine and another page which is aesthetically pleasing and designed for their human visitors. When a search engine spider visits a site, the page which has been optimized for that search engine is delivered to it. When a human visits a site, the page which was designed for the human visitors is shown. The primary benefit of doing this is that the human visitors don't need to be shown the pages which have been optimized for the search engines, because the pages which are meant for the search engines may not be aesthetically pleasing, and may contain an over-repetition of keywords.

ii) It allows them to hide the source code of the optimized pages that they have created, and hence prevents their competitors from being able to copy the source code.

Page cloaking is implemented by using some specialized cloaking scripts. A cloaking script is installed on the server, which detects whether it is a search engine or a human being that is requesting a page. If a search engine is requesting a page, the cloaking script delivers the page which has been optimized for that search engine. If a human being is requesting the page, the cloaking script delivers the page which has been designed for humans.

There are two primary ways by which the cloaking script can detect whether a search engine or a human being is visiting a site:

i) The first and simplest way is by checking the User-Agent variable. Each time anyone (be it a search engine spider or a browser being operated by a human) requests a page from a site, it reports an User-Agent name to the site. Generally, if a search engine spider requests a page, the User-Agent variable contains the name of the search engine. Hence, if the cloaking script detects that the User-Agent variable contains a name of a search engine, it delivers the page which has been optimized for that search engine. If the cloaking script does not detect the name of a search engine in the User-Agent variable, it assumes that the request has been made by a human being and delivers the page which was designed for human beings.

However, while this is the simplest way to implement a cloaking script, it is also the least safe. It is pretty easy to fake the User-Agent variable, and hence, someone who wants to see the optimized pages that are being delivered to different search engines can easily do so.

ii) The second and more complicated way is to use I.P. (Internet Protocol) based cloaking. This involves the use of an I.P. database which contains a list of the I.P. addresses of all known search engine spiders. When a visitor (a search engine or a human) requests a page, the cloaking script checks the I.P. address of the visitor. If the I.P. address is present in the I.P. database, the cloaking script knows that the visitor is a search engine and delivers the page optimized for that search engine. If the I.P. address is not present in the I.P. database, the cloaking script assumes that a human has requested the page, and delivers the page which is meant for human visitors.

Although more complicated than User-Agent based cloaking, I.P. based cloaking is more reliable and safe because it is very difficult to fake I.P. addresses.

Now that you have an idea of what cloaking is all about and how it is implemented, the question arises as to whether you should use page cloaking. The one word answer is "NO". The reason is simple: the search engines don't like it, and will probably ban your site from their index if they find out that your site uses cloaking. The reason that the search engines don't like page cloaking is that it prevents them from being able to spider the same page that their visitors are going to see. And if the search engines are prevented from doing so, they cannot be confident of delivering relevant results to their users. In the past, many people have created optimized pages for some highly popular keywords and then used page cloaking to take people to their real sites which had nothing to do with those keywords. If the search engines allowed this to happen, they would suffer because their users would abandon them and go to another search engine which produced more relevant results.

Of course, a question arises as to how a search engine can detect whether or not a site uses page cloaking. There are three ways by which it can do so:

i) If the site uses User-Agent cloaking, the search engines can simply send a spider to a site which does not report the name of the search engine in the User-Agent variable. If the search engine sees that the page delivered to this spider is different from the page which is delivered to a spider which reports the name of the search engine in the User-Agent variable, it knows that the site has used page cloaking.

ii) If the site uses I.P. based cloaking, the search engines can send a spider from a different I.P. address than any I.P. address which it has used previously. Since this is a new I.P. address, the I.P. database that is used for cloaking will not contain this address. If the search engine detects that the page delivered to the spider with the new I.P. address is different from the page that is delivered to a spider with a known I.P. address, it knows that the site has used page cloaking.

iii) A human representative from a search engine may visit a site to see whether it uses cloaking. If she sees that the page which is delivered to her is different from the one being delivered to the search engine spider, she knows that the site uses cloaking.

Hence, when it comes to page cloaking, my advice is simple: don't even think about using it.

Article by Sumantra Roy. Sumantra is one of the most respected search engine positioning specialists on the Internet. To have Sumantra's company place your site at the top of the search engines, go to http://www.1stSearchRanking.com/t.cgi?1195 For more advice on how you can take your web site to the top of the search engines, subscribe to his FREE newsletter by going to http://www.1stSearchRanking.com/t.cgi?1195&newsletter.htm


Use Doorway Pages At Your Own Risk.

Subia Creative, September 2002
Doorway pages are smoke and mirror. It's an SEO version of “slight of hand” that's not without controversy.
The practicing SEO claims to optimize your web pages, but really doesn't. Although no one ever touches your web page, search engine ranking improves. It's a deceptive practice of presenting a “fake page” to a search engine spider that delivers an entirely different page to the user. It's simply a form of bait-and-switch using redirection code.
 
Although you're not likely to ever see one, a doorway page is generally nothing more than bunches of keywords crudely arranged within non-intelligible content exclusively purposed to satisfy search engine ranking algorithms. This method used by many search engine optimization firms is intended to either cover-up its lack of real optimization skills or as a timesaving mechanism to enhance its profit margins.
We call it “lazy optimization” that has little regard for a client's long-term web visibility value. That's because most search engines now consider doorway techniques as spam, particularly when doorway copy is not relevant to the actual web page content. Upon discovery, a search engine may apply severe ranking penalties and in some cases the index banishment of the offending website.
The sure and safe way of protecting the long-term value of prominent web visibility is optimizing actual web pages without spam tactics. If it takes adding more web pages to target a wider range of search terms, then add them. It takes more time and greater skills, but the rewards of a lifetime flow of traffic is worth the investment.
Before retaining the services of a search engine optimization firm, ask if they use redirection doorway methods. If so, proceed cautiously. Some SEO firms retain ownership of doorway pages under a separate URL. If you decline an “optional” ranking maintenance agreement, it's easy to disengage doorway pages from the web. Rankings can easily disappear almost overnight.
Despite the downside risks of penalty assessments, if you elect to use doorway techniques, make sure any agreement specifies your ownership of all doorway pages.
Reprint rights granted with credit and link to this website.
© Subia. Search Engine Marketing
[TOP]

 

Site Index 

Home

Our Company

Our Company Flash System Overview

Compensation Plan

Products Overview

SFI Fast-Track Club

Home Based Business Links Directory


Home Base Business Opportunities Articles

Internet Marketing - Network Marketing [0] [1]

Search Engines Ranking

Link Popularity


Personal Development Articles

Personal Development