| There are a number of things to know about search engines and what is acceptable to them.
In the early days, webmasters found lots of neat little tricks to help them get top rankings. They would use approaches such as repeating keywords numerous times in various formats. This is what is considered "spam," the term used to describe unscrupulous marketing tactics. Some sites would use invisible text that was the same color as the background. Others would hide the text behind graphics or just place them in the META tags.
Search engines quickly caught on to these
approaches and started penalizing the perpetrators. Abusers of the submission privilege now either receive less relevant results for their site or they can even be banned from listing at all. So, what we are saying is... don't spam! It only hurts you and your site's chances of getting seen. If you think about it, the content of web pages ought to be enough for search engines to determine relevance and appropriate placement without promoters and webmasters having to resort to excessively
repeating keywords for no other reason than to get higher rankings. Netizens, in general, have come to despise sites that use spam and tend to avoid sites that use this method of promotion.
For more information on the correct ways to get better listings, read through our other search engine tips. TargetedListings, as a standard, always offers their basic submission clients the option of entry pages designed specifically for each search engine. We create these separate pages so
that they focus on a single keyword phrase or a small group of keywords/phrases. However, overuse of this can be a problem. If you go all out by creating a different page for eight different engines, then submit every page to all eight engines, you could get your pages banned for "spamdexing". That is, if all eight pages are very similar in content. Of course, it is only a danger if a bunch of those similar pages wind up listing highly for the same keyword. This is something nobody
likes to see and it's likely somebody will report you for spamdexing. Even though it isn't a frequent occurence, it can happen.
To avoid this kind of problem, it is best to just submit the specifically designed entry page only to it's corresponding engine. You should also make sure that the link from your entry pages is a "one-way" situation. Do not place links to your doorway pages on your homepage. Otherwise, spiders will eventually crawl through and find the entry pages
that were not designed to be listed within it's database.
It is good to setup a robots.txt file to exclude certain entry pages from specific engines, as well. For instance, make sure the entry page designed for AltaVista is excluded from all other engines besides AltaVista. Basically, you just create a text file with any basic HTML editor that can save ASCII .txt files and use the following syntax:
User-agent: {SpiderName} Disallow: {Filename}
For example, to tell
the Alta Vista spider, called Scooter, to not index files called secure.html and private.html, create a robots.txt file as follows:
User-agent: Scooter Disallow: secure.html Disallow: private.html
You would then upload this robots.txt file to the root directory of your Web site. Understand that it will not work in any other location but your root directory. So, if your ISP does not allow access to changing or saving files in your root directory, you will not be able to
use this file type. In any case, this is a voluntary protocol and most major search engines will honor it.
You can add as many Disallow lines under a particular engine as is necessary. If you'd like a page to be excluded from indexing on all of the engines, then just place a * where the filename would go. Or, if you want to exclude an entire directory, use this syntax: Disallow: /mydirectory/* You can also disallow your entire website from being indexed by using: Disallow: /* Make
sure you use the proper syntax! If you misspell something, it will not work. | |