Some web developers spam the search services in attempts to get higher rankings. In the beginning, spam worked, but search services are getting more sophisticated, and spam now results in pages and sometimes whole sites being dropped from search service databases. Each service has its own definition of spam, but the following list characterizes spam.
|Repetition of keywords. Search services give higher rankings for multiple occurrences of keywords. Developers abused this by repeating keywords many times, and services now penalizes pages that have repetition of keywords. Some services allow a few (2 or 3) repetitions, but use caution in how much repetition you have.|
Scatter your keywords and phrases throughout the page, and use variations of the words. Make the use of the keywords appropriate for the context of the page. Search services now do context checking of keywords.
|Keywords not related to web site. Certain keywords, such as sex and naked have high usage in web searches. Some developers add those keywords to their pages in hopes of getting more hits. Search services now do context-testing of keywords and penalize pages that have keywords not related to the themes of the pages.|
|More than one page title (HTML <title> tag). Many search services put a high priority on keywords found in page titles, assuming that titles will be read even if the pages aren't read. Pages that had more than one title tag used to get higher rankings, but search services now penalize this abuse.|
|Text the same color as the background. Most search services now index every word in the text of the page and give higher rankings for words that occur more often. People abused this by repeating keywords in the text many, many times. The repeated keywords were set to the color of the background so they were invisible to the human visitors but visible to the spider software used by the services. Services now penalize this technique.|
|Duplication of pages with different URLs. A technique that has been used to give multiple listings in search results has been to duplicate a page several times and assign different names (URLs) to the clones. Search services penalize the use of this technique.|
A better technique is to divide the site into several unique pages and submit the site to search services. The spiders will eventually visit each page.Some people create different versions of the same page and optimize the versions for different search services. The pages may be interpreted as spam. If you do this, use a robots.txt file to limit access to a given page by only the service for whom that page is optimized.
|Different splash pages linking to same home page. Another technique to get multiple listings in search results has been to have several splash pages link to the same home page. This technique is also being penalized. Again, a better technique is to divide the site into several unique pages.|
|Different pages given to search services than to human visitors. This is known as cloaking and allows pages that would be hard to read by people to be given to search services. Search services penalize this technique.|
|Only link to/from appropriate sites. Search engines give higher rankings for sites that have a lot of links coming to them. However, these sites must be appropriate for the theme of your site. Avoid linking to and requesting links from sites that aren't related to your site. Search services want quality links not quantity of links.|
|Don't resubmit your site many times. After you have submitted your site to a search service, wait until you are indexed before you resubmit your site. Make significant changes to the content of the site before resubmitting..|
[ Site Map ] [ Distance Learning ][ Home ] [ Up ] [ Page Titles ] [ Meta Tags ] [ Page Text ] [ Adding URLs ] [ Higher Rankings ] [ Optimizing Pages ] [ Spamming ] [ Hidden Pages ]
© Copyright 1998, 2011 Allen Leigh