You are invited to Log in or Register a free Frihost Account!

Does search engines recognize your page elements?

You have a beautiful website with great products, great guarantees, many comprehensive pages and great customer service. Unfortunately, Google and other search engines
won’t give your website high rankings.

There are several reasons why search engines do not list websites although they look great and offer quality content:

1. Your Web Pages are Meaningless to Search Engine Spiders
Search engines use simple software programs to visit your web pages. In general, search engine spiders won’t see anything that is displayed in images, Flash elements, JavaScript (except for a few exceptions) and other multimedia formats.

If the main content of your website is displayed in images or Flash then your website can be totally meaningless to search engines. If your website navigation is pure JavaScript then chances are that search engines won’t find the pages of your website.

Your website will look like a single page site although it consists of many different pages.

2. The HTML Code of Your Webpage Contains Major Errors
Most web pages have minor errors in their HTML code. While most search engine spiders can handle minor HTML code errors, some errors can prevent search engine spiders from indexing your web pages.

For example, a tag at the top of your web pages could tell search engine spiders that they have reached the end of the page although the main content of the page has not been indexed yet.

3. The HTML Code of Your Webpages Doesn’t Contain The Right Elements
If you want to get high rankings for certain keywords then these keywords must appear in the right places on your web page. For example, it usually helps to use the keyword in the web page title.

There are many other elements that are important if you want to have high rankings. All of them should be in place if you want to get high rankings.

4. Your Web Server Sends The Wrong Status Codes
Some web servers send wrong status codes to search engine spiders and visitors. When a search engine spider requests a web page from your site then your server sends a response code. This should be the “200 OK” code.

Some servers send a “302 moved” or even a “404 not found” response code to the search engine spiders although the web page can be displayed in a normal web browser.

If your web server sends the wrong response code, search engine spiders will think that the web page doesn’t exist and they won’t index the page.

5. Your robots.txt File Rejects All Search Engine Spiders
If your robots.txt file does not allow search engine spiders to visit your web pages then your website won’t be included in the search results. Some robots.txt file contain errors and search engine spiders are blocked by mistake.

Solution: Check the contents of your robots.txt file. In general, it is not necessary to use a robots.txt file if you don’t want to block certain areas of your website.

Search engine spiders must be able to understand your web pages if you want to get high rankings on Google, Bing and other search engines. The tips above help you to make sure that search engine spiders see what you want them to see.
Related topics
Submit to tons of search engines for free
URL redirectors and search engines
Search Engines still use KEWORDS Tag
Dynamical websites and search engines
To submit or not to engines
SEO TACTICS and NO-NOs - True or False?
What Search Engines See When They Spider Your Web Page...
Search engines and my site
Submitting individual pages to search engines - blogger prob
Mambo and Search Engines
Does domains crawled by search engines?
new crowling techniques used by search engines
does search engines recognize your page elements?
Reply to topic    Frihost Forum Index -> Webmaster and Internet -> SEO and Search Engines

© 2005-2011 Frihost, forums powered by phpBB.