Although SEO is an formal technique which requires in-depth background research and analysis to make your website visible in search engines, don’t always focus on SEO as a technical solution and don’t spend all your time in tweaking tags, analyzing keywords by measuring keyword metrics and watching your ranking in the search engines. Do just simple, smart and enough SEO to make your pages indexable by the search engines. Spend most of your time and money on writing great pages. Make sure that you have a clear USP so that your targeted audience selects your site from the list in the SERPs.
Indexing issues in SEO:
Let’s have a look at a few issues that may cause indexing problems for your website. There are two kinds of problems: using tools that can’t be indexed by search engines, and using methods that are banned by search engines. If your web page isn’t being indexed, review these pages to see if you can find a solution.
Here are details about these:
-
Using Ajax web developers can easily build interactive sites. However, search engines can’t index Ajax. If you want to use Ajax in your page, leave space on the page for content in plain HTML that is indexable by the search engines.
-
Do not create a website that uses only images. Some designers tend to do this to use unique fonts. It gives nice look and feel, but search engines can’t read images and won’t index them.
-
Password-protected or Registration-required pages will not be indexed by the search engines. If you offer an FAQ but it requires registration, the content won’t be indexed.
-
If a link is broken, the search engines won’t follow it. You can use link-validation software (available in all HTML editors) to test the links to make sure that the search engine spiders can crawl and do follow them to the next level.
-
Frames were popular in the 90s. However, search engines can’t index frames, people can’t link to your framed pages, and you can’t point PPC ads to framed pages. If your website uses frames, rebuild it so it doesn’t use frames.
Keep the Search Engines Out:
In some cases, you may not want a search engine to index your web page. Perhaps you have pages that are not for the public. You have two ways to keep search engines out of your site: the robots.txt file and the robots meta tag.
The robots.txt file simply lists all files that should not be indexed by a search engine. Create a text file, add a list of pages, save the file as robots.txt, and place it in the main HTML directory where your index file resides. Search engines will look for this file. Here’s an example:
User-agent: *
Disallow: /images
Disallow: /user/welcome.html
In this example, the robots.txt file tells the search engines not to index the contents of the images folder and the welcome.html page. You can also use the robots meta tag. This tag is placed in the HEAD tag of each web page to be excluded. Here’s an example:
<META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”>
NOINDEX means the page should not be indexed. NOFOLLOW means not to follow any links from the page. Note that just because you use these items doesn’t mean the page won’t be indexed. There are hundreds of search engines, and not all of them follow these rules. If you have text that shouldn’t appear in search engines, place it in a password-protected page.
Here is what Google can index: HTML, Flash, TXT, Word, Excel, PowerPoint, Microsoft Works, Microsoft Write, RTF, Adobe PDF, Adobe PostScript, Lotus 1-2-3, Lotus WordPro, and MacWrite files. If you put these files on your website and point links to them, Google should be able to index the files.
Disclaimer: The post is completely based on individual thoughts and SEO Services Group bears no responsibilities for the thoughts reflected in the post.
A really great post here. I assume there’s a chance to post more about related theme here. Really great. Thank you.