views:

310

answers:

8

Hi friend,

I had read that SEO is applicable for static website, which holding the information in the initial page itself.. I want to know whether is it possible to achive the SEO for dynamically added informations..

I mean here i used ajax for loading information, in this situation how can achive SEO, is it possible.. please help me..

Thanks Praveenjayapal

A: 

As long as each page has a unique URL (either by url rewriting or by query string parameters) and uses that to drive the content being displayed SEO will work.

I've done this a number of times in the past.

Mauro
A: 

Ensure that your content is accessible to clients without JavaScript. You may have JavaScript on your pages that changes the content based on the URL.

Liam
A: 

I don't really know about this, but IMHO, using semantic markups and submitting sitemap to Google helps a lot.

andyk
+2  A: 

You have to make all your content accessible without javascript (ie. ajax). Otherwise the search engine spiders cannot index your content.

Garry Shutler
A: 

SEO ultimately is based on have a good site. Things that will help you are links from other "good sites", Having descriptive friendly, URLS, page titles and H1 headings submitting sitemaps to google and using there webmaster tools is a great starting place

UndertheFold
Although what you say is true it's not answering the question at all.
allesklar
that's because the question has been edited and no longer reflects what i answered
UndertheFold
A: 

You can create a website that has AJAX and is search engine compatible but it must be created such that the same information can be accessed without AJAX through the same URL. Search engine cannot execute Javascript and as such any content only available through Javascript will be inaccessible to the search engine.

You need to either provide this content within the <noscript> tag or within the page by default and have the Javascript hide it for your AJAX version.

You cannot deliver a different page to a search engine such as Google as they will generally crawl a page both as their bot but also masking as a user by sending a user-agent string purporting to be say, Internet Explorer. This is their method for ensuring you're not trying to game the search engines and they're seeing the same content as a regular user.

Michael Glenn
+1  A: 

The proper way to use javascript and Ajax is to first code your pages and delivery content without javascript. All content should show in a logically organized manner. Once this is done you can use JS/Ajax to provide superior usability to the visitors who have JS enabled.

That will benefit all your users, javascript enabled and disabled, and the search engines.

allesklar
A: 

To solve this problem I have create a sitemap of the site. For example, in my sitemap i have

www.site.com/level_one/level_two/page1.html, www.site.com/level_one/level_two/page2.html, ...

So the crawlers (google,yahoo,bing,etc) knows what to look for. But when an user goes to www.site.com always use pure ajax site. So you need to acces the pages in the sitemap like a static site.

Other way to solve this (more work) is to make page compatible without javascript, so if the user can execute javascript you rewrite all href to "#" (for example)

Please check : http://www.mattcutts.com/blog/give-each-store-a-url/

Hope it helps

llazzaro