tags:

views:

60

answers:

1

I have an application that makes a series of WCF calls that return JSON via JSONP. In turn javascript code will then bind that returned data to HTML controls.

When a bot / spider hits my application, no data would be indexed because javascript would not execute in the bot.

What are some good patterns for dealing with this problem? Ideally I'd like to not have to maintain two sets of data-binding code (one on the server side and one on the client side).

Essentially I need the resulting data to come downstream. Some ideas I had were to.

1) link RSS/ATOM equivalent data
2) a backdoor HTML page
3) an HTML renderer that can execute an ASPX page server side ahead of time and then pass that off to the client

Any guidance would be helpful

A: 

Option 3 can I think solve the problem, I will suggest to try this:

  1. Try to see if javascript is enable/bot is browsing the page or not
  2. If it is bot or js is disabled, load the page without web service call and render it with server side code
  3. otherwise go for js version.

I will recommend this if your data is relatively low and the implementation cost is not too high.

lakhlaniprashant.blogspot.com