views:

87

answers:

4

If I'm adding content in page through JavaScript will it be crawl-able by Search engine spider and accessible by screen reader.

For example this

var tip = "<p>Most computers will open PDF documents ";
tip += "automatically, but you may";
tip += "need to download <a title='Link to Adobe website-opens in a new window'";
tip +=" href='http://www.adobe.com/products/acrobat/readstep2.html'  
               target='_blank'>Adobe Reader</a>.</p>";

$(document).ready(function(){

    //IF NUMBER OF PDF LINKS IS MORE THAN ZERO INSIDE DIV WITH ID maincontent
    //THEN THIS WILL PUT TIP PARAGRAPH AS LAST CHILD OF DIV
    if($("div#maincontent a[href*='/pdf']").length>0){
    $("div#maincontent").children(":last-child").after(tip);
    }
});

Edit: I want to hide this from Search engine but at the same time keep accessible by screen reader is it possible?

+2  A: 

It depends on the crawler, but don't expect most bots to interpret Javascript.

David Titarenco
+2  A: 

Short answer, probably not. But, Google is getting more sophisticated all the time, so I have my suspicions that they actually render Javascript as part of the indexing process.

Is there a particular reason to do it this way? I'd recommend doing this logic server-side if possible, then you know your HTML is readable by search engines.

Matt Sherman
But this generated code is showing in rendered view source of code and in firebug also
metal-gear-solid
Yes, Firebug will show you the code DOM as it exists *after* fully rendering in the browser. I don't know if Google does that as part of the indexing process. If you want to be sure, send the HTML from the server if possible.I mean, that particular bit of HTML in your example is not going to be interesting to a search engine anyway, so it's not a big deal. But if there is real content, I'd do the above.
Matt Sherman
Google does not generally execute javascript (though they do use some heuristics to try and figure out what's happening). In this particular case, they'd probably just see the embedded HTML in the strings in your javascript and index it anyway. Personally, I agree with this answer: there is no advantage to doing this with javascript, except that it won't work for people with javascript turned off (which is a disadvantage, of course)...
Dean Harding
@codeka - Through this code snippet i'm giving additional info to user to (Download link of adobe reader). this is actually not a part of my page's content . so i don't want to add this in my content for SE spiders. and second reason is this code is saving time for me also. it automatically add the link of Adobe Reader if page has any PDF, otherwise not.
metal-gear-solid
+1  A: 

Re: will content generated dynamically (on the browser) be crawlable by a search engine?

Normally, no.

But Google has invented a way to solve the problem. See ajax crawling

Note: they do it by crawling your urls with various query parameters representing the different states of the dynamic page. They do not attempt to run the js on your page.

Larry K
A: 

No, most web crawlers do not execute JavaScript and older screen readers do not read it either. Your best bet would be to only use Javascript for presentation purposes and use the logic server side (PHP, Ruby, .NET, etc) and some CSS magic to achieve what you are trying to do above with the content. Always insert content via server side if you are concerned about web crawlers and screen readers, and use JavaScript for presentation only. Alternatively, you can use a Flash and JavaScript sniffer for screen readers to redirect the user to an alternate page that does not rely on dynamic content.

joseeight