views:

73

answers:

3

Hello there,

I'm about to open a new website wich have 15.000 pages (ecommerce store). Of course, i will not publish all this pages at the same time, but i'm looking for infos on how much pages should i start with without going to the sandbox. For example, could i start with a 50 pages website or is it too much ?

Then, have you an idea (i know there's no precises rules here) what's the frenquency / volume of pages i could add later ? (50 pages / a day is it ok ?)

Thank you very much for your advice, and sorry for my english :-)

A: 

As far as I am aware, the sandbox is an automatic process for all new sites (except .gov and .edu). Of course most of the information on SEO is based on conjecture anyway.

I have to admit I am interested where you got the idea that the number of pages a site lauches with determines whether it is put in the sandbox or not. I sounds a somewhat preposterous assertion.

Simon
hi, well i've read somewhere a website should have, in order to avoid google penalties, a natural growth. I assumed the sandbox was actually the penalty. Maybe a website stays longer in the sandbox if his growth is too fast ??So, do you have an idea of the number pages i should start with (and then frequently add ?) thx
François
I would be very careful what you pay attention to in the SEO world. There is an awful lot of rubbish mixed in with a relatively small amount of useful information.
Simon
A: 

There is no maximum number of pages. Although many large sites have seen their number of indexed pages dropping.

According to a blogpost by Rand Fishkin (founder of SeoMoz) there are 8 points at which Google might limit the number of pages:

  • Importance on the Web's Link Graph
  • Backlink Profile of the Domain
  • Trustworthiness of the Domain
  • Rate of Growth in Pages vs
  • Backlinks
  • Depth & Frequency of Linking to Pages on the Domain
  • Content Uniqueness
  • Visitor CTR and Usage Data Metrics
  • Search Quality Rater Analysis + Manual Spam Reports

You can read this interessting blog post: Google's Indexation Cap.

Having a good site architecture will help you too. Also from Rand this post is related: Diagrams for Solving Crawl Priority & Indexation Issues

dampee
Thanks man, very interesting reading you gave me. As i understand this, i notice this is also very empirical !
François
That's right françois. No never know for sure if this is the right answer, until you try it out yourself. But SeoMoz is well enough know in the seo community to know they point you in the good direction.
dampee
A: 

Another great way to make sure google can access all of your pages is to have a good sitemap. This will show google that you are not trying to hide anything as well as it will give your site a great interlinking. The best way to have such a large site to get indexed is to have great linking between your pages and to have a sitemap. You can auto-generate a sitemap by going to :

http://www.xml-sitemaps.com/

Good luck

bvandrunen