tags:

views:

101

answers:

4

Hello guys I'm working on a social network like Friendfeed. so when user add his feeds links I use the cron job to parse each user feeds.Is this possible with big number of users ? like parssing 10.000 links each 1h or can make problems? if isn't possible what the technicals are used on Friendfeed or RSS readers to do that ?

A: 

Don't have quite enough information to judge whether this design is good or not, but to answer the basic question, unless you are doing some very intensive processing on 10k questions, that should be trivial for an hourly cron job to handle.

More information on how you process the feeds, and in particular how the process scales with respect to number of users who have feeds and number of feeds per user, would be useful in giving you further advice.

Adam Bellaire
+2  A: 

You might consider adding some information about your hardware to your question, this makes a big difference for someone looking to advise you on how easily your implementation will scale.

If you end up parsing millions of links, one big cron job is going to become problematic. I am assuming you are doing the following (if not, you probably should):

  • Realizing when users subscribe to the same feed, to avoid fetching it twice.
  • When fetching a new feed, check for the existence of a site map that tells you how often the feed is likely to change, re-visit that value on a sensible interval
  • Checking system load and memory usage to know when to 'back off' and go to sleep for a while.

This reduces the amount of sweat that an hourly cron would produce.

If you are harvesting millions of feeds, you'll probably want to distribute that work, something that you might want to keep in mind while you're still desigining your database.

Again, please update your question with details on the hardware you are using and how big your solution needs to scale. Nothing scales 'infinitely', so please be realistic :)

Tim Post
A: 

Your limiting factor will be the network access to these 10,000 feeds. You could process the feeds serially and likely do 10,000 in an hour (you'd need to average about 350ms latency).

Of course you'd want to have more than one process doing the work simultaneously to speed things up.

Steven Huwig
A: 

What ever solution you select, if you meet success (which I hope), you will have performance issue.

As the founder of FF said many times: the only solution to select the best actual solution is to profile/measure. With numbers the choice will be obvious.

So: build a test architecture close to your expected (=realistic) situation in a few months and profile/measure.

Toto