views:

54

answers:

2

Hey,

I am developing a website that has some sort of realtime update. Now the website is generated with a javascript variable of the current ID of the dataset. Then in an interval of some seconsd an AJAX call is made passing on the current ID, and if theres something new the server returns it along with the latest ID which is then updated in the javascript. Very simple, but here comes the Problem.

If the user opens the same page multiple times, every page does this AJAX requests which produces heavy serverload.

Now I thought about the following approach:

The website is loaded with a javascript variable of the current timestamp and ID of the current dataset. My desired refresh interval is for example 3 seconds.

In the website an interval counter counts up every seconds, and everytime the timestamp reaches a state where (timestmap % 3===0) returns true, the content is updated. The link looks like http://www.example.com/refresh.php?my-revision=123&timestamp=123456

Now this should ensure that every browser window calls the same URL. Then I can turn on browser level caching.

But I don't really like this solution. I would prefer adding another layer of data sharing in a Cookie. This shouldn't be much of a problem, I can just store every request in a cookie named by timestamp and data revision with a TTL of 10 seconds or so and check for its exitence first.

BUT

The pages will do the request at the same time. So the whole logic of browser caching and cookie might not work because the requests occour simultanously and not one after another.

So I thought about limiting the current connections to 1 server side. But then I would need at least an extra vhost, because I really dont want to do that for the whole page. And this lets me run into problems concerning cross-site policies!

Of course there are some super complicated load balancing solutions / server side solusions bound to request uri and ip adress or something but thats all extreme overkill!

It must be a common problem! Just think of facebook chat. I really don't think they do all the requests in every window you have open...

Any ideas? I'm really stuck with this one!

Maby I can do some inter-window Javascript communication? Shouldnt be a problem if its all on the same domain?

A thing I can do of course is server side caching. Which avoids at least DB Connections and intensive calculations... but it still is an request which I would like to avoid.

+2  A: 

You might want to check out Comet and Orbited . This is best solved with server push technology.

dekomote
thanks ill check it out right away!
Joe Hopfgartner
thanks, this is a good article about the tequniques for real time and incremental updating! but i'm afraid it doesn't cover the specific problem i have... but maby it points me in different directions where i might discover a solution! (+1 for the good resources!)
Joe Hopfgartner
By using STOMP (or XMPP) you can define different channels for different dataset ID's. Orbited supports both protocols. Then, you will have separate channel and real-time update for each ID of the dataset. I proposed Comet, as it reduces load on the front-end a lot. If you integrate good backend (signals/triggers on insert/update of the dataset) you will optimize the backend a lot as well.
dekomote
+1  A: 
T.J. Crowder
thanks! you used the right words "guarantee of atomicity in writing cookies" which was exactly my concern! anway, in addition to your solution i will just add a random time of 2-3 seconds until the request is made in each window, i minimize the chance of collisions, and if some happen... its not a big problem!
Joe Hopfgartner
@Joe Hopfgartner: Note that I said I *don't* know that there's a guarantee. :-) So be careful. Have fun!
T.J. Crowder
yes and i assume there most likely isnt and its up to the browser anway. therefore the random timeshift :)
Joe Hopfgartner