views:

257

answers:

8

looking for some general advice and/or thoughts...

i'm creating what i think to be more of a web application then web page, because i intend it to be like a gmail app where you would leave the page open all day long while getting updates "pushed" to the page (for the interested i'm using the comet programming technique). i've never created a web page before that was so rich in ajax and javascript (i am now a huge fan of jquery). because of this, time and time again when i'm implementing a new feature that requires a dynamic change in the UI that the server needs to know about, i am faced with the same question:

1) should i do all the processing on the client in javascript and post back as little as possible via ajax or 2) should i post a request to the server via ajax, have the server do all the processing and then send back the new html. then on the ajax response i do a simple assignment with the new HTML

i have been inclined to always follow #1. this web app i imagine may get pretty chatty with all the ajax requests. my thought is minimize as much as possible the size of the requests and responses, and rely on the continuously improving javascript engines to do as much of the processing and UI updates as possible. i've discovered with jquery i can do so much on the client side that i wouldn't have been able to do very easily before. my javascript code is actually much bigger and more complex than my serverside code. there are also simple calulcations i need to perform and i've pushed that on the client side, too.

i guess the main question i have is, should we ALWAYS strive for client side processing over server side processing whenever possible? i 've always felt the less the server has to handle the better for scalability/performance. let the power of the client's processor do all the hard work (if possible).

thoughts?

A: 

Of course it depends on the data, but a majority of the time if you can push it client side, do. Make the client do more of the processing and use less bandwidth. (Again this depends on the data, you can get into cases that you have to send more data across to do it client side).

Myles
in my case the data sent accross will never be more in order to do it on the client side. in most cases all i need to do is send a few hundred bytes at the most, and based on that data the client can rebuild the UI piece necessary. of course this complicates the javascript pretty significantly (jquery makes it reasonable), but if i did it on the server side the ajax response could be thousands of bytes..
mdz
The site will be more responsive and scale better the more you can push to the client side. Client side code does present it's own problems though, with things like unit testing, security, etc., but it's definitely the way to go if possible.
Myles
+1  A: 

I agree with you. Push as much as possible to users, but not too much. If your app slows or even worse crashes their browser you loose.

My advice is to actually test how you application acts when turned on for all day. Check that there are no memory leaks. Check that there isn't a ajax request created every half of second after working with application for a while (timers in JS can be a pain sometime).

Apart from that never perform user input validation with javascript. Always duplicate it on server.

Edit

Use jquery live binding. It will save you a lot of time when rebinding generated content and will make your architecture more clear. Sadly when I was developing with jQuery it wasn't available yet; we used other tools with same effect.

In past I also had a problem when one page part generation using ajax depends on other part generation. Generating first part first and second part second will make your page slower as expected. Plan this in front. Develop a pages so that they already have all content when opened.

Also (regarding simple pages too), keep number of referenced files on one server low. Join javascript and css libraries into one file on server side. Keep images on separate host, better separate hosts (creating just a third level domain will do too). Though this is worth it only on production; it will make development process more difficult.

Sergej Andrejev
the great thing is there are NO JS timers being used. no polling at all. it's interesting to see how fast chrome's JS engine is compared to other browsers.... things seem to happen instantly on chrome, but u can actually watch things happen sometimes in firefox (actually isn't a bad feature)
mdz
A: 

Some stuff like security checks should always be done on the server. If you have a computation that takes a lot of data and produces less data, also put it on the server.

Incidentally, did you know you could run Javascript on the server side, rendering templates and hitting databases? Check out the CommonJS ecosystem.

Tobu
A: 

There could also be cross-browser support issues. If you're using a cross-browser, client-side library (eg JQuery) and it can handle all the processing you need then you can let the library take care of it. Generating cross-browser HTML server-side can be harder (tends to be more manual), depending on the complexity of the markup.

AUSteve
+1  A: 

There are several considerations when deciding if new HTML fragments created by an ajax request should be constructed on the server or client side. Some things to consider:

  • Performance. The work your server has to do is what you should be concerned with. By doing more of the processing on the client side, you reduce the amount of work the server does, and speed things up. If the server can send a small bit of JSON instead of giant HTML fragment, for example, it'd be much more efficient to let the client do it. In situations where it's a small amount of data being sent either way, the difference is probably negligible.

  • Readability. The disadvantage to generating markup in your JavaScript is that it's much harder to read and maintain the code. Embedding HTML in quoted strings is nasty to look at in a text editor with syntax coloring set to JavaScript and makes for more difficult editing.

  • Separation of data, presentation, and behavior. Along the lines of readability, having HTML fragments in your JavaScript doesn't make much sense for code organization. HTML templates should handle the markup and JavaScript should be left alone to handle the behavior of your application. The contents of an HTML fragment being inserted into a page is not relevant to your JavaScript code, just the fact that it's being inserted, where, and when.

I tend to lean more toward returning HTML fragments from the server when dealing with ajax responses, for the readability and code organization reasons I mention above. Of course, it all depends on how your application works, how processing intensive the ajax responses are, and how much traffic the app is getting. If the server is having to do significant work in generating these responses and is causing a bottleneck, then it may be more important to push the work to the client and forego other considerations.

Jimmy Cuadra
good points on readability and separation. i started to get concerned because my javascript is looking pretty ugly with html fragments and escaped quotes all over the place.. started to pull out as much as i can and decalre the HTML fragments in another constants js file...
mdz
You can maintain readability of your code by using XSLT to transform XML into HTML on the clientside. Since the XSL documents become cached, it really adds little over head to it. The exception of course is that you can only transform XML using an XSL document. All major browsers (including IE6) support XSLT at the client-side (though not necessarily via javascript for ajax responses). Obviously there are ways around this, but beware the added compatability testing.
Kevin Peno
+1  A: 

I'm currently working on a pretty computationally-heavy application right now and I'm rendering almost all of it on the client-side. I don't know exactly what your application is going to be doing (more details would be great), but I'd say your application could probably do the same. Just make sure all of your security- and database-related code lies on the server-side, because not doing so will open security holes in your application. Here are some general guidelines that I follow:

  • Don't ever rely on the user having a super-fast browser or computer. Some people are using Internet Explore 7 on old machines, and if it's too slow for them, you're going to lose a lot of potential customers. Test on as many different browsers and machines as possible.
  • Any time you have some code that could potentially slow down or freeze the browser momentarily, show a feedback mechanism (in most cases a simple "Loading" message will do) to tell the user that something is indeed going on, and the browser didn't just randomly freeze.
  • Try to load as much as you can during initialization and cache everything. In my application, I'm doing something similar to Gmail: show a loading bar, load up everything that the application will ever need, and then give the user a smooth experience from there on out. Yes, they're going to have to potentially wait a couple seconds for it to load, but after that there should be no problems.
  • Minimize DOM manipulation. Raw number-crunching JavaScript performance might be "fast enough", but access to the DOM is still slow. Avoid creating and destroying elements; instead simply hide them if you don't need them at the moment.
musicfreak
thanks for the tips. cant u expand on the "cache everythihng" comment? i don't have experience using caching on the client side. are you referring to possible plugins or just plain global variables?
mdz
What I'm saying is don't throw anything away. If you do a bunch of computation, don't throw away the results afterwards; keep them in memory for next time. I'm not talking about using plugins, just plain variables or whatever, although you could use the client-side storage available in Gears and most browsers.
musicfreak
I wouldn't use gears. Even though it had some hype among developers. it's not that popular. If you really need to store something between requests consider using cookies or flash storage because of it's higher availability.
Sergej Andrejev
@Sergej: It doesn't hurt to check for it, does it? Just check if the user has it installed, and if not then oh well. I'm not saying you should rely on it, simply use it if available.
musicfreak
@Sergej: I also wouldn't downplay gears. While the plugin itself is not gaining much traction, the same functionality (sandboxed multi-process javascript, local caching of files via manifest, and local SQL store) is being suggested in HTML 5 spec (the iphone has even implemented it, and others I cannot recall). Thus, if you implemented it now via plugin, you could switch over later.
Kevin Peno
+1  A: 

I recently ran into the same problem and decided to go with browser side processing, everything worked great in FF and IE8 and IE8 in 7 mode, but then... our client, using Internet Explorer 7 ran into problems, the application would freeze up and a script timeout box would appear, I had put too much work into the solution to throw it away so I ended up spending an hour or so optimizing the script and adding setTimeout wherever possible.

My suggestions?

  • If possible, keep non-critical calculations client side.
  • To keep data transfers low, use JSON and let the client side sort out the HTML.
  • Test your script using the lowest common denominator.
  • If needed use the profiling feature in FireBug. Corollary: use the uncompressed (development) version of jQuery.
Kristoffer S Hansen
glad i posted this question. getting a lot of good suggestions, and getting confirmation on my original thoughts. i too am a strong believe in keeping the server as simple and lean as possible, but because of that it's increased the complexity substantially of client side code. however, jquery has mitigated that TREMENDOUSLY. also i like using chrome's task manager. so far my chrome session takes up like 20MB (gmail is 50), and cpu gets to 15 at the heaviest point during my iterating calculations. it shoulodn't get too much heavier from here so i think i'm ok.
mdz
dynatrace has an excellent profiler out (for free!) specific for IE, you can see exactly what's taking so long, network, paint, script time, it'll tell you what line of javascript's hanging up/getting called 100,000 times etc...I'd recommend you grab a copy, it's helped me a lot in fixing any IE performance issues. http://ajax.dynatrace.com/
Nick Craver
A: 

this is possible, but with the heavy intial page load && heavy use of caching. take gmail as an example

  • On initial page load, it downloads most of the js files it needed to run. And most of all cached.
  • dont over use of images and graphics.
  • Load all the data need to show in intial load and along with the subsequent predictable user data. in gmail & latest yahoo mail the inbox is not only populated with the single mail conversation body, It loads first few full email messages in advance at the time of pageload. secret of high resposiveness comes with the cost (gmail asks to load the light version if the bandwidth is low.i bet most of us have experienced ).
  • follow KISS principle. means keep ur desgin simple.
  • And never try to render the whole page using javascript in any case, you cannot predict all your endusers using the high config systems or high bandwidth systems.

Its smart to split the workload between your server and client.

Cheers

RameshVel

Ramesh Vel