views:

59

answers:

2

Possible Duplicate:
Which is the future of web development: HTML5 or Silverlight(or other RIA framework)?

What do you think about the distiction of server web language like php, ruby, python, ... and client side language(s) into web browser, like javascript? I mean, with the evolution of the web application I think that a lot of code will be moved from server side to client side. Other languages will be supported on client side (web browser)?

A: 

Whilst it may seem the 'evolution' of the web application moves more code to the client side, it only does so to make the client side a more interresting and usable experience. It actually does not move code from the servers side at all nor does it replace the server side code one bit. JavaScript, Silverlight and Flash are all about GUI/View logic and offering another pipeline to the server (ajax).

Ajax offered new possibilities sure, but it's nothing more then a variation of a simple 'can i', 'may i' request scheme we already had in place. No way in hell will user validation/authentication be moved to the client side, no way in hell should client side code be able to access databases indiscriminantly and so on. This is all server side logic no sane person would want in the client side. Any programmer that is doing that, should be fired immediatly and kept away from user data not belonging to him/her, perhaps even arrested for being a public menace.

Other languages being supported on the client side will never happen through means other then a plugin based one. Imagine the havoc one could wreak with C++ or Java in the browser or at least try to fathom how many years of security patching that would take. PHP might be a possibility if you throw out all the I/O and other potential security leaks, but it will fail and fail hard due to legacy (start with IE6 and ponder from there on).

The best chance for other languages to be supported client side (browser) is to compile to JavaScript, but then.. it's not really support for another language now is it...

BGerrissen
A: 

Well, I cannot call myself the most expert to answer this question, but here is what I think from my work experience so far :

It is true that with modern computers, interpreted languages are fast enough to be used for developing applications (python, etc.). Javascript engines have greatly improved in the last decade, to the point that talking about 200k+ (uncompressed) javascript is not a ludicrous thing anymore (ex: GWT applications can grow in the order of megabytes even). I remember installing very advanced servers about 6 years ago, they had two dual core processors with virtualization capabilities, a capacity of 16 GB of RAM and 8 SATA slots where we put 2 HD separatly plus 6 other on RAID. Well, nowadays, even laptops have some of that technology and the price of those machines are almost four to five times cheaper of what they were back then.

However, thinking that because of this, there will be less and less server side scripts is not understanding where application development comes from and where it is going now. In fact, the model applications follow more and more is a 2-tier (or n-tier) model and this is solving (among many other reasons) the dependency and packaging issues that desktop (client only) application have. I'm not talking about the average user here, but rather how enterprises deploy and maintain their environments on their hundreds (even many thousands) workstations.

In average, there will always be as much code on the client as there is on the server. The difference is what the code actually do. Server side code are usually in a stateless (or semi-stateless) mode. Where the client code is more dynamic. On a general level, the server handles data, where the client render the UI, and handle various user inputs. (On virtual clients, the server handles just about everything, but we're not talking about that.)

At the moment, the heaviest calculations are performed by servers with lower level languages. But as Javascript engines and browsers become more and more efficient, some calculation will slowly move to the client side and lighten the stress level of servers (making them also more efficient). It's all a question of balancing work load, because there will always be a benefit in having the server run some calculations; for sharing search results, highly complex graphical models, or any cached data, etc.

If laptops now have what servers had about 6 years ago, todays server machines are way beyond what the average consumer's PC is. The fact is that a client machine will never (not in our lifetime anyway) be able to compete what a cluster of servers can do, and this is very powerful in term of application.

What I personally think is that the layers between interpreted languages the CPU will get narrower, perhaps even to the point that languages like javascript will be executed directly without an engine... Well, maybe not, but at least have human readable code also be readable by the computer itself natively. If this happens, the work "compiling" will become archaic.

Also, everything will not always be online, but installing an application would be a matter of caching the web application in an application cache on the client and be able to use it even offline, then synchronize the data once a connection can be established (or upon user request) later. Upgrades and updates will simply be to refresh the application cache.

For example, download your favorite Office suite, edit your document offline, then once a connection is made through an access point, sync/save the document online (and perform application updates in the background). No installation required. All the online part would be performed by PHP, Python, J2EE, etc. And backed by some SQL database.

My two cents.

Yanick Rochon