Well, I cannot call myself the most expert to answer this question, but here is what I think from my work experience so far :
It is true that with modern computers, interpreted languages are fast enough to be used for developing applications (python, etc.). Javascript engines have greatly improved in the last decade, to the point that talking about 200k+ (uncompressed) javascript is not a ludicrous thing anymore (ex: GWT applications can grow in the order of megabytes even). I remember installing very advanced servers about 6 years ago, they had two dual core processors with virtualization capabilities, a capacity of 16 GB of RAM and 8 SATA slots where we put 2 HD separatly plus 6 other on RAID. Well, nowadays, even laptops have some of that technology and the price of those machines are almost four to five times cheaper of what they were back then.
However, thinking that because of this, there will be less and less server side scripts is not understanding where application development comes from and where it is going now. In fact, the model applications follow more and more is a 2-tier (or n-tier) model and this is solving (among many other reasons) the dependency and packaging issues that desktop (client only) application have. I'm not talking about the average user here, but rather how enterprises deploy and maintain their environments on their hundreds (even many thousands) workstations.
In average, there will always be as much code on the client as there is on the server. The difference is what the code actually do. Server side code are usually in a stateless (or semi-stateless) mode. Where the client code is more dynamic. On a general level, the server handles data, where the client render the UI, and handle various user inputs. (On virtual clients, the server handles just about everything, but we're not talking about that.)
At the moment, the heaviest calculations are performed by servers with lower level languages. But as Javascript engines and browsers become more and more efficient, some calculation will slowly move to the client side and lighten the stress level of servers (making them also more efficient). It's all a question of balancing work load, because there will always be a benefit in having the server run some calculations; for sharing search results, highly complex graphical models, or any cached data, etc.
If laptops now have what servers had about 6 years ago, todays server machines are way beyond what the average consumer's PC is. The fact is that a client machine will never (not in our lifetime anyway) be able to compete what a cluster of servers can do, and this is very powerful in term of application.
What I personally think is that the layers between interpreted languages the CPU will get narrower, perhaps even to the point that languages like javascript will be executed directly without an engine... Well, maybe not, but at least have human readable code also be readable by the computer itself natively. If this happens, the work "compiling" will become archaic.
Also, everything will not always be online, but installing an application would be a matter of caching the web application in an application cache on the client and be able to use it even offline, then synchronize the data once a connection can be established (or upon user request) later. Upgrades and updates will simply be to refresh the application cache.
For example, download your favorite Office suite, edit your document offline, then once a connection is made through an access point, sync/save the document online (and perform application updates in the background). No installation required. All the online part would be performed by PHP, Python, J2EE, etc. And backed by some SQL database.
My two cents.