Just wondering what your comments are regarding the current trend as everything is moving to the web or even the cloud. The significance of an OS or desktop application is getting less attention than web application. So to those folks out there who still develop windows applications, such as WPF. Why still do it? Why not move to web programming? Silverlight instead for example...
Because there are still levels of security and interactivity that you can deliver with a Desktop Application rather than a Web Application.
There isn't a direct disconnect between desktop applications and web development/the cloud. Whether the UI layer is a website or an application, the cloud can still back your application. I think thin clients are on the rise, of which RIA technologies like silverlight are just one facilitator.
That said, some tasks just make more sense to do locally. If I had to connect to the cloud to build my console java applications, I would be rather upset.
One reason is performance. Web applications will always run in a sandbox and won't have full access to native resources. Thus, they can't be as fast as a desktop applications. This is very important for applications like IDEs, CAD systems and games.
By now
Desktop software is faster (you haven't to go for a network to read/write/process data)
Hardware Access is easier and preferable to execute locally because security
Don't needing an internet connection can be good ( for example a laptop needing to connect internet for listening a mp3 or viewing your own photos won't be faster neither private enough)
Perhap one day it will be preferable and faster a cloud than a local storage, but whenever we were looking for a stand alone, fail safe, or minimalist, we will need to be unplugged
The applications I've running right now:
- excel
- wireshark
- winzip
- file manager
- several ssh consolees
I don't see what benefit any of those(perhaps bar Excel - but if you ever tried to make a big pivot table with html + ajax, you're just annoying your users) would have as a web application. People make a lot of applications with similar needs. It'd also be silly to enforce a network/web connection for applications that doesn't strictly need it.
Silverlight is something inbetween, I'm not sure I'd consider it a typical "web" application, more a web deployed application with some sandboxing - though the distinctions get blurry.
A well designed desktop app will be considerably more usable than a well designed web app. This is because a web app is limited by what is allowed by the browser, limited by the fact that the browser has controls (toolbar buttons, menus, etc) that aren't related to the application, etc.
For many types of problems this isn't important, and the benefits of a web based app outweigh the usability concerns. But at least with the state of the browser today, web apps simply can't complete with a desktop app in the areas of usability and responsiveness.
It depends on the environment. A corporate user can be expected to be online all/most of the time, whereas someone at home may not be connected continually. Plus you then have the various security issues, control, and update problems that you run into with every app. In a business environment, web apps are great because you can update the application once and everyone immediately sees the change. I do this type of work now. But most of the apps are data driven web apps. For something graphical or processor intensive, you'd want to work at the desktop level (like a few other people above mentioned).
I am sick and tired of the continuous division of "web app vs. desktop app".
There's no such thing as a "web app" - the HTML and the JavaScript are executed on your desktop by the browser; they just happen to be delivered on demand when you navigate to the URL. But even that might not be true, if they happen to be in the cache.
What people usually consider a "web app" is a browser-hosted app that happens to be manipulating resources hosted on the cloud; and quite often this manipulation happens to involve code running on the server as well.
But there's nothing that prevents a "desktop app" to be just a rich front-end for the user manipulating a cloud-based resources. Just look at all the WPF, Air and Objective-C Twitter clients out there.
Also, there's nothing that prevents you from delivering an browser-hosted HTML5/JS based app that manipulates local resources only. Just look at the3 canonical example of a "web app" - Gmail and it's ability to work offline, without network access.
Not to mention that with all the advances in the browsers and the RIA platforms (both Flash and Silverlight) - removing the browser chrome, separating tabs in processes, suporting out-of-browser execution, what is considered a "web app" nowadays is quickly transitioning to something that you should be better called a "light-weight desktop app" or "portable desktop app". :-)
The real division is "rich apps" vs. "reach apps". Developers might be investing heavily into implementing a rich application that relies heavily on the abilities of a particular platform to deliver a level of user experience that can't be implemented otherwise. Or they might want to invest heavily into implementing an application that aim to reach to as many user as possible, on as many platforms as possible and are willing to limit the user experience in certain ways in favor of penetration.
In that context, you question becomes moot. The choice of a presentation layer is completely orthogonal from the choice of the location of the resources your application manipulates.