views:

233

answers:

4

A line in this answer, "Get used to using the Mac on its own terms", took me aback a little bit, and made me start to think about something I pondered ages ago and still haven't come up with a good answer to.

Each OS has its own guidelines for how to develop an application's interface so it fits with the OS, and consequently the mindset of the user.

While no one really enforces this to any great degree (aside from the usual "made for xyz OS" programs) it bothers me that I have to make a choice when developing an application that suggests functionality that is different from the OS suggestion. It may be the application itself is better with a different user interface, or it may be due to implementation, such as being a webapp or cross-platform.

App vs OS:

  • What are the ramifications if I choose to develop against those UI recommendations assuming I have a good reason to do so?

Webapp vs OS:

  • If I'm developing a webapp that's meant to be used as if it were a desktop app what do I do:
    • Develop and follow my own convention (or one of the major OS's conventions)
    • Check the user's OS and follow that convention (and thus a user would get a different experience on different computers even if using the same account)
    • Follow the convention of another major webapp (gmail/docs, live, etc)

Cross platform vs OS:

  • In developing an app that is cross platform
    • Follow one OS's convention
    • Use a cross platform library that follows most of the OS conventions depending on the OS it's being run on, though none are perfect
    • Custom interface for each OS that is fully native

I realize it depends greatly on resources available, and various other unknowables, but what are the considerations, tactics, and arguments you use when considering this choice.

+6  A: 

I remember some app that was adamant about keeping the same look and feel across platforms, touting all sorts of benefits when moving between platforms. The problem is that most people don't use one app on different platforms, they use multiple apps on one platform. If one of those apps behaves significantly differently than what they expect based on every other app they use, they aren't going to like it.

Paul Tomblin
+3  A: 

I don't think there's a really good answer here, other than just to do what you can to follow the de-facto conventions of the platform you're targetting. If you're developing a Web-app with a rich GUI, you should still try to follow general web design principles.

If you have a multi-platform app and you can't have a different front end for each, you can try to fuse together the conventions of the various platforms, but perhaps follow the conventions of the platform where you will have the most users.

You must know the rules before you can break them.

Terrapin
+1  A: 

I think that the general idea is to give the user a sense of familiarity. So it is best to make your app behave like any app on the platform you target. This is particulary true on Mac OS X where OS, Apple apps and third party apps have a consistent look and feel (of course exceptions exist but are considered as such: not the best user experience on the platform).

For Web apps, consistency is among all web apps, where the rule seems to be creativity (like for games or DVDs). And usually it works.

For cross-platform apps, I would suggest a common core (e.g., app menu with File, Edit, View, ... Help), together with specificities on each platform.

mouviciel
+6  A: 

The general rule is that you can deviate from the style guide of a platform when you have compelling reason to believe that the deviation results in a net improvement user performance. That is, the gains provided by the deviation for the user are greater than the cost.

“Compelling reason” generally means empirical evidence –usability test data confirming the net improvement of the deviation. A designer’s hunch is not good enough. Most OS guidelines are not arbitrary but associated with user performance advantages over the alternatives, probably including the one you’re thinking of. The usability tests should be formulated to realistically assess both the performance cost and gains so they can be compared quantitatively. Ties go to following the style guide.

Even when a guideline is arbitrary, there is still always a cost associated with external inconsistency for any deviation, a cost that can be difficult to quantify but must be accounted for through designer’s judgment. External inconsistency costs are primarily the effort to learn the deviation and the errors associated with trying the use the deviation in the context of other apps. The effective cost for learning is less if training is provided and/or the frequency of use is high (the latter justifying the learning overhead). Frequency of errors for a deviation are associated with the frequency and context of app use. The effective cost of errors is less if your app is used extensively and separately from other apps of the platform. As a rule of thumb, unless users use your app for hours a day every day, you need to demonstrate overwhelming superior user performance to justify a deviation.

Generally, inconsistency in UI behavior is a more serious problem than inconsistency in UI appearance. Also, contradictions (something that looks like something from the style guide but means something different) are more serious than irregularities (something that looks different from something in the style guide but means the same thing).

Conformance with OS style guides is generally more important for an app than conformance with web conventions because web conventions are functionally weaker, applying to fewer apps the user is likely to come across. One should not follow the conventions of a single app like Gmail at the expense of OS guidelines unless most of your users use the single app and the single app is used in conjunction with your app. For maximum usability, you should follow the guidelines of whatever OS your users are using. There is no substitute.

Michael Zuschlag