B, for security reasons.
However how much you should care about that breaking things in your app very much depends on the type of development you are doing. The usual case is almost like a cross compile for a different target, as hardly any user machines are configured like dev machines are.
If it's a web application, your 'platform' is the browser(s) and whatever you use server side. For me that is ruby, rails, and plugins plus apache/mysql etc. Client side, I have no control over. Server side, I want to apply at least security patches (aside: plea to server side developers, please release security patches on a different cycle to functionality changes!).. But OS updates usually have little effect on the dev server stack I'm using, and in any case my deployment environment uses a different OS (I develop on OSX and Ubuntu and deploy on Debian and Solaris).
If it's a desktop application, then it depends how tightly integrated you are with your platform and whether you have control over it in the target environment. As I generally use Java, I see that as the platform not the OS, though it does need testing on different OS flavors and Java versions. If some patch to the OS breaks Java, then that's a separate issue.
If you are very tightly coupled to the OS, then update dependencies are very difficult to test for and is almost impossible without some heavy use of virtual machines plus snapshots etc. Also if you're that fragile with respect to OS changes your app will likely fail if your target machine has a different software load or different configuration anyway - again very difficult to test for.