I've written an abstract base class for unit tests that sets up just enough environment for our tests to run. The class exposes some of the runtime environment bits as properties whose types vary test by test (the property types are type arguments specified in the inheriting, concrete test class).
This is all well and good, except a co-worker noticed that he can't view any of the class' properties in the debugger. Turns out the reason is that he had no fields defined in his inheriting class, and the CLR optimized something or other away, so the debugger couldn't display the properties. Is it possible to prevent this in the base class somehow, or do I have to resort to telling everyone they need to define at least one field which is used somewhere during the tests?
Edit:
Sounds like a likely culprit should be the optimization/debug settings. That said, I'm building the app from Visual Studio in Debug mode, I've double-checked that all projects are set for a debug build, and none of the projects in this solution have the Optimize flag set.
Perhaps it would also be relevant to note that I'm using MSTest and the Visual Studio test runner.
Edit 2:
By "can't view properties" I'm referring to when I evaluate the property in Quickwatch and get a red exclamation mark and a text "Could not evaluate expression" error text. And lest you think I'm entirely off base with my suspicions, adding an instance field that gets initialized in the test initialize method makes the problem go away...
Edit 3:
Checked the build output. I notice that the compiler is invoked with these options:
/debug+
/debug:full
/optimize-
/define:DEBUG,TRACE
I should think that would be enough to stop this from happening, but there you go. :)