Since I started to develop in a test/behavior driven style, I appreciated the ability to mock out every dependency.
Since mocking frameworks like Moq work best when told to mock an interface, I now implement an interface for almost every class I create b/c most likely I will have to mock it out in a test eventually. Well, and programming to an interface is good practice, anyways.
At times, my classes take dependencies on .Net classes (e.g. FileSystemWatcher, DispatcherTimer). It would be great in that case to have an interface, so I could depend on an IDispatcherTimer instead, to be able to pass it a mock and simulate its behavior to see if my system under test reacts correctly.
Unfortunately both of above mentioned classes do not implement such interfaces, so I have to resort to creating adapters, that do nothing else but inherit from the original class and conform to an interface, that I then can use.
Here is such an adapter for the DispatcherTimer and the corresponding interface:
using System;
using System.Windows.Threading;
public interface IDispatcherTimer
{
#region Events
event EventHandler Tick;
#endregion
#region Properties
Dispatcher Dispatcher { get; }
TimeSpan Interval { get; set; }
bool IsEnabled { get; set; }
object Tag { get; set; }
#endregion
#region Public Methods
void Start();
void Stop();
#endregion
}
/// <summary>
/// Adapts the DispatcherTimer class to implement the <see cref="IDispatcherTimer"/> interface.
/// </summary>
public class DispatcherTimerAdapter : DispatcherTimer, IDispatcherTimer
{
}
Although this is not the end of the world, I wonder, why the .Net developers didn't take the minute to make their classes implement these interfaces from the get go. It puzzles me especially since now there is a big push for good practices from inside Microsoft.
Does anyone have any (maybe inside) information why this contradiction exists?