Reading this question, I wanted to test if I could demonstrate the non-atomicity of reads and writes on a type for which the atomicity of such operations is not guaranteed.
private static double _d;
[STAThread]
static void Main()
{
new Thread(KeepMutating).Start();
KeepReading();
}
private static void KeepReading()
{
while (true)
{
double dCopy = _d;
// In release: if (...) throw ...
Debug.Assert(dCopy == 0D || dCopy == double.MaxValue); // Never fails
}
}
private static void KeepMutating()
{
Random rand = new Random();
while (true)
{
_d = rand.Next(2) == 0 ? 0D : double.MaxValue;
}
}
To my surprise, the assertion refused to fail even after a full three minutes of execution. What gives?
- The test is incorrect.
- The specific timing characteristics of the test make it unlikely/impossible that the assertion will fail.
- The probability is so low that I have to run the test for much longer to make it likely that it will trigger.
- The CLR provides stronger guarantees about atomicity than the C# spec.
- My OS/hardware provides stronger guarantees than the CLR.
- Something else?
Of course, I don't intend to rely on any behaviour that is not explicitly guaranteed by the spec, but I would like a deeper understanding of the issue.
FYI, I ran this on both Debug and Release (changing Debug.Assert
to if(..) throw
) profiles in two separate environments:
- Windows 7 64-bit + .NET 3.5 SP1
- Windows XP 32-bit + .NET 2.0
EDIT: To exclude the possibility of John Kugelman's comment "the debugger is not Schrodinger-safe" being the problem, I added the line someList.Add(dCopy);
to the KeepReading
method and verified that this list was not seeing a single stale value from the cache.
EDIT:
Based on Dan Bryant's suggestion: Using long
instead of double
breaks it virtually instantly.