views:

132

answers:

1

I was reading through 2010 CWE/SANS Top 25 Most Dangerous Programming Errors and one of the entries is for Buffer Copy without Checking Size of Input. It suggests using a language with features to prevent or mitigate this problem, and says:

For example, many languages that perform their own memory management, such as Java and Perl, are not subject to buffer overflows. Other languages, such as Ada and C#, typically provide overflow protection, but the protection can be disabled by the programmer.

I was not aware that Java and C# differed in any meaningful way with regard to memory management. How is it that Java is not subject to buffer overflows, while C# only protects against overflows? And how is it possible to disable this protection in C#?

+3  A: 

java does not support raw pointers (strictly speaking it does not support pointer arithmetic).

In C#, you can use unsafe code and pointers, and unmanaged memory, which makes buffer overruns possible. See unsafe keyword.

To maintain type safety and security, C# does not support pointer arithmetic, by default. However, by using the unsafe keyword, you can define an unsafe context in which pointers can be used. For more information about pointers, see the topic Pointer types.

Mitch Wheat
+1 The programmer perspective is that you disable the protection by using the `unsafe` keyword. It's also worth being aware of the IT administrator perspective, which is that you can **prevent** applications disabling the protection via your security policy.
MarkJ
@MarkJ: good point.
Mitch Wheat