As a sort of follow up to the questions called "Differences between MSIL and Java bytecode?" what is the (major) differences or similarity in how the Java Virtual Machine works versus how the .NET Framwork Common Language Runtime (CLR) works?

Also, is the .NET framework CLR a "virtual machine" or does it not have the attributes of a virtual machine?

+3  A: 

The CLR and the JVM are both virtual machines.

The .NET Framework and the Java Runtime Environment are the bundling of the respective VMs and their libraries. Without libraries the VMs are pretty useless.

Allain Lalonde

It is not a virtual machine, the .net framework compiles the assemblies into native binary at the time of the first run:

In computing, just-in-time compilation (JIT), also known as dynamic translation, is a technique for improving the runtime performance of a computer program. JIT builds upon two earlier ideas in run-time environments: bytecode compilation and dynamic compilation. It converts code at runtime prior to executing it natively, for example bytecode into native machine code. The performance improvement over interpreters originates from caching the results of translating blocks of code, and not simply reevaluating each line or operand each time it is met (see Interpreted language). It also has advantages over statically compiling the code at development time, as it can recompile the code if this is found to be advantageous, and may be able to enforce security guarantees. Thus JIT can combine some of the advantages of interpretation and static (ahead-of-time) compilation.

Several modern runtime environments, such as Microsoft's .NET Framework, most implementations of Java, and most recently Actionscript 3, rely on JIT compilation for high-speed code execution.


Adding up .NET framework contains a virtual machine, just like Java.

Just because virtual machine makes use of JIT for performance optimization doesn't mean it's not a virtual machine anymore. When the programmer compiles he compiles to the virtual machine, leaving up to the implementation to perform the execution however it sees fit
Allain Lalonde
+5  A: 

Your first question is comparing the JVM with the .NET Framework - I assume you actually meant to compare with the CLR instead. If so, I think you could write a small book on this (EDIT: looks like Benji already has :-)

One important difference is that the CLR is designed to be a language-neutral architecture, unlike the JVM.

Another important difference is that the CLR was specifically designed to allow for a high level of interoperability with native code. This means that the CLR must manage reliability and security when native memory is accessed and modified, and also manage marshalling between CLR-based data structures and native data structures.

To answer your second question, the term “virtual machine” is an older term from the hardware world (e.g. IBM’s virtualisation of the 360 in the 1960s) that used to mean a software/hardware emulation of the underlying machine to accomplish the same sort of stuff that VMWare does.

The CLR is often referred to as an "execution engine". In this context, that's an implementation of an IL Machine on top of an x86. This is also what the JVM does, although you can argue that there's an important difference between the CLR's polymorphic bytecodes and the JVM's typed bytecodes.

So the pedantic answer to your second question is "no". But it really comes down to to how you define these two terms.

EDIT: One more difference between the JVM and the CLR is that the JVM (version 6) is very reluctant to release allocated memory back to the operating system, even where it can.

For example, let's say that a JVM process starts and allocates 25 MB of memory from the operating system initially. The app code then attempts allocations that require an additional 50 MB. The JVM will allocate an additional 50 MB from the operating system. Once the application code has stopped using that memory, it is garbage-collected and the JVM heap size will decrease. However, the JVM will only free the allocated operating system memory under certain very specific circumstances. Otherwise, for the rest of the process lifetime that memory will remain allocated.

The CLR, on the other hand, releases allocated memory back to the operating system if it's no longer needed. In the example above, the CLR would have released the memory once the heap had decreased.

It's absolutley not correct that the JVM will not free allocated memory. See my answer to this question for proof:
Michael Borgwardt
I have seen the JVM return memory back to Windows.
Steve Kuo
I've changed my answer to say that the JVM 6 is very reluctant to release memory, with links to Ran's and Michael's answers. I never saw this behaviour with JVM 5, so maybe that version was even more reluctant.
+26  A: 

There are a lot of similarities between both implementations (and in my opinion: yes, they're both "virtual machines").

For one thing, they're both stack-based VM's, with no notion of "registers" like we're used to seeing in a modern CPU like the x86 or PowerPC. The evaluation of all expressions ((1 + 1) / 2) is performed by pushing operands onto the "stack" and then popping those operands off the stack whenever an instruction (add, divide, etc) needs to consume those operands. Each instruction pushes its results back onto the stack.

It's a convenient way to implement a virtual machine, because pretty much every CPU in the world has a stack, but the number of registers is often different (and some registers are special-purpose, and each instruction expects its operands in different registers, etc).

So, if you're going to model an abstract machine, a purely stack-based model is a pretty good way to go.

Of course, real machines don't operate that way. So the JIT compiler is responsible for performing "enregistration" of bytecode operations, essentially scheduling the actual CPU registers to contain operands and results whenever possible.

So, I think that's one of the biggest commonalities between the CLR and the JVM.

As for differences...

One interesting difference between the two implementations is that the CLR includes instructions for creating generic types, and then for applying parametric specializations to those types. So, at runtime, the CLR considers a List<int> to be a completely different type from a List<String>.

Under the covers, it uses the same MSIL for all reference-type specializations (so a List<String> uses the same implementation as a List<Object>, with different type-casts at the API boundaries), but each value-type uses its own unique implementation (List<int> generates completely different code from List<double>).

In Java, generic types are a purely a compiler trick. The JVM has no notion of which classes have type-arguments, and it's unable to perform parametric specializations at runtime.

From a practical perspective, that means you can't overload Java methods on generic types. You can't have two different methods, with the same name, differing only on whether they accept a List<String> or a List<Date>. Of course, since the CLR knows about parametric types, it has no problem handling methods overloaded on generic type specializations.

On a day-to-day basis, that's the difference that I notice most between the CLR and the JVM.

Other important differences include:

  • The CLR has closures (implemented as C# delegates). The JVM does not.

  • The CLR has coroutines (implemented with the C# 'yield' keyword). The JVM does not.

  • The CLR allows user code to define new value types (structs), whereas the JVM provides a fixed collection of value types (byte, short, int, long, float, double, char, boolean) and only allows users to define new reference-types (classes).

  • The CLR provides support for declaring and manipulating pointers. This is especially interesting because both the JVM and the CLR employ strict generational compacting garbage collector implementations as their memory-management strategy. Under ordinary circumstances, a strict compacting GC has a really hard time with pointers, because when you move a value from one memory location to another, all of the pointers (and pointers to pointers) become invalid. But the CLR provides a "pinning" mechanism so that developers can declare a block of code within which the CLR is not allowed to move certain pointers. It's very convenient.

  • The largest unit of code in the JVM is a 'class'. In the CLR, classes are aggregated into 'assemblies', and the CLR provides logic for reasoning about and manipulating assemblies (which are loaded into "AppDomains", providing sub-application-level sandboxes for memory allocation and code execution).

  • The CLR bytecode format (composed of MSIL instructions and metadata) has fewer instruction types than the JVM. In the JVM, every unique operation (add two int values, add two float values, etc) has its own unique instruction. In the CLR, all of the MSIL instructions are polymorphic (add two values) and the JIT compiler is responsible for determining the types of the operands and creating appropriate machine code. I don't know which is the preferably strategy, though. Both have trade-offs. The HotSpot JIT compiler, for the JVM, can use a simpler code-generation mechanism (it doesn't need to determine operand types, because they're already encoded in the instruction), but that means it needs a more complex bytecode format, with more instruction types.

I've been using Java (and admiring the JVM) for about ten years now.

But, in my opinion, the CLR is now the superior implementation, in almost every way.

Wow. that is a way better answer than I could have ever asked for. Sincerely - Thank you.
Frank V
Closures and generators are implemented at a language level and are simply represented as classes on the CLR level.
Curt Hagenlocher
+2  A: 

More specifics on the differences can be found at from various academic and private sources. Once good example is CLR Design Choices.

Some specific examples include:

  • Some low-level opperands are typed such as "add two ints" where as CLR uses a polymorphic operand. (i.e. fadd/iadd/ladd vs just add)
  • Currently, the JVM does more aggresive runtime profiling and optimization (i.e. Hotspot). CLR currently does JIT optimizations, but not runtime optimization (i.e. replace code while you're running).
  • CLR doesn't inline virtual methods, JVM does...
  • Support for value types in the CLR beyond just the "primitives".
James Schek