Java vs .NET runtime optimization
This article is yet another article showing Java beating C/C++ in several benchmarks. There are quite a few of these around, and whether you have problems with their methodology or not, I think they provide cumulative evidence using C++ for performance alone is pointless. I think it's likely that the only people remaining in the "Java must be slow" camp don't actually use it.
Interestingly it contains a discussion on why Java might be able to beat C++. There seem to be two main reasons why:
In fact I believe the main reason why C# defaults to non-virtual methods is because of the CLR's use of precompilation, which means that having every method virtual would be a big performance hit. So in the CLR, de-virtualization effectively needs to be done by hand. C#'s designer makes a virtue out of a necessity by claiming this is a better idea anyway: you should design for overridable methods. While I can agree with that, there's no reason not to use virtual by default and use "final" when you really don't want to have someone override a method.
MS actually recommends you do not use static compilation in this MSDN article, making the same claim that the JIT can produce much better performance. So why do they precompile the CLR framework classes and remove the opportunity for global optimizations?
Interestingly it contains a discussion on why Java might be able to beat C++. There seem to be two main reasons why:
- The Java JIT can make global optimizations at run time that aren't possible at compile time: this includes system classes.
- The removal of arbitrary pointer manipulation allows not only safe GC, but also safe inferences about aliasing.
In fact I believe the main reason why C# defaults to non-virtual methods is because of the CLR's use of precompilation, which means that having every method virtual would be a big performance hit. So in the CLR, de-virtualization effectively needs to be done by hand. C#'s designer makes a virtue out of a necessity by claiming this is a better idea anyway: you should design for overridable methods. While I can agree with that, there's no reason not to use virtual by default and use "final" when you really don't want to have someone override a method.
MS actually recommends you do not use static compilation in this MSDN article, making the same claim that the JIT can produce much better performance. So why do they precompile the CLR framework classes and remove the opportunity for global optimizations?
0 Comments:
Post a Comment
<< Home