How to measure code performance in .NET?
Solution 1:
The Stopwatch
class, available since .NET 2.0, is the best way to go for this. It is a very high performance counter accurate to fractions of a millisecond.
Take a look at the MSDN documentation, which is pretty clear.
EDIT: As previously suggested, it is also advisable to run your code a number of times in order to get a reasonable average time.
Solution 2:
Execute your code repeatedly. The problem seems to be that your code executes a lot faster than the granularity of your measuring instrument. The simplest solution to this is to execute your code many, many times (thousands, maybe millions) and then calculate the average execution time.
Edit: Also, due to the nature of current optimizing compilers (and Virtual Machines such as the CLR and the JVM) it can be very misleading to measure the execution speed of single lines of code, since the measurement can influence the speed quite a lot. A much better approach would be to profile the entire system (or at least larger blocks) and check where the bottlenecks are.
Solution 3:
I find these useful
http://accelero.codeplex.com/SourceControl/changeset/view/22633#290971 http://accelero.codeplex.com/SourceControl/changeset/view/22633#290973 http://accelero.codeplex.com/SourceControl/changeset/view/22633#290972
TickTimer is a cut down copy of Stopwatch that starts when constructed and does not support restarting. It will also notify you if the current hardware does not support high resolution timing (Stopwatch swallows this problem)
So this
var tickTimer = new TickTimer();
//call a method that takes some time
DoStuff();
tickTimer.Stop();
Debug.WriteLine("Elapsed HighResElapsedTicks " + tickTimer.HighResElapsedTicks);
Debug.WriteLine("Elapsed DateTimeElapsedTicks " + tickTimer.DateTimeElapsedTicks);
Debug.WriteLine("Elapsed ElapsedMilliseconds " + tickTimer.ElapsedMilliseconds);
Debug.WriteLine("Start Time " + new DateTime(tickTimer.DateTimeUtcStartTicks).ToLocalTime().ToLongTimeString());
will output this
Elapsed HighResElapsedTicks 10022886
Elapsed DateTimeElapsedTicks 41896
Elapsed ElapsedMilliseconds 4.18966178849554
Start Time 11:44:58
DebugTimer is a wrapper for TickTimer that will write the result to Debug. (note: it supports the Disposable pattern)
So this
using (new DebugTimer("DoStuff"))
{
//call a method that takes some time
DoStuff();
}
will output this to the debug window
DoStuff: Total 3.6299 ms
IterationDebugTimer is for timing how long it takes to run an operation multiple times and write the result to Debug. It will also perform an initial run that is not included so as to ignore startup time. (note: it supports the Disposable pattern)
So this
int x;
using (var iterationDebugTimer = new IterationDebugTimer("Add", 100000))
{
iterationDebugTimer.Run(() =>
{
x = 1+4;
});
}
Will output this
Add: Iterations 100000
Total 1.198540 ms
Single 0.000012 ms
Solution 4:
Just to add to what others have already said about using Stopwatch and measuring averages.
Make sure you call your method before measuring. Otherwise you will measure the time needed to JIT compile the code as well. That may skew your numbers significantly.
Also, make sure you measure release mode code as optimizations are turned off by default for debug builds. Tuning debug code is pointless imho.
And make sure you're measuring what you actually want to measure. When optimizations kick in, the compiler/JIT compiler may rearrange code or remove it entirely, so you may end up measuring something a little different than intended. At least take a look at the generated code to make sure code has not been stripped.
Depending on what you're trying to measure keep in mind, that a real system will stress the runtime differently than a typical test application. Some performance problems are related to e.g. how objects are garbage collected. These problems will typically not show up in a simple test application.
Actually, the best advise is to measure real systems with real data as sandbox tests may turn out to be highly inaccurate.