Solution 1:

A couple of important points should be made to address the OP's question:

  1. .NET GC is non-deterministic (i.e. you never know nor should you depend on when it happens)
  2. Dispose is never called by the .NET Framework; you must call it manually - preferably by wrapping its creation in a using() block.
  3. Explicitly setting a disposable object to null without calling Dispose() on it is a bad thing to do. What happens is that you explicitly set the objects "root reference" to null. This effectively means that you cannot call Dispose later AND more importantly, it sends the object to the GC Finalization Queue for Finalization. Causing Finalization by bad programming practice should be avoided at all costs.

Finalizer: Some developers refer to it as a destructor. And in fact it is even called a Destructor in the C# 4.0 Language Spec (section 1.6.7.6) and in previous versions of the current ECMA-334 spec. Fortunately, the 4th Edition (June 2006) correctly defines Finalizers in Section 8.7.9 and attempts to clear up the confusion between the two in Section 17.12. It should be noted that there are important internal differences (no need to go into those gory details here) between what is traditionally known as a destructor and a Destructor/Finalizer in the .NET Framework.

  1. If a Finalizer is present, then it will be called by the .NET Framework if and only if GC.SuppressFinalize() is not called.
  2. You should NEVER explicitly call a finalizer. Fortunately, C# will not explicitly allow this (I don't know about other languages); though it can be forced by calling GC.Collect(2) for the 2nd generation of the GC.

Finalization: Finalization is the .NET Framework's way to deal with the 'graceful' cleanup and releasing of resources.

  1. It only occurs when there are objects in the Finalization Queue.
  2. It only occurs when a garbage collection occurs for Gen2 (which is approx 1 in every 100 collections for a well-written .NET app).
  3. Up to and including .NET 4, there is a single Finalization thread. If this thread becomes blocked for any reason, your app is screwed.
  4. Writing proper and safe finalization code is non-trivial and mistakes can be made quite easily (i.e. accidently allowing exceptions to be thrown from the Finalizer, allowing dependencies on other objects that could already be finalized, etc.)

While this is certainly more info that you asked for, it provides background on how things work and why they work the way they do. Some people will argue that they shouldn't have to worry about managing memory and resources in .NET, but that doesn't change the fact that it needs to be done - and I don't see that going away in the near future.

Unfortunately, the examples above (mistakenly) imply that you need to implement a Finalizer as part of the standard Dispose pattern. However, you should not implement a Finalizer unless you are using UNmanaged code. Otherwise, there are negative performance implications.

I have posted a template for implementing the Dispose pattern here: How do you properly implement the IDisposable pattern?

Solution 2:

I want to write a class that is straightforward and very easy to use, to make sure that every possible resource is cleaned up. I don't want to put that responsibilty to the user of my class.

You can't do that. The memory management is simply not built to accomodate resources that are not memory specifically.

The IDisposable pattern is intended for developers as a way of telling an object when they are done with it, instead of having the memory management trying to figure that out by using things like reference counting.

You can use the Finalizer as a fallback for users who fail to dispose objects properly, but it doesn't work well as the primary method for cleaning up objects. To work smoothly objects should be disposed properly, so that the more costly Finalizer doesn't ever need to be called.