Should the hash code of null always be zero, in .NET

Given that collections like System.Collections.Generic.HashSet<> accept null as a set member, one can ask what the hash code of null should be. It looks like the framework uses 0:

// nullable struct type
int? i = null;
i.GetHashCode();  // gives 0
EqualityComparer<int?>.Default.GetHashCode(i);  // gives 0

// class type
CultureInfo c = null;
EqualityComparer<CultureInfo>.Default.GetHashCode(c);  // gives 0

This can be (a little) problematic with nullable enums. If we define

enum Season
{
  Spring,
  Summer,
  Autumn,
  Winter,
}

then the Nullable<Season> (also called Season?) can take just five values, but two of them, namely null and Season.Spring, have the same hash code.

It is tempting to write a "better" equality comparer like this:

class NewNullEnumEqComp<T> : EqualityComparer<T?> where T : struct
{
  public override bool Equals(T? x, T? y)
  {
    return Default.Equals(x, y);
  }
  public override int GetHashCode(T? x)
  {
    return x.HasValue ? Default.GetHashCode(x) : -1;
  }
}

But is there any reason why the hash code of null should be 0?

EDIT/ADDITION:

Some people seem to think this is about overriding Object.GetHashCode(). It really is not, actually. (The authors of .NET did make an override of GetHashCode() in the Nullable<> struct which is relevant, though.) A user-written implementation of the parameterless GetHashCode() can never handle the situation where the object whose hash code we seek is null.

This is about implementing the abstract method EqualityComparer<T>.GetHashCode(T) or otherwise implementing the interface method IEqualityComparer<T>.GetHashCode(T). Now, while creating these links to MSDN, I see that it says there that these methods throw an ArgumentNullException if their sole argument is null. This must certainly be a mistake on MSDN? None of .NET's own implementations throw exceptions. Throwing in that case would effectively break any attempt to add null to a HashSet<>. Unless HashSet<> does something extraordinary when dealing with a null item (I will have to test that).

NEW EDIT/ADDITION:

Now I tried debugging. With HashSet<>, I can confirm that with the default equality comparer, the values Season.Spring and null will end in the same bucket. This can be determined by very carefully inspecting the private array members m_buckets and m_slots. Note that the indices are always, by design, offset by one.

The code I gave above does not, however, fix this. As it turns out, HashSet<> will never even ask the equality comparer when the value is null. This is from the source code of HashSet<>:

    // Workaround Comparers that throw ArgumentNullException for GetHashCode(null).
    private int InternalGetHashCode(T item) {
        if (item == null) { 
            return 0;
        } 
        return m_comparer.GetHashCode(item) & Lower31BitMask; 
    }

This means that, at least for HashSet<>, it is not even possible to change the hash of null. Instead, a solution is to change the hash of all the other values, like this:

class NewerNullEnumEqComp<T> : EqualityComparer<T?> where T : struct
{
  public override bool Equals(T? x, T? y)
  {
    return Default.Equals(x, y);
  }
  public override int GetHashCode(T? x)
  {
    return x.HasValue ? 1 + Default.GetHashCode(x) : /* not seen by HashSet: */ 0;
  }
}

So long as the hash code returned for nulls is consistent for the type, you should be fine. The only requirement for a hash code is that two objects that are considered equal share the same hash code.

Returning 0 or -1 for null, so long as you choose one and return it all the time, will work. Obviously, non-null hash codes should not return whatever value you use for null.

Similar questions:

GetHashCode on null fields?

What should GetHashCode return when object's identifier is null?

The "Remarks" of this MSDN entry goes into more detail around the hash code. Poignantly, the documentation does not provide any coverage or discussion of null values at all - not even in the community content.

To address your issue with the enum, either re-implement the hash code to return non-zero, add a default "unknown" enum entry equivalent to null, or simply don't use nullable enums.

Interesting find, by the way.

Another problem I see with this generally is that the hash code cannot represent a 4 byte or larger type that is nullable without at least one collision (more as the type size increases). For example, the hash code of an int is just the int, so it uses the full int range. What value in that range do you choose for null? Whatever one you pick will collide with the value's hash code itself.

Collisions in and of themselves are not necessarily a problem, but you need to know they are there. Hash codes are only used in some circumstances. As stated in the docs on MSDN, hash codes are not guaranteed to return different values for different objects so shouldn't be expected to.


It doesn't have to be zero -- you could make it 42 if you wanted to.

All that matters is consistency during the execution of the program.

It's just the most obvious representation, because null is often represented as a zero internally. Which means, while debugging, if you see a hash code of zero, it might prompt you to think, "Hmm.. was this a null reference issue?"

Note that if you use a number like 0xDEADBEEF, then someone could say you're using a magic number... and you kind of would be. (You could say zero is a magic number too, and you'd be kind of right... except that it's so widely used as to be somewhat of an exception to the rule.)


Bear in mind that the hash code is used as a first-step in determining equality only, and [is/should]never (be) used as a de-facto determination as to whether two objects are equal.

If two objects' hash codes are not equal then they are treated as not equal (because we assume that the unerlying implementation is correct - i.e. we don't second-guess that). If they have the same hash code, then they should then be checked for actual equality which, in your case, the null and the enum value will fail.

As a result - using zero is as good as any other value in the general case.

Sure, there will be situations, like your enum, where this zero is shared with a real value's hash code. The question is whether, for you, the miniscule overhead of an additional comparison causes problems.

If so, then define your own comparer for the case of the nullable for your particular type, and ensure that a null value always yields a hash code that is always the same (of course!) and a value that cannot be yielded by the underlying type's own hash code algorithm. For your own types, this is do-able. For others - good luck :)


Good question.

I just tried to code this:

enum Season
{
  Spring,
  Summer,
  Autumn,
  Winter,
}

and execute this like this:

Season? v = null;
Console.WriteLine(v);

it returns null

if I do, instead normal

Season? v = Season.Spring;
Console.WriteLine((int)v);

it return 0, as expected, or simple Spring if we avoid casting to int.

So.. if you do the following:

Season? v = Season.Spring;  
Season? vnull = null;   
if(vnull == v) // never TRUE

EDIT

From MSDN

If two objects compare as equal, the GetHashCode method for each object must return the same value. However, if two objects do not compare as equal, the GetHashCode methods for the two object do not have to return different values

In other words: if two objects have same hash code that doesn't mean that they are equal, cause real equality is determined by Equals.

From MSDN again:

The GetHashCode method for an object must consistently return the same hash code as long as there is no modification to the object state that determines the return value of the object's Equals method. Note that this is true only for the current execution of an application, and that a different hash code can be returned if the application is run again.


But is there any reason why the hash code of null should be 0?

It could have been anything at all. I tend to agree that 0 wasn't necessarily the best choice, but it's one that probably leads to fewest bugs.

A hash function absolutely must return the same hash for the same value. Once there exists a component that does this, this is really the only valid value for the hash of null. If there were a constant for this, like, hm, object.HashOfNull, then someone implementing an IEqualityComparer would have to know to use that value. If they don't think about it, the chance they'll use 0 is slightly higher than every other value, I reckon.

at least for HashSet<>, it is not even possible to change the hash of null

As mentioned above, I think it's completely impossible full stop, just because there exist types which already follow the convention that hash of null is 0.