Why is "null" present in C# and Java?

Solution 1:

Anders Hejlsberg, "C# father", just spoke about that point in his Computerworld interview:

For example, in the type system we do not have separation between value and reference types and nullability of types. This may sound a little wonky or a little technical, but in C# reference types can be null, such as strings, but value types cannot be null. It sure would be nice to have had non-nullable reference types, so you could declare that ‘this string can never be null, and I want you compiler to check that I can never hit a null pointer here’.

50% of the bugs that people run into today, coding with C# in our platform, and the same is true of Java for that matter, are probably null reference exceptions. If we had had a stronger type system that would allow you to say that ‘this parameter may never be null, and you compiler please check that at every call, by doing static analysis of the code’. Then we could have stamped out classes of bugs.

Cyrus Najmabadi, a former software design engineer on the C# team (now working at Google) discuss on that subject on his blog: (1st, 2nd, 3rd, 4th). It seems that the biggest hindrance to the adoption of non-nullable types is that notation would disturb programmers’ habits and code base. Something like 70% of references of C# programs are likely to end-up as non-nullable ones.

If you really want to have non-nullable reference type in C# you should try to use Spec# which is a C# extension that allow the use of "!" as a non-nullable sign.

static string AcceptNotNullObject(object! s)
{
    return s.ToString();
}

Solution 2:

Nullity is a natural consequence of reference types. If you have a reference, it has to refer to some object - or be null. If you were to prohibit nullity, you would always have to make sure that every variable was initialized with some non-null expression - and even then you'd have issues if variables were read during the initialization phase.

How would you propose removing the concept of nullity?

Solution 3:

Like many things in object-oriented programming, it all goes back to ALGOL. Tony Hoare just called it his "billion-dollar mistake." If anything, that's an understatement.

Here is a really interesting thesis on how to make nullability not the default in Java. The parallels to C# are obvious.

Solution 4:

Null in C# is mostly a carry-over from C++, which had pointers that didn't point to anything in memory (or rather, adress 0x00). In this interview, Anders Hejlsberg says that he would've like to have added non-nullable reference types in C#.

Null also has a legitimate place in a type system, however, as something akin to the bottom type (where object is the top type). In lisp, the bottom type is NIL and in Scala it is Nothing.

It would've been possible to design C# without any nulls but then you'd have to come up with an acceptable solution for the usages that people usually have for null, such as unitialized-value, not-found, default-value, undefined-value, and None<T>. There would've probably been less adoption amongst C++ and Java programmers if they did succeed in that anyhow. At least until they saw that C# programs never had any null pointer exceptions.

Solution 5:

Removing null wouldn't solve much. You would need to have a default reference for most variables that is set on init. Instead of null-reference exceptions you would get unexpected behaviour because the variable is pointing to the wrong objects. At least null-references fail fast instead of causing unexpected behaviour.

You can look at the null-object pattern for a way to solve part of this problem