Is there a real difference between "null" and "zero"?

Are zero and null perfect synonyms?


'null' is qualitative, representing the absence of quantity. Closer to the word 'void' than the number 'zero'. Example: he reduced it to nil.

'zero' is quantitative. Example: he got zero on his exam.


No, they are not the same.

In an everyday language context, 'null' can mean that something is meaningless, as in:

The agreement became null when Sam failed to fulfill his side.

In a programming/data context (though I still think this is a language question, rather than a programming question), 'null' can mean the absence of information. If you are wondering how many apples there are, 'null' means 'I don't know'. 'Zero' means that you know that there aren't any apples.

Zero always refers to a quantity.


Is there a difference between a cheque(check) with $0.00 and a NULL(VOID) check?

Yes.

A $0.00 cheque will put exactly $0.00 into your bank account.

A check with VOID written on it will not be processed.

The difference may be subtle, but there is a difference.

0 represents an integer in the set of all integers (called the set Z in mathematics) NULL is not an integer, and it could represent the absence of things that aren't even numbers.

A NULL and VOID Check for example.

A NULL Marriage.

A NULL Agreement


null and zero are used in many contexts where they have different meanings. In math you can have a set with no items in it (a null set) or you can have a set with a zero in it ({ 0 }) which are not the same.

In programming some languages make null the same as zero (C++) but some don't (Java).

In databases a record might have a null value in one field or it might have a zero and these are not the same.

Zero is can be used to indicate a counted quantity whereas null cannot.


It's not just null and zero too. What about "nothing","naugt","none" etc.

Zero is usually a noun. Zero is a number. Zero can refer to the symbol "0".

Null is usually an adjective (Null set, Null argument, Null pointer, *a nullity). Null is a not a number, usually has a different symbol each time.

Programmers are forced to make these distinctions all the time (although, 'null' is often modelled as the number 0). Although it is important in programming, the distinction is more to do with logic and mathematics.

They do have quite different roles in language although I guess in a few cases they are the same.

History of Zero