Do you prefer "if (var)" or "if (var != 0)"? [closed]

I've been programming in C-derived languages for a couple of decades now. Somewhere along the line, I decided that I no longer wanted to write:

if (var)    // in C
if ($var)   # in Perl

when what I meant was:

if (var != 0)
if (defined $var and $var ne '')

I think part of it is that I have a strongly-typed brain and in my mind, "if" requires a boolean expression.

Or maybe it's because I use Perl so much and truth and falsehood in Perl is such a mine-field.

Or maybe it's just because these days, I'm mainly a Java programmer.

What are your preferences and why?


Solution 1:

I like my ifs to make sense when read aloud:

if (is_it_happening) ...
if (number_of_sheep != 0) ...
if (pointer_to_something != NULL) ...

Solution 2:

I prefer

if (var != 0)

It's easier to read/understand. Since you write the code once but read quite a few times, easy reading is more important than easy writing.

Solution 3:

It's quite simple. if( var ) tests for truthiness. if( var != 0 ) tests that it's not the number 0. THEY ARE NOT INTERCHANGABLE! There are three reasons.

First, using if( var != 0 ) to test for truth is more complicated. There's simply more there to read and understand. You have to grok that != and 0 is the idiom for "is true". Lacking a distinctive visual pattern, you have to do a little more studying to know that it's not the same as if( var == 0). This is a thin distinction, but worth mentioning. The fact that the if( 0 != var ) style exists gives credence. Better to just eliminate the problem and use if( var ) for truthiness.

Second, and more importantly, the intent must be clear. Are you testing for truth or are you testing for a number (or lack thereof)? if( var ) is testing for truth, if( var != 0 ) is testing a number. To determine anything else requires having knowledge of the author's style, which we must assume the maintenance programmer does not.

Third, there is an assumption here about the value of true and false and numeric operators which might work out in some languages and not in others. In Perl, and I think Javascript, too, empty string is false. A lot of operators return empty string for false. So testing truth with if( var != 0 ) leads to a warning. It becomes more stark when you do something more naive like if( var == 1 ) to mean truth, a clearly dangerous assumption. I have seem many junior programmers write that and in turn written functions that return odd, but true, numbers to punish this sort of thing. Or when I'm in a giving mood, I've sanitized my return value with return var ? 1 : 0.

In a related note, Perl will return the last evaluated expression from a subroutine, so it's not necessary to actually write return. Combine this with the idea people have that explicit return is slower and you get a lot of people abusing this fact.

sub set {
    my( $self, $key, $value ) = @_;

    $self->{$key} = $value;
}

set will return $value. Was that intended? I don't know. What I do know is someone will start relying on it. And the maintenance programmer won't know if they can change it. So I like to explicitly put a return in every non-trivial subroutine.

sub set {
    my( $self, $key, $value ) = @_;

    $self->{$key} = $value;
    return;
}

In that case I decided set will return nothing for the time being and that decision will be quite clear to both the reader and the user.