Is there actually a reason why overloaded && and || don't short circuit?

The short circuiting behaviour of the operators && and || is an amazing tool for programmers.

But why do they lose this behaviour when overloaded? I understand that operators are merely syntactic sugar for functions but the operators for bool have this behaviour, why should it be restricted to this single type? Is there any technical reasoning behind this?


All design processes result in compromises between mutually incompatible goals. Unfortunately, the design process for the overloaded && operator in C++ produced a confusing end result: that the very feature you want from && -- its short-circuiting behavior -- is omitted.

The details of how that design process ended up in this unfortunate place, those I don't know. It is however relevant to see how a later design process took this unpleasant outcome into account. In C#, the overloaded && operator is short circuiting. How did the designers of C# achieve that?

One of the other answers suggests "lambda lifting". That is:

A && B

could be realized as something morally equivalent to:

operator_&& ( A, ()=> B )

where the second argument uses some mechanism for lazy evaluation so that when evaluated, the side effects and value of the expression are produced. The implementation of the overloaded operator would only do the lazy evaluation when necessary.

This is not what the C# design team did. (Aside: though lambda lifting is what I did when it came time to do expression tree representation of the ?? operator, which requires certain conversion operations to be performed lazily. Describing that in detail would however be a major digression. Suffice to say: lambda lifting works but is sufficiently heavyweight that we wished to avoid it.)

Rather, the C# solution breaks the problem down into two separate problems:

  • should we evaluate the right-hand operand?
  • if the answer to the above was "yes", then how do we combine the two operands?

Therefore the problem is solved by making it illegal to overload && directly. Rather, in C# you must overload two operators, each of which answers one of those two questions.

class C
{
    // Is this thing "false-ish"? If yes, we can skip computing the right
    // hand size of an &&
    public static bool operator false (C c) { whatever }

    // If we didn't skip the RHS, how do we combine them?
    public static C operator & (C left, C right) { whatever }
    ...

(Aside: actually, three. C# requires that if operator false is provided then operator true must also be provided, which answers the question: is this thing "true-ish?". Typically there would be no reason to provide only one such operator so C# requires both.)

Consider a statement of the form:

C cresult = cleft && cright;

The compiler generates code for this as thought you had written this pseudo-C#:

C cresult;
C tempLeft = cleft;
cresult = C.false(tempLeft) ? tempLeft : C.&(tempLeft, cright);

As you can see, the left hand side is always evaluated. If it is determined to be "false-ish" then it is the result. Otherwise, the right hand side is evaluated, and the eager user-defined operator & is invoked.

The || operator is defined in the analogous way, as an invocation of operator true and the eager | operator:

cresult = C.true(tempLeft) ? tempLeft : C.|(tempLeft , cright);

By defining all four operators -- true, false, & and | -- C# allows you to not only say cleft && cright but also non-short-circuiting cleft & cright, and also if (cleft) if (cright) ..., and c ? consequence : alternative and while(c), and so on.

Now, I said that all design processes are the result of compromise. Here the C# language designers managed to get short-circuiting && and || right, but doing so requires overloading four operators instead of two, which some people find confusing. The operator true/false feature is one of the least well understood features in C#. The goal of having a sensible and straightforward language that is familiar to C++ users was opposed by the desires to have short circuiting and the desire to not implement lambda lifting or other forms of lazy evaluation. I think that was a reasonable compromise position, but it is important to realize that it is a compromise position. Just a different compromise position than the designers of C++ landed on.

If the subject of language design for such operators interests you, consider reading my series on why C# does not define these operators on nullable Booleans:

http://ericlippert.com/2012/03/26/null-is-not-false-part-one/


The point is that (within the bounds of C++98) the right-hand operand would be passed to the overloaded operator function as argument. In doing so, it would already be evaluated. There is nothing the operator||() or operator&&() code could or could not do that would avoid this.

The original operator is different, because it's not a function, but implemented at a lower level of the language.

Additional language features could have made non-evaluation of the right-hand operand syntactically possible. However, they didn't bother because there are only a select few cases where this would be semantically useful. (Just like ? :, which is not available for overloading at all.

(It took them 16 years to get lambdas into the standard...)

As for the semantical use, consider:

objectA && objectB

This boils down to:

template< typename T >
ClassA.operator&&( T const & objectB )

Think about what exactly you'd like to do with objectB (of unknown type) here, other than calling a conversion operator to bool, and how you'd put that into words for the language definition.

And if you are calling conversion to bool, well...

objectA && obectB

does the same thing, now does it? So why overload in the first place?


A feature has to be thought of, designed, implemented, documented and shipped.

Now we thought of it, let's see why it might be easy now (and hard to do then). Also keep in mind that there's only a limited amount of resources, so adding it might have chopped something else (What would you like to forego for it?).


In theory, all operators could allow short-circuiting behavior with only one "minor" additional language-feature, as of C++11 (when lambdas were introduced, 32 years after "C with classes" started in 1979, a still respectable 16 after c++98):

C++ would just need a way to annotate an argument as lazy-evaluated - a hidden-lambda - to avoid the evaluation until neccessary and allowed (pre-conditions met).


What would that theoretical feature look like (Remember that any new features should be widely usable)?

An annotation lazy, which applied to a function-argument makes the function a template expecting a functor, and makes the compiler pack the expression into a functor:

A operator&&(B b, __lazy C c) {return c;}

// And be called like
exp_b && exp_c;
// or
operator&&(exp_b, exp_c);

It would look under the cover like:

template<class Func> A operator&&(B b, Func& f) {auto&& c = f(); return c;}
// With `f` restricted to no-argument functors returning a `C`.

// And the call:
operator&&(exp_b, [&]{return exp_c;});

Take special note that the lambda stays hidden, and will be called at most once.
There should be no performance-degradation due to this, aside from reduced chances of common-subexpression-elimination.


Beside implementation-complexity and conceptual complexity (every feature increases both, unless it sufficiently eases those complexities for some other features), let's look at another important consideration: Backwards-compatibility.

While this language-feature would not break any code, it would subtly change any API taking advantage of it, which means any use in existing libraries would be a silent breaking change.

BTW: This feature, while easier to use, is strictly stronger than the C# solution of splitting && and || into two functions each for separate definition.