Is there a platform or situation where dereferencing (but not using) a null pointer to make a null reference will behave badly?

I'm currently using a library that uses code like

T& being_a_bad_boy()
{
    return *reinterpret_cast<T*>(0);
}

to make a reference to a T without there actually being a T. This is undefined behavior, specifically noted to be unsupported by the standard, but it's not an unheard of pattern.

I am curious if there are any examples or platforms or usages that show that in practice this can cause problems. Can anyone provide some?


Classically, compilers treated "undefined behavior" as simply an excuse not to check for various types of errors and merely "let it happen anyway." But contemporary compilers are starting to use undefined behavior to guide optimizations.

Consider this code:

int table[5];
bool does_table_contain(int v)
{
    for (int i = 0; i <= 5; i++) {
        if (table[i] == v) return true;
    }
    return false;
}

Classical compilers wouldn't notice that your loop limit was written incorrectly and that the last iteration reads off the end of the array. It would just try to read off the end of the array anyway, and return true if the value one past the end of the array happened to match.

A post-classical compiler on the other hand might perform the following analysis:

  • The first five times through the loop, the function might return true.
  • When i = 5, the code performs undefined behavior. Therefore, the case i = 5 can be treated as unreachable.
  • The case i = 6 (loop runs to completion) is also unreachable, because in order to get there, you first have to do i = 5, which we have already shown was unreachable.
  • Therefore, all reachable code paths return true.

The compiler would then simplify this function to

bool does_table_contain(int v)
{
    return true;
}

Another way of looking at this optimization is that the compiler mentally unrolled the loop:

bool does_table_contain(int v)
{
    if (table[0] == v) return true;
    if (table[1] == v) return true;
    if (table[2] == v) return true;
    if (table[3] == v) return true;
    if (table[4] == v) return true;
    if (table[5] == v) return true;
    return false;
}

And then it realized that the evaluation of table[5] is undefined, so everything past that point is unreachable:

bool does_table_contain(int v)
{
    if (table[0] == v) return true;
    if (table[1] == v) return true;
    if (table[2] == v) return true;
    if (table[3] == v) return true;
    if (table[4] == v) return true;
    /* unreachable due to undefined behavior */
}

and then observe that all reachable code paths return true.

A compiler which uses undefined behavior to guide optimizations would see that every code path through the being_a_bad_boy function invokes undefined behavior, and therefore the being_a_bad_boy function can be reduced to

T& being_a_bad_boy()
{
    /* unreachable due to undefined behavior */
}

This analysis can then back-propagate into all callers of being_a_bad_boy:

void playing_with_fire(bool match_lit, T& t)
{
    kindle(match_lit ? being_a_bad_boy() : t);
} 

Since we know that being_a_bad_boy is unreachable due to undefined behavior, the compiler can conclude that match_lit must never be true, resulting in

void playing_with_fire(bool match_lit, T& t)
{
    kindle(t);
} 

And now everything is catching fire regardless of whether the match is lit.

You may not see this type of undefined-behavior-guided optimization in current-generation compilers much, but like hardware acceleration in Web browsers, it's only a matter of time before it starts becoming more mainstream.


The largest problem with this code isn't that it's likely to break - it's that it defies an implicit assumption programmers have about references that they will always be valid. This is just asking for trouble when someone unfamiliar with the "convention" runs into this code.

There's a potential technical glitch too. Since references are only allowed to refer to valid variables without undefined behavior, and no variable has the address NULL, an optimizing compiler is allowed to optimize out any checks for nullness. I haven't actually seen this done but it is possible.

T &bad = being_a_bad_boy();
if (&bad == NULL)  // this could be optimized away!

Edit: I'm going to shamelessly steal from a comment by @mcmcc and point out that this common idiom is likely to crash because it's using an invalid reference. According to Murphy's Law it will be at the worst possible moment, and of course never during testing.

T bad2 = being_a_bad_boy();

I also know from personal experience that the effects of an invalid reference can propagate far from where the reference was generated, making debugging pure hell.

T &bad3 = being_a_bad_boy();
bad3.do_something();

T::do_something()
{
    use_a_member_of_T();
}

T::use_a_member_of_T()
{
    member = get_unrelated_value(); // crash occurs here, leaving you wondering what happened in get_unrelated_value
}

Use the NullObject pattern.

class Null_T : public T
{
public:
    // implement virtual functions to do whatever
    // you'd expect in the null situation
};

T& doing_the_right_thing()
{
    static Null_T null;
    return null;
}