Why does the statement "2i;" NOT cause a compiler error?

Instead of 2*i, I carelessly wrote 2i:

int foo(int i)
{
    2i;
    return 2i;
}

I expected the compiler to catch the error. But it did not. So is 2i a valid statement in C? If so what does it do? Puzzled!

I compiled using gcc version 5.3.0 and here is the assembly output:

    .file   "strange.c"
    .text
    .globl  foo
    .type   foo, @function
foo:
.LFB0:
    .cfi_startproc
    pushq   %rbp
    .cfi_def_cfa_offset 16
    .cfi_offset 6, -16
    movq    %rsp, %rbp
    .cfi_def_cfa_register 6
    movl    %edi, -4(%rbp)
    nop
    popq    %rbp
    .cfi_def_cfa 7, 8
    ret
    .cfi_endproc
.LFE0:
    .size   foo, .-foo
    .ident  "GCC: (GNU) 5.3.0"
    .section    .note.GNU-stack,"",@progbits

Solution 1:

This is a gcc extension, and 2i is the imaginary constant enter image description here. So you can write a complex number like so:

#include <complex.h>

_Complex x = 4 + 5i;

Solution 2:

2i is a gcc extension for a complex integer literal, a pure imaginary number twice the square root of -1. This extension is supported by clang as well.

It is somewhat surprising that your compiling with gcc 5.4.0 produces the posted assembly output:

  • Compiling on http://gcc.godbolt.org/# I get a compilation error from gcc 5.3.0: http://gcc.godbolt.org/#: error: cannot convert '__complex__ int' to 'int' in return.
  • The posted assembly code for function foo is incorrect: it does not return 0. Converting the complex integer constant 2i to int should return its real part 0.

Conversely, with clang 3.7, it compiles without a warning and generates optimum code, but of course not what you expect:

foo(int):                       # @foo(int)
    xorl    %eax, %eax
    retq

This syntax can be combined with other suffixes in any order. Compiling the code below with clang -Weverything gives me appropriate warnings warning: imaginary constants are a GNU extension [-Wgnu-imaginary-constant]:

#include <stdio.h>

int main() {
    /* complex integer literals */
    printf("sizeof(2i) = %zd\n", sizeof(2i));
    printf("sizeof(2ui) = %zd\n", sizeof(2ui));
    printf("sizeof(2li) = %zd\n", sizeof(2li));
    printf("sizeof(2lli) = %zd\n", sizeof(2lli));
    /* complex floating point literals */
    printf("sizeof(2.i) = %zd\n", sizeof(2.i));
    printf("sizeof(2.fi) = %zd\n", sizeof(2.fi));
    printf("sizeof(2e0fi) = %zd\n", sizeof(2e0fi));
    printf("sizeof(2e0i) = %zd\n", sizeof(2e0i));
    /* alternate order */
    printf("sizeof(2il) = %zd\n", sizeof(2il));
    printf("sizeof(2ill) = %zd\n", sizeof(2ill));
    printf("sizeof(2.if) = %zd\n", sizeof(2.if));

    return 0;
}

It produces this output in my environment:

sizeof(2i) = 8
sizeof(2ui) = 8
sizeof(2li) = 16
sizeof(2lli) = 16
sizeof(2.i) = 16
sizeof(2.fi) = 8
sizeof(2e0fi) = 8
sizeof(2e0i) = 16
sizeof(2il) = 16
sizeof(2ill) = 16
sizeof(2.if) = 8

Try the last one with your syntax coloring editor ;-)