Why does the C standard leave use of indeterminate variables undefined?
C chooses to not initialize variables to some automatic value for efficiency reasons. In order to initialize this data, instructions must be added. Here's an example:
int main(int argc, const char *argv[])
{
int x;
return x;
}
generates:
pushl %ebp
movl %esp, %ebp
subl $16, %esp
movl -4(%ebp), %eax
leave
ret
While this code:
int main(int argc, const char *argv[])
{
int x=1;
return x;
}
generates:
pushl %ebp
movl %esp, %ebp
subl $16, %esp
movl $1, -4(%ebp)
movl -4(%ebp), %eax
leave
ret
As you can see, a full extra instruction is used to move 1 into x. This used to matter, and still does on embedded systems.
Garbage values are not really stored anywhere. In fact, garbage values do not really exist, as far as the abstract language is concerned.
You see, in order to generate the most efficient code it is not sufficient for the compiler to operate in terms of lifetimes of objects (variables). In order to generate the most efficient code, the compiler must operate at much finer level: it must "think" in terms of lifetimes of values. This is absolutely necessary in order to perform efficient scheduling of the CPU registers, for one example.
The abstract language has no such concept as "lifetime of value". However, the language authors recognize the importance of that concept to the optimizing compilers. In order to give the compilers enough freedom to perform efficient optimizations, the language is intentionally specified so that it doesn't interfere with important optimizations. This is where the "garbage values" come into picture. The language does not state that garbage values are stored anywhere, the language does not guarantee that the garbage values are stable (i.e. repeated attempts to read the same uninitialized variable might easily result in different "garbage values"). This is done specifically to allow optimizing compilers to implement the vital concept of "lifetime of value" and thus perform more efficient variable manipulation than would be dictated by the language concept of "object lifetime".
IIRC, Thompson or Richie did an interview some years ago where they said the language definition purposely left things vague in some places so the implementers on specific platforms had leeway to do things that made sense (cycles, memory, etc) on that platform. Sorry I don't have a reference to link to.