Why does printf print wrong values?
Why do I get the wrong values when I print an int
using printf("%f\n", myNumber)
?
I don't understand why it prints fine with %d
, but not with %f
. Shouldn't it just add extra zeros?
int a = 1;
int b = 10;
int c = 100;
int d = 1000;
int e = 10000;
printf("%d %d %d %d %d\n", a, b, c, d, e); //prints fine
printf("%f %f %f %f %f\n", a, b, c, d, e); //prints weird stuff
Solution 1:
well of course it prints the "weird" stuff. You are passing in int
s, but telling printf
you passed in float
s. Since these two data types have different and incompatible internal representations, you will get "gibberish".
There is no "automatic cast" when you pass variables to a variandic function like printf
, the values are passed into the function as the datatype they actually are (or upgraded to a larger compatible type in some cases).
What you have done is somewhat similar to this:
union {
int n;
float f;
} x;
x.n = 10;
printf("%f\n", x.f); /* pass in the binary representation for 10,
but treat that same bit pattern as a float,
even though they are incompatible */
Solution 2:
If you want to print them as floats, you can cast them as float before passing them to the printf function.
printf("%f %f %f %f %f\n", (float)a, (float)b, (float)c, (float)d, (float)e);
Solution 3:
a, b, c, d and e aren't floats. printf() is interpreting them as floats, and this would print weird stuff to your screen.