Does 17% have to be equal to 0.17?

I have a friend who believes that 17% doesn't have to be equal to 0.17. Even though he says that 17% is equal to 0.17 on its own, he says that 17% at any other time is not equal to 0.17, referring to the argument that $17\%x \neq 0.17$. No matter how I try to explain it to him, he won't believe me when I say that 17% is always equal to 0.17, no matter what. Does anyone have a good explanation for this?


Solution 1:

"17 per cent" on its own is $\frac{17}{100} = 0.17$. That's what it means in English language and I'm pretty sure it's the same in most languages.

However $17\%$ of something, say $x$, will be $\frac{17}{100}x = 0.17x$ which of course isn't $0.17$ except for the special case $x = 1$ but that's not very interesting.

If this still doesn't convince you friend, you could take an example :
Say we have an object with a certain price $x$. Then $1\%$ of the price is like $1$ hundredth of the price which is :$$\frac{x}{100} = \frac{1}{100}x = 0.01x$$

$17\%$ of the price of the object is $17$ times greater than $1\%$ of the price therefore it is :$$17\times\frac{1}{100}x = \frac{17}{100}x = 0.17x$$

Solution 2:

The term percent comes from the Latin per centum, or per hundred. 17 per 100 is 0.17, so 17 percent is most definitely 0.17