In grade school we learn to rationalize denominators of fractions when possible. We are taught that $\frac{\sqrt{2}}{2}$ is simpler than $\frac{1}{\sqrt{2}}$. An answer on this site says that "there is a bias against roots in the denominator of a fraction". But such fractions are well-defined and I'm failing to see anything wrong with $\frac{1}{\sqrt{2}}$ - in fact, IMO it is simpler than $\frac{\sqrt{2}}{2}$ because 1 is simpler than 2 (or similarly, because the former can trivially be rewritten without a fraction).

So why does this bias against roots in the denominator exist and what is its justification? The only reason I can think of is that the bias is a relic of a time before the reals were understood well enough for mathematicians to be comfortable dividing by irrationals, but I have been unable to find a source to corroborate or contradict this guess.


This was very important before computers in problems where you had to do something else after computing an answer.

One simple example is the following: When you calculate the angle between two vectors, often you get a fraction containing roots. In order to recognize the angle, whenever when possible, it is good to have a standard form for these fractions [side note, I saw often students not being able to find the angle $\theta$ so that $\cos(\theta)=\frac{1}{\sqrt{2}}$]. The simplest way to define a standard form is by making the denominator or numerator integer.

If you wonder why the denominator is the choice, it is the natural choice: As I said often you need to make computations with fractions. What is easier to add: $$\frac{1}{\sqrt{3}}+\frac{1}{\sqrt{6}+\sqrt{3}} \, \mbox{ or }\, \frac{\sqrt{3}}{3}+\frac{\sqrt{6}-\sqrt{3}}{3} \,?$$

Note that bringing fractions to the same denominator is usually easier if the denominator is an integer. And keep in mind that in many problems you start with quantities which need to be replaced by fractions in standard form [for example in trigonometry, problems are set in terms of $\cos(\theta)$ where $\theta$ is some angle].

But at the end of the day, it is just a convention. And while you think that $\frac{1}{\sqrt{2}}$ looks simpler, and you are right, the key with conventions is that they need to be consistent for the cases where you need recognition. The one which looks simpler is often relative...


The historical reason for rationalizing the denominator is that before calculators were invented, square roots had to be approximated by hand.

To approximate $\sqrt{n}$, where $n \in \mathbb{N}$, the ancient Babylonians used the following method:

  1. Make an initial guess, $x_0$.

  2. Let $$x_{k + 1} = \frac{x_k + \dfrac{n}{x_k}}{2}$$

If you use this method, which is equivalent to applying Newton's Method to the function $f(x) = x^2 - n$, to approximate the square root of $2$ by hand with $x_0 = 3/2$, you will see that while the sequence converges quickly, the calculations become onerous after a few steps. However, once an approximation was known, it was easy to calculate $$\frac{1}{\sqrt{2}}$$ quickly by rationalizing the denominator to obtain $$\frac{\sqrt{2}}{2}$$ then dividing the approximation by $2$.