How to find the distance between a point and line joining two points on a sphere?

This assumes that everything is on the surface of the sphere. Furthermore I assume the sphere has radius $1$.

Change the coordinates so that $A$ and $B$ are both on the equator of the sphere. For definiteness, move $A$ to $(1,0,0)$ and move $B$ to $(\cos \theta, \sin \theta, 0)$ where $\theta$ is the angle between the vectors from the center of the sphere to $A$ and to $B$. This is a linear transformation.

Then what you care about is the latitude of $X$. If $X$ is in the sector of the sphere immediately north or south of the line $AB$, then the answer is just $2\pi \phi$ where $\phi$ is the latitude of $X$. If $X$ is not in that sector then the answer is just the distance to either $A$ or $B$, whichever is closer.


The question is a little ambiguous: the three previous answers used three different interpretations. If the OP wants the surface distance from the point $X$ to the geodesic line $\overleftrightarrow{AB}$, the answer is straightforward. If the desired distance is between $X$ and the segment $\overline{AB}$, a bit more work is required.

Using longitude ($\theta$) and latitude ($\phi$), let $A=(\theta_A, \phi_A)$, $B=(\theta_B, \phi_B)$, and $X=(\theta_X, \phi_X)$. The direction vectors for these points are $$\hat A = (\cos \phi_A \cos \theta_A, \cos \phi_A \sin \theta_A, \sin \phi_A),$$ $$ \hat B = (\cos \phi_B \cos \theta_B, \cos \phi_B \sin \theta_B, \sin \phi_B), $$ $$\hat X = (\cos \phi_X \cos \theta_X, \cos \phi_X \sin \theta_X, \sin \phi_X).$$

Let $\Phi$ be the distance on the unit sphere between $\hat X$ and the geodesic line passing through $\hat A$ and $\hat B$. Imagine the plane $\mathcal{P}$ passing through $\hat A$, $\hat B$, and the origin, which cuts the unit sphere in half. Then the Euclidean distance of $\hat X$ from plane $\mathcal{P}$ is $\sin \Phi$. Now let $\hat n$ be a unit normal vector for $\mathcal{P}$, and we have

$$\hat n = \hat A \times \hat B$$ $$\sin \Phi = | \hat n \cdot \hat X |$$

So, if the radius of the original sphere is $R$, then the surface distance from the point $X$ to the geodesic line $\overleftrightarrow{AB}$ is $R \Phi$.

To determine the distance to the segment $\overline{AB}$, we need to determine whether or not the point of line $\overleftrightarrow{ A B}$ that $ X$ is closest to is between $A$ and $B$. If the closest point is between $A$ and $B$, then the surface distance to the segment is $R \Phi$. Otherwise, the distance to the segment is the distance to the closest endpoint, which is best resolved though the methods described in the Wikipedia article referenced by Ross Millikan. One way to make this determination is to find the point $\hat{X}_{\textrm{proj}}$, the projection of $\hat X$ onto plane $\mathcal{P}$,

$$\hat{X}_{\textrm{proj}} = \hat X - (\hat n \cdot \hat X) \hat n,$$

and then normalize $\hat{X}_{\textrm{proj}}$,

$$\hat x = \frac{\hat{X}_{\textrm{proj}} }{| \hat{X}_{\textrm{proj}} |},$$

So determining whether the point of line $\overleftrightarrow{AB}$ that $X$ is closest to is between $A$ and $B$ reduces to determining whether $\hat x$ is between $\hat A$ and $\hat B$.

Now consider the mid-point of $\hat A$ and $\hat B$,

$$M=\frac{\hat A + \hat B}{2}$$

If the projection of $\hat x$ on the ray $\overrightarrow{OM}$ is further along than the projection of $\hat A$ or $\hat B$, then $\hat x$ is between $\hat A$ and $\hat B$, that is, if $\; \hat x \cdot M > \hat A \cdot M \; \; (=\hat B \cdot M)$, then $\hat x$ is between $\hat A$ and $\hat B$, otherwise not.