Geospatial coordinates and distance in kilometers

Solution 1:

You've got a serious case of the magic numbers in the code. The expression:

 (60 * 1.1515 * 1.609344 * 1000)

appears twice, but there's not much explanation of it. With some help: 1.609344 is the number of kilometres in a mile; 60 is the number of minutes in a degree; 1000 is the number of metres in a kilometre; and 1.1515 is the number of statute miles in a nautical mile (thanks, DanM). One nautical mile is the length of one minute of latitude at the equator.

I assume you are using a spherical earth model, rather than a spheroidal earth? The algebra isn't complex enough to be spheroidal.

The first formula - conversion between two latitude and longitude pairs - is odd. You need both delta-lat (Δλ) and delta-lon (Δφ) to sort out the answer. Further, the distance between the pairs:

(60° N, 30° W), (60° N, 60° W)
(60° N, 60° W), (60° N, 90° W)

should be the same - but I'm pretty sure your code produces different answers.

So, I think you need to go back to your spherical trigonometry reference materials and see what you're doing wrong. (It would take me a while to find my book on the subject - it would need to be unpacked from whichever box it is in.)

[...time passes...unpacking done...]

Given a spherical triangle with angles A, B, C at the vertices and sides a, b, c opposite those vertices (that is, side a is from B to C, etc.), the Cosine Formula is:

cos a = cos b . cos c + sin b . sin c . cos A

Applying this to the problem, we can call the two points given B and C, and we create a right spherical triangle with a right angle at A.

ASCII art at its worst:

                  + C
                 /|
                / |
            a  /  | b
              /   |
             /    |
            /     |
         B +------+ A
              c

The side c is equal to the difference in longitude; the side b is equal to the difference in latitude; the angle A is 90°, so cos A = 0. Therefore, I believe an equation for a is:

cos a = cos Δλ . cos Δφ + sin Δλ . sin Δφ . cos 90°

a = arccos (cos Δλ . cos Δφ)

The angle a in radians is then converted to a distance by multiplying by the radius of the Earth. Alternatively, given a in degrees (and fractions of a degree), then there are 60 nautical miles to one degree, hence 60 * 1.1515 statute miles, and 60 * 1.1515 * 1.609344 kilometres to one degree. Unless you want the distance in metres, I don't see a need for the factor of 1000.

Paul Tomblin points to Aviation Formulary v1.44 as a source of the equation - and indeed, it is there, together with a more numerically stable version for when the difference in position is small.

Going to basic trigonometry, we also know that:

cos (A - B) = cos A . cos B + sin A . sin B

Applying that twice in the equation I gave might well end up at the formula in the Aviation Formulary.

(My reference: "Astronomy: Principles and Practice, Fourth Edition" by A E Roy and D Clarke (2003); my copy is the first edition from 1977, Adam Hilger, ISBN 0-85274-346-7.)


NB Check out (Google) 'define:"nautical mile"'; it appears that a nautical mile is now 1852 m (1.852 km) by definition. The multiplier 1.1515 corresponds to the old definition of the nautical mile as approximately 6080 ft. Using bc with a scale of 10, I get:

(1852/(3*0.3048))/1760
1.1507794480

Which factor works for you depends on what your basis is.


Looking at the second problem from first principles, we have a slightly different setup, and we need the 'other' spherical trigonometry equation, the Sine Formula:

sin A   sin B   sin C
----- = ----- = -----
sin a   sin b   sin c

Adapting the previous diagram:

                  + C
                 /|
                / |
            a  /  | b
           |  /   |
           |X/    |
           |/     |
         B +------+ A
              c

You are given starting point B, angle X = 90º - B, length (angle) a, and angle A = 90°. What you are after is b (the delta in latitude) and c (the delta in longitude).

So, we have:

sin a   sin b
----- = ----
sin A   sin B

Or

        sin a . sin B
sin b = -------------
            sin A

Or, since A = 90°, sin A = 1, and sin B = sin (90° - X) = cos X:

sin b = sin a . cos X

That means you convert the distance travelled into an angle a, take the sine of that, multiply by the cosine of the course direction, and take the arcsine of the result.

Given a, b (just calculated) and A and B, we can apply the cosine formula to get c. Note that we cannot simply re-apply the sine formula to get c since we don't have the value of C and, because we're playing with spherical trigonometry, there is no convenient rule that C = 90° - B (the sum of the angles in a spherical triangle can be greater than 180°; consider an equilateral spherical triangle with all angles equal to 90°, which is perfectly feasible).


Solution 2:

Check out http://www.movable-type.co.uk/scripts/latlong.html

That site has a lot of different formulas and Javascript code that should help you out. I've successfully translated it into both C# and a SQL Server UDF and I use them all over the place.

For example for Javascript to calculate distance:

var R = 6371; // km
var φ1 = lat1.toRadians();
var φ2 = lat2.toRadians();
var Δφ = (lat2-lat1).toRadians();
var Δλ = (lon2-lon1).toRadians();

var a = Math.sin(Δφ/2) * Math.sin(Δφ/2) +
        Math.cos(φ1) * Math.cos(φ2) *
        Math.sin(Δλ/2) * Math.sin(Δλ/2);
var c = 2 * Math.atan2(Math.sqrt(a), Math.sqrt(1-a));

var d = R * c; 

Enjoy!

Solution 3:

Your conversion between km and radians is wrong. A nautical mile is 1/60th of a degree, so assuming that 1.15... is your conversion from miles to nautical miles, and 1.6... is your conversion from km to statute miles,

   nm = km /  (1.1515 * 1.609344);
   deg = nm / 60;
   rad = toRadians(deg);

In other words, I think you're off by a factor of 1000.

Solution 4:

Regarding your updated question: Shouldn't

double lon1 = endOfCourse(lat1, 0.0, 90.0, distEast)[0];

be

double lon1 = endOfCourse(lat1, 0.0, 90.0, distEast)[1];