Date difference in Javascript (ignoring time of day)

I'm writing an equipment rental application where clients are charged a fee for renting equipment based on the duration (in days) of the rental. So, basically, (daily fee * number of days) = total charge.

For instant feedback on the client side, I'm trying to use Javascript to figure out the difference in two calendar dates. I've searched around, but nothing I've found is quite what I'm looking for. Most solutions I've seen are of the form:

function dateDiff1(startDate, endDate) {
    return ((endDate.getTime() - startDate.getTime()) / 1000*60*60*24);
}

My problem is that equipment can be checked out and returned at any time of day during those two dates with no additional charge. The above code is calculating the number of 24 hour periods between the two dates, when I'm really interested in the number of calendar days.

For example, if someone checked out equipment at 6am on July 6th and returned it at 10pm on July 7th, the above code would calculate that more than one 24 hour period had passed, and would return 2. The desired result is 1, since only one calendar date has elapsed (i.e. the 6th to the 7th).

The closest solution I've found is this function:

function dateDiff2(startDate, endDate) {
    return endDate.getDate() - startDate.getDate();
}

which does exactly what I want, as long as the two dates are within the same month. However, since getDate() only returns the day of month (i.e. 1-31), it doesn't work when the dates span multiple months (e.g. July 31 to August 1 is 1 day, but the above calcuates 1 - 31, or -29).

On the backend, in PHP, I'm using gregoriantojd(), which seems to work just fine (see this post for an example). I just can't find an equivalent solution in Javascript.

Anyone have any ideas?


Solution 1:

If you want whole days for your student camera rental example ...

function daysBetween(first, second) {

    // Copy date parts of the timestamps, discarding the time parts.
    var one = new Date(first.getFullYear(), first.getMonth(), first.getDate());
    var two = new Date(second.getFullYear(), second.getMonth(), second.getDate());

    // Do the math.
    var millisecondsPerDay = 1000 * 60 * 60 * 24;
    var millisBetween = two.getTime() - one.getTime();
    var days = millisBetween / millisecondsPerDay;

    // Round down.
    return Math.floor(days);
}

Solution 2:

I just had this problem and solved it after finding this question, so I came back to post this. This will get the total days regardless of time. And DST doesn't mess it up:

date1 = Date.UTC(date1.getFullYear(), date1.getMonth(), date1.getDate());
date2 = Date.UTC(date2.getFullYear(), date2.getMonth(), date2.getDate());
var ms = Math.abs(date1-date2);
return Math.floor(ms/1000/60/60/24); //floor should be unnecessary, but just in case

The trick is converting to a UTC date that doesn't even know about the times of the original dates.

Solution 3:

What I would do is set the two date's times to the same time. For example, set endDate's time to 12:00am and startDate's time to 12:00 am also. Then get the difference between them.

On a side note, since I too am in the rental equipment software industry, it seems like you're losing rental revenue by not counting the hours. Per your example if someone picked up the equipment on July 6th at 6am and returned it on july 7th at 10pm. They had two full days to use the equipment and possibly incur an excess meter charge too...