SqlDateTime.MinValue != DateTime.MinValue, why?
I wonder, why SqlDateTime.MinValue is not the same as DateTime.MinValue?
Solution 1:
I think the difference between SQL's and .NET's Date data types stems from the fact that SQL Server's datetime data type, it's minimum and maximum values, and it's precision are much older than .NET's DateTime datatype.
With the advent of .NET, the team decided that the Datetime data type should have a more natural minimum value, and 01/01/0001 seems a fairly logical choice, and certainly from a programming language, rather than database perspective, this value is more natural.
Incidentally, with SQL Server 2008, there are a number of new Date-based datatypes (Date, Time, DateTime2, DateTimeOffset) that actually do offer an increased range and precision, and closely map to the DateTime datatype in .NET. For example, the DateTime2 data type has a date range from 0001-01-01 through 9999-12-31.
The standard "datetime" data type of SQL Server always has had a minimum value of 01/01/1753 (and indeed still does have!). I must admit, I too was curious as to the significance of this value, so did some digging.. What I found was as follows:
During the period between 1 A.D. and today, the Western world has actually used two main calendars: the Julian calendar of Julius Caesar and the Gregorian calendar of Pope Gregory XIII. The two calendars differ with respect to only one rule: the rule for deciding what a leap year is. In the Julian calendar, all years divisible by four are leap years. In the Gregorian calendar, all years divisible by four are leap years, except that years divisible by 100 (but not divisible by 400) are not leap years. Thus, the years 1700, 1800, and 1900 are leap years in the Julian calendar but not in the Gregorian calendar, while the years 1600 and 2000 are leap years in both calendars.
When Pope Gregory XIII introduced his calendar in 1582, he also directed that the days between October 4, 1582, and October 15, 1582, should be skipped—that is, he said that the day after October 4 should be October 15. Many countries delayed changing over, though. England and her colonies didn't switch from Julian to Gregorian reckoning until 1752, so for them, the skipped dates were between September 4 and September 14, 1752. Other countries switched at other times, but 1582 and 1752 are the relevant dates for the DBMSs that we're discussing.
Thus, two problems arise with date arithmetic when one goes back many years. The first is, should leap years before the switch be calculated according to the Julian or the Gregorian rules? The second problem is, when and how should the skipped days be handled?
This is how the Big Eight DBMSs handle these questions:
Pretend there was no switch. This is what the SQL Standard seems to require, although the standard document is unclear: It just says that dates are "constrained by the natural rules for dates using the Gregorian calendar"—whatever "natural rules" are. This is the option that DB2 chose. When there is a pretence that a single calendar's rules have always applied even to times when nobody heard of the calendar, the technical term is that a "proleptic" calendar is in force. So, for example, we could say that DB2 follows a proleptic Gregorian calendar.
Avoid the problem entirely. Microsoft and Sybase set their minimum date values at January 1, 1753, safely past the time that America switched calendars. This is defendable, but from time to time complaints surface that these two DBMSs lack a useful functionality that the other DBMSs have and that the SQL Standard requires.
Pick 1582. This is what Oracle did. An Oracle user would find that the date-arithmetic expression October 15 1582 minus October 4 1582 yields a value of 1 day (because October 5–14 don't exist) and that the date February 29 1300 is valid (because the Julian leap-year rule applies). Why did Oracle go to extra trouble when the SQL Standard doesn't seem to require it? The answer is that users might require it. Historians and astronomers use this hybrid system instead of a proleptic Gregorian calendar. (This is also the default option that Sun picked when implementing the GregorianCalendar class for Java—despite the name, GregorianCalendar is a hybrid calendar.)
This above quotation from taken from the following link:
SQL Performance Tuning: Dates in SQL
Solution 2:
Since, in SQL Server the minimum date that can be stored in a datetime field (1753/1/1), is not equal to the MinValue of the DateTime .NET data type (0001/1/1).
Solution 3:
1753 was the date of the first adopter of the Gregorian calendar (England). As to why this was chosen over 01/01/0001 - it is no doubt legacy from when SQL Server was Sybase back in the 1990s. They must've made the design decision early on and the Microsoft SQL team haven't seen a reason to change it.
Since the explosion of .NET and the integration of it into Sql Server, there is now the DateTime2 object for compatability. If you're an NHibernate user, you can provide this type in your type mappings to avoid DateTime.Min
problems
.NET Dates cater for other calendars besides the Gregorian one:
-
Calendar
- ChineseLunisolarCalendar
- EastAsianLunisolarCalendar
- GregorianCalendar
- HebrewCalendar
- HijriCalendar
- JapaneseCalendar
- JapaneseLunisolarCalendar
- JulianCalendar
- KoreanCalendar
- KoreanLunisolarCalendar
- PersianCalendar
- TaiwanCalendar
- TaiwanLunisolarCalendar
- ThaiBuddhistCalendar
- UmAlQuraCalendar
The JulianCalendar infact pre-dates DateTime.MinValue
Solution 4:
Two different groups decided what "minimum" means to them with regard to date/time.