What is the difference between DirectX 11 and 12?
I wanted to know this because older graphics hardware that only supported DirectX 11 now supports DirectX 12 too. This was not the case between DirectX 10 and DirectX 11 (i.e. DirectX 10 hardware could not support DirectX 11 games).
What is the difference between DirectX 11 and 12? How is it possible that older hardware can now use DirectX 12?
As a gamer it doesn't mean too much. The most obvious difference that DirectX 12 requires Windows 10, while DirectX 11 requires Windows 7 or later. DirectX 12 also requires that your video card driver supports it as well. This means you need to have a relatively recent AMD, NVIDIA or Intel video card with updated drivers.
In terms of its effect on games DirectX 12 doesn't really change what can be displayed, it just allows for more efficient rendering. Its main improvement is that it lets more than one CPU core to submit commands to the graphic card at the same time. With DirectX 11 or earlier games were effectively limited to accessing the video card from only one CPU core of a multicore CPU at a time.
However the advantages of DirectX 12 aren't easy for developers to exploit in practice. At this point, I don't expect that many games will be able make effective use of it. For the most part, only AAA games would have both the resources and the need to usefully exploit DirectX 12.
Since DirectX 12 doesn't really add new rendering functionality, it just changes the how games access the video card, it's possible to support it with older hardware simply by updating the drivers.
(To be a bit more technical, Direct3D 12 requires that the driver be updated to use WDDM 2.0 and that the hardware supports at least feature level 11_0. The newer feature levels 12_0 and 12_1, mostly affect how games can access graphics resources. The limited additional hardware requirements meant that some older "DirectX 11" hardware was able to support the newer 12_0 level.)
The difference between DirectX 11 and DirectX 12 is a very broad topic which you can read on various sources (e.g. here and here) and I will not discuss here.
I'm assuming your real question is:
How is it possible that older hardware can now use DirectX 12?
It's quite simple really, it's because of something called Feature Levels introduced in DirectX 11.
DirectX 10 had a fixed set of mandatory requirements for the hardware to support it. Thus only new hardware could support DirectX 10 because any hardware that wanted to be DirectX 10 compatible needed to implement all of the features of DirectX 10.
With the introduction of Feature Levels, the hardware no longer needed to implement the full feature set.
For example, a graphics card compatible with DirectX 12 with Feature Level 11_0 is essentially a DirectX 11 card that may take advantage DirectX 12 features that don't require a specific graphics hardware to run (there are a lot of DirectX 12 features in this category).
You can read more on feature levels and find a complete list of them here.