Converting Color to ConsoleColor?
Here are the console color hex values, as converted by .NET 4.5. First the program:
using System;
using System.Drawing;
class Program
{
static void Main(string[] args)
{
foreach (var n in Enum.GetNames(typeof(ConsoleColor)))
Console.WriteLine("{0,-12} #{1:X6}", n, Color.FromName(n).ToArgb() & 0xFFFFFF);
}
}
And here's the output. As you can see, there's a problem with the reporting for DarkYellow
. The full 32-bits of that one show up as zero. All the others have 0xFF for the alpha channel.
Black #000000
DarkBlue #00008B
DarkGreen #006400
DarkCyan #008B8B
DarkRed #8B0000
DarkMagenta #8B008B
DarkYellow #000000 <-- see comments
Gray #808080
DarkGray #A9A9A9
Blue #0000FF
Green #008000
Cyan #00FFFF
Red #FF0000
Magenta #FF00FF
Yellow #FFFF00
White #FFFFFF
edit: I got a little more carried away just now, so here's a converter from RGB
to the nearest ConsoleColor
value. Note that the dependency on System.Windows.Media
is only for the demonstration harness; the actual function itself only references System.Drawing
.
using System;
using System.Windows.Media;
class NearestConsoleColor
{
static ConsoleColor ClosestConsoleColor(byte r, byte g, byte b)
{
ConsoleColor ret = 0;
double rr = r, gg = g, bb = b, delta = double.MaxValue;
foreach (ConsoleColor cc in Enum.GetValues(typeof(ConsoleColor)))
{
var n = Enum.GetName(typeof(ConsoleColor), cc);
var c = System.Drawing.Color.FromName(n == "DarkYellow" ? "Orange" : n); // bug fix
var t = Math.Pow(c.R - rr, 2.0) + Math.Pow(c.G - gg, 2.0) + Math.Pow(c.B - bb, 2.0);
if (t == 0.0)
return cc;
if (t < delta)
{
delta = t;
ret = cc;
}
}
return ret;
}
static void Main()
{
foreach (var pi in typeof(Colors).GetProperties())
{
var c = (Color)ColorConverter.ConvertFromString(pi.Name);
var cc = ClosestConsoleColor(c.R, c.G, c.B);
Console.ForegroundColor = cc;
Console.WriteLine("{0,-20} {1} {2}", pi.Name, c, Enum.GetName(typeof(ConsoleColor), cc));
}
}
}
The output (partial)...
public static System.ConsoleColor FromColor(System.Drawing.Color c) {
int index = (c.R > 128 | c.G > 128 | c.B > 128) ? 8 : 0; // Bright bit
index |= (c.R > 64) ? 4 : 0; // Red bit
index |= (c.G > 64) ? 2 : 0; // Green bit
index |= (c.B > 64) ? 1 : 0; // Blue bit
return (System.ConsoleColor)index;
}
The ConsoleColors enumeration seems to use the EGA style palette ordering, which is:
index Brgb
0 0000 dark black
1 0001 dark blue
2 0010 dark green
3 0011 dark cyan
4 0100 dark red
5 0101 dark purple
6 0110 dark yellow (brown)
7 0111 dark white (light grey)
8 1000 bright black (dark grey)
9 1001 bright blue
10 1010 bright green
11 1011 bright cyan
12 1100 bright red
13 1101 bright purple
14 1110 bright yellow
15 1111 bright white
You can roughly map a 24-bit colour (or 32-bit colour, by ignoring the alpha channel) to what is essentially 3-bit colour with a brightness component. In this case, the 'brightness' bit is set if any of the System.Drawing.Color's red, green or blue bytes are greater than 128, and the red, green, blue bits are set if the equivalent source bytes are higher than 64.
Unfortunately, even though the Windows console can support RGB colors, the Console class only exposes the ConsoleColor enumeration which greatly limits the possible colors you can use. If you want a Color structure to be mapped to the "closest" ConsoleColor, that will be tricky.
But if you want the named Color to match a corresponding ConsoleColor you can make a map such as:
var map = new Dictionary<Color, ConsoleColor>();
map[Color.Red] = ConsoleColor.Red;
map[Color.Blue] = ConsoleColor.Blue;
etc...
Or if performance is not that important, you can round trip through String. (Only works for named colors)
var color = Enum.Parse(typeof(ConsoleColor), color.Name);
EDIT: Here's a link to a question about finding color "closeness".
On Vista and later see the SetConsoleScreenBufferInfoEx API function.
For an example of usage refer to my answer to another very similar StackOverflow question. (Thanks Hans Passant for original answer).