Why should I make the underlying type of an Enum Int32 instead of byte?

Solution 1:

Have a look on MSDN for the reason.

Here is an excerpt:

An enumeration is a value type that defines a set of related named constants. By default, the System.Int32 data type is used to store the constant value. Even though you can change this underlying type, it is not necessary or recommended for most scenarios. Note that no significant performance gain is achieved by using a data type that is smaller than Int32. If you cannot use the default data type, you should use one of the Common Language System (CLS)-compliant integral types, Byte, Int16, Int32, or Int64 to make sure that all values of the enumeration can be represented in CLS-compliant programming languages.

Solution 2:

There are specific situations where narrowing the underlying type brings some advantages, for example performance related or forcing a particular memory layout when interfacing to unmanaged code.

Consider this sample:

using System;

public enum Operations_PerHourType //   : byte
{
    Holes = 1,
    Pieces = 2,
    Sheets = 3,
    Strips = 4,
    Studs = 5
}

class Program
{
    static void Main()
    {
        long before = GC.GetTotalMemory(false);
        var enums = new Operations_PerHourType[10000];
        long after = GC.GetTotalMemory(false);

        Console.WriteLine(after - before);
        // output  (byte): 12218 (I'm using Mono 2.8)
        // output (Int32): 40960
    }
}

This code consumes roughly 40 KB of the heap. Now specify (uncomment) the underlying type as byte and recompile. Wow. Suddenly we only need roughly 10 KB.

Compacting memory like this may sometimes make a program slower, not faster, depending on particular access patterns and data sizes. There is no way to know for sure than to make some measurements and attempt to generalize to other possible circumstances. Sequential traversal of smaller data is usually faster.

However, developing a habit of specifying narrow types just because it is usually possible and sometimes crucial, is not a good idea. Memory savings rarely materialize due to memory alignment of surrounding wider data types. Performance is then either the same or slightly worse due to additional instructions needed to mask away padding bytes.

As another answer has already put it well, follow the Int32 crowd that the runtime is optimized for, until you have to start profiling and addressing real memory hogs in your application.

Solution 3:

According to the documentation, there is no performance gain from using a byte instead of INT32. Unless there is a reason to do so, they recommend not changing it. The underlying idea, is that .NET is optimized for using INT32 in many scenarios, and they selected that for enums for a reason. You don't get anything in your scenario by changing it, so why bother.

http://msdn.microsoft.com/en-us/library/ms182147.aspx

This also talks about how .NET is optimized to use 32 bit integers: .NET Optimized Int32