Calculate System.Decimal Precision and Scale

Yes, you'd need to use Decimal.GetBits. Unfortunately, you then have to work with a 96-bit integer, and there are no simple integer type in .NET which copes with 96 bits. On the other hand, it's possible that you could use Decimal itself...

Here's some code which produces the same numbers as your examples. Hope you find it useful :)

using System;

public class Test
{
    static public void Main(string[] x)
    {
        ShowInfo(123.4500m);
        ShowInfo(0m);
        ShowInfo(0.0m);
        ShowInfo(12.45m);
        ShowInfo(12.4500m);
        ShowInfo(770m);
    }

    static void ShowInfo(decimal dec)
    {
        // We want the integer parts as uint
        // C# doesn't permit int[] to uint[] conversion,
        // but .NET does. This is somewhat evil...
        uint[] bits = (uint[])(object)decimal.GetBits(dec);


        decimal mantissa = 
            (bits[2] * 4294967296m * 4294967296m) +
            (bits[1] * 4294967296m) +
            bits[0];

        uint scale = (bits[3] >> 16) & 31;

        // Precision: number of times we can divide
        // by 10 before we get to 0        
        uint precision = 0;
        if (dec != 0m)
        {
            for (decimal tmp = mantissa; tmp >= 1; tmp /= 10)
            {
                precision++;
            }
        }
        else
        {
            // Handle zero differently. It's odd.
            precision = scale + 1;
        }

        uint trailingZeros = 0;
        for (decimal tmp = mantissa;
             tmp % 10m == 0 && trailingZeros < scale;
             tmp /= 10)
        {
            trailingZeros++;
        }

        Console.WriteLine("Example: {0}", dec);
        Console.WriteLine("Precision: {0}", precision);
        Console.WriteLine("Scale: {0}", scale);
        Console.WriteLine("EffectivePrecision: {0}",
                          precision - trailingZeros);
        Console.WriteLine("EffectiveScale: {0}", scale - trailingZeros);
        Console.WriteLine();
    }
}

I came across this article when I needed to validate precision and scale before writing a decimal value to a database. I had actually come up with a different way to achieve this using System.Data.SqlTypes.SqlDecimal which turned out to be faster that the other two methods discussed here.

 static DecimalInfo SQLInfo(decimal dec)

     {

         System.Data.SqlTypes.SqlDecimal x;
         x = new  System.Data.SqlTypes.SqlDecimal(dec);                     
         return new DecimalInfo((int)x.Precision, (int)x.Scale, (int)0);
     }