Have you ever had to use bit shifting in real projects? [closed]

Have you ever had to use bit shifting in real programming projects? Most (if not all) high level languages have shift operators in them, but when would you actually need to use them?


Solution 1:

I still write code for systems that do not have floating point support in hardware. In these systems you need bit-shifting for nearly all your arithmetic.

Also you need shifts to generate hashes. Polynomial arithmetic (CRC, Reed-Solomon Codes are the mainstream applications) or uses shifts as well.

However, shifts are just used because they are handy and express exactly what the writer intended. You can emulate all bit-shifts with multiplication if you want to, but that would be harder to write, less readable and sometimes slower.

The compilers detect cases where multiplication can be reduced to a shift.

Solution 2:

Yes, I've used them a lot of times. Bit twiddling is important on embedded hardware where bit-masks are very common. It's also important in games programming, when you need every last bit of performance.

Edit: Also, I use them a lot for manipulating bitmaps, for example changing the colour depth, or converting RGB <-> BGR.

Solution 3:

  • Creating nice flag values for the enums (rather than typing manually 1, 2, 4...)
  • Unpacking the data from the bit-fields (many network protocols use them)
  • Z-curve traversal
  • Performance hacks

And I cannot think of many cases when they are being used. It's usually other way around - there is some specific problem, and it turns out that employing bit operations will yield the best results (usually in term of performance - time and/or space).

Solution 4:

One place I use them all the time is when transposing the endian-ness of integers for cross-platform applications. They also sometimes come in handy (along with other bit-manipulation operators) when blitting 2D graphics.

Solution 5:

I've used them a few times, but pretty much always for parsing a binary file format.