first_bit_set(i)
Return the index of the least significant bit set in the integer i
. Return -1
if no bits are set.
The input is treated as if represented in two's complement, even if it is not represented that way internally.
first_bit_set(0b00000)
// -1
first_bit_set(0b00001)
// 0
first_bit_set(0b01100)
// 2