Talk:Find first and last set bit of a long integer

From Rosetta Code
Revision as of 23:52, 7 December 2011 by 208.80.119.67 (talk)

The task is unclear in what number should be computed. It mentions "first" and "last" set bits, but with bits written in most-significant to least-significant order, or the other way around? (i.e. which of "upb" and "lwb" should compute the least significant set bit? and which computes the most significant set bit?) For a 32-bit integer should the least significant bit be numbered 0, 1, 31, or 32? Also, what values should be returned if there are no set bits? And for negative integers, should the most significant set bit be the sign bit?

The ALGOL example seems to indicate that: the least significant bit is numbered 32 (and the most significant bit is numbered 1). This seems the reverse of most people's bit numberings (where the least significant bit should be numbered 0, and the most significant 31), and also wouldn't make sense for arbitrary-precision integers (where would you start counting?). Also, it seems that "lwb" computes the most significant set bit, while "upb" computes the least significant set bit. Is this the example we should follow for other languages? --208.80.119.67 23:38, 7 December 2011 (UTC)