Mail Archives: djgpp/1998/01/14/23:32:57
At 02:03 1/14/1998 +0200, Eli Zaretskii wrote:
>
>On Wed, 14 Jan 1998, I wrote:
>
>> On Tue, 13 Jan 1998, Nate Eldredge wrote:
>>
>> > Solution: Declare the bitfields as `unsigned'.
>>
>> I'm not sure this solution is indeed required. I think what Noam
>> reported was due to the effect of `printf', and the actual value
>> stored inside the variable was correct.
>
>Sorry, I was wrong. The ANSI C Standard explicitly says that signed bit
>fields of size N can be used to represent values in the range [0, 2^(N-1))
>so when N is 1, you cannot represent 1. You need to make it unsigned, as
>Nate suggested.
Oughtn't that to be:
[0..(2^(N-1))-1] ?
Because, for instance, a 16-bit signed value can only go up to 32767.
Also, 2^(N-1) for N=1 equals 2^0 = 1.
Or is the range not inclusive?
Or am I just confused??
Nate Eldredge
eldredge AT ap DOT net
- Raw text -