delorie.com/archives/browse.cgi   search  
Mail Archives: djgpp-workers/1998/11/17/10:55:20

Message-ID: <19981117160610.A17410@duna58>
Date: Tue, 17 Nov 1998 16:06:10 +0100
From: Laszlo Molnar <laszlo DOT molnar AT eth DOT ericsson DOT se>
To: djgpp-workers AT delorie DOT com
Subject: Re: gcc: feature or bug?
References: <19981117131553 DOT B17146 AT duna58> <Pine DOT LNX DOT 3 DOT 93 DOT 981117142107 DOT 7061M-100000 AT acp3bf>
Mime-Version: 1.0
X-Mailer: Mutt 0.93.2
In-Reply-To: <Pine.LNX.3.93.981117142107.7061M-100000@acp3bf>; from Hans-Bernhard Broeker on Tue, Nov 17, 1998 at 02:22:16PM +0100
Reply-To: djgpp-workers AT delorie DOT com

On Tue, Nov 17, 1998 at 02:22:16PM +0100, Hans-Bernhard Broeker wrote:
> > Let's say I have a 32-bit unsigned integer, and I want to clear it's
> > most significant bit. I wrote this code:

   ^^^^^ 

> > 	(x*2)/2
> > And of course(?) it doesn't work. Is it a bug in gcc or a bug in me?
> A bug in you :-) To get what you want, you should have written either
> 	(x / 2) * 2
> or 	(x >> 1) << 1
> or      x & (~ 0x1U)

I want to clear to most significant bit, not the least one. The
question is that why the construct I used gave the wrong(?) result.

Laszlo

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019