Date: Wed, 24 Nov 1999 09:39:09 -0500 Message-Id: <199911241439.JAA25063@envy.delorie.com> From: DJ Delorie To: djgpp-workers AT delorie DOT com CC: rainer AT mathematik DOT uni-bielefeld DOT de In-reply-to: (message from Eli Zaretskii on Wed, 24 Nov 1999 15:52:43 +0200 (IST)) Subject: Re: AW: ANNOUNCE: rsxntdj 1.6 BETA References: Reply-To: djgpp-workers AT delorie DOT com X-Mailing-List: djgpp-workers AT delorie DOT com X-Unsubscribes-To: listserv AT delorie DOT com Precedence: bulk > If other versions of GCC do the right thing, it is probably a > configuration problem. It wasn't a "problem" back when I set it up that way. DJGPP didn't support any platform that supported wide characters, so when I got to that part of libc I just picked a number, and I picked 4 bytes because I figured by the time we got around to doing anything about it, we'd be using the 32-bit unicode set. When you set up gcc for a given target, you have to configure it to match the target. If we're supporting windows, we should reconfigure gcc to use 16-bit wide characters. However, you'll have to rebuild *all* of libc to support them properly (after fixing the wchar type in the system headers).