delorie.com/archives/browse.cgi | search |
From: | sam DOT ravnborg AT image DOT dk (Sam Ravnborg) |
Newsgroups: | comp.os.msdos.djgpp |
Subject: | GCC: Is it possible to set int=16bit |
Date: | Wed, 05 Nov 1997 18:45:18 GMT |
Organization: | - |
Message-ID: | <3460bcb1.8980759@news.image.dk> |
MIME-Version: | 1.0 |
NNTP-Posting-Host: | pm7-33.image.dk |
Lines: | 22 |
To: | djgpp AT delorie DOT com |
DJ-Gateway: | from newsgroup comp.os.msdos.djgpp |
I have downloded DJGPP and like the look and feel of RHIDE (yes, I used Borland in the past). I am looking for an easy way to test my programs which int the end will run on an embedded target. I would like to use RHIDE/GCC to compile my programs if possible. In first place GCC fails because sizeof(int) == 4bytes == 32 bits. The requirements I have is: sizeof(int) == 2 bytes little endian alignment within records: ints are aligned to an even address. enums with less than 128 members s one byte in size. one byte equals 8 bits. My question is: Are there any easy way to setup GCC to these requirements - I know I have to figure out how to compile GCC etc. but want to hear any suggestion to the above first. -- /Sam Ravnborg - sam DOT ravnborg AT image DOT dk
webmaster | delorie software privacy |
Copyright © 2019 by DJ Delorie | Updated Jul 2019 |