Mail Archives: djgpp/1996/02/11/18:17:05
Jurgen Wenzel <rocwiz AT solace DOT mh DOT se> wrote:
[...]
>According to the GRXlib documentation each pixel in 32K RGB mode is
>represented as xrrrrrgggggbbbbb (it never mentiones the actual type).
It's an unsigned short (-> 16 bit)
[...]
>Then there is the issue of setting the unsigned int colour value into
>my unsigned char screen buffer. (When passing a buffer in graphic mode
>initialization, it is supposed to be a char pointer.) This is easily
>solved by getting the highbyte and the lowbyte and then set them both
>sequentially in the screen buffer. Thus the the colour values end up
>in xrrrrrgggggbbbbb in the screen buffer. When I plot each character
>pair in the screen buffer using GrPlot(x, y, (*buf << 8) | *(buf + 1))
>it plots the colours correctly through the entire screen buffer. I
>therefore draw the conclusion that the colour values have been properly
>set in the screen buffer. Finally I try to blit the screen context the
>same way I did in 256 colour mode only to see that it -fails-, producing
>colours that's not the ones I wanted.
It looks like you just swapped the high and low byte. With Intel x86 series
processors, the memory representation of a multi-byte variable looks like
base+0 .... base+n
low .... high byte
If you change the byte order, the color bits will move :
xrrrrrgg gggbbbbb
-> gggbbbbb xrrrrrgg
This will look quite different ;)
Hartmut
--
Hartmut Schirmer | Phone: +49-431-77572-709 FAX: -703
Automatisierungs- u. Regelungstechnik, | hsc AT techfak DOT uni-kiel DOT d400 DOT de
Technische Fakult"at, Universit"at Kiel,| http://www.techfak.uni-kiel.de/~hsc
Kaiserstr. 2, D-24143 Kiel, Germany | PGP key via WWW, Key ID:6D84AEC1
- Raw text -