From: Mike Darrett Newsgroups: comp.os.msdos.djgpp Subject: software interrupts Date: Fri, 29 Aug 1997 01:11:25 -0700 Organization: University of California, Davis Lines: 76 Message-ID: NNTP-Posting-Host: dilbert.ucdavis.edu Mime-Version: 1.0 Content-Type: TEXT/PLAIN; charset=US-ASCII To: djgpp AT delorie DOT com DJ-Gateway: from newsgroup comp.os.msdos.djgpp Precedence: bulk Hi Guys, I'm sure there is a very simple solution to this problem, but I've been working on this for the last three days and it has me stumped. I had a PCX decoding routine I was porting from Turbo C++ to DJGPP, and I found the source of the bug was in the Palette Updating routine. In a nutshell, it goes like this, but the colors are the WRONG ONES. I know it has something to do with the way I'm issuing a software interrupt from the 32-bit code, but I tried everything, from __dpmi_int() to int86() for the int 10 function (this function takes a pointer in ES:DX pointing to palette data and updates the video palette). The FAQ's DJ wrote said just use int86() and forget about ES, ... Any help would be appreciated. - Mike #include #include #include #include typedef unsigned char byte; typedef struct{ byte r, g, b; } point; point palette[256]; void SetPalette(void) { union REGS r; int i; r.x.dx = r.x.di = (unsigned long)palette; r.x.ax = 0x1012; r.x.bx = 0; r.x.cx = 256; int86( 0x10, &r, &r ); } void SetMode( byte m ) { __dpmi_regs r; r.h.al = m; r.h.ah = 0; r.h.bl = 0; __dpmi_int (0x10, &r); } void main() { int i; __dpmi_regs r; SetMode( 0x13 ); for( i=0; i<256; i++ ){ palette[i].r = 0; palette[i].g = 0; palette[i].b = 60; } SetPalette(); getch(); SetMode( 3 ); }