From: greve AT rs1 DOT thch DOT uni-bonn DOT de (Thomas Greve) Subject: Re: malloc() doesn't return a null pointer when out of memory To: daniel AT asd470 DOT dseg DOT ti DOT com (Daniel Lemon) Date: Fri, 5 Jun 92 12:39:17 NFT Cc: djgpp AT sun DOT soe DOT clarkson DOT edu > It is supposed to keep allocating memory until there is no more memory > left, at which point I expected to get a NULL pointer returned from > malloc(). This is really nasty: allocating more and more memory. You will get real enemies when you try this on a Unix machine... Since gcc is a Unix compiler, it is thought to give you as much memory as you need. It should *never* let you run out of memory. Especially malloc cannot see when your disk is full... > Instead, I got this error: > > Fatal! disk full writing to swap file Oh yes. This is the real limit of virtual memory: the swap file (-partition, whatever) will be full some day. If you really need to process such a lot of memory, get a bigger hard disk. ;-) > When I increased the size of the array item.string to 1 000 000 or > 10 000 000, my PC froze and I had to reboot. When I increased it to > 100 000 000, I got the error > > Segmentation violation in pointer 0xe8421000 at 40:1318 > Exception 14 at eip=1318 Yes, i've seen this, too. But this is not a question of enough memory or not, but of access permissions. Maybe go32 has some internal `segment sizes'? (But this time at 64M instead of 64K ;-) > Shouldn't malloc just return a NULL pointer? I have some > memory-hogging programs that depend on that for error checking. This is the usual behaviour on DOS machines. You are on the way to Unix :-) :-) BTW: keep your memory requirements in a reasonable proportion to your available RAM. I'm trying to work with 1024x1024 double matrices on an 8M-Machine, and things have to be thought carefully not to blow up the LRU-paging algorithm. - Thomas greve AT rs1 DOT thch DOT uni-bonn DOT de unt145 AT dbnrhrz1