delorie.com/archives/browse.cgi   search  
Mail Archives: cygwin/2005/07/27/12:58:33

Mailing-List: contact cygwin-help AT cygwin DOT com; run by ezmlm
List-Subscribe: <mailto:cygwin-subscribe AT cygwin DOT com>
List-Archive: <http://sourceware.org/ml/cygwin/>
List-Post: <mailto:cygwin AT cygwin DOT com>
List-Help: <mailto:cygwin-help AT cygwin DOT com>, <http://sourceware.org/ml/#faqs>
Sender: cygwin-owner AT cygwin DOT com
Mail-Followup-To: cygwin AT cygwin DOT com
Delivered-To: mailing list cygwin AT cygwin DOT com
To: cygwin AT cygwin DOT com
From: Krzysztof Duleba <krzysan AT skrzynka DOT pl>
Subject: Re: perl - segfault on "free unused scalar"
Date: Wed, 27 Jul 2005 18:56:19 +0200
Lines: 44
Message-ID: <dc8ebq$36m$1@sea.gmane.org>
References: <dc7n8k$m72$1 AT sea DOT gmane DOT org> <42E76865 DOT 4000301 AT familiehaase DOT de> <dc82t0$r0a$1 AT sea DOT gmane DOT org> <42E7B413 DOT 8040203 AT familiehaase DOT de>
Mime-Version: 1.0
User-Agent: Mozilla Thunderbird 1.0.2 (Windows/20050317)
In-Reply-To: <42E7B413.8040203@familiehaase.de>
X-IsSubscribed: yes

Gerrit P. Haase wrote:

>>> $ ./inter.pl
>>> perl> sub foo($){$a=shift;foo($a+1);}
>>> perl> foo 1
>>> Out of memory during request for 4040 bytes, total sbrk() is 
>>> 402624512 bytes!
>>> Segmentation fault (core dumped)
>>
>> Another version (with "my $a"):
>>
>> perl> sub foo($){my $a=shift;foo($a+1);}
>> perl> foo 1
>> Out of memory during "large" request for 134221824 bytes, total sbrk() 
>> is 304633856 bytes at (eval 19) line 1.
>> perl> foo 1
>> Bad realloc() ignored at (eval 19) line 1.
>> Segmentation fault (core dumped)
>>
>> Is this a perl bug, Cygwin bug, or just a feature?
> 
> I don't know.  Maybe it is a Windows feature that applications running
> out of memory are crashing?

But there's plenty of memory left when perl crashes. I have 1 GB RAM and 1 
GB swap file.

I've simplified the test case. It seems that Cygwin perl can't handle too 
much memory. For instance:

$ perl -e '$a="a"x(200 * 1024 * 1024); sleep 9'

OK, this could have failed because $a might require 200 MB of continuous 
space. But hashes don't, do they? Then why does the following code fail?

$ perl -e '$a="a"x(1024 * 1024);my %b; $b{$_}=$a for(1..400);sleep 9'

Or that one?

$ perl -e '$a="a"x(50 * 1024 * 1024);$b=$a;$c=$a;$d=$a;$e=$a;sleep 10'

On linux there's no such problem - perl can use all available memory.

Krzysztof Duleba


--
Unsubscribe info:      http://cygwin.com/ml/#unsubscribe-simple
Problem reports:       http://cygwin.com/problems.html
Documentation:         http://cygwin.com/docs.html
FAQ:                   http://cygwin.com/faq/

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019