Mailing-List: contact cygwin-help AT cygwin DOT com; run by ezmlm List-Subscribe: List-Archive: List-Post: List-Help: , Sender: cygwin-owner AT cygwin DOT com Mail-Followup-To: cygwin AT cygwin DOT com Delivered-To: mailing list cygwin AT cygwin DOT com To: cygwin AT cygwin DOT com From: Krzysztof Duleba Subject: Re: perl - segfault on "free unused scalar" Date: Thu, 28 Jul 2005 00:14:02 +0200 Lines: 107 Message-ID: References: <42E76865 DOT 4000301 AT familiehaase DOT de> <42E7B413 DOT 8040203 AT familiehaase DOT de> Mime-Version: 1.0 Content-Type: text/plain; charset=ISO-8859-2; format=flowed Content-Transfer-Encoding: 7bit User-Agent: Mozilla Thunderbird 1.0.2 (Windows/20050317) In-Reply-To: X-IsSubscribed: yes Igor Pechtchanski wrote: >>>>>$ ./inter.pl >>>>>perl> sub foo($){$a=shift;foo($a+1);} > > You do realize you have infinite recursion here, right? Sure. >>>>Segmentation fault (core dumped) > And this is Windows saying "I don't think so". :-) :-) >>>I don't know. Maybe it is a Windows feature that applications running >>>out of memory are crashing? >> >>But there's plenty of memory left when perl crashes. I have 1 GB RAM and >>1 GB swap file. > > IIRC, unless you specifically increase heap_chunk_in_mb, Cygwin will only > use 384M of address space (which seems consistent with the sbrk() and the > request size above). I thought of that. However: $ cat foo.c #include int main(int argc, char * argv[]){ int i; char * ptrs[1024]; for(i = 0; i < atoi(argv[2]); ++i){ ptrs[i] = malloc(1024 * 1024 * atoi(argv[1])); memset(ptrs[i], 'a', 1024 * 1024 * atoi(argv[1])); } sleep(10); } $ ./foo 200 5 $ ./foo 800 1 $ ./foo 2 500 I've been using more than 384 MB in C and C++ in Cygwin for a long time. Why heap_chunk_in_mb would affect Perl, but not C? >>I've simplified the test case. It seems that Cygwin perl can't handle >>too much memory. For instance: >> >>$ perl -e '$a="a"x(200 * 1024 * 1024); sleep 9' >> >>OK, this could have failed because $a might require 200 MB of continuous >>space. > > Actually, $a requires *more* than 200MB of continuous space. Perl > characters are 2 bytes, so you're allocating at least 400MB of space! Right, UTF. I completely forgot about that. > FWIW, the above doesn't fail for me, but then, I have heap_chunk_in_mb set > to 1024. :-) I'll try that in a while. >>But hashes don't, do they? Then why does the following code fail? >> >>$ perl -e '$a="a"x(1024 * 1024);my %b; $b{$_}=$a for(1..400);sleep 9' > > Wow. You're copying a 2MB string 400 times. No wonder this fails. It > would fail with larger heap sizes as well. :-) > > This works with no problems and very little memory usage, FWIW: > > $ perl -e '$a="a"x(1024 * 1024);my %b; $b{$_}=\$a for(1..400);sleep 9' I didn't use references on purpose. I wanted to avoid the problem that arrays require continuous space, so using an array to measure system memory capacity is inaccurate. On the other hand, hash is a pointer structure (at least I think so), so it should work with fragmented memory. I don't see why "no wonder it fails", unless it's a reference to aforementioned heap_chunk_in_mb. >>Or that one? >> >>$ perl -e '$a="a"x(50 * 1024 * 1024);$b=$a;$c=$a;$d=$a;$e=$a;sleep 10' > > Yep, let's see. 100MB * 5 = 500MB. Since Cygwin perl by default can only > use 384MB, the result is pretty predictable. Perl shouldn't segfault, > though -- that's a bug, IMO. Should I do anything about it? >>On linux there's no such problem - perl can use all available memory. > > Yeah. Set heap_chunk_in_mb to include all available memory, and I'm sure > you'll find that Cygwin perl works the same too. However, you might want > to read some Perl documentation too, to make sure your data structure size > calculations are correct, and that your expectations are reasonable. Thanks for being so helpful. That really explans a lot. Thanks to Dave and Gerrit, too. Krzysztof Duleba -- Unsubscribe info: http://cygwin.com/ml/#unsubscribe-simple Problem reports: http://cygwin.com/problems.html Documentation: http://cygwin.com/docs.html FAQ: http://cygwin.com/faq/