Mail Archives: cygwin/2007/10/02/15:29:31
Try executing:
find -exec echo {} \;
Simple command. This one, however, leaks at about 5kB/s. I tried the
following:
find|xargs echo
This one didn't appear to leak, but then I tried this one:
find|xargs -n 1 echo
This also leaked at around the same rate. Then I tried the following:
COUNTER=1
while [ $COUNTER -lt 123456 ]; do echo $COUNTER; let COUNTER=$COUNTER+1;
done
This one did not leak, making me believe that this is related to new process
execution. I tried this:
COUNTER=1
while [ $COUNTER -lt 123456 ]; do (echo $COUNTER); let COUNTER=$COUNTER+1;
done
and it started leaking pretty fast (maybe 4kB/s).
I searched for this, but I couldn't find anything useful (or I couldn't find
any good search queries). A few questions:
- Is this a bug or am I just wrong in part or the whole of the above?
- If so, is there a way to prevent or circumvent this?
- Again, if it is a bug, is there a way to free this memory somehow without
restarting the computer (or anything involving closing all/majority of
applications, like logout)? It appears that no process "owns" this memory,
but it is disappearing (i.e. true leak), so a simple taskkill is not a
solution.
--
View this message in context: http://www.nabble.com/Huge-memory-leak%2C-probably-related-to-making-new-processes-tf4557470.html#a13006193
Sent from the Cygwin Users mailing list archive at Nabble.com.
--
Unsubscribe info: http://cygwin.com/ml/#unsubscribe-simple
Problem reports: http://cygwin.com/problems.html
Documentation: http://cygwin.com/docs.html
FAQ: http://cygwin.com/faq/
- Raw text -