From: "Tim Van Holder" To: Cc: "Mark E." Subject: Possible bash issue Date: Fri, 18 May 2001 19:51:43 +0200 Message-ID: MIME-Version: 1.0 Content-Type: text/plain; charset="iso-8859-1" Content-Transfer-Encoding: 7bit X-Priority: 3 (Normal) X-MSMail-Priority: Normal X-Mailer: Microsoft Outlook IMO, Build 9.0.2416 (9.0.2910.0) Importance: Normal X-MimeOLE: Produced By Microsoft MimeOLE V5.50.4133.2400 Reply-To: djgpp-workers AT delorie DOT com Ran into an odd thing. The ltconfig used by cvs binutils/gdb/gcc has a test for the maximum command-line length. With the previous bash 2.04 (beta 3), this always resulted in 147457. While I thought this a bit high, I didn't see a problem, as no error was reported. I now use the current bash beta. When setting up cvs gcc3 to configure with cvs autoconf/automake though, I had bash crash on me, during this test, resulting in a long (>40 lines) traceback. So I built a bash with debugging enabled to try and track down the problem. This time, however, the test did take a LONG time, but did not trigger a crash. The test yielded 1179649, which is definitely too high. I removed the result from config.cache and tried again; this time I had to interrupt configure as it had not yet produced a result after 2 minutes. So is this likely to be a bug in bash, or a libc problem (my bash is linked against stock 2.03)? The test used by ltconfig follows: # find the maximum length of command line arguments echo "$progname:780: finding the maximum length of command line arguments" 1>&5 echo $ac_n "finding the maximum length of command line arguments... $ac_c" 1>&6 if test "${lt_cv_sys_max_cmd_len+set}" = set; then echo $ac_n "(cached) $ac_c" 1>&6 else i=0 testring="ABCDEF" while test `$CONFIG_SHELL $0 --fallback-echo "X$testring" >/dev/null 2>&1` == `echo "X$testring" >/dev/null 2>&1` && new_result=`expr "X$testring" : ".*" 2>&1` && lt_cv_sys_max_cmd_len=$new_result && test $i != 32 # 1 MB should be enough do i=`expr $i + 1` testring=$testring$testring done testring= # add a significant safety factor because C++ compilers can tack on massive amounts # of additional arguments before passing them to the linker. 1/4 should be good. len=`expr $lt_cv_sys_max_cmd_len \/ 4` lt_cv_sys_max_cmd_len=`expr $lt_cv_sys_max_cmd_len - $len` fi echo "$progname:@lineno@: result: $lt_cv_sys_max_cmd_len" 1>&5 echo "${ac_t}$lt_cv_sys_max_cmd_len" 1>&6 if test -n $lt_cv_sys_max_cmd_len ; then max_cmd_len=$lt_cv_sys_max_cmd_len else max_cmd_len=none fi I tried lowering the loop limit to 18, as it only got really slow from i=20. Oddly this yielded 1179649 again, which made me suspicious. i=10 yielded 4609, and i=12 18433, which is about 4 times larger, so that seems ok. But 18433 * 2^8 = 4718848, not 1179649. So maybe some overflow is wreaking havoc here? Also, isn't the transfer buffer supposed to be an upper limit to the number of arguments, or is that only when invoking DOS apps?