Process hangs on large pipe writes
Kevin Oberman
kob6558 at gmail.com
Fri Jan 11 07:32:38 UTC 2013
I have run into a problem after upgrading s system from 7.2 to 8.3
with a script that has worked correctly for about a decade. The
script, written in perl, starts about 25 processes to gather
information from a number of other systems. I create an array of
filehandles, one per remote system and then open each with an ssh
command:
$pid = open $fh, "ssh -xa $_ <command> 2>& 1 |"
I then set up a 30 second watchdog via sigalarm that will kill any
process that has not completed and exited and do a wait for each
process to complete with a waitpid. (The processes normally all
complete in about 2 seconds.)
Each ssh command will return between about 14 and 32 KB of ASCII data,
written line at a time with lines of no more than 80- characters.. On
each run after the upgrade between 8 and 15 of the processes will hang
and only a portion of the data will be available when I read the
filehandle.
Anyone have any idea what could have changed between FreeBSD 7.2 and
8.3? It looks like a bug or a new limitation on pipe operations. I
don't know much about Perl internals, but I assume that no read is
done on the pipe until I attempt to read the filehandle, so I believe
it is some limit on writes to the pipe with no reads done.
I have modified the script to write to files instead of a pipe and it
is working again, but I would like to understand what broke my script.
--
R. Kevin Oberman, Network Engineer
E-mail: kob6558 at gmail.com
More information about the freebsd-questions
mailing list