> Some compilation of perl, especially Debian compilations,
> shows strange <DATA> behaviour.
> Even if you start reading DATA in bunch of forked processes
> AFTER fork operation, process which first started to read
> "slurps" some buffer from <DATA> so other processes
> see DATA not from beginning, but from some offset.
> That's why sometimes Constant::import() cannot find
> some definitions in "pod" part.
> seek() does not help here !

This actually looks like "good" perl behaviour,
man perlfunc says
" File descriptors (and sometimes locks on
those descriptors) are shared, while everything else is copied.".
SHARED
I also made some tests for other filehandles (files opened from disk)
and every perl version I tested shows mess when reading
forked handle in more than one process.

Thus using default already opened DATA handle
in process that may be forked with module
loaded before fork is not a good idea.

I suggest that Const.pm should parse data once
and load it into hashes (%const_sym2value, %const_value2sym)
on first request for constants.
Eg. load data unless keys(%const_sym2value).
Do not use default data handle, but open some explicit one,
with filename got from sub myfn {  #  use caller() here  } (better than
$0 which
can be changed).
Skip until __DATA__ and then parse POD.
After first import, data is left in hashes which can be used when
needed
and subsequent parsing of DATA or equivalent is not necessary.

Reply via email to