Hello to
all,
Before jumping into
this I wish to express my thanks to all of you who have provided help and
comments on my little problems in the past. This has been the best
resource I have encountered.
Now for my current
problem:
I have a project
which opens a telnet session via Net::Telnet to a particular ip and port
combination. Upon opening this ip and port, data begins to pour out ( not a bug
- a feature they tell me ). My task is to capture this data into files on a
daily basis.
I have the following
base code:
$t = new Net::Telnet
(Timeout => 10);
$t->open(host=>$host, port=>$port);
@lines = $t->getlines([Timeout => $secs,]);
$t->close;
$t->open(host=>$host, port=>$port);
@lines = $t->getlines([Timeout => $secs,]);
$t->close;
My question has to
do with the best way to handle this without losing any data. I can set the
$secs to the amount of time for a 12 hours and slurp up the data into
@lines. My question is can I do something like this
\&my_reader_routine = $t->getlines(.........
sub
my_reader_routine {
my ($line) = @_;
print OUTFILE $_
}
Is this a
way? Is there a better way? In any method how do I avoid
losing data while I close an open a new file?
On unix with fork -
this would be a piece of cake - but I am on windows 2000 and that's all I have
to work with.
Thanks to
all.
#Joseph Norris (Perl
- what else is there?/Linux/CGI/Mysql)
print @c=map chr $_+100,(6,17,15,16,-68,-3,10,11,16,4,1,14,-68,12,1,14,8,
-68,4,-3,-1,7,1,14,-68,-26,11,15,1,12,4,-68,-22,11,14,14,5,15,-90);
print @c=map chr $_+100,(6,17,15,16,-68,-3,10,11,16,4,1,14,-68,12,1,14,8,
-68,4,-3,-1,7,1,14,-68,-26,11,15,1,12,4,-68,-22,11,14,14,5,15,-90);