ything.
>
> My situation is as follows:
>
> I perform some text manipulation on files that are 30 MB in size.
> The newly formatted files get pushed to another script that can only handle
> files of 5MB maximum.
>
> So I would like to be able to limit the file size and s
of 5MB maximum.
So I would like to be able to limit the file size and start a new one when
it reaches (or comes close to) this limit. This would allow me to automate
it rather than having to manually break the big files up before continuing.
Thanks!
James.
In article
1388676082.98276.yahoomail...@web193403.mail.sg3.yahoo.com,
mani_nm...@yahoo.com (mani kandan) wrote:
Hi,
We have file size of huge size 500MB, Need to Manipulate the file, some
replacement and then write the file, I have used File::slurp and works for
file size of 300MB
Hi List,
On Friday, January 03, 2014 10:57:13 AM kurtz le pirate wrote:
have you try this kind of command :
perl -p -i -e s/oneThing/otherThing/g yourFile
I was about to post the same thing. My suggestion: Create a backup file just
in case something goes wrong.
perl -pi.bak -e
Am 02.01.2014 18:08, schrieb David Precious:
Oh, I was thinking of a wrapper that would:
(a) open a new temp file
(b) iterate over the source file, line-by-line, calling the provided
coderef for each line
(c) write $_ (potentially modified by the coderef) to the temp file
(d) finally, rename
Hi,
Thanks for all your guidance, The Error was Perl Command Line Intepretar has
encountered a problem and needs to close,
Also increased the virtual memory, No use, My system configuration OS XP SP3
Intel Core 2 duo with 2 GB Ram.
regards
Manikandan N
On Friday, 3 January 2014 9:06 PM,
On 01/03/2014 10:22 AM, Janek Schleicher wrote:
A short look to CPAN brings out https://metacpan.org/pod/File::Inplace
what looks to do what OP wants.
Honestly I never used, and it can be that it has also a performance
problem, but for at least I looked to it's source code and it implements
On 01/03/2014 12:10 PM, mani kandan wrote:
Hi,
Thanks for all your guidance, The Error was Perl Command Line
Intepretar has encountered a problem and needs to close,
that isn't the real error. you need to run this in a command window that
won't close after it fails so you can see the real
On Fri, 03 Jan 2014 12:22:48 -0500
Uri Guttman u...@stemsystems.com wrote:
i haven't seen that before but it was last touched in 2005.
That means it has no bugs. A better metric of a modules quality is how
many outstanding bugs are? See
https://rt.cpan.org//Dist/Display.html?Queue=File-Inplace
On 01/03/2014 12:48 PM, Shawn H Corey wrote:
On Fri, 03 Jan 2014 12:22:48 -0500
Uri Guttman u...@stemsystems.com wrote:
i haven't seen that before but it was last touched in 2005.
That means it has no bugs. A better metric of a modules quality is how
many outstanding bugs are? See
On 02/01/2014 15:21, mani kandan wrote:
Hi,
We have file size of huge size 500MB, Need to Manipulate the file, some
replacement and then write the file, I have used File::slurp and works
for file size of 300MB (Thanks Uri) but for this huge size 500MB it is
not processing and come out
On 01/03/2014 02:28 PM, Rob Dixon wrote:
On 02/01/2014 15:21, mani kandan wrote:
Hi,
We have file size of huge size 500MB, Need to Manipulate the file, some
replacement and then write the file, I have used File::slurp and works
for file size of 300MB (Thanks Uri) but for this huge size 500MB
Hi,
We have file size of huge size 500MB, Need to Manipulate the file, some
replacement and then write the file, I have used File::slurp and works for file
size of 300MB (Thanks Uri) but for this huge size 500MB it is not processing
and come out with error. I have also used Tie::file module
On Thu, 2 Jan 2014 23:21:22 +0800 (SGT)
mani kandan mani_nm...@yahoo.com wrote:
Hi,
We have file size of huge size 500MB, Need to Manipulate the file,
some replacement and then write the file, I have used File::slurp and
works for file size of 300MB (Thanks Uri) but for this huge size
On 01/02/2014 10:39 AM, David Precious wrote:
On Thu, 2 Jan 2014 23:21:22 +0800 (SGT)
mani kandan mani_nm...@yahoo.com wrote:
Hi,
We have file size of huge size 500MB, Need to Manipulate the file,
some replacement and then write the file, I have used File::slurp and
works for file size
On Thu, 02 Jan 2014 11:18:31 -0500
Uri Guttman u...@stemsystems.com wrote:
On 01/02/2014 10:39 AM, David Precious wrote:
Secondly - do you need to work on the file as a whole, or can you
just loop over it, making changes, and writing them back out? In
other words, do you *need* to hold
On 01/02/2014 11:48 AM, David Precious wrote:
On Thu, 02 Jan 2014 11:18:31 -0500
Uri Guttman u...@stemsystems.com wrote:
On 01/02/2014 10:39 AM, David Precious wrote:
Secondly - do you need to work on the file as a whole, or can you
just loop over it, making changes, and writing them back
On Thu, 02 Jan 2014 11:56:26 -0500
Uri Guttman u...@stemsystems.com wrote:
Part of me wonders if File::Slurp should provide an in-place (not
slurping into RAM) editing feature which works like edit_file_lines
but line-by-line using a temp file, but that's probably feature
creep :)
On 01/02/2014 12:08 PM, David Precious wrote:
On Thu, 02 Jan 2014 11:56:26 -0500
Uri Guttman u...@stemsystems.com wrote:
Part of me wonders if File::Slurp should provide an in-place (not
slurping into RAM) editing feature which works like edit_file_lines
but line-by-line using a temp file, but
On 01/02/2014 12:33 PM, David Precious wrote:
On Thu, 02 Jan 2014 12:19:16 -0500
Uri Guttman u...@stemsystems.com wrote:
On 01/02/2014 12:08 PM, David Precious wrote:
Oh, I was thinking of a wrapper that would:
(a) open a new temp file
(b) iterate over the source file, line-by-line, calling
On Thu, 02 Jan 2014 12:19:16 -0500
Uri Guttman u...@stemsystems.com wrote:
On 01/02/2014 12:08 PM, David Precious wrote:
Oh, I was thinking of a wrapper that would:
(a) open a new temp file
(b) iterate over the source file, line-by-line, calling the provided
coderef for each line
(c)
Hi folks, happy new year to everyone. )
John, you're right, of course. ) The filenames in nested directories could
well overlap, and using $File::Find::name would be safer.
Didn't think of that as a big problem, though, as original script (with
'opendir') ignored all the nested folders overall.
Hello:
On Sat, Dec 31, 2011 at 02:56:50AM +0200, Igor Dovgiy wrote:
$filedata{$_} = [$filesize, $filemd5];
*snip*
my ($size, $md5) = @{ $filedata{$filename} };
Alternatively, store a nested hash-reference:
$filedata{$File::Find::name} = {
md5 = $file_md5,
size =
in a given directory _have_ to be unique.
I think that we can all be in agreement then that these entries should be
guaranteed to have unique keys and can have non-unique data such as file
size attributed to them- therefore:
push @{$files{$filename}}, $File::Find::name;
When sorting the hash
Hi Jonathan,
Argh, really stupid mistake by me. ) But let's use it to explain some
points a bit further, shall we?
A skilled craftsman knows his tools well, and Perl programmer (with CPAN as
THE collection of tools of all sizes and meanings) has an advantage here: even
if documentation is a bit
Hi John, yes, good point! Totally forgot this. ) Adding new files to a
directory as you browse it is just not right, of course. Possible, but not
right. )
I'd solve this by using hash with filenames as keys and collected 'result'
strings (with md5 and filesizes) as values, filled by File::Find
On Fri, Dec 30, 2011 at 11:58 AM, Igor Dovgiy ivd.pri...@gmail.com wrote:
Hi John, yes, good point! Totally forgot this. ) Adding new files to a
directory as you browse it is just not right, of course. Possible, but not
right. )
I'd solve this by using hash with filenames as keys and
On Thu, Dec 29, 2011 at 03:43:19PM +, Jonathan Harris wrote:
Hi All
Hello Jonathan:
(Disclaimer: I stayed up all night playing Skyrim and am running
on about 4.5 hours of sleep.. ^_^)
I think most things have already been addressed, but I think Igor
might have had a bit of trouble making
wanted {
my $filesize = (stat($_))[7];
push @{$files{$filesize}}, $File::Find::name;
}
find(\wanted, $path);
to hash files and file size results together - then process after
And yep, Igor has been thorough and very helpful
Thanks again for your input on this - hope you manage to get some
);
to hash files and file size results together - then process after
And yep, Igor has been thorough and very helpful
Thanks again for your input on this - hope you manage to get some sleep!
All the best
Jonathan
Igor Dovgiy wrote:
Great work, Jonathan!
Notice how simple your script has become - and that's a good sign as well
in Perl. :) We can make it even simpler, however.
As you probably know, Perl has two fundamental types of collections: arrays
(where data is stored as a sequence of elements, data
On Thu, Dec 29, 2011 at 5:08 PM, Igor Dovgiy ivd.pri...@gmail.com wrote:
Hi Jonathan,
Let's review your script a bit, shall we? )
It's definitely good for a starter, but still has some rough places.
#!/usr/bin/perl
# md5-test.plx
use warnings;
use strict;
use File::Find;
use
Hi Jonathan,
Let's review your script a bit, shall we? )
It's definitely good for a starter, but still has some rough places.
#!/usr/bin/perl
# md5-test.plx
use warnings;
use strict;
use File::Find;
use Digest::MD5;
use File::Spec;
So far, so good. )
my $dir = shift ||
entry
to a separate file?
There are clearly better ways to achieve this result - all suggestions are
gratefully received!
Thanks again
Jon
Here's the script:
#
# This program reads in files from a directory, produces a hex digest and
writes the hex, along with
# the file size
Jonathan Harris wrote:
Hi Igor
Many thanks for your response
I have started reviewing the things you said
There are some silly mistakes in there - eg not using closedir
It's a good lesson in script vigilance
I found the part about opening the file handle particularly interesting
I had no
On Thu, Dec 29, 2011 at 6:39 PM, John W. Krahn jwkr...@shaw.ca wrote:
Jonathan Harris wrote:
Hi Igor
Many thanks for your response
I have started reviewing the things you said
There are some silly mistakes in there - eg not using closedir
It's a good lesson in script vigilance
I found
On Fri, Dec 30, 2011 at 12:33 AM, Jonathan Harris
jtnhar...@googlemail.comwrote:
On Thu, Dec 29, 2011 at 6:39 PM, John W. Krahn jwkr...@shaw.ca wrote:
Jonathan Harris wrote:
Hi Igor
Many thanks for your response
I have started reviewing the things you said
There are some silly
Jonathan Harris wrote:
On Thu, Dec 29, 2011 at 6:39 PM, John W. Krahnjwkr...@shaw.ca wrote:
Igor made a lot of good points. Here are my two cents worth. You are
using the File::Find module to traverse the file system and add new files
along the way. This _may_ cause problems on some file
Jonathan Harris wrote:
FInally, I was advised by a C programmer to declare all variables at the
start of a program to avoid memory issues
Is this not necessary in Perl?
It is not really necessary in C either.
John
--
Any intelligent fool can make things bigger and
more complex... It takes
of this, except for analysing the file
size - I think that the file size being analysed may be the md5 object
result as the same value is printed to each file
I am running out of ideas and would appreciate any help you could give!
I have tried using File::stat::OO and File::stat - but to no avail - I
of this, except for analysing the file
size - I think that the file size being analysed may be the md5 object
result as the same value is printed to each file
Print out the file size returned by stat. Check if it is the same displayed
by the ls command.
I am running out of ideas and would
name and a .md5 extension
4) Check the original file for its size
5) Add this data to the newly created file on a new line (in bytes)
I have a script that will do most of this, except for analysing the file
size - I think that the file size being analysed may be the md5 object
result
)
Will this file contain information for one file or many files?
I have a script that will do most of this, except for analysing the file
size - I think that the file size being analysed may be the md5 object
result as the same value is printed to each file
Print out the file size returned by stat
the file
size - I think that the file size being analysed may be the md5 object
result as the same value is printed to each file
I am running out of ideas and would appreciate any help you could give!
I have tried using File::stat::OO and File::stat - but to no avail - I
could be using them
Hi,
I am writing a small script to download files of the web.
How can I get the file size without downloading the file?
use LWP::Simple;
my $file = http://www.abc.com/file.mp3;
my @array = head($file);
print $array[1]\n;
head() doesn't always returns all values? why??
Sometime there are all
) {
my $headers = $res-headers;
return $headers;
}
return 0;
}
$link='http://www.abc.com/file.mp3';
$header = GetFileSize($link);
print File size: .$header-content_length. bytes\n;
exit;
On Thu, Nov 26, 2009 at 12:28 PM, raphael() raphael.j...@gmail.com wrote:
Hi
Hi All,
Is there any way to limit the file size while zipping using
Archive::Zip so that it will stop processing a zip operation on a file
list when it crosses the maximum file size.
Thanks in advance.
-A
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL
San wrote:
Is there any way to limit the file size while zipping using
Archive::Zip so that it will stop processing a zip operation on a file
list when it crosses the maximum file size.
Hey San
Unfortunately Archive::Zip requires that an archive be written to disk
before the compression
On Sat, 2007-01-20 at 09:31 +1100, Ken Foskey wrote:
What's exactly the difference between:
++$lines;
and
$lines++; ?
Nothing in this context.
What about other contexts?
David.
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL
David Moreno Garza am Sonntag, 21. Januar 2007 07:50:
On Sat, 2007-01-20 at 09:31 +1100, Ken Foskey wrote:
What's exactly the difference between:
++$lines and $lines++; ?
Nothing in this context.
What about other contexts?
Hi David
#!/usr/bin/perl
use strict;
use warnings;
{ #
On 1/19/07, Bertrand Baesjou [EMAIL PROTECTED] wrote:
While running my script it seems to use around a gigabyte of memory
(there is 1GB of RAM and 1GB of swap in the system), might this be the
problem?
If you're running low on memory, unless you're working on an
inherintly large problem, your
Hi,
I am trying to read data from a file, I do this by using the while
(FILE){ $line} construction.
However with files with a size of roughly bigger than 430MB it seems to
crash the script :S Syntax seems all fine (perl -wc - syntax OK).
I was thinking that maybe it was running to the
On Fri, 2007-01-19 at 13:16 +0100, Bertrand Baesjou wrote:
Hi,
I am trying to read data from a file, I do this by using the while
(FILE){ $line} construction.
However with files with a size of roughly bigger than 430MB it seems to
crash the script :S Syntax seems all fine (perl -wc -
Bertrand Baesjou wrote:
Hi,
I am trying to read data from a file, I do this by using the while
(FILE){ $line} construction.
However with files with a size of roughly bigger than 430MB it seems to
crash the script :S Syntax seems all fine (perl -wc - syntax OK).
How does your script
Ken Foskey wrote:
On Fri, 2007-01-19 at 13:16 +0100, Bertrand Baesjou wrote:
Hi,
I am trying to read data from a file, I do this by using the while
(FILE){ $line} construction.
However with files with a size of roughly bigger than 430MB it seems to
crash the script :S Syntax seems all
On Fri, Jan 19, 2007 at 03:17:19PM +0100, Bertrand Baesjou wrote:
foreach $line (INFILE) {
See, this isn't a while loop, as you have in the subject.
That is the cause of your problems.
--
Paul Johnson - [EMAIL PROTECTED]
http://www.pjcj.net
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For
On Fri, 2007-01-19 at 13:24 +, Rob Dixon wrote:
++$lines;
What's exactly the difference between:
++$lines;
and
$lines++; ?
David.
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/
On Fri, 2007-01-19 at 16:21 -0600, David Moreno Garza wrote:
On Fri, 2007-01-19 at 13:24 +, Rob Dixon wrote:
++$lines;
What's exactly the difference between:
++$lines;
and
$lines++; ?
Nothing in this context.
It does make a difference if you are 'using' the value see
David Moreno Garza wrote:
On Fri, 2007-01-19 at 13:24 +, Rob Dixon wrote:
++$lines;
What's exactly the difference between:
++$lines;
and
$lines++; ?
In void context they are both the same because perl optimizes $lines++ to
++$lines.
John
--
Perl isn't a toolbox, but a
Is it possible to calculate the File SIZE which is from HTTP. i.e if I
wanted to know
file size of http://www.yahoo.com/images/a.gif from PERL..
Thanks
Anish
--
No virus found in this outgoing message.
Checked by AVG Free Edition.
Version: 7.1.408 / Virus Database: 268.13.2/471 - Release
Anish Kumar K. [EMAIL PROTECTED] asked:
Is it possible to calculate the File SIZE which is from HTTP.
i.e if I wanted to know
file size of http://www.yahoo.com/images/a.gif from PERL..
Send a HEAD request for the URI and look at the
Content-Length header of the response object:
#!/usr/bin
From the $ENV{CONTENT_LENGTH}
-Original Message-
From: Anish Kumar K. [EMAIL PROTECTED]
Sent: Oct 11, 2006 2:17 AM
To: beginners@perl.org
Subject: getting the File size of image URL
Is it possible to calculate the File SIZE which is from HTTP. i.e if I
wanted to know
file size of http
Jose Alves De Castro wrote:
On Mon, 2004-08-09 at 14:53, David Dorward wrote:
On 9 Aug 2004, at 14:34, SilverFox wrote:
Hi all, I'm trying to writing a script that will allow a user to enter
a
number and that number will be converted into KB,MB or GB depending on
the
size of the
Quickest wya would be to get the left over from begining.
...
print Please enter your number:\n;
chomp($num=STDIN);
$bytes = $num % $kilo;
$num -= $bytes
...
HTH,
Mark G.
- Original Message -
From: SilverFox [EMAIL PROTECTED]
Date: Monday, August 9, 2004 12:06 pm
Subject: Re: File
And the clouds parted, and SilverFox said...
Hi all, I'm trying to writing a script that will allow a user to enter a
number and that number will be converted into KB,MB or GB depending on the
size of the number. Can someone point me in the right direction?
Example:
user enter: 59443
SilverFox [EMAIL PROTECTED] wrote:
:
: I haven't put anything together as yet. Putting
: some if/elsif statement together would be the
: easiest way I can think off. Something like:
We can see a few problems right off. All scripts
should start with 'strict' and 'warnings'. We need a
And the clouds parted, and Brian Gerard said...
[1] http://www.alcyone.com/max/reference/physics/binary.html
-anyone remember offhand the URL to the /. story on these, btw?
...never mind. Found it. (uncaught typo on my first google query... DOH!)
Hi all, I'm trying to writing a script that will allow a user to enter a
number and that number will be converted into KB,MB or GB depending on the
size of the number. Can someone point me in the right direction?
Example:
user enter: 59443
Script will output: 58M
SilverFox
--
To
On 9 Aug 2004, at 14:34, SilverFox wrote:
Hi all, I'm trying to writing a script that will allow a user to enter
a
number and that number will be converted into KB,MB or GB depending on
the
size of the number. Can someone point me in the right direction?
What have you got so far? Where are you
On Mon, 2004-08-09 at 14:53, David Dorward wrote:
On 9 Aug 2004, at 14:34, SilverFox wrote:
Hi all, I'm trying to writing a script that will allow a user to enter
a
number and that number will be converted into KB,MB or GB depending on
the
size of the number. Can someone point me in
On Mon, 9 Aug 2004, SilverFox wrote:
Example:
user enter: 59443
Script will output: 58M
I know this isn't getting into the spirit of things, but have you
considered simply using the `units` program?
% units
500 units, 54 prefixes
You have: 59443 bytes
You want: megabytes
Hi,
Is there any way to get the size of a file without downloading it?
I want to write a program using LWP to download a file only if it is bigger
than 3K but smaller than 500K.
So I need to know the file size in the first place.
Thank you.
-u
--
To unsubscribe, e-mail: [EMAIL PROTECTED
usef wrote:
Hi,
Is there any way to get the size of a file without downloading it?
I want to write a program using LWP to download a file only
if it is bigger
than 3K but smaller than 500K.
So I need to know the file size in the first place.
You issue a HEAD request to the server and look
Hi,
FTP or HTTP?
HTTP, but I want to know the method for FTP as well.
Thanks -u
PS.
Sorry Rus for multiple copy *smacks forehead*
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/ http://learn.perl.org/first-response
usef wrote:
Hi,
FTP or HTTP?
HTTP, but I want to know the method for FTP as well. Thanks -u
I think that will work for FTP as well. Give it a try.
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/
On Wed, 10 Dec 2003, usef wrote:
Hi,
Is there any way to get the size of a file without downloading it?
I want to write a program using LWP to download a file only if it is bigger
than 3K but smaller than 500K.
So I need to know the file size in the first place.
Hi,
FTP or HTTP?
Rgds
Rus
Hello,
usef [EMAIL PROTECTED] asked:
Is there any way to get the size of a file without downloading it?
I want to write a program using LWP to download a file only
if it is bigger than 3K but smaller than 500K.
So I need to know the file size in the first place.
Try making a HEAD request
Is there any way to get the size of a file without downloading it?
I want to write a program using LWP to download a file only
if it is bigger than 3K but smaller than 500K.
So I need to know the file size in the first place.
Try making a HEAD request - that should return
file size and last
On Wed, 2003-12-10 at 09:42, Bob Showalter wrote:
usef wrote:
Hi,
FTP or HTTP?
HTTP, but I want to know the method for FTP as well. Thanks -u
I think that will work for FTP as well. Give it a try.
If I type ls when I FTP into somewhere I get a listing of files and
size. I
Dan Anderson [EMAIL PROTECTED] wrote:
On Wed, 2003-12-10 at 09:42, Bob Showalter wrote:
usef wrote:
Hi,
FTP or HTTP?
HTTP, but I want to know the method for FTP as well. Thanks -u
I think that will work for FTP as well. Give it a try.
If I type ls when I FTP into
relatively few colors and/or large blocks of the same color).
When attempting to compress a cartoon, for example, you'll find that
JPEG/JFIF will give *lower* quality and a *larger* file size than GIF.
For this type of image, PNG-8 would be a better choice than GIF, and a
much better choice than JPEG
Kevin Goodsell wrote:
Third, only in relatively bad cases will GIF require a byte for every
pixel. For example, I just created a solid white 200 by 200 image.
That's 40,000 pixels. The file size is 345 bytes. One byte per pixel is
what you would get if no compression was used at all (probably
Kevin Goodsell wrote:
Third, only in relatively bad cases will GIF require a byte for every
pixel. For example, I just created a solid white 200 by 200 image.
That's 40,000 pixels. The file size is 345 bytes. One byte per pixel is
what you would get if no compression was used at all (probably
Eamon Daly wrote:
Hi, all. I'm using Imager to create gifs, but the resultant
file sizes are /huge/. I'm writing the files out like so:
Are you doing animations? If not, skip the GIFs. You can get much
better depth [16 million] in a lot less space with JPEG files.
Some of the compression
Hi, all. I'm using Imager to create gifs, but the resultant
file sizes are /huge/. I'm writing the files out like so:
$img-write(type = 'gif',
max_colors = 16,
gif_eliminate_unused = 1,
data = \$data) or die $img-errstr;
I've verified that the
Hi,
This is wat I'm doing... but its not working :-(
find (\wanted,$Root);
print OUT ' /table
pnbsp;/p
/body
/html';
sub wanted()
{
if (-d $File::Find::name)
{
return;
}
$file = $File::Find::name;
$file =~ s/\//\\/g;
$st = stat($file);
$size = $st-size;
$size
Vasudev.K. wrote:
Hi,
I have this rather critical problem, I am trying to download quite huge
files from a remote server through ftp. (The file being in a zipped
format). I have an array which stores the names of the files to be
downloaded. I am opening each of them up at my end and
Hi,
I have this rather critical problem, I am trying to download quite huge
files from a remote server through ftp. (The file being in a zipped
format). I have an array which stores the names of the files to be
downloaded. I am opening each of them up at my end and extracting data
out of it and
-Original Message-
Vasudev.K. wrote:
.
Q1. After unzipping, the file is huge (even the zipped one is :(( )..
almost 5GB. The system throws an errorFile too large and exits.
How do I get around this ache? One way I want to do it is unzipped file
into many parts and process
]
Subject: File size problem
Hi,
I have this rather critical problem, I am trying to download quite huge
files from a remote server through ftp. (The file being in a zipped
format). I have an array which stores the names of the files to be
downloaded. I am opening each of them up at my end and extracting
Hello,
I have to get the size and last modified date of a remote file via URL without
reading in the whole file. I have gone through LWP::UserAgent but couldn't make much
headway. Any pointers on how to do it would be appreciated.
TIA
Shishir
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
Hi,
I have an upload script, and i want to check the file size before it
uploads.
Any suggestion is appreciated
Anthony
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
Upload via FTP? Via a web based form? 18 questions left...
John
-Original Message-
From: anthony [mailto:[EMAIL PROTECTED]]
Sent: 22 February 2002 14:22
To: [EMAIL PROTECTED]
Subject: file size
Hi,
I have an upload script, and i want to check the file size before it
uploads.
Any
anthony wrote:
Hi,
I have an upload script, and i want to check the file size before it
uploads.
Any suggestion is appreciated
Anthony
here's some old code that does that, might be something built-in in
CGI.pm as well:
my $tempFile = CGI::tmpFileName($img_filename);
my
Anthony == awards anthony writes:
Anthony Hi, I have an upload script, and i want to check the file
Anthony size before it uploads.
The stat() function returns a list that includes file size as the
seventh element. You can use:
$size = (stat($filename))[7];
... to retrieve
On Feb 22, Chris Ball said:
Anthony == awards anthony writes:
Anthony Hi, I have an upload script, and i want to check the file
Anthony size before it uploads.
The stat() function returns a list that includes file size as the
seventh element. You can use:
$size = (stat($filename
Hi,
But i
tried this i didn't $size= -s $filename but it didn't work, anyways i want
my upload script not to upload files that are bigger than 250Kb
Anthony
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
hello--
i was wondering if anyone knew what the file size limitation is for perl and
if there are any work arounds. i tried to compile it with the gcc flags that
are necessary for LFS, but to no avail. i dont have a problem with large files
under other programs. any ideas?
i am running redhat
Hi,
I would like to know if with a perl script you can get the size of a file ?
I need to get all the size of 250 files on 250 computers ...
thanx
-s
as in:
perl -e 'print $_: . -s . \n for (glob (*.*))'
Hi,
I would like to know if with a perl script you can get the size of a
file ?
I need to get all the size of 250 files on 250 computers ...
thanx
1 - 100 of 101 matches
Mail list logo