>  .------[ Dan Muey wrote (2003/03/04 at 15:04:30) ]------
>  | 
>  |  > Long ago in GW Basic there were sequential files and random 
>  |  > access files. Does perl have the latter? I only to get (and 
>  |  > then put) certain info in a certain place every time. If I 
>  |  > can avoid writing the whole file every time I bet my web site 
>  |  Every time what??
>  |  
>  |  
>  |  I'm not familiar with GQ Basic but there's lots of ways 
> to modify files.  |  
>  |  How big is your file?What format?
>  |  
>  |  I'd use File::Slurp. Although you may still be writing 
> the entire file 
>  |  
>  |  If you just did 
>  |  
>  |  use File::Slurp;
>  |  
>  |  $contents = read_file("myfile.txt");
>  |  $contents .= "Here is a new line for my file!\n";
>  |  write_file("myfile.txt", $contents);
>  |  
>  |  That script takes not quite one second for a 2.5 megabyte 
> file for me.  |  If that puts your hosting provider in a 
> pinch then I'd get another 
>  |  hosting provider!!!
>  |  
>  |  DMuey
>  |  
>  `-------------------------------------------------
> 
>     No offense, but this isn't very efficient. It might be 

None taken since you missed my point.

> fine on a 2.5
>     MB file, but what about a 2,500 MB one? I bet your hosting company
>     would freak out then.  Try this instead: 
> 
>     open(OUT, ">>myfile.txt"); 
>     print OUT "Here is a new line for my file!\n"; 
>     close(OUT); 

Very true not efficient, but I was doing that for illustrative purposes since I 
didn't know exactly what he was trying to do. Perhaps I should have labeled it as such 
instead of asuming it'd be obvious.

Do this with a 1,000,000,000,000 MB file if you want, and it's only one line :
( yes I know it basically just does the exact same code as you have above, but it's
Modular boogy woogy woogy :D )

append_file("myfile.txt", "Here is a new line for my file!");

The point was basically there's lots ways of doing things quickly and efficiently
without having to get bogged down by trying to use obscure methods from other
languages that will end up taking you lots of time to write/test/debug and run.

I was assuming ( again we all know what happens when we assume ;p )
 he'd want to do something with the contents first then 
modify the file besides just appending one line. So sorry if I mislead, confused, or
otherwise incited fury.

Basically the the File::Slurp handles the messy details of opening files.
Sysopen just calls open with special settings and that still leaves you with
Readind, writing, or appending data.

So in all of this typing we probably havn't helped this guy very much.

If you want to read and modify something in the middle of a file I think you can use 
seek() and related funtions if you want to get real tricky.

perldoc -f seek

That should also mention the similar/related functions.

Hope that helps!

DMuey

> 
>     The above will append a line of text onto the end of the file
>     without the need to "slurp" the entire thing into RAM.  
> 
>     I'm of the opinion that you shouldn't slurp an entire 
> file into RAM 
>     ever.  Even if you have the RAM to spare, even if the 
> file is small,
>     etc, it's just not a good practice to get into. 
> 
>     As to the original question, Perl does have similar concepts to
>     sequential and random access files. They aren't much like 
> GW Basic 
>     however. 
> 
>     I would suggest reading up on on your Perldoc for open() 
> and sysopen(), 
>     probably reading the Perl Cookbook would be a good idea 
> to get a better
>     handle on this stuff. 
> 
>  ---------------------------------
>    Frank Wiles <[EMAIL PROTECTED]>
>    http://frank.wiles.org
>  ---------------------------------
> 
> 

--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to