Giulio wrote:
Hi,
I'm developing an application that uploads file to a server using http.
the app calls a php on the server sending all the data just like a web form post.


It works just fine uploading little files, now i'm concerning about having the apllication split large files in little chunks ( about 256 KB os so ) and sending them with multiple post. Wanna do that to avoid possible problems with php temporary files max size.

No problem on the client side ( is sends the form with some info and the firts chunk, waits for server response and then repost a form with succesive chunk of file, and so on ), and also on the php side should be quite easy to store a file with the first post, and keep on adding to it the subsequent chunks of file posted, at least using the filesystem functions.

My problem is that I'm using, to make the system as general as possible, and make it work even on servers where php doesn't have write privileges, ftp functions instead of filesystem functions, and using ftp it seems that it's not possible to append to a file. I also thinked to use the fopen function with an ftp address, but reading the docs it says that fseek function ( to position the pointer at eof to go on appending ) may not work if the file is opened using ftp or http.

Suggestion about this issue?

ftp_put() has startpos parameter. I think you need to turn FTP_AUTOSEEK off, because you have only partial file.



-- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php



Reply via email to