Rather than using the callback, why wouldn't reading the log tell you the 'success' after each file upload? Then you could just add those names one at a time, including the elapsed time and upload-rate.
Jim Ault Las Vegas On 8/2/06 2:39 AM, "Mark Schonewille" <[EMAIL PROTECTED]> wrote: > Hi Dave, > > Thanks for taking this up. > > I can't use the callback message to get a file listing, because the > user of my software may wish to upload hundreds, if not thousands, of > files at once. The callback message would be called hundreds of times > and if the callback message caused the directory listing to update, I > would probably run into serious trouble, calling getURL hundreds of > times. > > So, I check whether each file has been handled and if none of the > files is still being uploaded, downloaded, queued etc, I get the url > of the directory twice, to get one listing with and one without > additional information. This works perfectly fine if only one file is > uploaded, but the "wait" commands in the library block the whole > thing if I want to get a directory listing after uploading more than > 1 file. So, yes, I am using getURL twice, and use the > libUrlFtpCommand command to set the type of listing I get back from > the server, but this always worked fine until I started uploading > multiple files. > > All my repeat loops now contain "with messages" and the repeat loop > that calls the libUrlFtpUpload handler has been changed into a > regular handler with "send xxxxx in yy millisecs to me" at the end. > Although this seems to help for a small number of small files, I > still have problems if I upload two large files, for example. > > Currently, I am handling only one batch at a time, but I intend to > create a programme that uploads and downloads files completely > asynchronically, refreshes directory lists, deletes files etc. just > like any other ftp programme. This means that all libURLftpUpload and > -Download commands, get URL commands, and libUrlFtpCommand commands > should be sent independently of each other. Are you implying this is > impossible with the libUrl library? > > Can you explain what exactly causes the handlers in the library to > block each other? Maybe we can change it? > > If you want, you can download ecxFTP from the Economy-x-Talk > homepage. If you drag a bunch of files into the main window, wait > until they have been uploaded, and then try to navigate to a > different directory, you will get an error similar to "URL 'ftp:// > blabla' is blocking. Do you want to reset?". The url points to a > directory on the server, not to a file. So, it is the GetURL handler > that is blocking here. (Since I am still working on the application, > tomorrow's version may bot display this error anymore). > > Thanks for your time and best regards, > > Mark > > -- > > Economy-x-Talk > Consultancy and Software Engineering > http://economy-x-talk.com > http://www.salery.biz > > Download ErrorLib at http://economy-x-talk.com/developers.html and > get full control of error handling in Revolution. > > > > Op 2-aug-2006, om 9:00 heeft Dave Cragg het volgende geschreven: > >> >> On 1 Aug 2006, at 23:42, Mark Schonewille wrote: >> >>> I'll problaby post an enhancement request to bugzilla regarding >>> this problem, if there isn't an entry in BZ yet. >> >> Mark, I didn't see the earlier post on this. (I was away.) >> >> From your earlier post: >> >>> Right after uploading a dozen of files to my ftp account >>> asynchronically using the libUrlFtpUploadFile command, I can't get >>> a directory listing using the syntax: get url "ftp:// >>> name:[EMAIL PROTECTED]/folder/" (note the trailing slash, as I >>> want a listing). Most of the time I receive the error "Error >>> Previous request not completed". >>> >>> It appears the GetURL handler in the revLibURL script doesn't let >>> me get the url because the variable lvBlockingUrl is not empty. >>> Although everything has been uploaded/downloaded already, >>> lbBlockingUrl indicates that one or more transfers are still in >>> progress. So, in the GetURL handler, I changed the line which >>> checks whether any url's are currently being handled and added >>> variable to force the handler to run normally: >> >> Are you doing anything else other than libUrlFtpUploadFile followed >> by "get url" for the listing? For example, is there any chance your >> script is calling "get url" twice? Or do you make any other url >> calls during this process. >> >> The reason I'm asking is that you pointed out a possible problem >> with the lvBlockingUrl variable in the libUrl script. But this is >> not set by libUrlFtpUploadFile, so is unlikely to be the issue here >> if you are only using libUrlFtpUploadFile followed by a single "get >> url" call. >> >> Another possible source of a "Previous request not completed" >> message is the libUrlFtpCommand handler. Are you using that anywhere? >> >> How are you checking that the uploads are complete before getting >> the directory listing? The "normal" way would be to use the >> callback message from libUrlFtpUploadFile. If you could let me see >> the script you are using, I'll give it a run. >> >> Cheers >> Dave > > > _______________________________________________ > use-revolution mailing list > [email protected] > Please visit this url to subscribe, unsubscribe and manage your subscription > preferences: > http://lists.runrev.com/mailman/listinfo/use-revolution _______________________________________________ use-revolution mailing list [email protected] Please visit this url to subscribe, unsubscribe and manage your subscription preferences: http://lists.runrev.com/mailman/listinfo/use-revolution
