ls will list the files in the folder - do it before and after the ftp command and if there's more than you have something. A more intelligent (and repeatable way) is storing the filenames/dates in a table - which will allow better tracking of what, when & why.
As was pointed out earlier - you can reconnect - but I would imagine that it would complicate matters down the line. If a download takes 14.5 mins and you then start processing the files with 30s on the clock - what happens when it then times out processing the files. However you may have the problem with corrupted files which are terminated when the 15 mins are up. Is the data in the files atomic? Does the order of files matter? If not then have 2 cron jobs - one for just downloading the files - the other for processing them. It's usually better to divide & conquer - than to ball-of-wax it with these things. Also plain for things going wrong - with these types of problems - things do go wrong - so having a split approach allows for recovery later. On Wed, 30 Sep 2009 16:45:53 +0200, Tom Haskins-Vaughan <[email protected]> wrote: > > Thanks, David, > > Unfortunately, I don't really have access to cron jobs. Well, I can > schedule jobs, but I can't run any shell commands. I can just call php > files. > > The problem I have is knowing if a file that is downloaded has > finished downloading. Because I have a 15 minute max execution time, > a file could be partially downloaded. There may be another way of > checking to see if a file is complete or not, but I can't think of it. > > On Wed, Sep 30, 2009 at 9:30 AM, david <[email protected]> > wrote: >> >> Tom, >> >> Have you tried tackling this problem from another direction. >> If you can setup cron jobs - then couldn't you: >> 1) Use standard ftp commands (rather than via PHP) to download the files >> in the cron job itself. >> 2) In the same cron job - after the files have been transferred - start >> the PHP side to process the files. >> >> On Wed, 30 Sep 2009 15:16:02 +0200, Tom Haskins-Vaughan >> <[email protected]> wrote: >> >>> >>> Sorry to bump this, guys, but it's holding up my development. Can >>> anyone even just confirm or deny whether it's possible to reconnect to >>> a Doctrine database during the middle of a script. If it's impossible, >>> I'll just use a native mysql connection for this part. >>> >>> Thanks, >>> >>> Tom >>> >>> On Mon, Sep 28, 2009 at 8:59 AM, Tom Haskins-Vaughan >>> <[email protected]> wrote: >>>> >>>> OK, so assuming I have in fact lost a connection during a script due >>>> to inactivity, is there a way to open a doctrine connection in the >>>> middle of a script, or is that just all taken care of at >>>> initialization? >>>> >>>> TIA, >>>> >>>> Tom >>>> >>>> On Sep 24, 8:58 am, Tom Haskins-Vaughan <[email protected]> >>>> wrote: >>>>> I'm running the script as a task with a cron job. I'm not generating >>>>> the >>>>> file, the files are held in a remote ftp directory, posted by a real >>>>> estate listing service. They're basically a pipe delimited files of >>>>> new >>>>> and modified properties for sale. >>>>> >>>>> So my flow is: >>>>> >>>>> 1. download file, creating log as I go >>>>> 2. unzip file if download was completed >>>>> 3. process file and insert data into property table >>>>> >>>>> As I say, I use the log to keep track of the status of each file. >>>>> >>>>> Jeremy Thomerson wrote: >>>>> > Are you generating this file during an HTTP request that is also >>>>> > downloading it? >>>>> >>>>> > Although I still don't understand what you're doing, try changing >>>>> it >>>>> to >>>>> > a command line task rather than something that is triggered by >>>>> HTTP. >>>>> > Then execute it from the shell or cron. Then you can just rsync >>>>> the >>>>> files. >>>>> >>>>> > Jeremy >>>>> >>>>> > On Thu, Sep 24, 2009 at 7:39 AM, Tom Haskins-Vaughan >>>>> > <[email protected] <mailto:[email protected]>> >>>>> wrote: >>>>> >>>>> > Well, basically, I have to download loads of file, unzip them >>>>> and >>>>> > process the content. But I only get 15 minutes of execution >>>>> time >>>>> with my >>>>> > current host, so I have a log table which tells me where I am >>>>> and where >>>>> > to pick up from the next time the script runs (every 20 >>>>> minutes). I need >>>>> > to know that the filehasdownloaded correctly before I unzip it >>>>> so >>>>> > after it's downloaded I put the enter the download_completed_at >>>>> field. >>>>> > Then I know that I can unzip it the next time around. >>>>> >>>>> > The problem is, I'm losing the connection during the time it >>>>> takes to >>>>> > download the file. >>>>> >>>>> > Eno wrote: >>>>> > > On Wed, 23 Sep 2009, Tom Haskins-Vaughan wrote: >>>>> >>>>> > >> But the problem is, by the time I've downloaded the file, >>>>> I've >>>>> > lost the >>>>> > >> connection (SQLSTATE[HY000]: General error: 2006 >>>>> MySQLserver >>>>> > hasgone away) >>>>> >>>>> > >> Is there any way to reconnect the database? >>>>> >>>>> > > What does a file download have to do with your database? >>>> > >>>> >>> >>> > >> >> >> -- >> Using Opera's revolutionary e-mail client: http://www.opera.com/mail/ >> >> > >> > > > -- Using Opera's revolutionary e-mail client: http://www.opera.com/mail/ --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "symfony users" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/symfony-users?hl=en -~----------~----~----~----~------~----~------~--~---
