[PHP] huge time/memory consuming script

2001-03-25 Thread Christian Dechery

I'm thinking about building a PHP script that loads an entire database on a 
weekly basis, based on a text catalog.

The reason I'm doing it, it's because the catalog is too generic and has 
lots off stuff to be done in it (such as lots of string replacements and 
'fixing') before I can really parse it and create the insert statements.

The doubt I have is this: it's kinda of BIG catalog, I'm talking about more 
than 20MB of text files, and I have to 'clean' the whole DB and load it 
entirely again (I don't make the rules, this is how it works)...
I'm worried about the time it will take and the memory it'll cost, so I was 
thinking of a step-by-step solution, like break the process into several 
'steps' that of course would be carried out automatically.

Like I'd have
?php
do this this this and that
reload the page to go to the next step
?

So... just by reloading the page, I get a good 'memory cleaning' so I can 
be sure everything is going to get carried out??... there are steps that 
I'm worried about taking over 30min to finish, like loading the main table 
with 140.000 rows...

thanks...

. Christian Dechery (lemming)
. http://www.tanamesa.com.br
. Gaita-L Owner / Web Developer


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]




RE: [PHP] huge time/memory consuming script

2001-03-25 Thread Brooks, Ken

I don't know exactly what you are asking.

But I'm running a script as we speak that is pulling 75000+ rows from
an oracle database over ODBC (from a remote server) and creating
a table in a local mysql database, dumping all records into it.

-ken

-Original Message-
From: Christian Dechery [mailto:[EMAIL PROTECTED]]
Sent: Sunday, March 25, 2001 10:53 AM
To: [EMAIL PROTECTED]
Subject: [PHP] huge time/memory consuming script


I'm thinking about building a PHP script that loads an entire database on a 
weekly basis, based on a text catalog.

The reason I'm doing it, it's because the catalog is too generic and has 
lots off stuff to be done in it (such as lots of string replacements and 
'fixing') before I can really parse it and create the insert statements.

The doubt I have is this: it's kinda of BIG catalog, I'm talking about more 
than 20MB of text files, and I have to 'clean' the whole DB and load it 
entirely again (I don't make the rules, this is how it works)...
I'm worried about the time it will take and the memory it'll cost, so I was 
thinking of a step-by-step solution, like break the process into several 
'steps' that of course would be carried out automatically.

Like I'd have
?php
do this this this and that
reload the page to go to the next step
?

So... just by reloading the page, I get a good 'memory cleaning' so I can 
be sure everything is going to get carried out??... there are steps that 
I'm worried about taking over 30min to finish, like loading the main table 
with 140.000 rows...

thanks...

. Christian Dechery (lemming)
. http://www.tanamesa.com.br
. Gaita-L Owner / Web Developer


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]