On Wed, 2008-04-30 at 12:30 +1000, Matthew Hannigan wrote:
> On Wed, Apr 30, 2008 at 12:08:50PM +1000, Howard Lowndes wrote:
> > 
> > The process is:
> > Read a line from the file.
> > Decompose the line into the data elements.
> > For each data element, do a select on the database to see whether it
>                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^
> > already exists, if not then do an insert into the database.
> > Rinse and repeat...
> > 
> > It certainly has the smell of being a PHP memory leak, but how I can work
> > around it I am just not sure.
> 
> You do a select before every insert?!  Is the table indexed?
> If not that might explain the slowdown; a select WILL take
> longer the bigger the table.

Perhaps better to put a unique index on the appropriate column(s), then
just do an insert and throw away the error if the data is already there.

-- 
Thanks,
.
Sonia Hamilton
http://www.snowfrog.net
http://training.snowfrog.net
http://www.linkedin.com/in/soniahamilton
.
Your manuscript is both good and original; but the part that is good is
not original and the part that is original is not good - Samuel Johnson.

-- 
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html

Reply via email to