RE: [PHP-DB] Copying large volumes of data to a DB

2005-07-04 Thread Bastien Koert

How is the data coming down? Text?

Bastien


From: Alan Lord [EMAIL PROTECTED]
To: php-db@lists.php.net
Subject: [PHP-DB] Copying large volumes of data to a DB
Date: Mon, 04 Jul 2005 22:45:04 +0100

Hi All,

I'm starting to build an app to manage my eBay sales. Now one of the things 
I need to do is to write the Category information into a local DB.


Once the initial download is done I'm O.K. about updating the content but 
my questions relates more to this:


What is the best way to manage the Mbytes of data which will be coming down 
a relatively slow line (DSL) and then writing it into my DB.


Should I have some sort of buffer where I read n bytes or records and then 
write those to the DB and repeat until the process is finished, or, do I 
wait until the lot has downloaded then write it to my DB in one big gulp?


I've done plenty or read/writinig to/from DBs but it has always single 
records in the past and I'm just wondering about the best way to handle 
this?


Thanks in advance

Alan

--
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



--
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP-DB] Copying large volumes of data to a DB

2005-07-04 Thread Alan Lord
HI,

Sort of yes.

It is actually transmitted as SOAP but I'm using a PHP PEAR package
called Services_Ebay which manages the SOAP/XML interface and presents
the data to me in UTF-8 text.

Thanks

Alan 

-Original Message-
From: Bastien Koert [mailto:[EMAIL PROTECTED] 
Sent: 04 July 2005 23:39
To: [EMAIL PROTECTED]; php-db@lists.php.net
Subject: RE: [PHP-DB] Copying large volumes of data to a DB

How is the data coming down? Text?

Bastien

From: Alan Lord [EMAIL PROTECTED]
To: php-db@lists.php.net
Subject: [PHP-DB] Copying large volumes of data to a DB
Date: Mon, 04 Jul 2005 22:45:04 +0100

Hi All,

I'm starting to build an app to manage my eBay sales. Now one of the 
things I need to do is to write the Category information into a local
DB.

Once the initial download is done I'm O.K. about updating the content 
but my questions relates more to this:

What is the best way to manage the Mbytes of data which will be coming 
down a relatively slow line (DSL) and then writing it into my DB.

Should I have some sort of buffer where I read n bytes or records and 
then write those to the DB and repeat until the process is finished, 
or, do I wait until the lot has downloaded then write it to my DB in
one big gulp?

I've done plenty or read/writinig to/from DBs but it has always single 
records in the past and I'm just wondering about the best way to 
handle this?

Thanks in advance

Alan

--
PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: 
http://www.php.net/unsub.php


-- 
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php