Re: [PHP-DB] SELECT string
cmiiw.. since i don't the visual what u said bellow - Original Message - From: "Ron Piggott" <[EMAIL PROTECTED]> To: "PHP DB" Sent: Tuesday, April 24, 2007 11:31 AM Subject: [PHP-DB] SELECT string > > I am looking for help to write a SELECT syntax to help me process a > directory searching query tool I am developing. you have a dir like this?? root -include -main -body --admin --user u want to search a file inside the dir? why don't you create a function that read inside the dir and return query for insert as database 1. read all file inside 2. create an insert query 3. refresh the query (repair the table?) and then.. u can use select but target it to the database not the directory > > If you start at > http://www.actsministrieschristianevangelism.org/ministrydirectory/ and > under 'Step 1:' click Business a form is displayed. > > My question is how would you generate the SELECT syntax for the search > results "Could Include" a given category and "Must Include" a given must include?? require u mean? > category based on what the user has inputted through this form? > > Ron > > -- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP-DB] SELECT string
I am looking for help to write a SELECT syntax to help me process a directory searching query tool I am developing. If you start at http://www.actsministrieschristianevangelism.org/ministrydirectory/ and under 'Step 1:' click Business a form is displayed. My question is how would you generate the SELECT syntax for the search results "Could Include" a given category and "Must Include" a given category based on what the user has inputted through this form? Ron
Re: [PHP-DB] Is there any I can do to copy data from a mysql database to access database?
There is a "tool" in Access that can connect an access database to any type of database [via ODBC]. It can be used as a table, so you can make all query rigth the way as other table. i'm sorry but i don't remember the "tool" name. Hope this will help. 2007/4/23, Steve Smith <[EMAIL PROTECTED]>: I have a mysql database and would like to be able to copy the data from that to my access database. If there something that is freeware to do this? Or maybe someone has some code that might do this for me? Thanks in advanced. Steve -- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php -- Samel alias Luca "Close the world,txen eht nepo!" "You will never break my mind!" -- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
RE: [PHP-DB] Forking and database connections
Glad to help. I was actually interested in your post because of forking versus threading. I still haven't found anything except for PHP-GTK but it may help your database related performance if it is added/exists in PHP. Regards, Dwight > -Original Message- > From: Chris Verges [mailto:[EMAIL PROTECTED] > > Again, thanks for your help! It was really beneficial to have a second > set > of eyes and a sounding board for getting through this problem. -- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP-DB] Forking and database connections
For anyone interested in following the progress of this forking/PEAR::DB issue, see http://pear.php.net/bugs/bug.php?id=10813. Thanks, Chris -- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP-DB] Forking and database connections
Hey Dwight, After getting your first e-mail, I started adding the PEAR::DB persistent connection code. Unfortunately, it yielded the same results that I was getting before. At a hunch, I created a second proof-of-concept script that uses the mysql_* functions in the PHP base. For each of these, I set the $new_link parameter in the mysql_connect() function to "true". As it turns out, the primary $db link that was originally created in the parent process *is* destroyed when the first child process closes -- this makes sense, however. If I re-create that thread after spawning all the child processes, though, it works beautifully! So it looks like this is a bug with PEAR::DB in handling persistent connections across forked processes. I'll go ahead and file it with the PEAR::DB folk and include the two scripts in the bug request. Again, thanks for your help! It was really beneficial to have a second set of eyes and a sounding board for getting through this problem. In case anyone wants the second proof-of-concept script: Thanks! Chris On 4/23/07 10:43 AM, "Dwight Altman" <[EMAIL PROTECTED]> wrote: > To use a PEAR::DB persistent connection, try > $db = DB::connect($dsn, TRUE); > or > $db = DB::connect($dsn, true); > > Googled for "pear::db persistent connection" and got > http://vulcanonet.com/soft/?pack=pear_tut > > > Regards, > Dwight > >> -Original Message- >> From: Dwight Altman [mailto:[EMAIL PROTECTED] >> >> Where in your code does it say you are using persistent connections? -- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
RE: [PHP-DB] Forking and database connections
To use a PEAR::DB persistent connection, try $db = DB::connect($dsn, TRUE); or $db = DB::connect($dsn, true); Googled for "pear::db persistent connection" and got http://vulcanonet.com/soft/?pack=pear_tut Regards, Dwight > -Original Message- > From: Dwight Altman [mailto:[EMAIL PROTECTED] > > Where in your code does it say you are using persistent connections? -- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
[PHP-DB] Is there any I can do to copy data from a mysql database to access database?
I have a mysql database and would like to be able to copy the data from that to my access database. If there something that is freeware to do this? Or maybe someone has some code that might do this for me? Thanks in advanced. Steve -- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
RE: [PHP-DB] Forking and database connections
Good point. Anything (children or parent) sharing the connection will get it closed on them. I was looking up persistent connections, then I thought you could just address the way you used the pcntl_* functions. I use mysql_* directly (where mysql_connect "Opens or reuses a connection to a MySQL server."), so you may need to look into PEAR::DB for persistent connections. Where in your code does it say you are using persistent connections? I haven't had the need to use them yet. For instance mysql_pconnect (the persistent counterpart) says "Second, the connection to the SQL server will not be closed when the execution of the script ends." If you are using the PEAR::DB persistent connection as documented, then I'd open a bug report. Again, I haven't used them. Regards, Dwight > -Original Message- > From: Chris Verges [mailto:[EMAIL PROTECTED] > Sent: Monday, April 23, 2007 12:01 PM > To: Dwight Altman; php-db@lists.php.net > Subject: Re: [PHP-DB] Forking and database connections > > Hey Dwight, > > Thanks for the replies! When the first child thread closes, all of the DB > connections seem to close. That is, the $db->execute() statements will > fail > in the child threads after the first child thread closes. It's almost as > though separate connections are not being made, but they are instead > sharing > the same connection. This would make sense if persistent connection > pooling > was enabled, but I assumed there to be a "garbage collection" mechanism in > place to avoid prematurely closing a persistent connection if multiple > resources are using it. > > Again, thanks for the help! Any other ideas on what might be wrong, or > should I open a bug against it? > > Thanks! > Chris > > > On 4/23/07 8:12 AM, "Dwight Altman" <[EMAIL PROTECTED]> wrote: > > > Actually I suppose you need to loop with foreach to wait on all children > > before attempting any $db->execute, then after your foreach loop, get > your > > $db and execute. > > > > Regards, > > Dwight > > > >> -Original Message- > >> From: Dwight Altman [mailto:[EMAIL PROTECTED] > >> Sent: Monday, April 23, 2007 10:09 AM > >> To: 'php-db@lists.php.net' > >> Subject: RE: [PHP-DB] Forking and database connections > >> > >> http://php.he.net/manual/en/function.pcntl-fork.php says 'The reason > for > >> the MySQL "Lost Connection during query"...' like what you concluded, > >> although they grab a new $db connection in the first for loop > >> "} else if ( $pid ) {" I bet a child closes the one you create after > the > >> for loop while the parent is waiting in the foreach loop. > >> > >> In your foreach when you wait for each child, can you $db- > >execute($stmt, > >> $data) AFTER you pcntl_waitpid($pid, $status) ? Just reverse the > lines? > >> > >> I think after the wait is when the child closes the connection (since I > >> suppose you are reusing the same connection), so it is already closed. > >> > >> > >> Regards, > >> Dwight > >>> -Original Message- > >>> From: Chris Verges [mailto:[EMAIL PROTECTED] > >>> Sent: Saturday, April 21, 2007 12:21 PM > >>> To: php-db@lists.php.net > >>> Subject: [PHP-DB] Forking and database connections > >>> > >>> Hey all, > >>> > >>> I'm writing a PHP script that queries a database for some records, > >> splits > >>> the records into N number of groups of equal size, and then creates N > >>> number > >>> of forks where each child handles one of the groups. During the > >> execution > >>> of each of the child processes, I'd like the parent process to update > >> the > >>> status of the job in the database. > >>> > >>> The problem is regarding my database connection pre- and post- fork. > >>> After > >>> reading the pcntl_fork() page on the PHP manual, I realize that the > >> child > >>> process inherits the file descriptor, and if the child process closes > >> the > >>> connection, then it is closed in the parent process. So for each > child > >>> process (because I have more than one), I reinitialize the database > >> link. > >>> I > >>> also reinitialize the database link for the parent process immediately > >>> after > >>> the fork. > >>> > >>> However, when a child process finishes, it seems like the database > link > >>> that > >>> I reinitialized in the parent process also disconnects. I thought a > >> fork > >>> copied the entire heap, and therefore would make two copies of the > >> object > >>> instances that would remain segmented for the life of the processes. > >>> Changes made to one copy of the heap wouldn't affect others. However, > >>> this > >>> doesn't seem to be the case. > >>> > >>> So at this point, my workaround is to wait until all of the child > >>> processes > >>> are finished, then re-initialize the database link, and give an > updated > >>> status message at the end rather than incrementally as child processes > >>> finish. > >>> > >>> Here's some proof-of-concept code that explains what I mean: > >>> > >>> >>> > >>> /* Include PEAR::DB */ > >>> req
[PHP-DB] PDO prepared statements sometimes returns empty resultset
Hi, I'm trying to get some data form a MySQL db using PDO and prepared statement, but I noticed that sometimes it fetches an empty result set, but sometimes does what it's supposed to do - returns the whole result set. The environment is WinXP SP2 Home, Apache 2.2.4, PHP 5.2.1 and MySQL 5.0.27. Here is some code: $db = new PDO( 'mysql:host=localhost;dbname=name_of_db', 'user_name', 'password', array ( PDO::ATTR_PERSISTENT => true ) ); $db->query('SET CHARACTER SET utf8'); $stmt = $db->prepare("CALL sp_get_all_currencies(:lang_id)"); $stmt->bindValue(':lang_id', $currenctLanguageId, PDO::PARAM_INT); $stmt->execute(); $currencies = $stmt->fetchAll(); So here, $currencies sometimes is an empty array, and after a refresh of the browser it populates. I also noticed that it works ok with $db->query(), but query doesn't have that fancy param binding (I think). Is there some caching or something... Is this a bug in PHP or I'm not getting the picture right? Regards, Emil Ivanov -- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
Re: [PHP-DB] Forking and database connections
Hey Dwight, Thanks for the replies! When the first child thread closes, all of the DB connections seem to close. That is, the $db->execute() statements will fail in the child threads after the first child thread closes. It's almost as though separate connections are not being made, but they are instead sharing the same connection. This would make sense if persistent connection pooling was enabled, but I assumed there to be a "garbage collection" mechanism in place to avoid prematurely closing a persistent connection if multiple resources are using it. Again, thanks for the help! Any other ideas on what might be wrong, or should I open a bug against it? Thanks! Chris On 4/23/07 8:12 AM, "Dwight Altman" <[EMAIL PROTECTED]> wrote: > Actually I suppose you need to loop with foreach to wait on all children > before attempting any $db->execute, then after your foreach loop, get your > $db and execute. > > Regards, > Dwight > >> -Original Message- >> From: Dwight Altman [mailto:[EMAIL PROTECTED] >> Sent: Monday, April 23, 2007 10:09 AM >> To: 'php-db@lists.php.net' >> Subject: RE: [PHP-DB] Forking and database connections >> >> http://php.he.net/manual/en/function.pcntl-fork.php says 'The reason for >> the MySQL "Lost Connection during query"...' like what you concluded, >> although they grab a new $db connection in the first for loop >> "} else if ( $pid ) {" I bet a child closes the one you create after the >> for loop while the parent is waiting in the foreach loop. >> >> In your foreach when you wait for each child, can you $db->execute($stmt, >> $data) AFTER you pcntl_waitpid($pid, $status) ? Just reverse the lines? >> >> I think after the wait is when the child closes the connection (since I >> suppose you are reusing the same connection), so it is already closed. >> >> >> Regards, >> Dwight >>> -Original Message- >>> From: Chris Verges [mailto:[EMAIL PROTECTED] >>> Sent: Saturday, April 21, 2007 12:21 PM >>> To: php-db@lists.php.net >>> Subject: [PHP-DB] Forking and database connections >>> >>> Hey all, >>> >>> I'm writing a PHP script that queries a database for some records, >> splits >>> the records into N number of groups of equal size, and then creates N >>> number >>> of forks where each child handles one of the groups. During the >> execution >>> of each of the child processes, I'd like the parent process to update >> the >>> status of the job in the database. >>> >>> The problem is regarding my database connection pre- and post- fork. >>> After >>> reading the pcntl_fork() page on the PHP manual, I realize that the >> child >>> process inherits the file descriptor, and if the child process closes >> the >>> connection, then it is closed in the parent process. So for each child >>> process (because I have more than one), I reinitialize the database >> link. >>> I >>> also reinitialize the database link for the parent process immediately >>> after >>> the fork. >>> >>> However, when a child process finishes, it seems like the database link >>> that >>> I reinitialized in the parent process also disconnects. I thought a >> fork >>> copied the entire heap, and therefore would make two copies of the >> object >>> instances that would remain segmented for the life of the processes. >>> Changes made to one copy of the heap wouldn't affect others. However, >>> this >>> doesn't seem to be the case. >>> >>> So at this point, my workaround is to wait until all of the child >>> processes >>> are finished, then re-initialize the database link, and give an updated >>> status message at the end rather than incrementally as child processes >>> finish. >>> >>> Here's some proof-of-concept code that explains what I mean: >>> >>> >> >>> /* Include PEAR::DB */ >>> require_once('DB.php'); >>> >>> # Database table definition >>> # - >>> # CREATE TABLE `logs` ( >>> # `message` VARCHAR(128) NOT NULL >>> # ); >>> >>> /* Create the initial database connection for the parent process */ >>> $dsn = 'mysql://test:[EMAIL PROTECTED]/testdb'; >>> $db = DB::connect($dsn); >>> if ( PEAR::isError($db) ) { >>> die($db->getMessage() . "\n"); >>> } >>> >>> /* This will be the common SQL statement for all inserts */ >>> $sql = "INSERT INTO `logs` (`message`) VALUES (?);"; >>> $stmt = $db->prepare($sql); >>> >>> /* Perform a DB update */ >>> $data = array('Started parent process'); >>> $db->execute($stmt, $data); >>> >>> /* Create the child processes */ >>> $childPids = array(); >>> for ( $i = 0; $i < 5; $i++ ) { >>> $pid = pcntl_fork(); >>> if ( $pid == -1 ) { >>> die("\nUnable to fork!\n"); >>> } else if ( $pid ) { >>> /* Parent process */ >>> echo "Child process $pid created\n"; >>> array_push($childPids, $pid); >>> } else { >>> /* Child process */ >>> $myPid = posix_getpid(); >>> >>> /* Create a new datab
RE: [PHP-DB] Forking and database connections
Actually I suppose you need to loop with foreach to wait on all children before attempting any $db->execute, then after your foreach loop, get your $db and execute. Regards, Dwight > -Original Message- > From: Dwight Altman [mailto:[EMAIL PROTECTED] > Sent: Monday, April 23, 2007 10:09 AM > To: 'php-db@lists.php.net' > Subject: RE: [PHP-DB] Forking and database connections > > http://php.he.net/manual/en/function.pcntl-fork.php says 'The reason for > the MySQL "Lost Connection during query"...' like what you concluded, > although they grab a new $db connection in the first for loop > "} else if ( $pid ) {" I bet a child closes the one you create after the > for loop while the parent is waiting in the foreach loop. > > In your foreach when you wait for each child, can you $db->execute($stmt, > $data) AFTER you pcntl_waitpid($pid, $status) ? Just reverse the lines? > > I think after the wait is when the child closes the connection (since I > suppose you are reusing the same connection), so it is already closed. > > > Regards, > Dwight > > -Original Message- > > From: Chris Verges [mailto:[EMAIL PROTECTED] > > Sent: Saturday, April 21, 2007 12:21 PM > > To: php-db@lists.php.net > > Subject: [PHP-DB] Forking and database connections > > > > Hey all, > > > > I'm writing a PHP script that queries a database for some records, > splits > > the records into N number of groups of equal size, and then creates N > > number > > of forks where each child handles one of the groups. During the > execution > > of each of the child processes, I'd like the parent process to update > the > > status of the job in the database. > > > > The problem is regarding my database connection pre- and post- fork. > > After > > reading the pcntl_fork() page on the PHP manual, I realize that the > child > > process inherits the file descriptor, and if the child process closes > the > > connection, then it is closed in the parent process. So for each child > > process (because I have more than one), I reinitialize the database > link. > > I > > also reinitialize the database link for the parent process immediately > > after > > the fork. > > > > However, when a child process finishes, it seems like the database link > > that > > I reinitialized in the parent process also disconnects. I thought a > fork > > copied the entire heap, and therefore would make two copies of the > object > > instances that would remain segmented for the life of the processes. > > Changes made to one copy of the heap wouldn't affect others. However, > > this > > doesn't seem to be the case. > > > > So at this point, my workaround is to wait until all of the child > > processes > > are finished, then re-initialize the database link, and give an updated > > status message at the end rather than incrementally as child processes > > finish. > > > > Here's some proof-of-concept code that explains what I mean: > > > > > > > /* Include PEAR::DB */ > > require_once('DB.php'); > > > > # Database table definition > > # - > > # CREATE TABLE `logs` ( > > # `message` VARCHAR(128) NOT NULL > > # ); > > > > /* Create the initial database connection for the parent process */ > > $dsn = 'mysql://test:[EMAIL PROTECTED]/testdb'; > > $db = DB::connect($dsn); > > if ( PEAR::isError($db) ) { > > die($db->getMessage() . "\n"); > > } > > > > /* This will be the common SQL statement for all inserts */ > > $sql = "INSERT INTO `logs` (`message`) VALUES (?);"; > > $stmt = $db->prepare($sql); > > > > /* Perform a DB update */ > > $data = array('Started parent process'); > > $db->execute($stmt, $data); > > > > /* Create the child processes */ > > $childPids = array(); > > for ( $i = 0; $i < 5; $i++ ) { > > $pid = pcntl_fork(); > > if ( $pid == -1 ) { > > die("\nUnable to fork!\n"); > > } else if ( $pid ) { > > /* Parent process */ > > echo "Child process $pid created\n"; > > array_push($childPids, $pid); > > } else { > > /* Child process */ > > $myPid = posix_getpid(); > > > > /* Create a new database connection for the child > process > > */ > > $db = DB::connect($dsn); > > if ( PEAR::isError($db) ) { > > die("\nChild process $myPid: " . $db- > >getMessage() > > . > > "\n" . $db->getDebugInfo() . "\n"); > > } > > > > $data = array("Child process $myPid"); > > $stmt = $db->prepare($sql); > > $db->execute($stmt, $data); > > > > /* Add some latency for testing purposes */ > > sleep(5); > > exit; > > } > > } > > > > /* Create a new database connection for the parent process */ > > $db = DB::connect($dsn); > > if ( PEAR::isError($db) ) { > > die("\nParent process: " . $db->getMessage(
RE: [PHP-DB] Forking and database connections
http://php.he.net/manual/en/function.pcntl-fork.php says 'The reason for the MySQL "Lost Connection during query"...' like what you concluded, although they grab a new $db connection in the first for loop "} else if ( $pid ) {" I bet a child closes the one you create after the for loop while the parent is waiting in the foreach loop. In your foreach when you wait for each child, can you $db->execute($stmt, $data) AFTER you pcntl_waitpid($pid, $status) ? Just reverse the lines? I think after the wait is when the child closes the connection (since I suppose you are reusing the same connection), so it is already closed. Regards, Dwight > -Original Message- > From: Chris Verges [mailto:[EMAIL PROTECTED] > Sent: Saturday, April 21, 2007 12:21 PM > To: php-db@lists.php.net > Subject: [PHP-DB] Forking and database connections > > Hey all, > > I'm writing a PHP script that queries a database for some records, splits > the records into N number of groups of equal size, and then creates N > number > of forks where each child handles one of the groups. During the execution > of each of the child processes, I'd like the parent process to update the > status of the job in the database. > > The problem is regarding my database connection pre- and post- fork. > After > reading the pcntl_fork() page on the PHP manual, I realize that the child > process inherits the file descriptor, and if the child process closes the > connection, then it is closed in the parent process. So for each child > process (because I have more than one), I reinitialize the database link. > I > also reinitialize the database link for the parent process immediately > after > the fork. > > However, when a child process finishes, it seems like the database link > that > I reinitialized in the parent process also disconnects. I thought a fork > copied the entire heap, and therefore would make two copies of the object > instances that would remain segmented for the life of the processes. > Changes made to one copy of the heap wouldn't affect others. However, > this > doesn't seem to be the case. > > So at this point, my workaround is to wait until all of the child > processes > are finished, then re-initialize the database link, and give an updated > status message at the end rather than incrementally as child processes > finish. > > Here's some proof-of-concept code that explains what I mean: > > > /* Include PEAR::DB */ > require_once('DB.php'); > > # Database table definition > # - > # CREATE TABLE `logs` ( > # `message` VARCHAR(128) NOT NULL > # ); > > /* Create the initial database connection for the parent process */ > $dsn = 'mysql://test:[EMAIL PROTECTED]/testdb'; > $db = DB::connect($dsn); > if ( PEAR::isError($db) ) { > die($db->getMessage() . "\n"); > } > > /* This will be the common SQL statement for all inserts */ > $sql = "INSERT INTO `logs` (`message`) VALUES (?);"; > $stmt = $db->prepare($sql); > > /* Perform a DB update */ > $data = array('Started parent process'); > $db->execute($stmt, $data); > > /* Create the child processes */ > $childPids = array(); > for ( $i = 0; $i < 5; $i++ ) { > $pid = pcntl_fork(); > if ( $pid == -1 ) { > die("\nUnable to fork!\n"); > } else if ( $pid ) { > /* Parent process */ > echo "Child process $pid created\n"; > array_push($childPids, $pid); > } else { > /* Child process */ > $myPid = posix_getpid(); > > /* Create a new database connection for the child process > */ > $db = DB::connect($dsn); > if ( PEAR::isError($db) ) { > die("\nChild process $myPid: " . $db->getMessage() > . > "\n" . $db->getDebugInfo() . "\n"); > } > > $data = array("Child process $myPid"); > $stmt = $db->prepare($sql); > $db->execute($stmt, $data); > > /* Add some latency for testing purposes */ > sleep(5); > exit; > } > } > > /* Create a new database connection for the parent process */ > $db = DB::connect($dsn); > if ( PEAR::isError($db) ) { > die("\nParent process: " . $db->getMessage() . "\n" . > $db->getDebugInfo() . "\n"); > } > > /* Wait for the children to finish */ > foreach ( $childPids as $pid ) { > $data = array("Parent process waiting on child process $pid"); > $db->execute($stmt, $data); > pcntl_waitpid($pid, $status); > $data = array("Child process $pid is finished"); > $db->execute($stmt, $data); > } > > $data = array("Parent process is finished"); > $db->execute($stmt, $data); > > ?> > > The command-line output of this code: > > $ php forking-proof-of-concept.php > Child process 27012 created > Child process 27013 created > Child process