Re: [PHP] parallel execution of php code?

2005-08-09 Thread Marcus Bointon

On 8 Aug 2005, at 16:07, Martin van den Berg wrote:


$query = SELECT id FROM mytable WHERE bla LIKE  . $x .;
$rows = execute( $query )
if ( $rows == 0 )
{
   /* some more processing */
  $query = INSERT INTO mytable .. etc etc
  execute( $query )
}


I have lots of places where I used to do this. Since MySQL 4.1 I've  
changed it to something like:


INSERT INTO mytable SET id = 123, bla = '$x' ON DUPLICATE KEY UPDATE  
bla='$x'


This saves you a query, and makes it easier to isolate the insert/ 
update as it will play nicely with autocommit.


Marcus
--
Marcus Bointon
Synchromedia Limited: Putting you in the picture
[EMAIL PROTECTED] | http://www.synchromedia.co.uk

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] parallel execution of php code?

2005-08-09 Thread Martin van den Berg
 I have lots of places where I used to do this. Since MySQL 4.1 I've
 changed it to something like:

 INSERT INTO mytable SET id = 123, bla = '$x' ON DUPLICATE KEY UPDATE
 bla='$x'

 This saves you a query, and makes it easier to isolate the insert/
 update as it will play nicely with autocommit.


Neat construction but in my case, id is an auto-increment field... so
I guess your example won't work since I never get a duplicate key.

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] parallel execution of php code?

2005-08-08 Thread Edward Vermillion

Martin van den Berg wrote:

I have this piece of php-code which inserts data into a database.
Before inserting it must verify if the data is unique. The php code
looks something like:

$query = SELECT id FROM mytable WHERE bla LIKE  . $x .;
$rows = execute( $query )
if ( $rows == 0 )
{
   /* some more processing */
  $query = INSERT INTO mytable .. etc etc
  execute( $query )
}

Now here is the problem: when the user enters the page, and directly
refreshes the record is inserted twice Is is possible that both
requests are processed simulatiounsly by the server (apache on linux)?
And can I add something like a critical section or semaphore to
overcome this problem.

Thanx,

Martin

It's my understanding that a SELECT has a higher priority than an INSERT 
on most MySQL setups.


I've ran into this problem on a site I hobby-code on also. It's my guess 
that the write isn't hitting the DB in time for the second read to pick 
it up, but that's a guess. As far as what to do about it if that's the 
problem, I'll let someone else come up with that answer. ;) It's not 
*that* critical in my app and it doesn't happen often enough to be a 
real problem for me, but I would like to know if there's a way around it.


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] parallel execution of php code?

2005-08-08 Thread Richard Davey
Hello Martin,

Monday, August 8, 2005, 4:07:47 PM, you wrote:

MvdB Now here is the problem: when the user enters the page, and
MvdB directly refreshes the record is inserted twice Is is
MvdB possible that both requests are processed simulatiounsly by the
MvdB server (apache on linux)? And can I add something like a
MvdB critical section or semaphore to overcome this problem.

What data type does the execute function return? You're doing a
loose type comparison between $rows and zero (== rather than ===),
because it's a loose comparison a number of different results could
equal zero in this case. For example if execute() returned false your
code would assume an insert is required, which might not be the case.

If you're using MySQL then you may want to look at using a different
method for checking / inserting this data. Rather than a SELECT
followed by INSERT you could use an INSERT IGNORE which won't
duplicate data if it already exists. Or possibly REPLACE INTO -
depends how you need this to work (i.e. retain the oldest copy of the
data, or keep the newest). Look in the MySQL manual for those two
functions for more info.

Best regards,

Richard Davey
-- 
 http://www.launchcode.co.uk - PHP Development Services
 Zend Certified Engineer
 I do not fear computers. I fear the lack of them. - Isaac Asimov

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP] parallel execution of php code?

2005-08-08 Thread Michael Sims
Martin van den Berg wrote:
 I have this piece of php-code which inserts data into a database.
 Before inserting it must verify if the data is unique. The php code
 looks something like:

 $query = SELECT id FROM mytable WHERE bla LIKE  . $x .;
 $rows = execute( $query )
 if ( $rows == 0 )
 {
/* some more processing */
   $query = INSERT INTO mytable .. etc etc
   execute( $query )
 }

 Now here is the problem: when the user enters the page, and directly
 refreshes the record is inserted twice Is is possible that both
 requests are processed simulatiounsly by the server (apache on linux)?
 And can I add something like a critical section or semaphore to
 overcome this problem.

The problem with the approach above is that a race condition exists between the
check for the existence of the row in question, and the insertion of that row.  
It's
possible that the two requests can come so close together that both of them 
execute
their selects before either do their inserts.  It's not very likely in the 
simplest
cases, but as the amount of traffic (or the number of users you have who like to
quickly click refresh) increases there is a greater chance that this race 
condition
will cause a problem.

In my opinion it's best to let your RDBMS handle this concurrency problem, since
it's best equipped to do that.  Ideally you would be using some sort of 
constraint
to prevent duplicate rows in your table...whether this is a primary key, unique
index, foreign key, etc.  Inserting a duplicate row should result in an error 
from
the database.  In that case you can trap for the error in your PHP code (using
functions like mysql_error()) and handle it appropriately (for example, 
displaying a
friendly error message, or simply ignoring the query).

Another approach would be to start a transaction with a high isolation level 
before
executing the select, but to me this is less desirable because depending on your
database system it may cause contention problems if the entire table has to be
locked.  Simply attempting the insert and catching the error should be much 
lighter,
assuming it's possible to create the appropriate constraint in your database.

HTH

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] parallel execution of php code?

2005-08-08 Thread M Saleh EG
Check if you're using MySQL 4.1. If Yes use the subquery functionality. 
So you could have your query as following:
Insert into sometable where not id=NULL and id=Select id from mytable where 
bla like 'some pattern';
 Not really sure if it would work thogh. Havent tried it yet.
 HTH.
 On 8/8/05, Michael Sims [EMAIL PROTECTED] wrote: 
 
 Martin van den Berg wrote:
  I have this piece of php-code which inserts data into a database.
  Before inserting it must verify if the data is unique. The php code
  looks something like:
 
  $query = SELECT id FROM mytable WHERE bla LIKE  . $x .;
  $rows = execute( $query )
  if ( $rows == 0 )
  {
  /* some more processing */
  $query = INSERT INTO mytable .. etc etc
  execute( $query )
  }
 
  Now here is the problem: when the user enters the page, and directly
  refreshes the record is inserted twice Is is possible that both
  requests are processed simulatiounsly by the server (apache on linux)?
  And can I add something like a critical section or semaphore to
  overcome this problem.
 
 The problem with the approach above is that a race condition exists 
 between the
 check for the existence of the row in question, and the insertion of that 
 row. It's
 possible that the two requests can come so close together that both of 
 them execute
 their selects before either do their inserts. It's not very likely in the 
 simplest
 cases, but as the amount of traffic (or the number of users you have who 
 like to
 quickly click refresh) increases there is a greater chance that this race 
 condition
 will cause a problem.
 
 In my opinion it's best to let your RDBMS handle this concurrency problem, 
 since
 it's best equipped to do that. Ideally you would be using some sort of 
 constraint
 to prevent duplicate rows in your table...whether this is a primary key, 
 unique
 index, foreign key, etc. Inserting a duplicate row should result in an 
 error from
 the database. In that case you can trap for the error in your PHP code 
 (using
 functions like mysql_error()) and handle it appropriately (for example, 
 displaying a
 friendly error message, or simply ignoring the query).
 
 Another approach would be to start a transaction with a high isolation 
 level before
 executing the select, but to me this is less desirable because depending 
 on your
 database system it may cause contention problems if the entire table has 
 to be
 locked. Simply attempting the insert and catching the error should be much 
 lighter,
 assuming it's possible to create the appropriate constraint in your 
 database.
 
 HTH
 
 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 
 


-- 
M.Saleh.E.G
97150-4779817