Richard Davey ha scritto:
Hi Burn,

Wednesday, June 6, 2007, 3:55:21 PM, you wrote:

Here's how I do it, for 1 single table though.. you'll have to write
yourself the mod to check on more tables. If you plan to have > 10 millions records make sure the maxrand is higher.

When the "do" loop exits you have a unique id.

function generateID() {
        $minrand = 1;
        $maxrand = 99999999;
        $uniqueid[0] = mt_rand($minrand, $maxrand);
        $uniqueid[1] = $uniqueid[0];
        return $uniqueid[1];
}
do {
        //New ID generation
        $newid = generateID();
        //Database check
$sqlcheck = "SELECT youruniqueidhere FROM yourtablehere WHERE youruniqueidhere = '".$newid."'";
        $conn = mysql_connect($db_host, $db_username, $db_password);
        mysql_select_db($db_name, $conn);
$rs = mysql_query($sqlcheck, $conn);
        $rscount = mysql_num_rows($rs);
} while ($rscount != 0);

Not that it is likely to occur on a low-traffic site, but the above
approach is a race condition waiting to happen I'm afraid. It will
however almost certainly satisfy the needs of the OP as long as his
site stays off digg / slashdot :)

Cheers,

Rich

Thanks for pointing it out,

I'm well aware of it, but I use it in a back-end facility managed by a handfull of administrators, and an expected users base within the 100.000 units. The hit rate is 1:10000 Perfectly reasonable with 5 possible concurrent threads running the code :)

Worst thing that can happen is having mysql throw an error while trying to insert an ID that's already present in the database. It gets more complicated and unpredictable if the field isn't required to be unique at database design level.

Wanting to make it perfectly safe the check should include the INSERT statement and continue looping till it's successful as well.

Cheers!

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to