php-general Digest 20 Jul 2010 11:06:38 -0000 Issue 6855

Topics (messages 307023 through 307027):

Re: enabling domdocument
        307023 by: Michael A. Peters

Re: MySQL Query Puzzle
        307024 by: Jim Lucas
        307027 by: Shreyas Agasthya

socket problem
        307025 by: Ümit CAN

Re: Determining the similarity between a user supplied short piece of text 
(between 5 and 15 characters) and a list of similar length text items.
        307026 by: Richard Quadling

Administrivia:

To subscribe to the digest, e-mail:
        php-general-digest-subscr...@lists.php.net

To unsubscribe from the digest, e-mail:
        php-general-digest-unsubscr...@lists.php.net

To post to the list, e-mail:
        php-gene...@lists.php.net


----------------------------------------------------------------------
--- Begin Message ---
Ashley Sheridan wrote:



OK, I seem to have answered my own question!

It seems that even though PHP had the XML module enabled, I still needed
to run 'yum update php-xml' in order for it to load in the DOM module.
It's now working fine, and for those of you interested, the ./configure
line in phpinfo() still says --disable-dom!

Yes.
I'm quite familiar with the rpm build process for php.

The initial build of php is without support for any of the modules. Then the modules are built. So the configure command for the core php apache DSO has just about everything disabled.

A nasty side effect of doing it this way - if you ever need to rebuild the src.rpm, either do it in a chroot build environment (such as mock) or be sure to remove all the old php packages - because what can happen is the new php is built but when it then goes to build the modules, it links them against installed php instead of the php it just built.

They may have fixed that in the Makefile, I don't know, but the net result can be a set of rpms that are broken.
--- End Message ---
--- Begin Message ---
Peter wrote:
Hi All,

I have a table which contain's some duplicate rows. I just want to delete the duplicate records alone
not original records.

Assume my table as look as below

column1 column2
1
    a
1
    a
2
    b
3
    c
3
    c



i want the above table need to be as below, After executing the mysql query.

column1
    column2
1
    a
2
    b
3
    c




Thanks in advance..

Regards
Peter


Use the SQL command alter with the ignore flag.

ALTER IGNORE TABLE `your_table` ADD UNIQUE ( `column1` , `column2` )

I tested this on my test DB and it worked fine. It erased all the duplicates and left one instance of the multiple entry values.

This will add a permanent unique restraint to the table. So, you will never have dupps again.

Jim Lucas

--- End Message ---
--- Begin Message ---
I am very keen to see a closure to this thread so that I can add to my
snippets.
Let's all know what worked best out of many solutions that have been
proposed.

--Shreyas

On Tue, Jul 20, 2010 at 10:07 AM, Jim Lucas <li...@cmsws.com> wrote:

> Peter wrote:
>
>> Hi All,
>>
>> I have a  table which contain's some duplicate rows. I just want to delete
>> the duplicate records alone
>> not original records.
>>
>> Assume my table as look as below
>>
>> column1 column2
>> 1
>>    a
>> 1
>>    a
>> 2
>>    b
>> 3
>>    c
>> 3
>>    c
>>
>>
>>
>> i want the above table need  to be as below, After executing the mysql
>> query.
>>
>> column1
>>    column2
>> 1
>>    a
>> 2
>>    b
>> 3
>>    c
>>
>>
>>
>>
>> Thanks in advance..
>>
>> Regards
>> Peter
>>
>>
> Use the SQL command alter with the ignore flag.
>
> ALTER IGNORE TABLE `your_table` ADD UNIQUE ( `column1` , `column2` )
>
> I tested this on my test DB and it worked fine.  It erased all the
> duplicates and left one instance of the multiple entry values.
>
> This will add a permanent unique restraint to the table.  So, you will
> never have dupps again.
>
> Jim Lucas
>
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>
>


-- 
Regards,
Shreyas Agasthya

--- End Message ---
--- Begin Message ---
Hi All;

    I have got problem on socket. First client send query on socket and running 
process for first client, socket blocking. Second client same time send query 
socket, accept second client but not runing process. Waiting first client end 
process. How can I do multi client runing process etc apache,telnet? Is it 
possible with php?


__________ ESET Smart Security Akıllı Güvenlik tarafından sağlanan bilgiler, 
virüs imza veritabanı sürümü: 5293 (20100719) __________

İleti ESET Smart Security Akıllı Güvenlik tarafından denetlendi.

http://www.nod32.com.tr


--- End Message ---
--- Begin Message ---
On 19 July 2010 19:46, tedd <tedd.sperl...@gmail.com> wrote:
> At 12:39 PM +0100 7/19/10, Richard Quadling wrote:
>>
>> I'm using MS SQL, not mySQL.
>>
>> Found a extended stored procedure with a UDF.
>>
>> Testing it looks excellent.
>>
>> Searching for a match on 30,000 vehicles next to no additional time -
>> a few seconds in total, compared to the over 3 minutes to search using
>> SQL code.
>
> That seems a bit slow.
>
> For example, currently I'm searching over 4,000 records (which contains
> 4,000 paragraphs taken from the text of the King James version of the Bible)
> for matching words, such as %created% and the times are typically around
> 0.009 seconds.
>
> As such, searching ten times that amount should be in the range of tenths of
> a second and not seconds -- so taking a few seconds to search 30,000 records
> seems excessive to me.


Tedd,

I'm not looking for a "word". I'm looking for similar "wrds".

Word is closer to the misspelled wrds that it is to wars.

select dbo.DamerauLevenshteinDistance('words', 'wars'),
dbo.DamerauLevenshteinDistance('words', 'wrds')

(No column name)        (No column name)
2       1

Lower is better.

Also, I have to compare every row in the set and then sort it to find
the lowest values for the Damerau-Levenshtein or the highest for the
Jaro–Winkler distance.

As the value entered is always going to be the unknown, I can't
pre-calculate the distances.

I do an exact match test first.

--- End Message ---

Reply via email to