What I have done is to have a field in the table  that contains the keys that 
make the record unique with all spaces, special characters, and vowels removed, 
as well as eliminating any consecutive consonants . See my example below. So 
far this has worked pretty well for me and I guess would fall under David’s 
category of "Carefully program your system to detect and prevent duplicate rows”

John

—————— Example ------------
$firstName:=[Contacts]firstName (John)
$lastName:=[Contacts]lastName (Baughman)
$company:=[Contacts]company (BY’te DESIGN Hawwaii)  notice I have mistakenly 
put more than 1 w in Hawaii.
$DupeCheck:= $firstName+ $lastName+$company
[Contacts]DupeCheck:=AlphaOnlyNoVowels ($DupeCheck;"*”)  //AlphaOnlyNoVowels 
does the heavy lifting. The asterisk tells the method to remove consecutive 
consonants.

[Contacts]DupeCheck now contains  “JHNBGHMNBYTDSGNHW”

I wrap the above in a duplicate checking method for the [Contacts] table called 
ContactsDuplicateManager

Whenever a record is updated or created in the Contacts table…

$DupeCheck:= ContactsDuplicateManager ("isDuplcate”;[Contacts]ID)

The ContactsDuplicate method creates the check string as above and searches the 
contacts table for duplicates using the [Contacts]DupeCheck field. If no 
duplicates are found it returns the check string. If a duplicate is found it 
returns the check string with a prepended asterisk.  The contact ID if passed 
prevents the dupe check from finding the record being updated. If this is a new 
record 0 is passed for the Contact ID. So…

If ($DupeCheck =“*@“
   Handle the duplicate in context. If, for example this is a user updating or 
creating a contact record, warn the user of the possible duplicate with 
available options.

else
   [Contacts]DupeCheck:=$DupeCheck
   SAVE RECORD([Contacts)
  
end if

-------------------------------------





> On Aug 7, 2017, at 6:56 AM, Dennis, Neil via 4D_Tech <[email protected]> 
> wrote:
> 
>> How do you deal with that problem (Preventing duplicate data)
> 
> When unique data is required because of a business need, I do implement one 
> of your suggested methods: "Carefully program your system to detect and 
> prevent duplicate rows."
> 
> I would suggest not doing this in a trigger, but instead on data entry 
> (imports, user entry). The 4D command "Find in Field" works in many of these 
> cases.
> 
> Neil
> 
> 
> 
> 
> --
> 
> 
> Privacy Disclaimer: This message contains confidential information and is 
> intended only for the named addressee. If you are not the named addressee you 
> should not disseminate, distribute or copy this email. Please delete this 
> email from your system and notify the sender immediately by replying to this 
> email.  If you are not the intended recipient you are notified that 
> disclosing, copying, distributing or taking any action in reliance on the 
> contents of this information is strictly prohibited.
> 
> The Alternative Investments division of UMB Fund Services provides a full 
> range of services to hedge funds, funds of funds and private equity funds.  
> Any tax advice in this communication is not intended to be used, and cannot 
> be used, by a client or any other person or entity for the purpose of (a) 
> avoiding penalties that may be imposed on any taxpayer or (b) promoting, 
> marketing, or recommending to another party any matter addressed herein.
> **********************************************************************
> 4D Internet Users Group (4D iNUG)
> FAQ:  http://lists.4d.com/faqnug.html
> Archive:  http://lists.4d.com/archives.html
> Options: http://lists.4d.com/mailman/options/4d_tech
> Unsub:  mailto:[email protected]
> **********************************************************************

John Baughman
Kailua, Hawaii
(808) 262-0328
[email protected]





**********************************************************************
4D Internet Users Group (4D iNUG)
FAQ:  http://lists.4d.com/faqnug.html
Archive:  http://lists.4d.com/archives.html
Options: http://lists.4d.com/mailman/options/4d_tech
Unsub:  mailto:[email protected]
**********************************************************************

Reply via email to