Hi
I have been working on a Perl script to do just that, but I have had a coversation
with Alex, and he let me know that the approach will break the integrity of the
relationshop between tables. I could share most of the code, but I would suggest to
come up with a system that will integrate into the search engine, instead of building
a parallel system, and end up reverse engineering a lot of things.
At this point my script can read from a blob locate offending urls in the database,
and I can delete all citations to that url, delete the blob of the specific bad word,
the url itself, but that leaves behind the index of all other words in taht document
in other blobs which refer to the deleted urlid (this is what would break the search)
So any ideas are apretiated, if anyone knows if a way NOT to break the existent code
and still hide/delete bad urls form teh results, then I am happy to make those
modifications.
best regards
Adonis
[EMAIL PROTECTED] ���:
>
hello !
I received your message.
I would like to know your step, in order to know if I am able to help
you for the programming.
thank you.
[EMAIL PROTECTED]
Adonis El Fakih wrote:
> I am working on a perl script that will do that. It will take me a while to finish
>it since I am busy, and I have to do a lot of testing to make sure it works right.
>
> If anyone wants to help, I would welcome his/her help.
>
> best regards
> adonis
>
>
> [EMAIL PROTECTED] ���:
>
>> hi everybody
>
>
> because it's long to reindex all url,
> I want to know does that's possible to remove a particular url?
>
>
>
______________________________________________________________
Ayna, The Arabic Internet Starts Here. http://www.ayna.com