Doğacan Güney wrote: > On 6/22/07, Andrzej Bialecki <[EMAIL PROTECTED]> wrote: >> Doğacan Güney wrote: >> >> > These 'urls' most likely come from parse-js plugin. Can you disable it >> > and see if they disappear? To extract links from js code, parse-js >> > uses a heuristic that unfortunately also may extract garbage urls. >> > >> >> Improvements to this heuristic are welcome ;) This plugin was a quick >> hack to get going on sites that provide javascript-only navigation, but >> a better regex is definitely needed. > > Perhaps we can use something like commons-validator to filter garbage > urls in ParseOutputFormat? Or would that be too slow?
Hmm, I'd rather not bring another dependency to this plugin, if a simple regex suffices ... I think the challenge is to modify the regex pattern so that is excludes forbidden characters, such as < >, space etc. -- Best regards, Andrzej Bialecki <>< ___. ___ ___ ___ _ _ __________________________________ [__ || __|__/|__||\/| Information Retrieval, Semantic Web ___|||__|| \| || | Embedded Unix, System Integration http://www.sigram.com Contact: info at sigram dot com ------------------------------------------------------------------------- This SF.net email is sponsored by DB2 Express Download DB2 Express C - the FREE version of DB2 express and take control of your XML. No limits. Just data. Click to get it now. http://sourceforge.net/powerbar/db2/ _______________________________________________ Nutch-general mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/nutch-general
