Re: Long urls - update
Dear John, On Sat, Mar 30, 2013 at 05:15:28PM +0100, John Niendorf wrote: This looks really great, but where do I put the script? I made it exicutable and put it in my path and I got the error that it wasn't there. I coped it to ~/ and got the same error. John http://www.memoryhole.net/~kyle/extract_url/ Try this. It's brilliant. I put it in one of the subdirectories of my $PATH (in ~/bin) and made the program executable (chmod +x ~/bin/extract_url.pl). I made a configuration file (~/.extract_urlview) with a sinngle line (COMMAND /etc/urlview/url_handler.sh) which is the COMMAND used by default by urlview, the program I employed previously, I installed the prerequisite perl packages and it worked nicely (under Debian/testing). I hope it helps. In particular, I liked the fact that I can see the context of the url's so I don't have to guess anymore which url to choose. Best regards, Luis -- o W. Luis Mochán, | tel:(52)(777)329-1734 /(*) Instituto de Ciencias Físicas, UNAM | fax:(52)(777)317-5388 `/ /\ Apdo. Postal 48-3, 62251 | (*)/\/ \ Cuernavaca, Morelos, México | moc...@fis.unam.mx /\_/\__/ GPG: DD344B85, 2ADC B65A 5499 C2D3 4A3B 93F3 AE20 0F5E DD34 4B85
Re: Long urls - update
I found a mistake in the extract_url.pl program: it doesn't escape ampersands when present in the url, so when the command to actually view the url is invoked, the shell gets confused. I made a quick fix by substituting $command=~s//\\/g before running command. http://www.memoryhole.net/~kyle/extract_url/ Try this. It's brilliant. -- o W. Luis Mochán, | tel:(52)(777)329-1734 /(*) Instituto de Ciencias Físicas, UNAM | fax:(52)(777)317-5388 `/ /\ Apdo. Postal 48-3, 62251 | (*)/\/ \ Cuernavaca, Morelos, México | moc...@fis.unam.mx /\_/\__/ GPG: DD344B85, 2ADC B65A 5499 C2D3 4A3B 93F3 AE20 0F5E DD34 4B85
Re: Long urls - update
Incoming from Luis Mochan: I found a mistake in the extract_url.pl program: it doesn't escape ampersands when present in the url, so when the command to actually view the url is invoked, the shell gets confused. I made a quick fix by substituting $command=~s//\\/g before running command. Line 633? 634? So: # $command =~ s/%s/'$url'/g; $command=~s//\\/g; I'm a perl guy, yet that's non-trivial here. Thx. :-) -- Any technology distinguishable from magic is insufficiently advanced. (*) :(){ :|: };: - - signature.asc Description: Digital signature
Re: Long urls - update
Line 633? 634? So: # $command =~ s/%s/'$url'/g; $command=~s//\\/g; Sorry for not having given the line numbers, etc. I actually made changes around 522 and 647, and defined a new subroutine (I named it wlmsanitize) which modifies the command to run. A patch follows. I'm a perl guy, yet that's non-trivial here. Thx. :-) You're welcome. I don't know if there are other characters that appear in an url and need to be escaped for the shell ([;]?); they could easily be accomodated by modifying 'wlmsanitize'. The page for the extract_url project (http://www.memoryhole.net/~kyle/extract_url/) mentions that the program already transforms characters dangerous to the shell, but then it only mentions explicitly single quotes and dollar signs. Best regards, Luis -- patch to fix ampersands in urls : --- extract_url.pl~ 2013-03-31 12:35:39.303174972 -0600 +++ extract_url.pl 2013-03-31 15:10:47.822005282 -0600 @@ -519,7 +519,7 @@ } else { $urlviewcommand .= $url; } - system $urlviewcommand; + system wlmsanitize($urlviewcommand); exit 0; } @@ -644,7 +644,7 @@ ); } if ($return) { - system $command; + system wlmsanitize($command); if ($stayopen == 0) { exit 0 if ($persist == 0); } else { @@ -689,6 +689,14 @@ print $value\n; } } + +sub wlmsanitize { +my $cmd=shift @_; +$cmd =~ s//\\/g; +return $cmd; +} + + =head1 NAME Bextract_url.pl -- extract URLs from email messages signature.asc Description: Digital signature