Re: Long urls - update

2013-04-04 Thread Tom
On Tue, Apr 02, 2013 at 09:27:17AM +0100, James Griffin wrote:
 Mon  1.Apr'13 at  8:14:35 -0600 Luis Mochan
   By the way, the author of the program, Kyle Wheeler, wrote to me that
   he expects that adding the line
COMMAND /etc/urlview/url_handler.sh '%s'
   to the configuration file ~/.extract_urlview would be enough to solve
   the problem (with %s between quotes). I believe I had tried that and
   that it didn't work, but now I'm not completely sure. You could try it. 
  
  Well I tried it and it doesn't work without the patch; it is not
  enough to add '%s' to COMMAND.  
  Regards,
  Luis
 
 You are not using the program correctly. extract_urlview has worked
 perfectly with mutt, for me, for probably about 2 years now. Why bother
 trying to integrate it into your shell script, just use it as a
 stand-alone program and don't use urlview at all.
 
 the contents of my ~/.extract_urlview:
 
 
 COMMAND firefox %s 
 SHORTCUT
 PERSISTENT
 
 That's it! The ampersand ensures Firefox runs in the background. Nothing
 else needed.
 
 -- 
 James Griffin:jmz at kontrol.kode5.net 
   jmzgriffin at gmail.com
 
Thanks mine never worked right until I used your line for
COMMAND.  Using the example for COMMAND that is on the
.extract_urlview website mozilla would go to google and if the
url was in google then could go to it from there if not then
couldn't get there with .extract_urlview.

Tom


Re: Long urls - update

2013-04-03 Thread James Griffin
Tue  2.Apr'13 at 11:51:15 -0600 Luis Mochan
 Hello James,
  You are not using the program correctly. extract_urlview has worked
  perfectly with mutt, for me, for probably about 2 years now. Why bother
  trying to integrate it into your shell script, just use it as a
  stand-alone program and don't use urlview at all.
  
  the contents of my ~/.extract_urlview:
  COMMAND firefox %s 
  ...
 As there are so many types of URLs, and I am not sure whether my
 browser can handle all of them correctly, I wanted to continue using
 the same handler that urlview, the program I used until a few days
 ago, used. I guess your suggestion would work for most, or maybe all
 of the cases, and the complexity of /etc/urlview/url_handler.sh may be
 unnecessary, but now it is working and, at the end, the fix was
 trivial, i.e. changing 'COMMAND /etc/urlview/url_handler.sh' into
 'COMMAND /etc/urlview/url_handler.sh %s'.
 Thanks and regards,
 Luis

Ah that's great news. Glad you got it working. I like Kyle's program, so
I'm glad you have found it useful too. It's typical isn't it, that
there's always a simple explanation for things not working at first. :-)

Cheers, Jamie.


-- 
James Griffin:  jmz at kontrol.kode5.net 
jmzgriffin at gmail.com

A4B9 E875 A18C 6E11 F46D  B788 BEE6 1251 1D31 DC38


Re: Long urls - update

2013-04-02 Thread James Griffin
Mon  1.Apr'13 at  8:14:35 -0600 Luis Mochan
  By the way, the author of the program, Kyle Wheeler, wrote to me that
  he expects that adding the line
   COMMAND /etc/urlview/url_handler.sh '%s'
  to the configuration file ~/.extract_urlview would be enough to solve
  the problem (with %s between quotes). I believe I had tried that and
  that it didn't work, but now I'm not completely sure. You could try it. 
 
 Well I tried it and it doesn't work without the patch; it is not
 enough to add '%s' to COMMAND.  
 Regards,
 Luis

You are not using the program correctly. extract_urlview has worked
perfectly with mutt, for me, for probably about 2 years now. Why bother
trying to integrate it into your shell script, just use it as a
stand-alone program and don't use urlview at all.

the contents of my ~/.extract_urlview:


COMMAND firefox %s 
SHORTCUT
PERSISTENT

That's it! The ampersand ensures Firefox runs in the background. Nothing
else needed.

-- 
James Griffin:  jmz at kontrol.kode5.net 
jmzgriffin at gmail.com

A4B9 E875 A18C 6E11 F46D  B788 BEE6 1251 1D31 DC38


Re: Long urls - update

2013-04-02 Thread Kyle Wheeler
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256

On Monday, April  1 at 07:30 PM, quoth Luis Mochan:
 I tried now your fix, and it didn't work for me; my browser doesn't 
 find the resulting pages when the url has ampersands that are 
 converted to %26 (probably because the % itself is further encoded 
 as %25 before been sent to the server by the browser (?))

What?!? That's *really* strange. Is extract_url.pl double-encoding the 
percent, or is some script in between doing that?

 I don't know much about shell programming, but I found that 
 /etc/urlhandler/url_handler.sh is a shell script that obtains its 
 url doing '$url=$1'.

Ahh, indeed, that could cause a problem. Variables are substituted 
simply, in an as if typed manner. Take this simple example:

   # foo=bar
   # echo $foo
   bar

Fairly straightforward. BUT, if we use an ampersand:

   # foo=barbaz
   [1] 20113
   baz: command not found
   [1]+  Donefoo=bar
   # echo $foo

   #

You see, the ampersand did two things there: first, it created a 
sub-shell to execute 'foo=bar' in the background, and then attempted 
to execute 'baz' as if it were a command. What that shell script 
*should* have done is quote the argument, like this:

   url=$1

To return to my example:

   # foo=barbaz
   # echo $foo
   barbaz

 I don't understand why echo by itself yields the correct result 
 (above) while echo through a bash script yields the truncated 
 result.

It's because of the incorrectly quoted variable assignment, which is 
cutting off the URL at the ampersand.

~Kyle
- -- 
Always forgive your enemies; nothing annoys them so much.
 -- Oscar Wilde
-BEGIN PGP SIGNATURE-
Comment: Thank you for using encryption!

iQIcBAEBCAAGBQJRWvELAAoJECuveozR/AWeW+QP/0a/uhmbPio/web5Fu4zbcS8
VFj5/FJR1ql/0Wfrs9eL/JPb06cyvGm8M2xrLq/eVwuvMc4gPzIKwtqEwutanVh8
1JvUWCYFlQxqwYW5H4EKT9y6zCkqgJenisXdD0T9EbWO13fleDnlBXki5o63WPyb
GzPA11TuHOXIq6cxcMWfx6tt8DgtkIj36Jk5bpcS0/mE1s+nthsTCGILhM3PVU5k
ejLlFUFY1L7YUePqYpqhyCA/akMOvpcbMPdxw701WWMcr1UGi9Netu/otpZTPrz0
XiS2XvdzggCMwIRBJU5buhT9HpOOjmH2qPXIr0WDW3S6MKquAvm8agHvtkgXEqNI
0vGZ/KqKoNUBUJUgjOaFW9zZFvUHrYhzfsdFHjspglVbPMreqw3DggZErUa0Eifm
q1j0dYsOZ/xmmp3iFO38iYfc5NBTJcXuBZIDS+scRJw4ZFhfedEBJcskEX7Jcxnu
l9bwwa+nIxEidUou87EcKslzDB/lux0ihdsZzSRQNmpV82FW59kCqkcY0brLJoeh
GgJgtHk4tFe2Vw4cN4uAw8EpvxnmAP/lDCmTwUVcdKbvy+p8f49vvElpUC+eWnq/
vQuynegYN/GQyj0CQJxU25m35UhKG7X+XzYOKojmnxG9E9O94YANZ5NBxRUuyR8K
ntttDNCaEt6oARXNCp2H
=MZB4
-END PGP SIGNATURE-


Re: Long urls - update

2013-04-02 Thread Luis Mochan
Dear Kyle,
 ...
  /etc/urlhandler/url_handler.sh is a shell script that obtains its 
  url doing '$url=$1'.
 
 Ahh, indeed, that could cause a problem. Variables are substituted 
 simply, in an as if typed manner. Take this simple example:
 ...
Thanks for your explanation. As I wrote last night, the problem
seems to have been solved (following one of your early suggestions).
Best regards,
Luis

-- 

  o
W. Luis Mochán,  | tel:(52)(777)329-1734 /(*)
Instituto de Ciencias Físicas, UNAM  | fax:(52)(777)317-5388 `/   /\
Apdo. Postal 48-3, 62251 |   (*)/\/  \
Cuernavaca, Morelos, México  | moc...@fis.unam.mx   /\_/\__/
GPG: DD344B85,  2ADC B65A 5499 C2D3 4A3B  93F3 AE20 0F5E DD34 4B85




signature.asc
Description: Digital signature


Re: Long urls - update

2013-04-02 Thread Luis Mochan
Hello James,
 You are not using the program correctly. extract_urlview has worked
 perfectly with mutt, for me, for probably about 2 years now. Why bother
 trying to integrate it into your shell script, just use it as a
 stand-alone program and don't use urlview at all.
 
 the contents of my ~/.extract_urlview:
 COMMAND firefox %s 
 ...
As there are so many types of URLs, and I am not sure whether my
browser can handle all of them correctly, I wanted to continue using
the same handler that urlview, the program I used until a few days
ago, used. I guess your suggestion would work for most, or maybe all
of the cases, and the complexity of /etc/urlview/url_handler.sh may be
unnecessary, but now it is working and, at the end, the fix was
trivial, i.e. changing 'COMMAND /etc/urlview/url_handler.sh' into
'COMMAND /etc/urlview/url_handler.sh %s'.
Thanks and regards,
Luis


-- 

  o
W. Luis Mochán,  | tel:(52)(777)329-1734 /(*)
Instituto de Ciencias Físicas, UNAM  | fax:(52)(777)317-5388 `/   /\
Apdo. Postal 48-3, 62251 |   (*)/\/  \
Cuernavaca, Morelos, México  | moc...@fis.unam.mx   /\_/\__/
GPG: DD344B85,  2ADC B65A 5499 C2D3 4A3B  93F3 AE20 0F5E DD34 4B85




signature.asc
Description: Digital signature


Re: Long urls - update

2013-04-01 Thread James Griffin
Sun 31.Mar'13 at 15:37:28 -0600 Luis Mochan
 I found a mistake in the extract_url.pl program: it doesn't escape
 ampersands when present in the url, so when the command to actually
 view the url is invoked, the shell gets confused. I made a quick fix
 by substituting $command=~s//\\/g before running command.

I installed the curses-ui module to completely replace urlview, I have
never exerienced this problem.

-- 
James Griffin:  jmz at kontrol.kode5.net 
jmzgriffin at gmail.com

A4B9 E875 A18C 6E11 F46D  B788 BEE6 1251 1D31 DC38


Re: Long urls - update

2013-04-01 Thread Chris Bannister
On Sun, Mar 31, 2013 at 10:00:58PM -0600, s. keeling wrote:
 Incoming from Luis Mochan:
  I found a mistake in the extract_url.pl program: it doesn't escape
  ampersands when present in the url, so when the command to actually
  view the url is invoked, the shell gets confused. I made a quick fix
  by substituting $command=~s//\\/g before running command.
 
 Line 633?  634?  So:
 
# $command =~ s/%s/'$url'/g;
$command=~s//\\/g;
 
 I'm a perl guy, yet that's non-trivial here.  Thx.  :-)

Are you sure that will work? You've just commented out a line of code.
(Just wondering what your patch would look like.)

Disclaimer: I haven't looked at extract_url.pl myself.

-- 
If you're not careful, the newspapers will have you hating the people
who are being oppressed, and loving the people who are doing the 
oppressing. --- Malcolm X


Re: Long urls - update

2013-04-01 Thread Chris Bannister
On Tue, Apr 02, 2013 at 01:06:05AM +1300, Chris Bannister wrote:
 On Sun, Mar 31, 2013 at 10:00:58PM -0600, s. keeling wrote:
  Incoming from Luis Mochan:
   I found a mistake in the extract_url.pl program: it doesn't escape
   ampersands when present in the url, so when the command to actually
   view the url is invoked, the shell gets confused. I made a quick fix
   by substituting $command=~s//\\/g before running command.
  
  Line 633?  634?  So:
  
 # $command =~ s/%s/'$url'/g;
 $command=~s//\\/g;
  
  I'm a perl guy, yet that's non-trivial here.  Thx.  :-)
 
 Are you sure that will work? You've just commented out a line of code.
 (Just wondering what your patch would look like.)

Ahh, see it's included in a message by Luis Mochan in this thread.

-- 
If you're not careful, the newspapers will have you hating the people
who are being oppressed, and loving the people who are doing the 
oppressing. --- Malcolm X


Re: Long urls - update

2013-04-01 Thread John Niendorf

Hi Guys,

I guess I'm the slow one on the list.
Is there more to the patch than commenting out 


# $command =~ s/%s/'$url'/g;

and replacing it with

$command=~s//\\/g

Because either way, extract_url.pl isn't working for me.
I can see the list of urls, but if I click on one I still get a page not found 
error.  Another odd thing is that if I press c for context I sometimes get the 
full url displayed in a little box, like it should, but I often see the text 
that is near the link in the email, in the little box.
I've tried changing the view, but that doesn't seem to have much effect.

I've fallen back to using urlscan which seems to work albeit not very elegantly.


On Tue, Apr 02, 2013 at 01:51:22AM +1300, Chris Bannister wrote:

On Tue, Apr 02, 2013 at 01:06:05AM +1300, Chris Bannister wrote:

On Sun, Mar 31, 2013 at 10:00:58PM -0600, s. keeling wrote:
 Incoming from Luis Mochan:
  I found a mistake in the extract_url.pl program: it doesn't escape
  ampersands when present in the url, so when the command to actually
  view the url is invoked, the shell gets confused. I made a quick fix
  by substituting $command=~s//\\/g before running command.

 Line 633?  634?  So:

# $command =~ s/%s/'$url'/g;
$command=~s//\\/g;

 I'm a perl guy, yet that's non-trivial here.  Thx.  :-)

Are you sure that will work? You've just commented out a line of code.
(Just wondering what your patch would look like.)


Ahh, see it's included in a message by Luis Mochan in this thread.

--
If you're not careful, the newspapers will have you hating the people
who are being oppressed, and loving the people who are doing the
oppressing. --- Malcolm X


Re: Long urls - update

2013-04-01 Thread s. keeling
Incoming from John Niendorf:
 
 I guess I'm the slow one on the list.
 Is there more to the patch than commenting out
 
 # $command =~ s/%s/'$url'/g;
 
 and replacing it with
 
 $command=~s//\\/g
 
 Because either way, extract_url.pl isn't working for me.

It looks like that was incorrect; Luis says the changes need to be
made around line 518 and then 637.  I haven't had a chance to test it
yet.


-- 
Any technology distinguishable from magic is insufficiently advanced.
(*) :(){ :|: };:
- -


signature.asc
Description: Digital signature


Re: Long urls - update

2013-04-01 Thread Luis Mochan
Hi John,

 I guess I'm the slow one on the list.
 Is there more to the patch than commenting out
 
 # $command =~ s/%s/'$url'/g;
 
 and replacing it with
 
 $command=~s//\\/g
I didn't comment out that line; it is needed to replace %s by the URL
in the 'COMMAND' that actually opens the URL. What I did was modify
the two 'system' calls. In a previous email to the list I included the
patch.
 Ahh, see it's included in a message by Luis Mochan in this thread.
By the way, the author of the program, Kyle Wheeler, wrote to me that
he expects that adding the line
 COMMAND /etc/urlview/url_handler.sh '%s'
to the configuration file ~/.extract_urlview would be enough to solve
the problem (with %s between quotes). I believe I had tried that and
that it didn't work, but now I'm not completely sure. You could try it. 
Best regards,
Luis

-- 

  o
W. Luis Mochán,  | tel:(52)(777)329-1734 /(*)
Instituto de Ciencias Físicas, UNAM  | fax:(52)(777)317-5388 `/   /\
Apdo. Postal 48-3, 62251 |   (*)/\/  \
Cuernavaca, Morelos, México  | moc...@fis.unam.mx   /\_/\__/
GPG: DD344B85,  2ADC B65A 5499 C2D3 4A3B  93F3 AE20 0F5E DD34 4B85




Re: Long urls - update

2013-04-01 Thread Luis Mochan
 By the way, the author of the program, Kyle Wheeler, wrote to me that
 he expects that adding the line
  COMMAND /etc/urlview/url_handler.sh '%s'
 to the configuration file ~/.extract_urlview would be enough to solve
 the problem (with %s between quotes). I believe I had tried that and
 that it didn't work, but now I'm not completely sure. You could try it. 

Well I tried it and it doesn't work without the patch; it is not
enough to add '%s' to COMMAND.  
Regards,
Luis

-- 

  o
W. Luis Mochán,  | tel:(52)(777)329-1734 /(*)
Instituto de Ciencias Físicas, UNAM  | fax:(52)(777)317-5388 `/   /\
Apdo. Postal 48-3, 62251 |   (*)/\/  \
Cuernavaca, Morelos, México  | moc...@fis.unam.mx   /\_/\__/
GPG: DD344B85,  2ADC B65A 5499 C2D3 4A3B  93F3 AE20 0F5E DD34 4B85




Re: Long urls - update

2013-04-01 Thread Kyle Wheeler
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256

On Sunday, March 31 at 11:16 PM, quoth Luis Mochan:
 I'm a perl guy, yet that's non-trivial here.  Thx.  :-)

 You're welcome. I don't know if there are other characters that appear 
 in an url and need to be escaped for the shell ([;]?); they could 
 easily be accomodated by modifying 'wlmsanitize'. The page for the 
 extract_url project (http://www.memoryhole.net/~kyle/extract_url/) 
 mentions that the program already transforms characters dangerous to 
 the shell, but then it only mentions explicitly single quotes and 
 dollar signs.

Hello,

I'm the author of extract_url.pl, so perhaps I can shed some light 
here.

The *correct* place to fix the issue of escaping (or otherwise 
sanitizing) ampersands is in the sanitizeuri function (line 208). The 
current version of extract_url.pl uses this:

 sub sanitizeuri {
 my($uri) = @_;
 $uri =~ 
s/([^a-zA-Z0-9_.!*()\@:=\?\/%~+-])/sprintf(%%%X,ord($1))/egs;
 return $uri;
 }

Essentially, what that does is explicitly whitelists the characters 
a-z, A-Z, 0-9, _, ., !, *, (, ), @, , :, =, ?, /, %, ~, +, and - and 
turns *anything* else into the percent-encoded equivalent (e.g. %26), 
which should be correctly decoded by any standards-compliant 
URL-decoder (see RFC 3986). If you want to eliminate ampersands from 
the characters allowed in a URL, simply remove the ampersand from that 
list. It's as simple as that. I think Luis's patch is a little overly 
complicated, and I think the policy of using backslashes to escape 
such characters (instead of percent-encoding) is dangerous, given that 
it's more likely to be stripped off by intervening scripts. I don't 
want future bug reports that say my setup strips backslashes, so can 
you create an option that will triple-backslash the $ character?. :) 
(Followed, the next week, by a request for quadruple-backslashing, of 
course!)

I've personally never had a problem with ampersands, and I'm not sure 
why some people do. Extract_url.pl constructs system commands like so:

   /path/to/handler 'http://url.with/anampersand'

... which should be perfectly safe and work just fine (and does for 
me). I suspect the problem stems from using other wrapper script (e.g. 
/etc/urlhandler/urlhandler.sh). I bet the that wrapper script is not 
properly quoting its first argument.

In any event, percent-encoding, by modifying that one line, is 
probably the right way to go.

~Kyle
- -- 
The purpose of computing is insight, not numbers.
  -- Richard W. Hamming
-BEGIN PGP SIGNATURE-
Comment: Thank you for using encryption!

iQIcBAEBCAAGBQJRWfZoAAoJECuveozR/AWeH4oQAKRu3Jg1n7KVXT0q0DogCoE+
Ms/gH8EKUwN8KtWhg3wNDgCIh0GXaNykywQPshbM59qP6U8uFofavngGfQv1YCEV
vM94vsNLY8AOfdv/6tRkQFKDi5RadKRfjcJYqHzr11LSJ2e+Ns+i4gx+0jkSCe9/
2FIWjZjsmH5WUHNktAzC0dCGxqBb6vO4Oc7JRuLpaof6jLWLMvJBgM9HVCf67RrX
aEALusVBqSZKBlr+UBk1lF0obEbijGX+hJuHg8udaOVgCsljpzDcOku5my2V13Pu
LZ1ltKv4/y+Z2tofyjDpXNnsomENYfWb6LGfQgystY8xvSv94TJLOlM7oaSsJmJq
hPdP0T5rJ3lryaadc3I5p7GUI5zqUk0T6e8FM8vM1ZUXS8NyN0ZN7NeSSX/5mAMS
OCCkxxXSaLnbr2HUetjYknnVB4W6WKR2eEjgP+VHMtemRb9W6UVgjO1nnoqm4WOM
zRPDIk6VvJgTPUuIso5oq2JoYC0wowmXJBz31UL6y98p1zcPcZVPFDxtf/9p6pUV
/VTDD4bPZCSaQiwhr2abUd4OxOd5bpYx994Z7L5oCQezGDXhEt6XgeEdGBdT21bt
z8FKnqGNOp0EO9C2kX9fPGbRITXK32urUEqeuuB0AHDp3D7VyZ3KRiXIeFFRWvMj
kQzyzKnbnm1uHloyk89l
=n1YG
-END PGP SIGNATURE-


Re: Long urls - update

2013-04-01 Thread Luis Mochan
Hi Kyle,
 I'm the author of extract_url.pl, so perhaps I can shed some light 
 here.
Thanks.
 The *correct* place to fix the issue of escaping (or otherwise 
 sanitizing) ampersands is in the sanitizeuri function (line 208). The 
 current version of extract_url.pl uses this:
 
  sub sanitizeuri {
  my($uri) = @_;
  $uri =~ 
 s/([^a-zA-Z0-9_.!*()\@:=\?\/%~+-])/sprintf(%%%X,ord($1))/egs;
  return $uri;
  }

I tried now your fix, and it didn't work for me; my browser doesn't
find the resulting pages when the url has ampersands that are
converted to %26 (probably because the % itself is further encoded as
%25 before been sent to the server by the browser (?))

 ...
 I've personally never had a problem with ampersands, and I'm not sure 
 why some people do. Extract_url.pl constructs system commands like so:
 
/path/to/handler 'http://url.with/anampersand'

I changed my handler to '/bin/echo %s tmp.txt' and it wrote the
correct result, so I guess you're right here. 

 ... which should be perfectly safe and work just fine (and does for 
 me). I suspect the problem stems from using other wrapper script (e.g. 
 /etc/urlhandler/urlhandler.sh). I bet the that wrapper script is not 
 properly quoting its first argument.

I don't know much about shell programming, but I found that
/etc/urlhandler/url_handler.sh is a shell script that obtains its url
doing '$url=$1'. I replaced the whole handler by the following
program:
#! /bin/bash 
url=$1; shift
echo $url tmp.txt; 
and found out that the url is cut short at the first ampersand. 

I don't understand why echo by itself yields the correct result (above)
while echo through a bash script yields the truncated result.

Thanks and best regards,
Luis

-- 

  o
W. Luis Mochán,  | tel:(52)(777)329-1734 /(*)
Instituto de Ciencias Físicas, UNAM  | fax:(52)(777)317-5388 `/   /\
Apdo. Postal 48-3, 62251 |   (*)/\/  \
Cuernavaca, Morelos, México  | moc...@fis.unam.mx   /\_/\__/
GPG: DD344B85,  2ADC B65A 5499 C2D3 4A3B  93F3 AE20 0F5E DD34 4B85




signature.asc
Description: Digital signature


Re: Long urls - update

2013-04-01 Thread s. keeling
Incoming from Luis Mochan:
 
 I don't know much about shell programming, but I found that
 /etc/urlhandler/url_handler.sh is a shell script that obtains its url
 doing '$url=$1'. I replaced the whole handler by the following
 program:
 #! /bin/bash 
 url=$1; shift
 echo $url tmp.txt; 
 and found out that the url is cut short at the first ampersand. 
 
 I don't understand why echo by itself yields the correct result (above)
 while echo through a bash script yields the truncated result.

Unix shell handles variables abysmally.  You need to help it a lot to
do the right thing.  *Always* quote variables, else if they're empty
they tend to blow up on you.

  -
#!/bin/bash
#
# usage:gbg.sh http://url.with/anampersand;
#
url=$1; shift
echo $url # tmp.txt

exit 0

# Output:

(0) infidel /home/keeling_ sh/gbg.sh http://url.with/anampersand;
http://url.with/anampersand
  -

HTH.  :-)  Oh, you could do ${1} and ${url} instead, but even they
need to be quoted.


-- 
Any technology distinguishable from magic is insufficiently advanced.
(*) :(){ :|: };:
- -


signature.asc
Description: Digital signature


Re: Long urls - update

2013-04-01 Thread Luis Mochan
 Unix shell handles variables abysmally.  You need to help it a lot to
 do the right thing.  *Always* quote variables, else if they're empty
 they tend to blow up on you.
Thanks for the advice! Your script did work from the command line, but
it was not enough when called from extract_url.pl.
Anyway, I believe that the fix is as simple as adding %s to the COMMAND
COMMAND /etc/urlview/url_handler.sh %s
(as suggested by Kyle) without touching anything else. I thought I had
attempted that before, but I guess now that I made some mistake (as
also suggested by Kyle). I still don't understand why %s fixes the
problem, though. Thanks everybody. 
Best regards,
Luis

-- 

  o
W. Luis Mochán,  | tel:(52)(777)329-1734 /(*)
Instituto de Ciencias Físicas, UNAM  | fax:(52)(777)317-5388 `/   /\
Apdo. Postal 48-3, 62251 |   (*)/\/  \
Cuernavaca, Morelos, México  | moc...@fis.unam.mx   /\_/\__/
GPG: DD344B85,  2ADC B65A 5499 C2D3 4A3B  93F3 AE20 0F5E DD34 4B85




signature.asc
Description: Digital signature


Re: Long urls - update

2013-03-31 Thread Luis Mochan

Dear John,

On Sat, Mar 30, 2013 at 05:15:28PM +0100, John Niendorf wrote:
 This looks really great, but where do I put the script?
 I made it exicutable and put it in my path and I got the error that it wasn't 
 there.
 I coped it to ~/ and got the same error.
 
 John
 
 http://www.memoryhole.net/~kyle/extract_url/
 
 Try this. It's brilliant.

I put it in one of the subdirectories of my $PATH (in ~/bin) and made
the program executable (chmod +x ~/bin/extract_url.pl). I made a
configuration file (~/.extract_urlview) with a sinngle line (COMMAND
/etc/urlview/url_handler.sh) which is the COMMAND used by default by
urlview, the program I employed previously, I installed the
prerequisite perl packages and it worked nicely (under
Debian/testing). I hope it helps. In particular, I liked the fact that I
can see the context of the url's so I don't have to guess anymore which url to
choose. 
Best regards,
Luis
 
-- 

  o
W. Luis Mochán,  | tel:(52)(777)329-1734 /(*)
Instituto de Ciencias Físicas, UNAM  | fax:(52)(777)317-5388 `/   /\
Apdo. Postal 48-3, 62251 |   (*)/\/  \
Cuernavaca, Morelos, México  | moc...@fis.unam.mx   /\_/\__/
GPG: DD344B85,  2ADC B65A 5499 C2D3 4A3B  93F3 AE20 0F5E DD34 4B85




Re: Long urls - update

2013-03-31 Thread Luis Mochan
I found a mistake in the extract_url.pl program: it doesn't escape
ampersands when present in the url, so when the command to actually
view the url is invoked, the shell gets confused. I made a quick fix
by substituting $command=~s//\\/g before running command.

  http://www.memoryhole.net/~kyle/extract_url/
  Try this. It's brilliant.

-- 

  o
W. Luis Mochán,  | tel:(52)(777)329-1734 /(*)
Instituto de Ciencias Físicas, UNAM  | fax:(52)(777)317-5388 `/   /\
Apdo. Postal 48-3, 62251 |   (*)/\/  \
Cuernavaca, Morelos, México  | moc...@fis.unam.mx   /\_/\__/
GPG: DD344B85,  2ADC B65A 5499 C2D3 4A3B  93F3 AE20 0F5E DD34 4B85




Re: Long urls - update

2013-03-31 Thread s. keeling
Incoming from Luis Mochan:
 I found a mistake in the extract_url.pl program: it doesn't escape
 ampersands when present in the url, so when the command to actually
 view the url is invoked, the shell gets confused. I made a quick fix
 by substituting $command=~s//\\/g before running command.

Line 633?  634?  So:

   # $command =~ s/%s/'$url'/g;
   $command=~s//\\/g;

I'm a perl guy, yet that's non-trivial here.  Thx.  :-)


-- 
Any technology distinguishable from magic is insufficiently advanced.
(*) :(){ :|: };:
- -


signature.asc
Description: Digital signature


Re: Long urls - update

2013-03-31 Thread Luis Mochan
 Line 633?  634?  So:
 
# $command =~ s/%s/'$url'/g;
$command=~s//\\/g;

Sorry for not having given the line numbers, etc. I actually made
changes around  522 and 647, and defined a new subroutine (I named it
wlmsanitize) which modifies the command to run. A patch
follows. 
 
 I'm a perl guy, yet that's non-trivial here.  Thx.  :-)
 
You're welcome. I don't know if there are other characters that appear
in an url and need to be escaped for the shell ([;]?); they could
easily be accomodated by modifying 'wlmsanitize'. The page for the
extract_url project (http://www.memoryhole.net/~kyle/extract_url/)
mentions that the program already transforms characters dangerous to
the shell, but then it only mentions explicitly single quotes and
dollar signs. 

Best regards,
Luis

--
patch to fix ampersands in urls :
 

--- extract_url.pl~ 2013-03-31 12:35:39.303174972 -0600
+++ extract_url.pl  2013-03-31 15:10:47.822005282 -0600
@@ -519,7 +519,7 @@
} else {
$urlviewcommand .=  $url;
}
-   system $urlviewcommand;
+   system wlmsanitize($urlviewcommand);
exit 0;
}
 
@@ -644,7 +644,7 @@
);
}
if ($return) {
-   system $command;
+   system wlmsanitize($command);
if ($stayopen == 0) {
exit 0 if ($persist == 0);
} else {
@@ -689,6 +689,14 @@
print $value\n;
}
 }
+
+sub wlmsanitize {
+my $cmd=shift @_;
+$cmd =~ s//\\/g;
+return $cmd;
+}
+
+
 =head1 NAME
 
 Bextract_url.pl -- extract URLs from email messages


signature.asc
Description: Digital signature


Re: Long urls - update

2013-03-30 Thread John Niendorf

This looks really great, but where do I put the script?
I made it exicutable and put it in my path and I got the error that it wasn't 
there.
I coped it to ~/ and got the same error.

John

http://www.memoryhole.net/~kyle/extract_url/

Try this. It's brilliant.