Subject: w3-url-e21: Throws an error trying to retrieve CSS info from SSL server Package: w3-url-e21 Version: 2005.10.23-5 Severity: normal
*** Please type your report below this line *** When I try to view certain emails with VM, I get these error messages from w3 and the mail is not shown properly: Decoding MIME message... Inlining text/html, be patient... Parsed 20% of 9058... Parsed 29% of 9058... Parsed 37% of 9058... Parsed 46% of 9058... Parsed 55% of 9058... Parsed 63% of 9058... Parsed 72% of 9058... Parsed 80% of 9058... Parsed 89% of 9058... Parsed 100% of 9058...done Drawing... - HTTP/0.9 How I hate thee! error in process filter: url-http-generic-filter: Wrong type argument: number-or-marker-p, nil error in process filter: Wrong type argument: number-or-marker-p, nil ... It repeats this many times per message. Setting debug-on-error to t and loading the source file instead of the compiled file, I can see that the error occurs here: url-http.el:url-http-wait-for-headers-change-function:923: (cond ((or (= url-http-response-status 204) (= url-http-response-status 205)) The debugger tells me that url-http-response-status is nil, and it throws this error trying to compare nil to an integer. Why is this nil? It's because there's a bug in this function where, if old-http is set, the value of url-http-response-status is never computed. So, that's the first problem: maybe that variable should be set to 404 or something (I have no idea) if old-http is set, to avoid this problem. However, there's a meta-problem in that my HTTP server is definitely _NOT_ old! It should not have old-http set. Examining my email message I can see that the issue comes from CSS; in my email I see this: <style type="text/css" media="all"> @import url("https://my.server.com/skin/layout.css"); (and some other stuff). Now, this file cannot be retrieved: the server is not only using SSL but requires login before you can access it; this request will definitely fail. If I examine the temporary buffer containing the text returned from the GET request sent to the server I see that it contains some odd lines BEFORE the HTTP header starts. These lines look vaguely like LDAP output lines but they might be SSL certificate info of some kind: I have no idea where they came from. Any browser I've used to try to connect to the server never mentions them or displays them in any way, so either the browsers are just ignoring any text before the HTTP banner or else somehow these are only being displayed when Emacs is sending requests to the server, I'm not sure. I can't find a good way to recreate what Emacs is doing from the command line (maybe using Perl? I tried wget and couldn't come up with anything). The buffer looks like: verify depth ...other debug-looking output... HTTP/1.1 ... Because the HTTP/... is not the first thing in the buffer, this code in url-http-wait-for-headers-change-function decides the server is an old server: (goto-char (point-min)) (if (not (looking-at "^HTTP/[1-9]\\.[0-9]")) ;; Not HTTP/x.y data, must be 0.9 ;; God, I wish this could die. (setq end-of-headers t url-http-end-of-headers 0 old-http t) If the code tried looking further in the buffer it would see that there really is an HTTP value. In fact the function url-http-parse-response already does this: (goto-char (point-min)) (while (and (not (looking-at "^HTTP")) (not (forward-line 1)))) (skip-chars-forward " \t\n") ; Skip any blank crap (skip-chars-forward "HTTP/") ; Skip HTTP Version (read (current-buffer)) (setq url-http-response-status (read (current-buffer)))) This is what actually sets url-http-response-status, and this would work on my buffer; but this function is never called because old-http is set. So: many things: - If old-http is set we should set url-http-response-status to some legal value, or skip the cond, or SOMETHING so that we don't get these errors in the first place. - I have to try to figure out why my server is sending back this odd text before the HTTP line. - It seems like the setting of old-http in the above function should be smarter and look for ^HTTP appearing later in the buffer, not just at the beginning, like url-http-parse-response does. Finally, I'm nervous about having W3 automatically download CSS from remote sites. That's providing all sorts of opportunities for email (esp. spam) to magically "phone home" by retrieving CSS files from remote servers, just by viewing the mail. Isn't there some way to configure W3 so it won't even try to download URLs that come in emails? Most browsers provide this capability at least for images and I'd think remote CSS download would need the same flexibility. Ideally I'd be able to configure "safe domains" and/or "unsafe domains" using traditional allow/deny semantics. -- System Information: Debian Release: testing/unstable APT prefers unstable APT policy: (500, 'unstable'), (500, 'testing'), (500, 'stable') Architecture: i386 (i686) Shell: /bin/sh linked to /bin/bash Kernel: Linux 2.6.9-prep Locale: LANG=C, LC_CTYPE=C (charmap=ANSI_X3.4-1968) Versions of packages w3-url-e21 depends on: ii emacs21 21.4a-3 The GNU Emacs editor Versions of packages w3-url-e21 recommends: ii w3-el-e21 4.0pre.2001.10.27-19 Web browser for GNU Emacs 21 -- no debconf information -- ------------------------------------------------------------------------------- Paul D. Smith <[EMAIL PROTECTED]> HASMAT: HA Software Mthds & Tools "Please remain calm...I may be mad, but I am a professional." --Mad Scientist ------------------------------------------------------------------------------- These are my opinions--Nortel takes no responsibility for them. -- To UNSUBSCRIBE, email to [EMAIL PROTECTED] with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]