Re: Interwiki features supported in the squid wiki

2009-09-22 Thread Amos Jeffries

Kinkie wrote:

Hi all wiki editors,
  I've stumbled across a functionality of the wiki engine I hadn't
really explored before: interwiki links.
That's special keywords which get translated to external links.
Some of the most useful I found are:

Bug:# (e.g. Bug:123)
points to the Squid Bugzilla bug number specified (only the bug
number is shown in the rendered text)


) sweet!


Cache:url-without-http-prefix
links the given URL via Google cache
Google:single_search_keyword
only works for single-keyword searches. Link to google search


Hmm, might the 'single word' include + symbols?


RFC:#
creates link to RFC# on ietf.org


More can be added by editing the wiki page InterWikiMap.



:) can we get one added that links to the squid configuration guide for 
squid.conf options?


 squid.conf:perhapse?

That will be a bug plus in the Feature pages for referencing the related 
options.


Amos
--
Please be using
  Current Stable Squid 2.7.STABLE7 or 3.0.STABLE19
  Current Beta Squid 3.1.0.13


Re: Build failed in Hudson: 2.HEAD-i386-Debian-sid #57

2009-09-22 Thread Amos Jeffries

n...@squid-cache.org wrote:

See http://build.squid-cache.org/job/2.HEAD-i386-Debian-sid/57/

--
Started by upstream project 2.HEAD-amd64-CentOS-5.3 build number 118
Building remotely on rio.treenet
cvs [checkout aborted]: Name or service not known
FATAL: CVS failed. exit code=1



So what did I have to do to fix this after your changes the other day 
kinkie?



Amos
--
Please be using
  Current Stable Squid 2.7.STABLE7 or 3.0.STABLE19
  Current Beta Squid 3.1.0.13


wiki, bugzilla, feature requests

2009-09-22 Thread Robert Collins
AIUI we use the wiki [over and above being a source of docs] to /design
features/ and manage [the little] scheduling that we, as volunteers can
do.

I think thats great.

However, we also have many bugs that are not strictly-current-defects.
They are wishlist items.

What should we do here?

I've spent quite some time using wikis for trying to manage such things,
I think its a lost cause. Use them for design and notes and so forth,
but not for managing metadata.

I suggest that when there is a bug for a feature that is being designed
in the wiki, just link to bugzilla from that wiki page.

And for management of dependencies and todos, assignees and so forth, we
should do it in bugzilla, which is *designed* for that.

-Rob


signature.asc
Description: This is a digitally signed message part


Re: Interwiki features supported in the squid wiki

2009-09-22 Thread Kinkie
On Tue, Sep 22, 2009 at 2:06 PM, Amos Jeffries squ...@treenet.co.nz wrote:
 Kinkie wrote:

 Hi all wiki editors,
  I've stumbled across a functionality of the wiki engine I hadn't
 really explored before: interwiki links.
 That's special keywords which get translated to external links.
 Some of the most useful I found are:

 Bug:# (e.g. Bug:123)
    points to the Squid Bugzilla bug number specified (only the bug
 number is shown in the rendered text)

 ) sweet!

 Cache:url-without-http-prefix
    links the given URL via Google cache
 Google:single_search_keyword
    only works for single-keyword searches. Link to google search

 Hmm, might the 'single word' include + symbols?

Tried that. Didn't work, they get escaped. It's of limited
functionality, but better something than nothing :)
If a link to wikipedia is desider, just use WikiPedia:link_to_wikipedia_page

 RFC:#
    creates link to RFC# on ietf.org


 More can be added by editing the wiki page InterWikiMap.


 :) can we get one added that links to the squid configuration guide for
 squid.conf options?

     squid.conf:    perhapse?

Added as SquidConf:directive . Notice that links do NOT get verified
automatically for existence.

 That will be a bug plus in the Feature pages for referencing the related
 options.

The full list of available interwiki prefixes can be found at
http://wiki.squid-cache.org/InterWiki
The map can be extended by editing http://wiki.squid-cache.org/InterWikiMap

-- 
/kinkie


[PATCH] parameterize deny_info URL redirect

2009-09-22 Thread Amos Jeffries


It's long been an annoyance that deny_info will only accept the full 
canonical URL requested via the %s parameter. Yet the error pages have a 
full range of request and other details available to them.


That problem severely limits the possibilities of replacing url_rewrite 
redirect with an HTTP-compliant deny_info 3xx redirect. To bounce people 
from a.example.com/foo to b.example.com/foo for the simple case.



This patch opens most of the error page template % codes for use in 
deny_info URL as well.  With two alterations %s for back-compat remains 
the full canonical URL instead of the Squid program signature.  And %R 
is leveraged from the full request headers to the URL-path (since error 
pages had nothing existing and sensible).


NP: %P can be used to retain the protocol ie HTTP/HTTPS/FTP during the 
jump :)


Now : is this acceptable as-is or do we need to begin discussions about 
changing the % code tags to something more adaptable? (in-sync with 
logformat for example.)


Amos
=== modified file 'src/errorpage.cc'
--- src/errorpage.cc	2009-08-23 09:30:49 +
+++ src/errorpage.cc	2009-09-22 12:45:24 +
@@ -595,11 +595,12 @@
 #define CVT_BUF_SZ 512
 
 const char *
-ErrorState::Convert(char token)
+ErrorState::Convert(char token, bool url_presentable)
 {
 static MemBuf mb;
 const char *p = NULL;	/* takes priority over mb if set */
 int do_quote = 1;
+int no_urlescape = 1;   /* item is NOT to be further URL-encoded */
 char ntoabuf[MAX_IPSTRLEN];
 
 mb.reset();
@@ -607,37 +608,30 @@
 switch (token) {
 
 case 'a':
-
 if (request  request-auth_user_request)
 p = request-auth_user_request-username();
-
 if (!p)
 p = -;
-
 break;
 
 case 'B':
 p = request ? ftpUrlWith2f(request) : [no URL];
-
+no_urlescape = 1;
 break;
 
 case 'c':
 p = errorPageName(type);
-
 break;
 
 case 'e':
 mb.Printf(%d, xerrno);
-
 break;
 
 case 'E':
-
 if (xerrno)
 mb.Printf((%d) %s, xerrno, strerror(xerrno));
 else
 mb.Printf([No Error]);
-
 break;
 
 case 'f':
@@ -646,7 +640,6 @@
 p = ftp.request;
 else
 p = nothing;
-
 break;
 
 case 'F':
@@ -655,13 +648,12 @@
 p = ftp.reply;
 else
 p = nothing;
-
 break;
 
 case 'g':
+if (url_presentable) break;
 /* FTP SERVER MESSAGE */
 wordlistCat(ftp.server_msg, mb);
-
 break;
 
 case 'h':
@@ -676,12 +668,10 @@
 p = request-GetHost();
 } else
 p = [unknown host];
-
 break;
 
 case 'i':
 mb.Printf(%s, src_addr.NtoA(ntoabuf,MAX_IPSTRLEN));
-
 break;
 
 case 'I':
@@ -689,36 +679,34 @@
 mb.Printf(%s, request-hier.host);
 else
 p = [unknown];
-
 break;
 
 case 'l':
+if (!url_presentable) break;
 mb.append(error_stylesheet.content(), error_stylesheet.contentSize());
 do_quote = 0;
 break;
 
 case 'L':
+if (!url_presentable) break;
 if (Config.errHtmlText) {
 mb.Printf(%s, Config.errHtmlText);
 do_quote = 0;
 } else
 p = [not available];
-
 break;
 
 case 'm':
+if (!url_presentable) break;
 p = auth_user_request-denyMessage([not available]);
-
 break;
 
 case 'M':
 p = request ? RequestMethodStr(request-method) : [unknown method];
-
 break;
 
 case 'o':
 p = external_acl_message ? external_acl_message : [not available];
-
 break;
 
 case 'p':
@@ -727,7 +715,6 @@
 } else {
 p = [unknown port];
 }
-
 break;
 
 case 'P':
@@ -735,7 +722,10 @@
 break;
 
 case 'R':
-
+if (url_presentable) {
+p = (request-urlpath.size() != 0 ? request-urlpath.termedBuf() : /);
+break;
+}
 if (NULL != request) {
 Packer p;
 String urlpath_or_slash;
@@ -757,16 +747,21 @@
 } else {
 p = [no request];
 }
-
 break;
 
 case 's':
-p = visible_appname_string;
+/* for backward compatibility we need to maek %s show the full URL. */
+if (url_presentable) {
+p = request ? urlCanonical(request) : url ? url : [no URL];
+debugs(0,0, WARNING: deny_info now accepts coded tags. Use %u to get the full URL instead of %s);
+}
+else
+p = visible_appname_string;
 break;
 
 case 'S':
+if (!url_presentable) break;
 /* signature may contain %-escapes, recursion */
-
 if (page_id != ERR_SQUID_SIGNATURE) {
 const int saved_id = page_id;
 page_id = ERR_SQUID_SIGNATURE;
@@ -780,7 +775,6 @@
 /* wow, somebody put %S into 

Hudson build is back to normal: 2.HEAD-i386-Debian-sid #59

2009-09-22 Thread noc
See http://build.squid-cache.org/job/2.HEAD-i386-Debian-sid/59/




Build failed in Hudson: 3.HEAD-i386-Debian-sid #56

2009-09-22 Thread noc
See http://build.squid-cache.org/job/3.HEAD-i386-Debian-sid/56/

--
Started by upstream project 3.HEAD-amd64-CentOS-5.3 build number 111
Building remotely on rio.treenet
bzr: ERROR: Connection error: while sending POST 
http://www.squid-cache.org/bzr/squid3/trunk/.bzr/smart: (111, 'Connection 
refused')
Using saved parent location: http://www.squid-cache.org/bzr/squid3/trunk/
ERROR: Failed to pull



Re: wiki, bugzilla, feature requests

2009-09-22 Thread Amos Jeffries

Robert Collins wrote:

AIUI we use the wiki [over and above being a source of docs] to /design
features/ and manage [the little] scheduling that we, as volunteers can
do.

I think thats great.

However, we also have many bugs that are not strictly-current-defects.
They are wishlist items.

What should we do here?

I've spent quite some time using wikis for trying to manage such things,
I think its a lost cause. Use them for design and notes and so forth,
but not for managing metadata.

I suggest that when there is a bug for a feature that is being designed
in the wiki, just link to bugzilla from that wiki page.

And for management of dependencies and todos, assignees and so forth, we
should do it in bugzilla, which is *designed* for that.

-Rob


I'm mentally grouping the enhancement bugs into three categories:
 - real features : requiring user-visible configuration 
additions/alterations.
  These are what the wiki Features pages were intended for. So users 
could come along after the feature is released and read the 
documentation about the feature and its intended/possible uses. Also 
these pages linked to the roadmap for publishing the user-visible schedules.


 - operational enhancements : largely bugs which are improving some 
workings but not relevant to users configuration. These might get 
mentioned in the wiki as part of some other discussion or the planning 
like SourceLayout may be so tricky we require a dynamic page to track 
the changes well.


 - pure code enhancements : pretty much bugs but not problem-causing 
bugs. These don't suit the wiki at all and might stay as enhancement 
bugs only.



Amos
--
Please be using
  Current Stable Squid 2.7.STABLE7 or 3.0.STABLE19
  Current Beta Squid 3.1.0.13


Re: wiki, bugzilla, feature requests

2009-09-22 Thread Robert Collins
I'm proposing:
 - if there is a bug for something, and a wiki page, link them together.
 - scheduling, assignment, and dependency data should be put in bugs
 - whiteboards to sketch annotate document etc should always be in the
wiki

-Rob


signature.asc
Description: This is a digitally signed message part


Re: Build failed in Hudson: 2.HEAD-i386-Debian-sid #57

2009-09-22 Thread Henrik Nordstrom
ons 2009-09-23 klockan 00:07 +1200 skrev Amos Jeffries:
 n...@squid-cache.org wrote:
  See http://build.squid-cache.org/job/2.HEAD-i386-Debian-sid/57/
  
  --
  Started by upstream project 2.HEAD-amd64-CentOS-5.3 build number 118
  Building remotely on rio.treenet
  cvs [checkout aborted]: Name or service not known
  FATAL: CVS failed. exit code=1
  
 
 So what did I have to do to fix this after your changes the other day 
 kinkie?

it need to do a checkout from the main CVS repository, not the
SourceForge one.

Regards
Henrik



Re: wiki, bugzilla, feature requests

2009-09-22 Thread Henrik Nordstrom
tis 2009-09-22 klockan 23:27 +1000 skrev Robert Collins:
 I'm proposing:
  - if there is a bug for something, and a wiki page, link them together.
  - scheduling, assignment, and dependency data should be put in bugs
  - whiteboards to sketch annotate document etc should always be in the
 wiki

I fully second this opinion.

Wiki is great for documentation and the like, but very poor for tracking
progress (or lack thereof).

Regards
Henrik



Build failed in Hudson: 2.HEAD-i386-Debian-sid #63

2009-09-22 Thread noc
See http://build.squid-cache.org/job/2.HEAD-i386-Debian-sid/63/

--
Started by user amos
Building remotely on rio.treenet
cvs checkout: CVS password file /home/www/squid/.cvspass does not exist - 
creating a new file
cvs checkout: authorization failed: server cvs.squid-cache.org rejected access 
to /squid for user anoncvs
cvs checkout: used empty password; try cvs login with a real password
FATAL: CVS failed. exit code=1



Build failed in Hudson: 3.HEAD-i386-Debian-sid #57

2009-09-22 Thread noc
See http://build.squid-cache.org/job/3.HEAD-i386-Debian-sid/57/

--
Started by user amos
Building remotely on rio.treenet
bzr: ERROR: Invalid http response for 
http://www.squid-cache.org/bzr/squid3/trunk/.bzr/branch-format: Bad status line 
received
Using saved parent location: http://www.squid-cache.org/bzr/squid3/trunk/
ERROR: Failed to pull