but when it goes to the second http_access it doesn't ask for user
internet, it's like it doesn't know what to do with it.
the version of squid is 3.1.19 by the way.
Please help me with this. I am stuck.
thanks in advance,
Osmany
- Terminar mensaje reenviado -
On Mon, Aug 20, 2012 at 11:39 PM, Amos Jeffries squ...@treenet.co.nz wrote:
On 20.08.2012 23:54, Osmany Goderich wrote:
It was about the repeats. Can you please explain to me how to make my
configuration work.
Decide whether you want the connections between Squid and the backend/peer
It was about the repeats. Can you please explain to me how to make my
configuration work.
Thanks
Osmany
-Mensaje original-
De: Amos Jeffries [mailto:squ...@treenet.co.nz]
Enviado el: Sunday, August 19, 2012 2:07 AM
Para: squid-users@squid-cache.org
Asunto: Re: RV: [squid-users] reverse
://webmail.xxx.xx/ site1
acl port80 proto http
http_access deny port80 site1
http_access deny all
Please help. thanks in advance,
Osmany
Sorry about that everyone. I know that repeating post is annoying. It was
just a little mistake.
Apologies.
Osmany
://webmail.xxx.xx/ site1
acl port80 proto http
http_access deny port80 site1
http_access deny all
Please help. thanks in advance,
Osmany
://webmail.xxx.xx/ site1
acl port80 proto http
http_access deny port80 site1
http_access deny all
Please help. thanks in advance,
Osmany
...@andybev.com]
Enviado el: Monday, April 02, 2012 3:45 PM
Para: Osmany Goderich
CC: squid-users@squid-cache.org
Asunto: Re: [squid-users] bash/mysql script not working
On Mon, 2012-04-02 at 14:28 -0400, Osmany Goderich wrote:
Please have a look at this bash/mysql external helper. Can anyone tell
me why
Hi everyone,
Please have a look at this bash/mysql external helper. Can anyone tell me
why is it not working?
#/bin/bash
connect=mysql -h 127.0.0.1 -b squid -u squid -p password -e
url=%DST
while read $url
do
if [ $connect select site from porn where site='$url' ]
then
echo OK
else
echo ERR
fi
Hi everyone,
Please have a look at this bash/mysql external helper. Can anyone tell me
why is it not working?
#/bin/bash
connect=mysql -h 127.0.0.1 -b squid -u squid -p password -e
url=%DST
while read $url
do
if [ $connect select site from porn where site='$url' ]
then
echo OK
else
echo ERR
fi
-Mensaje original-
De: da...@lang.hm [mailto:da...@lang.hm]
Enviado el: Tuesday, April 05, 2011 11:13 PM
Para: osm...@es.quimefa.cu
CC: squid-users@squid-cache.org
Asunto: Re: [squid-users] Fwd: squid 3.1 to export access_log to rsyslog
On Tue, 5 Apr 2011, osm...@es.quimefa.cu wrote:
Hi everyone,
I would like to know how to export access_log in squid to a central
rsyslog in my network
I know I should you a local rsyslog daemon to forward logs to the central
server but I just can't
get squid to actually write to the local rsyslog daemon and I tried
various things:
access_log
something like this:
http://dnl-16.geo.kaspersky.com/ftp://dnl-kaspersky.quimefa.cu:2122/Updates/index/u0607g.xml.klz
I've changed the script many time so that I can get what I want but I
had no success. can you please help me?
On Sun, 2011-03-13 at 21:27 -0300, Marcus Kool wrote:
Osmany,
look
So finally this is what I have and it works perfectly. But I want to go
further than this. I want the clients to download what they've requested
from my local urls. For example...if a client wants to update their
Kaspersky antivirus and it requests for an internet update server, I
want it to
how can i get this db auth helper? do i have to recompile squid?
do i download it from somewhere?
On Thu, 2011-03-10 at 11:55 +1300, Amos Jeffries wrote:
On Wed, 09 Mar 2011 09:44:37 -0500, Osmany wrote:
Greetings,
Ok. I'm sure this is something that has been asked many times but I
Greetings,
Ok. I'm sure this is something that has been asked many times but I
googled alot and there is no real documentation or a how to for this.
Basically I wan all my acls that are pointing to a straight forward text
file on my configuration to poin to a table in a mysql database that I
On Tue, 2011-03-08 at 12:21 +1300, Amos Jeffries wrote:
On Tue, 08 Mar 2011 11:58:57 +1300, Amos Jeffries wrote:
On Mon, 07 Mar 2011 16:59:07 -0500, Osmany wrote:
Greetings everyone,
So I'm having trouble with my squid proxy-cache server. I recently
added
a redirect program because I
Forwarded Message
From: Osmany osm...@oc.quimefa.cu
Reply-to: osm...@oc.quimefa.cu
To: squid-users squid-users@squid-cache.org
Subject: Re: [squid-users] help with squid redirectors
Date: Tue, 08 Mar 2011 07:20:11 -0500
On Tue, 2011-03-08 at 12:21 +1300, Amos Jeffries wrote
On Wed, 2011-03-09 at 01:33 +1300, Amos Jeffries wrote:
On 09/03/11 01:20, Osmany wrote:
On Tue, 2011-03-08 at 12:21 +1300, Amos Jeffries wrote:
On Tue, 08 Mar 2011 11:58:57 +1300, Amos Jeffries wrote:
On Mon, 07 Mar 2011 16:59:07 -0500, Osmany wrote:
Greetings everyone,
So I'm having
Forwarded Message
From: Osmany osm...@oc.quimefa.cu
Reply-to: osm...@oc.quimefa.cu
To: squid-users squid-users@squid-cache.org
Subject: Re: [squid-users] help with squid redirectors
Date: Tue, 08 Mar 2011 07:41:47 -0500
On Wed, 2011-03-09 at 01:33 +1300, Amos Jeffries wrote
Greetings everyone,
So I'm having trouble with my squid proxy-cache server. I recently added
a redirect program because I had to make users go to my kaspersky admin
kit and my WSUS services to get their updates and it works fine but I
get constantly a warning and squid just collapses after a few
-
De: Henrik Nordstrom [mailto:[EMAIL PROTECTED]
Enviado el: jueves, 23 de octubre de 2008 14:07
Para: Osmany Goderich
CC: squid-users@squid-cache.org
Asunto: Re: [squid-users] Problems with downloads
On tor, 2008-10-23 at 14:34 -0500, Osmany Goderich wrote:
Hi everyone,
I have Squid3.0STABLE9
Hi everyone,
I have Squid3.0STABLE9 installed on a CentOS5.2_x86_64 system. I have
problems with downloads, especially large files. Usually downloads are slow
in my network because of the amount of users I have but I dealt with it
using download accelerators like FlashGET. Now the downloads get
On tor, 2008-10-23 at 14:34 -0500, Osmany Goderich wrote:
Hi everyone,
I have Squid3.0STABLE9 installed on a CentOS5.2_x86_64 system. I have
problems with downloads, especially large files. Usually downloads are
slow in my network because of the amount of users I have but I dealt
On tor, 2008-10-23 at 14:34 -0500, Osmany Goderich wrote:
Hi everyone,
I have Squid3.0STABLE9 installed on a CentOS5.2_x86_64 system. I have
problems with downloads, especially large files. Usually downloads are
slow in my network because of the amount of users I have but I dealt
Hi,
I get this error when I run a make to install the chpasswd.cgi. What shall
I do?
gcc -c -O2 -w -I. -DHAVE_LIBCRYPT=1 -DHAVE_DIRENT_H=1 -DSTDC_HEADERS=1
-DHAVE_CRYPT_H=1 -DHAVE_PWD_H=1 -DHAVE_STDIO_H=1 -DHAVE_STDLIB_H=1
-DHAVE_SYS_STAT_H=1 -DHAVE_SYS_TIME_H=1 -DHAVE_TIME_H=1
Hi,
I have a Little problem. I've just installed Squid2.6stable6 and apparently
it is not connecting to the parent proxy. In the cache.log it says that the
parent is dead. I have confirmed with the net-admins that the parent is not
dead.
This is what I have in the squid.conf
cache_peer
If i do that, all clients will be allowed to Access these allowed sites
without password or IPAddress verification. Considering that squid tries to
find the first matching rule and doesn´t read any further. It will only
reach that rule and ignore the rest of the access rules.
Administrador del
It's been a long time since I first had this problem. I have modified,
changed, erased, added configurations but I still don't see what's the
problem. My squid consumes swap until it takes it all and the service goes
down. I have 3.0Ghz processor and 512MB of RAM to offer internet service to
more
a little
bit.
-Mensaje original-
De: Henrik Nordstrom [mailto:[EMAIL PROTECTED]
Enviado el: viernes, 13 de julio de 2007 11:27
Para: Osmany
CC: squid-users@squid-cache.org
Asunto: Re: [squid-users] Squid consumin swap
fre 2007-07-13 klockan 10:54 -0400 skrev Osmany:
It's been a long time
30 matches
Mail list logo