On 05/03/2011 06:16, christ...@wanxp.com wrote:
Dear,
i installed and run 2 squid2.7 in same machine for balance CPU load
(quadcore proc). my question are can i do sibling between 2 squid in
same machine for sharing cache ? will it cause any problem in future ?
any1 have experience bout
On 05/03/2011 06:40, christ...@wanxp.com wrote:
On 2/5/2011 11:26 AM, Eliezer Croitoru wrote:
On 05/03/2011 06:16, christ...@wanxp.com wrote:
Dear,
i installed and run 2 squid2.7 in same machine for balance CPU load
(quadcore proc). my question are can i do sibling between 2 squid in
same
i have found something about it.
it happends only using one method.
i am using a init.d script:
service squid3 reload
or the same
/etc/init.d/squid3 reload
and then after that im getting the (null)://
error page.
so i tried using another method of reloading\reconfiguring the proxy for
i'm running tests on debian 6 32bit and 64 bit.
i compared V 3.1.11 vs 3.2.0.5 and 3.2 is giving so much better response
time than the older versions.
i had a little problem building the 3.2 and i needed to use the
build-essential package (not using tproxy).
in order to install the
On 07/04/2011 11:52, Linda Walsh wrote:
Amos Jeffries wrote:
Marked explicitly as private - aka cannot be cached by any
middleware proxy (such as Squid) which may send it to other users.
May be cached by a personal cache such as the browser storage.
---
But I don't have to log in.
More
On 07/04/2011 16:16, Linda Walsh wrote:
Eliezer Croitoru wrote:
On 07/04/2011 11:52, Linda Walsh wrote:
Amos Jeffries wrote:
Marked explicitly as private - aka cannot be cached by any
middleware proxy (such as Squid) which may send it to other users.
May be cached by a personal cache
On 09/04/2011 04:22, igor rocha wrote:
I more thank Amos, but if anyone has any other tips, information, help me
adding to Amos,
the common ISP setup is 30-50 percent but it really depends on the
knowledge and usage of the ISP cache operators.
caching using helpers instead of the ordinary
On 11/04/2011 20:53, sq...@sourcesystemsonline.com wrote:
Good day,
Some times when i check my ESET Antivirus LogFile, it shows that some
activities of clients in my network are attacking my network especially
squid port (3128) with TCP Flooding or DNS Poisioning. I check the
internet for there
On 12/04/2011 06:15, Amos Jeffries wrote:
On Mon, 11 Apr 2011 22:34:02 +0300, Eliezer Croitoru wrote:
On 11/04/2011 20:53, sq...@sourcesystemsonline.com wrote:
Good day,
Some times when i check my ESET Antivirus LogFile, it shows that some
activities of clients in my network are attacking my
On 12/04/2011 19:14, Marcello Romani wrote:
Il 12/04/2011 17:46, yogii ha scritto:
thank you Mr.jeffrey, this link very useful for me, i have read about
installation which is better, automatic install with apt-get squid or
i do
compile squid?
thanks Mr.
--
View this message in context:
On 13/04/2011 12:48, Amos Jeffries wrote:
On 13/04/11 21:11, Klaus Darilion wrote:
Hi!
For debugging of certain http clients I need to log the whole HTTP(S)
traffic passed through squid, i.e. complete HTTP request and the HTTP
response (all headers and bodies).
I read lots on the wiki and it
On 13/04/2011 17:45, Klaus Darilion wrote:
Am 13.04.2011 12:23, schrieb Eliezer Croitoru:
but if it certain clients you can might setup specific parent proxy for
this purpose as parosproxy.
it's not suppose to be as fast as squid but built for http\s traffic
interrogation.
Thanks
On 13/04/2011 22:06, childrenofch...@freenet.de wrote:
Hey,
The configuration listet above, runs longer 1 year without an probs.
Now we get the Squid Message: Timeout - DNS Error.
first step i tried: dig google.de from the squid maschine. No probs.
i saw in the cache.log that all
On 12/04/2011 08:37, Amos Jeffries wrote:
On 12/04/11 15:51, Eliezer Croitoru wrote:
On 12/04/2011 06:15, Amos Jeffries wrote:
On Mon, 11 Apr 2011 22:34:02 +0300, Eliezer Croitoru wrote:
On 11/04/2011 20:53, sq...@sourcesystemsonline.com wrote:
Good day,
Some times when i check my ESET
On 15/04/2011 07:05, Amos Jeffries wrote:
On 15/04/11 02:05, sq...@sourcesystemsonline.com wrote:
Good day,
Thanks all for concern. The network topology is as follow:
Workstations are installed with Windows 7 Pro with spyware terminator
with
integrated ClamAV all link to a Cisco 2950 switch
On 15/04/2011 15:49, Fran Márquez wrote:
Hi,
I have a doubt: Can I setup squid for allow full and free access
during a limited time period (for example, 15 minutes per day) for
users can access to any site without restrictions?
I don't like prohibit access to some popular sites, but I
On 15/04/2011 16:02, Leonardo Rodrigues wrote:
i'm pretty sure that the --with-large-files is needed on 32 bit
installations (x86). Large file support is the default on x86_64
machines.
please someone correct me if i'm wrong ...
Em 15/04/11 09:23, Helmut Hullen escreveu:
Hallo,
On 17/04/2011 19:44, Jenny Lee wrote:
Sorry for not answering. There was just had nothing I could be sure
about until now...
3.2.0.7 will be out early (and very soon) with fixes for the critical
and blocker bugs currently known to exist in 3.2.0.6 tarballs. The fixes
are now in 3.HEAD
On 18/04/2011 04:14, Amos Jeffries wrote:
On Sun, 17 Apr 2011 19:57:11 +0300, Eliezer Croitoru wrote:
On 17/04/2011 19:44, Jenny Lee wrote:
Sorry for not answering. There was just had nothing I could be sure
about until now...
3.2.0.7 will be out early (and very soon) with fixes
What is your network setup?
What is the position of each device related to the other on the network?
both of them on the same network?
Eliezer
On 22/04/2011 11:43, bmm-mailinglist wrote:
Hi all,
I am a new Squid user. I like Squid's ease of setup and -use. Unfortunately,
I've hit a
On 22/04/2011 09:12, Eugene M. Zheganin wrote:
Hi.
On 20.04.2011 13:48, Helmut Hullen wrote:
# stat swap.state
95 4012314 -rw-r- 1 squid squid 16326000 10203960 Apr 19
14:02:21 2011 Apr 20 10:53:45 2011 Apr 20 10:53:45 2011 Apr 19
14:02:21 2011 16384 19968 0 swap.state
What about
On 23/04/2011 17:57, Hasanen AL-Bana wrote:
edit /etc/sysctl.conf
change net.ipv4.tcp_syncookies=1 to net.ipv4.tcp_syncookies=0 and
reboot. dont forget to remove the # from the beginning of the line.
On Sat, Apr 23, 2011 at 5:39 PM, Andreas Braathen
andreas.braat...@andtux.net wrote:
Squid
On 23/04/2011 18:34, Andreas Braathen wrote:
I tried it, but it did not change anything. Squid still sends SYN packets to
establish state with destination.
Any other suggestions?
Sorry to tell you but you better try to just read the basics of tcp flow
to understand the meaning of a SYN
On 26/04/2011 14:07, sichent wrote:
Well, thanks for the pointer, But as far as I can see there, it's a
installer, how did you generate the binary?
Well we got the binaries from Acme Consulting (link from squid web site)
they seem to have quite thorough instructions on how to built the
On 19/04/2011 16:44, Ralf Hildebrandt wrote:
* Amos Jeffriessqu...@treenet.co.nz:
Arg! now I'm going crazy (and blind).
Fix applied to trunk.
http://www.squid-cache.org/Versions/v3/3.HEAD/changesets/squid-3-11387.patch
Working...
starting testing in two days.
on: ubuntu 10.04 x64
the
On 27/04/2011 11:53, rioda78.squid wrote:
(Help)
im new one in squid
im will build squid server with youtube cache.
can anyone squid master here to help me to enable youtube with squid
or third software (like youtube_cache).
thanks before
it's different using third party or squid2.7 using
On 27/04/2011 23:05, Sheridan Dan Small wrote:
Thanks for your reply again Amos,
I am familiar with wget and curl for fetching headers and files. I
think I can come up with a solution using either wget or curl to
create a static cache. I will write a simple server application to
serve the
On 28/04/2011 12:25, Jannis Kafkoulas wrote:
Hi,
We have a proxy chain of 3 squids (v2.7, RHEL5).
In my opinion only the last one needs really to resolve the URLs name in order
to
send the request directly to the web servers ip address in the Internet.
Would it also work if the first two
On 27/04/2011 22:53, Oscar Andrés Eraso Moncayo wrote:
Hi,
squid.conf:
**
http_port 127.0.0.1:3030
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
On 28/04/2011 18:05, Amos Jeffries wrote:
On 29/04/11 00:49, Eliezer Croitoru wrote:
On 27/04/2011 22:53, Oscar Andrés Eraso Moncayo wrote:
Hi,
squid.conf:
**
http_port
On 28/04/2011 17:18, Amos Jeffries wrote:
proxy was psychic
my proxy was psychic good name for a TV show :)
On 30/04/2011 11:58, Jannis Kafkoulas wrote:
OK, I see!
Thanks very much!
dont you have a local caching dns?
if you dont it's one of the basics recommendations.
and another good thing is to change the udp and tcp times on the linux
kernel\sysctl.
i dont remember the basic TCP settings for
On 01/05/2011 01:14, Mohsen Saeedi wrote:
Hi
another problem is when some users try to download file without
download manager application. they rate is very slow. for example with
download manager you can download about 120KB/s and without it speed
reduce to 10KB/s. how can i configure squid
On 01/05/2011 23:58, patrick.oesch...@bluewin.ch wrote:
scenario:
users use a client/browser which is located on the company lan using a private
IP address
the company lan is
protected by a firewall which translates all web access into one public IP
address
the proxy is implemented as a 'cloud
On 02/05/2011 12:32, Jannis Kafkoulas wrote:
Thanks for the hint!
I'll check it too.
I think, we should also replace the ip dsts within all of the intermediate
caches by domain names
thus saving all of the unnecessary dns lookups (about 80% -- Internet).
i dont know about the amount of
disable the NTLM auth for the exchange and local servers..
if they are protected with passwords already and they are
internal\specific use you can disable for these servers
the need for squid AUTHENTICATION.
On 18/05/2011 11:05, Stefanos Vizikidis wrote:
Hi!
I have recently set up a
i am using now squid 3.2.0.5 and it works fine for a weak and then the
cpu rocks for like 200%.
i'm almost sure it's related to logs.
ubuntu 10.04
i have tried squid 3.2.0.7 but i got other problems.
i want to recompile squid as a basic transparent proxy and to make sure
it will work
with no authentication at all and i get the
same results, so i assume that ntlm authentication is not the case.
I don't know if i am not really into the point. Correct me if
disabling ntlm auth will do something else besides what i describe
above.
On Wed, May 18, 2011 at 12:01 PM, Eliezer Croitoru
elie
it depends on the machine...
to make sure that the 3000 rules loading is the problem you can run the
squid server on a verbose mode to see how long it takes to load the rules.
from what i have seen the stop process depends less on the amount of
rules but on the amount of connections used on
well i just used wget to download
http://www.bandiweb.it/Left.asp
and it's got stuck after couple of seconds without squid or cache(in my
zone)
almost must be a site problem.
cause it's only
http://www.bandiweb.it/Left.asp
and not other parts of the site.
On 20/05/2011 18:36, Brian
every asp part of the site..
http://www.bandiweb.it/RegistrazioneProva.asp
also... the same
contact the site manager.
On 20/05/2011 18:36, Brian Tuley wrote:
I'm running squid proxy 2.7 and that site is almost unresponsive to me as well.
Works fine when I bypass squid however.
I've been
On 04/06/2011 02:05, sichent wrote:
Setup: Gentoo linux OS on squid and privoxy home lan server
Squid-3.1.12
privoxy-3.0.17
I'm not running an html server, just trying to use squid and privoxy
for my own browsing.
Why not to use ICAP or URL rewriter functionality built
well you are giving nice cache log..
what version of squid you are using? from ubuntu repos?
also did you tried to see the access.log? (for HIT)
i have never seen these errors on squid cache.log
but if you will have more info that i asked you, i thing we will manage
to get some...
Regards
The answer is to disable IPV6 on squid and on the linux machine and
software.
but we do not know that this is the case..
do you have a local DNS server on the machine for caching and forwarding?
you can setup on the squid to use the local dns server and on the dns
server setup specific
well if you do want to push an object you can do it in a more elegant way:
export http_proxy=http://localhost:3128 ; wgethttp://fqdn/object;
and use it on a big site using recursive download and on ram drive.
also another tip is to use --delete-after
this will pull the file into the squid
seems like amos gave you many things to see the result in...
Eliezer
On 04/06/2011 12:08, Amos Jeffries wrote:
On 04/06/11 09:16, MrNicholsB wrote:
Ok Ive had squid3 running rock solid for months, I recently migrated
from Ubuntu 9 to 10.04 and now Squid is clearly not caching, but traffic
IS
On 20/09/2011 08:22, Luis Daniel Lucio Quiroz wrote:
2011/9/19 rocky.lirocky...@italkbb.com.au:
Hi Luis Daniel,
Thank you very much for your reply . Which is the better one? Is it
difficult to use Squid3.2 with ICAP?
well you must know some programing and reverse some web apps.
i used
in the
http://wiki.squid-cache.org/SquidFaq/CompilingSquid#Debian.2C_Ubuntu
it doesnt have anything about the packages needed for the compilation.
i just provided this list that is working for ubuntu 10.04 + and also
debian 6.
sudo apt-get install build-essential libldap2-dev libpam0g-dev
i want to put a splash page every 30 minutes and i dont know a thing
about it.
i have a webserver and squid 2.7\3.0\3.1.15\3.2.0.12
i just need to choose one.
and what i want is that the clients will get specific page every 20 minutes.
hope you can help me sort it out in the squid.conf.
On 22/09/2011 03:17, rocky.li wrote:
Thanks.
I am not sure if the software use HTTP protocol. But there is a website has the
same video resource and the end user can watch video via IE or other web
browser after intalling plug-ins. Does that mean it use HTTP protocol? Or can
we try to cache
On 22/09/2011 02:57, Amos Jeffries wrote:
On 22/09/11 11:26, Eliezer Croitoru wrote:
i want to put a splash page every 30 minutes and i dont know a thing
about it.
i have a webserver and squid 2.7\3.0\3.1.15\3.2.0.12
i just need to choose one.
and what i want is that the clients will get
my squid is configured with
./configure --prefix=/opt/squid32012 --includedir=/include
--mandir=/share/man --infodir=/share/info
--localstatedir=/opt/squid32012/var --disable-maintainer-mode
--disable-dependency-tracking --disable-silent-rules --enable-inline
--enable-async-io=8
On 24/09/2011 15:37, Eliezer Croitoru wrote:
my squid is configured with
./configure --prefix=/opt/squid32012 --includedir=/include
--mandir=/share/man --infodir=/share/info
--localstatedir=/opt/squid32012/var --disable-maintainer-mode
--disable-dependency-tracking --disable-silent-rules
On 24/09/2011 15:52, Eliezer Croitoru wrote:
On 24/09/2011 15:37, Eliezer Croitoru wrote:
my squid is configured with
./configure --prefix=/opt/squid32012 --includedir=/include
--mandir=/share/man --infodir=/share/info
--localstatedir=/opt/squid32012/var --disable-maintainer-mode
--disable
Thanks i will check with the deny thing it seems much more efficient
Eliezer
On 24/09/2011 17:41, Helmut Hullen wrote:
Hallo, Eliezer,
Du meintest am 24.09.11:
i have used the info on:
http://www.cyberciti.biz/tips/linux-unix-squid-proxy-server-authent
ication.html
there it is
FATAL:
i had another problem.. using squid 32012 that squid wont whow the auth
screen on the browser.
Thanks Eliezer
On 30/08/2011 15:19, Rafal Zawierta wrote:
Hello,
Is it possible to use dual authentication helpers in one squid3 instance.
In my example:
auth_param negotiate program
and it seems to work fine.
the main things that must notice on gentoo are that you need to create
the proxy user for usage for squid kids and swap dir creation.
also insert the cache_dir into the squid.conf file.
the gentoo based squid is seems to work much more faster then on ubuntu.
opts
On 22/10/2011 23:38, David Touzeau wrote:
Dear
I encounter this error :
2011/10/22 23:32:05 kid2| Target number of buckets: 7908
2011/10/22 23:32:05 kid2| Using 8192 Store buckets
2011/10/22 23:32:05 kid2| Max Mem size: 8192 KB [shared]
2011/10/22 23:32:05 kid2| Max Swap size: 2048000 KB
On 24/10/2011 22:04, Soporte Técnico wrote:
Any one has been sucesfull ¿?
I had installed squid 2.7 and python and apache webserver and videocache
1.9.1 for caching youtube (testing purposes, later if sucesfull i´m going to
buy this url rewriter).
I see nothing in cachevideo.log
Any one has
this site is slow as hell.
i know about couple of sites that has java scripts in them that causes
this kind of behavior.
it's not related to any dns issue but the structure of the site.
the site is using two domains with ssl.
also on normal surfing to the site it takes a longer time then any
1319762961.344 0 192.168.10.32 TCP_MEM_HIT/200 2518 GET
http://www.google.co.il/ - HIER_NONE/- application/xhtml+xml
[Connection: Keep-Alive\r\nHost: www.google.co.il\r\nAccept:
text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,image/png,*/*;q=0.5,
On 29/10/2011 12:51, Amos Jeffries wrote:
On 28/10/11 14:01, Eliezer Croitoru wrote:
1319762961.344 0 192.168.10.32 TCP_MEM_HIT/200 2518 GET
http://www.google.co.il/ - HIER_NONE/- application/xhtml+xml
[Connection: Keep-Alive\r\nHost: www.google.co.il\r\nAccept:
text/xml,application/xml
On 31/10/2011 13:02, Tymur Islam wrote:
Hi Everybody,
Is it possible to do Transparently Proxy of https (i.e. face book, gmail
etc) traffic?
it is possible to do a transparently(almost) proxy for https but not
using squid.
https is a secure protocol that his purpose is to prevent
i would like to know if someone managed to use ubuntu and TPROXY?
i want to write a detailed manual on how to build a gentoo and ubutnu
based TPROXY squid server in a bridge and router mode.
if anyone can help me and give me a working guide i will be more then happy.
for now i managed to
On 04/11/2011 16:55, Rick Chisholm wrote:
clear your cache, then you get the pause again. The light versions of OWA
did you tried the 3.2 branch?
as let say 3.2.0.8?
Eliezer
On 04/11/2011 17:41, Rick Chisholm wrote:
3.2 does not appear to be available in the BSD ports tree at this time.
compile it yourself.. it pretty simple..
Eliezer
On Fri, November 4, 2011 11:39 am, Eliezer Croitoru wrote:
On 04/11/2011 16:55, Rick Chisholm wrote:
clear your cache
On 17/11/2011 16:11, Nataniel Klug wrote:
Hello all,
I am facing a very difficult problem in my network. I am
using a layout like this:
(internet) ==router ==squid == [clients]
I am running CentOS v5.1 with Squid-2.6 STABLE22 and Tproxy
why dont you use the interception\transparent mode instead of TPROXY?
for your setup it seems just the perfect idea.
i'm using a range setup like this:
-A PREROUTING -p tcp -m tcp -m iprange ! -d 192.168.0.0/16 -i eth1
--dport 80 -j REDIRECT --to-ports 3128 --src-range
On 19/12/2011 19:12, Terry Dobbs wrote:
it's an old issue from squid 3.1 to 3.2 there is nothing yet as far as i
know that solves this issue.
Regards
Eliezer
Hi All.
I just installed squid3 after running squid2.5 for a number of years. I
find after reloading squid3 and trying to access the
updated to squid 3.2.0.14 and then
i'm getting a page of request-too-large while trying to use facebook.
so what i did after searching for anything like that in the past of
squid to add
reply_header_max_size 30 KB
to the config and it works but squid is moving so slow so i switched
back to
On 22/12/2011 03:25, Amos Jeffries wrote:
On 22/12/2011 1:13 p.m., Eliezer Croitoru wrote:
updated to squid 3.2.0.14 and then
i'm getting a page of request-too-large while trying to use facebook.
so what i did after searching for anything like that in the past of
squid to add
On 23/12/2011 05:33, Chia Wei LEE wrote:
Hi
Thanks for the advice.
This is because we are selling the Static IP to the user, user should use
their own public IP to server the internet instead using the proxy server
IP.
i really recommend to use UBUNTU as os for that cause i have too much
good
administrator is webmaster.
Generated Fri, 23 Dec 2011 13:46:28 GMT by c (squid/3.2.0.14)
hope to make it work somehow.
Thanks
Eliezer
On 22/12/2011 2:35 p.m., Eliezer Croitoru wrote:
On 22/12/2011 03:25, Amos Jeffries wrote:
On 22/12/2011 1:13 p.m., Eliezer Croitoru wrote:
updated to squid
On 23/12/2011 16:28, Amos Jeffries wrote:
On 24/12/2011 2:47 a.m., Eliezer Croitoru wrote:
what im getting is:
ERROR
The requested URL could not be retrieved
Invalid Request error was encountered while trying to process the
request:
Er, these are: bInvalid Request/b error was encountered
On 25/12/2011 12:20, S.R. wrote:
I am trying to set up squid in proxy-only (no caching) mode, with
direct access except for specified domains. For specified domains I
want squid to forward the request to another proxy server. This second
part is not working! Here are the relevant config lines:
On 28/12/2011 17:58, Helmut Hullen wrote:
Works on squid 3.2.0.8
Eliezer
Hallo, Mario,
Du meintest am 28.12.11:
i am running Squid 3.1.0.14 and when i try to access
www.allplanlernen.de i get a 502 error.
Same here (squid 3.2.0.14):
502 Bad Gateway
nginx/0.7.67
It
, Eliezer Croitoru wrote:
updated to squid 3.2.0.14 and then
i'm getting a page of request-too-large while trying to use facebook.
so what i did after searching for anything like that in the past of
squid to add
reply_header_max_size 30 KB
to the config and it works but squid is moving so slow so
as far as i have seen the almost exact error it means you dont have all
building dependencies to make the compilation be done.
what linux version are you using?
Eliezer
On 31/12/2011 13:02, someone wrote:
Well I copied the configuration from my at the time, current squid,
3.1.6, which doesnt
i have couple of things things:
i have made a long way of testing squid for a couple of days on various
versions of linux distors such as centos 5.7\6.0\6.2 fedora 15\16 ubuntu
10.04.3\11.10 gentoo(on the last portage) using tproxy and forward
proxy. (all i686 but ubuntu x64)
i couldnt find
On 04/01/2012 11:15, Amos Jeffries wrote:
On 4/01/2012 5:32 p.m., Eliezer Croitoru wrote:
i have couple of things things:
i have made a long way of testing squid for a couple of days on
various versions of linux distors such as centos 5.7\6.0\6.2 fedora
15\16 ubuntu 10.04.3\11.10 gentoo
i made a squid url_rewriter for cache purposes and it works on ubunut
and on fedora 16(i686).
also it works on fedora 15 with the 3.2.0.12 rpm from fedora 16 repo.
the problem is that when the re_rewriter is replying with the address to
squid the session that squid is creating is : from the
On 06/01/2012 01:48, berry guru wrote:
I'm running Squid 2.7(stable) on Ubuntu 11.10. I'm having some
trouble with internal DNS. For some reason I get the following error:
ERROR
The requested URL could not be retrieved.
Unable to determine IP address from hose name server name goes here
The
On 06/01/2012 02:51, Eliezer Croitoru wrote:
On 06/01/2012 01:48, berry guru wrote:
I'm running Squid 2.7(stable) on Ubuntu 11.10. I'm having some
trouble with internal DNS. For some reason I get the following error:
ERROR
The requested URL could not be retrieved.
Unable to determine IP
:36.791| fwdConnectStart: got TCP FD 13
so the main problem is that the request that comes from squid is not
using the right address in tproxy mode.
Thanks
Eliezer
On 05/01/2012 17:20, Eliezer Croitoru wrote:
i made a squid url_rewriter for cache purposes and it works on ubunut
and on fedora 16
is not
using the right address in tproxy mode.
Thanks
Eliezer
On 05/01/2012 17:20, Eliezer Croitoru wrote:
i made a squid url_rewriter for cache purposes and it works on ubunut
and on fedora 16(i686).
also it works on fedora 15 with the 3.2.0.12 rpm from fedora 16 repo.
the problem
On 12/01/2012 19:58, Gerson Barreiros wrote:
I have an unique server doing this job. My scenario is most the same
as mentioned above.
I just want to know if i can make this server a Virtual Machine, that
will use shared hard disk / memory / cpu with another VMs.
web proxy on a vm is not the
On 23/01/2012 21:56, Henrik Nordström wrote:
ons 2012-01-04 klockan 12:48 +0200 skrev Eliezer Croitoru:
the funny thing is that fedora 16 with kernel 3.1.6 and squid 3.2.0.13
from the repo just works fine.
And have nothing special for making Squid run at all.. except not
mucking around
On 11/02/2012 19:03, João Paulo Ferreira wrote:
Is there any way to know what parameters were used by the YUM installation?
2012/2/11 Andrew Beverleya...@andybev.com:
On Sat, 2012-02-11 at 11:36 -0200, João Paulo Ferreira wrote:
Does anyone know how do I recompile my squid that was installing
On 14/02/2012 20:32, Wladner Klimach wrote:
Squid users,
when I try to get this url www.tcu.gov.br it takes too long when it
even does. Look at my set up configs in squid.conf:
it's not your squid configuration.
i am using squid and it doing the same but also using just wget gave me
the same
hey there Eli(i think i know you)
any ssl interception will make the connection slower but it can be tricky.
gmail is one big example of a site that has problems while working on
plain http and on https will work better also will solve many problems
because most ISP's wont do ssl interception.
On 20/02/2012 17:59, Fried Wil wrote:
the simple way is to use a redirection page on the ows web server.
change the index.html page on the / .
some sources for that:
http://www.web-source.net/html_redirect.htm
http://www.quackit.com/html/html_redirect.cfm
also no problem on squid 3.2.0.8
Eliezer
On 23/02/2012 15:54, karj wrote:
Hi all,
I’ have a problem with the first page of a site, that’s behind squid.
The page of the site www.tovima.gr seems to load forever (using chrome and
firefox).
When I bring squid out the equation (direct link to the
On 27/02/2012 14:49, E.S. Rosenberg wrote:
Hi all,
I would like to create a small external acl program for use in our
organization and I was wondering are there any code examples of
existing external acls in the public domain?
I have tried searching a bit but other than the spec I haven't really
On 27/02/2012 19:35, E.S. Rosenberg wrote:
2012/2/27 Eliezer Croitoruelie...@ec.hadorhabaac.com:
On 27/02/2012 14:49, E.S. Rosenberg wrote:
Hi all,
I would like to create a small external acl program for use in our
organization and I was wondering are there any code examples of
existing
it's a linux module and you should first check if it exists or loaded.
use:
lsmod |grep -i tproxy
to see if it's loaded
to check if the kernel has a built module you should run:
modprobe -l |egrep -i tproxy|socket
you should have 2 modules for tproxy and also some iptable socket moduels.
if
you need to add a the first rule such as:
ip6tables -t mangle -A PREROUTING -p tcp -d (IP of the machine) --dport
80 -j ACCEPT
= here all the other iptables rules =
Regards
Eliezer
On 05/03/2012 20:09, Vignesh Ramamurthy wrote:
Hello,
We are using squid to transparently proxy the traffic to
. It will be
of great help.
Thanks Regards
Vijay
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
elilezerat ngtech.co.il
all
http_port 3128
visible_hostname loclahost
debug_options ALL,1 33,2 28,9
tcp_outgoing_address 122.166.1.184
Can somebody help me with configuration for the my servers. It will be
of great help.
ThanksRegards
Vijay
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting
Regards
Vijay
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
elilezerat ngtech.co.il
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
elilezerat ngtech.co.il
--
Eliezer Croitoru
https://www1
know what to look for.
i am getting every couple minutes something like that:
2012/03/18 15:29:59 kid2| Failed to select source for
'http://www.crunchyroll.com/favicon.ico'
what the meaning of the statement? (not with this particular address).
Thanks,
Eliezer
--
Eliezer Croitoru
https://www1
1 - 100 of 2273 matches
Mail list logo