i would like to know if someone managed to use ubuntu and TPROXY?
i want to write a detailed manual on how to build a gentoo and ubutnu
based TPROXY squid server in a bridge and router mode.
if anyone can help me and give me a working guide i will be more then happy.
for now i managed to buil
On 04/11/2011 16:55, Rick Chisholm wrote:
clear your cache, then you get the pause again. The light versions of OWA
did you tried the 3.2 branch?
as let say 3.2.0.8?
Eliezer
On 04/11/2011 17:41, Rick Chisholm wrote:
3.2 does not appear to be available in the BSD ports tree at this time.
compile it yourself.. it pretty simple..
Eliezer
On Fri, November 4, 2011 11:39 am, Eliezer Croitoru wrote:
On 04/11/2011 16:55, Rick Chisholm wrote:
clear your cache, then
On 17/11/2011 16:11, Nataniel Klug wrote:
Hello all,
I am facing a very difficult problem in my network. I am
using a layout like this:
(internet) == == == [clients]
I am running CentOS v5.1 with Squid-2.6 STABLE22 and Tproxy
(cttproxy-2.6.1
why dont you use the interception\transparent mode instead of TPROXY?
for your setup it seems just the perfect idea.
i'm using a range setup like this:
-A PREROUTING -p tcp -m tcp -m iprange ! -d 192.168.0.0/16 -i eth1
--dport 80 -j REDIRECT --to-ports 3128 --src-range
192.168.0.0-192.168.0.190
On 19/12/2011 19:12, Terry Dobbs wrote:
it's an old issue from squid 3.1 to 3.2 there is nothing yet as far as i
know that solves this issue.
Regards
Eliezer
Hi All.
I just installed squid3 after running squid2.5 for a number of years. I
find after reloading squid3 and trying to access the in
updated to squid 3.2.0.14 and then
i'm getting a page of "request-too-large" while trying to use facebook.
so what i did after searching for anything like that in the past of
squid to add
reply_header_max_size 30 KB
to the config and it works but squid is moving so slow so i switched
back to 3
On 22/12/2011 03:25, Amos Jeffries wrote:
On 22/12/2011 1:13 p.m., Eliezer Croitoru wrote:
updated to squid 3.2.0.14 and then
i'm getting a page of "request-too-large" while trying to use facebook.
so what i did after searching for anything like that in the past
On 23/12/2011 05:33, Chia Wei LEE wrote:
Hi
Thanks for the advice.
This is because we are selling the Static IP to the user, user should use
their own public IP to server the internet instead using the proxy server
IP.
i really recommend to use UBUNTU as os for that cause i have too much
good e
is being asked from an HTTP/1.0 software.
Your cache administrator is webmaster.
Generated Fri, 23 Dec 2011 13:46:28 GMT by c (squid/3.2.0.14)
hope to make it work somehow.
Thanks
Eliezer
On 22/12/2011 2:35 p.m., Eliezer Croitoru wrote:
On 22/12/2011 03:25, Amos Jeffries wrote:
On 22/12/2011 1:1
On 23/12/2011 16:28, Amos Jeffries wrote:
On 24/12/2011 2:47 a.m., Eliezer Croitoru wrote:
what im getting is:
ERROR
The requested URL could not be retrieved
Invalid Request error was encountered while trying to process the
request:
Er, these are: "Invalid Request error was encoun
On 25/12/2011 12:20, S.R. wrote:
I am trying to set up squid in proxy-only (no caching) mode, with
direct access except for specified domains. For specified domains I
want squid to forward the request to another proxy server. This second
part is not working! Here are the relevant config lines:
On 28/12/2011 17:58, Helmut Hullen wrote:
Works on squid 3.2.0.8
Eliezer
Hallo, Mario,
Du meintest am 28.12.11:
i am running Squid 3.1.0.14 and when i try to access
www.allplanlernen.de i get a 502 error.
Same here (squid 3.2.0.14):
502 Bad Gateway
nginx/0.7.67
It works
14400% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4320
#end of squid.conf
i compiled on the same machine 3.1.18 and it works fine.
i will try later to compile 3.2.0.13 and the daily updated 3.2.0.14
with what to start?
E
as far as i have seen the almost exact error it means you dont have all
building dependencies to make the compilation be done.
what linux version are you using?
Eliezer
On 31/12/2011 13:02, someone wrote:
Well I copied the configuration from my at the time, current squid,
3.1.6, which doesnt i
i have couple of things things:
i have made a long way of testing squid for a couple of days on various
versions of linux distors such as centos 5.7\6.0\6.2 fedora 15\16 ubuntu
10.04.3\11.10 gentoo(on the last portage) using tproxy and forward
proxy. (all i686 but ubuntu x64)
i couldnt find any
On 04/01/2012 11:15, Amos Jeffries wrote:
On 4/01/2012 5:32 p.m., Eliezer Croitoru wrote:
i have couple of things things:
i have made a long way of testing squid for a couple of days on
various versions of linux distors such as centos 5.7\6.0\6.2 fedora
15\16 ubuntu 10.04.3\11.10 gentoo(on the
i made a squid url_rewriter for cache purposes and it works on ubunut
and on fedora 16(i686).
also it works on fedora 15 with the 3.2.0.12 rpm from fedora 16 repo.
the problem is that when the re_rewriter is replying with the address to
squid the session that squid is creating is : from the clie
On 06/01/2012 01:48, berry guru wrote:
I'm running Squid 2.7(stable) on Ubuntu 11.10. I'm having some
trouble with internal DNS. For some reason I get the following error:
ERROR
The requested URL could not be retrieved.
Unable to determine IP address from hose name "server name goes here"
The D
On 06/01/2012 02:51, Eliezer Croitoru wrote:
On 06/01/2012 01:48, berry guru wrote:
I'm running Squid 2.7(stable) on Ubuntu 11.10. I'm having some
trouble with internal DNS. For some reason I get the following error:
ERROR
The requested URL could not be retrieved.
Unable to de
04:28:36.791| fwdConnectStart: got TCP FD 13
so the main problem is that the request that comes from squid is not
using the right address in tproxy mode.
Thanks
Eliezer
On 05/01/2012 17:20, Eliezer Croitoru wrote:
i made a squid url_rewriter for cache purposes and it works on ubunut
and on fedo
t comes from squid is not
using the right address in tproxy mode.
Thanks
Eliezer
On 05/01/2012 17:20, Eliezer Croitoru wrote:
i made a squid url_rewriter for cache purposes and it works on ubunut
and on fedora 16(i686).
also it works on fedora 15 with the 3.2.0.12 rpm from fedora 16 repo.
t
On 12/01/2012 19:58, Gerson Barreiros wrote:
I have an unique server doing this job. My scenario is most the same
as mentioned above.
I just want to know if i can make this server a Virtual Machine, that
will use shared hard disk / memory / cpu with another VMs.
web proxy on a vm is not the bes
On 23/01/2012 21:56, Henrik Nordström wrote:
ons 2012-01-04 klockan 12:48 +0200 skrev Eliezer Croitoru:
the funny thing is that fedora 16 with kernel 3.1.6 and squid 3.2.0.13
from the repo just works fine.
And have nothing special for making Squid run at all.. except not
mucking around with
On 11/02/2012 19:03, João Paulo Ferreira wrote:
Is there any way to know what parameters were used by the YUM installation?
2012/2/11 Andrew Beverley:
On Sat, 2012-02-11 at 11:36 -0200, João Paulo Ferreira wrote:
Does anyone know how do I recompile my squid that was installing the
tool using y
On 14/02/2012 20:32, Wladner Klimach wrote:
Squid users,
when I try to get this url www.tcu.gov.br it takes too long when it
even does. Look at my set up configs in squid.conf:
it's not your squid configuration.
i am using squid and it doing the same but also using just wget gave me
the same r
hey there Eli(i think i know you)
any ssl interception will make the connection slower but it can be tricky.
gmail is one big example of a site that has problems while working on
plain http and on https will work better also will solve many problems
because most ISP's wont do ssl interception.
On 20/02/2012 17:59, Fried Wil wrote:
the simple way is to use a redirection page on the ows web server.
change the index.html page on the "/" .
some sources for that:
http://www.web-source.net/html_redirect.htm
http://www.quackit.com/html/html_redirect.cfm
http://billstclair.com/html-redirect2.ht
also no problem on squid 3.2.0.8
Eliezer
On 23/02/2012 15:54, karj wrote:
Hi all,
I’ have a problem with the first page of a site, that’s behind squid.
The page of the site www.tovima.gr seems to load forever (using chrome and
firefox).
When I bring squid out the equation (direct link to the
On 27/02/2012 14:49, E.S. Rosenberg wrote:
Hi all,
I would like to create a small external acl program for use in our
organization and I was wondering are there any code examples of
existing external acls in the public domain?
I have tried searching a bit but other than the spec I haven't really
On 27/02/2012 19:35, E.S. Rosenberg wrote:
2012/2/27 Eliezer Croitoru:
On 27/02/2012 14:49, E.S. Rosenberg wrote:
Hi all,
I would like to create a small external acl program for use in our
organization and I was wondering are there any code examples of
existing external acls in the public
it's a linux module and you should first check if it exists or loaded.
use:
lsmod |grep -i tproxy
to see if it's loaded
to check if the kernel has a built module you should run:
modprobe -l |egrep -i "tproxy|socket"
you should have 2 modules for tproxy and also some iptable socket moduels.
if
you need to add a the first rule such as:
ip6tables -t mangle -A PREROUTING -p tcp -d (IP of the machine) --dport
80 -j ACCEPT
= here all the other iptables rules =
Regards
Eliezer
On 05/03/2012 20:09, Vignesh Ramamurthy wrote:
Hello,
We are using squid to transparently proxy the traffic to
be
of great help.
Thanks& Regards
Vijay
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
elilezer ngtech.co.il
On 20/03/2012 00:36, Vijay S wrote:
Sorry i cannot share the url and hence im replacing the feed as
http://feeds.example.com/newsfeeds.xml
On Tue, Mar 20, 2012 at 1:37 AM, Eliezer Croitoru wrote:
On 19/03/2012 18:58, Vijay S wrote:
Hi
I have a my server box hosting apache and squid on
ml = curl_exec($s);
$xml = trim($xml);
curl_close($s);
On Tue, Mar 20, 2012 at 5:00 AM, Eliezer Croitoru wrote:
On 20/03/2012 00:36, Vijay S wrote:
Sorry i cannot share the url and hence im replacing the feed as
http://feeds.example.com/newsfeeds.xml
On Tue, Mar 20, 2012 at 1:37 AM, Elie
dont
know what to look for.
i am getting every couple minutes something like that:
2012/03/18 15:29:59 kid2| Failed to select source for
'http://www.crunchyroll.com/favicon.ico'
what the meaning of the statement? (not with this particular address).
Thanks,
Eliezer
--
Eliezer Croito
On 20/03/2012 04:50, Amos Jeffries wrote:
On 20.03.2012 15:34, Eliezer Croitoru wrote:
i have a small Gentoo X86_64 with kernel 3.2.1 with squid 3.2.0.16
that is crashing after a while when using workers.
i'm trying to debug it but have no clue on what debug flags to use in
order to get
nfoPage: no match
2012/03/20 10:14:23.892| FilledChecklist.cc(168) ~ACLFilledChecklist:
ACLFilledChecklist destroyed 0x19f0128
2012/03/20 10:14:23.892| ACLChecklist::~ACLChecklist: destroyed 0x19f0128
2012/03/20 10:14:23.893| FilledChecklist.cc(168) ~ACLFilledChecklist:
ACLFilledChecklist destroyed 0x19f0128
2012/03
squid irc channel or via email.
Regards,
Eliezer
Thanks,
Shan
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
elilezer ngtech.co.il
been disabled anywhere?
maybe you are using masquerading on the pfsense server?
Regards,
Eliezer
Any ideas?
Cheers!
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
elilezer ngtech.co.il
On 20/03/2012 04:50, Amos Jeffries wrote:
On 20.03.2012 15:34, Eliezer Croitoru wrote:
i have a small Gentoo X86_64 with kernel 3.2.1 with squid 3.2.0.16
that is crashing after a while when using workers.
i'm trying to debug it but have no clue on what debug flags to use in
order to get
s mailing list archive at Nabble.com.
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
elilezer ngtech.co.il
'!' ?
The final diagnosis of this problem is that the traffic was not even
entering Squid. No amount of Squid config will cause it to respond to
packets which dont even arrive.
Amos
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
elilezer ngtech.co.il
Squid - Users mailing list archive at Nabble.com.
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
elilezer ngtech.co.il
e.
This is true, but the way I saw was: "If the URL do not exist, so
can't be duplicate", I think isn't wrong !!
* ensure you have manager requests form localhost not going through the ACL
test.
I was making this wrong, the localhost was going through the ACL, but
I just
or squid index.
p.s.
forgot to say that you always must not get into a point that your proxy
ram is at the limit or else swapiness will happen and will slow the
server down.
Regards,
Eliezer
Best regards,
- Christian Loth
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
have the same uri but and host except the low level
domain.
so just to replace the low level domain with something else.
means a "cdn" like thing and very simple to cache using nginx and squid.
Thanks,
Eliezer
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
nt=0)
receive a HTTP request (count-=1)
- test ACL (count=1 -> OK)
receive b HTTP request (count=2)
- test ACL (count=2 -> ERR)
- reject b (count=1)
done a (count=0)
With your explanation and code from Eliezer Croitoru I made this:
#!/bin/bash
while read line; do
res
nothing to do with where squid caches the
object or where it fetches from. Only how long cacheable things get stored.
Amos
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
Yes, just run it on the shell as you would any other script, and input the
expected values (as specified in squid.conf) followed by a carriage return. The
script should return OK or ERR as appropriate.
Andy
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
's under one isa server then all of them share the same external IP.
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
corporate proxy
never_direct deny all
Regards,
Eliezer
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
o we have different filter groups for different users..
Hope it helps.
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
On 05/04/2012 10:25, Colin Coe wrote:
On Wed, Apr 4, 2012 at 7:40 PM, Amos Jeffries wrote:
On 4/04/2012 6:01 p.m., Eliezer Croitoru wrote:
On 04/04/2012 08:12, Colin Coe wrote:
Hi all
I'm trying to get our squid proxy server to allow clients to do
outbound FTP. The problem is tha
On 05/04/2012 12:14, Colin Coe wrote:
Oops, and send to list.
On Thu, Apr 5, 2012 at 4:26 PM, Eliezer Croitoru wrote:
On 05/04/2012 10:25, Colin Coe wrote:
On Wed, Apr 4, 2012 at 7:40 PM, Amos Jeffries
wrote:
On 4/04/2012 6:01 p.m., Eliezer Croitoru wrote:
On 04/04/2012 08:12, Colin
x27;m here.
Regards,
Eliezer
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
On 05/04/2012 16:21, Colin Coe wrote:
On Thu, Apr 5, 2012 at 8:32 PM, Eliezer Croitoru wrote:
On 05/04/2012 14:51, Colin Coe wrote:
OK, I did
export ftp_proxy=http://benpxy1p:3128
wget ftp://ftp2.bom.gov.au/anon/gen/fwo
--2012-04-05 19:43:38-- ftp://ftp2.bom.gov.au/anon/gen/fwo
Resolving
iathome.berkeley.edu (Original)
Number of Seti Units Returned: 19,471
Processing time: 32 years, 290 days, 12 hours, 58 minutes
(Total Hours: 287,489)
BOINC@HOME CREDITS
SETI12029945.909740 | EINSTEIN 7623371.809852
ROSETTA 4388616.446766 | ABC 12124980.377137
--
tp);+ return;+ }+ }+ /* bug fix end here*/ stale =
refreshCheckHTTPStale(e, r); debug(33, 2) ("clientCacheHit:
refreshCheckHTTPStale returned %d\n",stale); if (stale == 0) {
on what version of squid are you trying to do it?
it works poorly and only on squid 2.X
you can try this:
http://code.google.com
:%s\n",http->uri);
+ http->log_type = LOG_TCP_MISS;+ clientProcessMiss(http);
+ return;
+ }
+ }
+ /* bug fix end here*/
stale = refreshCheckHTTPStale(e, r);
debug(33, 2) ("clientCacheHit: refreshCheckHTTPStale returned
%d\n",stale);
if (stale == 0) {
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
On 14/04/2012 08:34, Colin Coe wrote:
On Thu, Apr 5, 2012 at 10:07 PM, Eliezer Croitoru wrote:
On 05/04/2012 16:21, Colin Coe wrote:
On Thu, Apr 5, 2012 at 8:32 PM, Eliezer Croitoru
wrote:
On 05/04/2012 14:51, Colin Coe wrote:
OK, I did
export ftp_proxy=http://benpxy1p:3128
wget ftp
tup?
have you tried to use "ssl-bump"? because it's a requirement for this to
work.
what did you tried to do on squid until now?
post the squid.conf
Regards,
Eliezer
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
at Nabble.com.
it worked for me for a long time.
some times i found that some file download got corrupted so it's not a
100% in any case.
Regards,
Elizer
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
.nabble.com/Youtube-storeurl-and-302-redirection-tp4541941p4577847.html
Sent from the Squid - Users mailing list archive at Nabble.com.
It was working last month so i'm pretty sure they didnt changed a thing.
i will try it on the next day just to make sure it works again.
Regards,
Eliezer
--
El
\download more
data of the video file to preserve bandwidth.
this is causing the problem and i will try to analyze it a bit.
if someone has done any research about it i will be glad to hear about it.
Regards,
Eliezer
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit
On 22/04/2012 15:25, Eliezer Croitoru wrote:
On 22/04/2012 11:48, x-man wrote:
I think youtube changed something in the player behavior recently,
even now
for me http://code.google.com/p/youtube-cache/ is not working with squid
2.7.
Is it working for you now?
--
View this message in context
look like this:
http://c.wrzuta.pl/wv15404/e14f5b450003f3b24f9333d1/4479241
Regards,
Eliezer
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
-parent-child-tp4573394p4580445.html
Sent from the Squid - Users mailing list archive at Nabble.com.
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
ransparent ?regular forward proxy?
what browser are you using?
do you have some squid logs? or squid.conf?
what dns server are you using?
Regards,
Eliezer
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
pool/squid
##ykhan squid redirection to squidguard
#redirect_program /usr/bin/squidGuard
#url_rewrite_program /usr/bin/squidGuard
#url_rewrite_children 5
On Mon, Apr 23, 2012 at 8:42 PM, Eliezer Croitoru wrote:
On 23/04/2012 18:38, Muhammad Yousuf Khan wrote:
well i have been experiencin
ours time MTWH 14:04-18:04
#--General Timing-Friday
acl aci_working_hours time F 10:04-13:04
acl aci_working_hours time F 15:04-18:04
http_access deny aci_dest aci_working_hours aci_general
On Tue, Apr 24, 2012 at 1:11 PM, Eliezer Croitoru wrote:
are you ta
will be implemented i will be happy to get some
help with it.
Thanks,
Eliezer
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
On 25/04/2012 06:02, Amos Jeffries wrote:
On 25/04/2012 6:02 a.m., Eliezer Croitoru wrote:
as for some people asking me recently about youtube cache i have
checked again and found that youtube changed their video uris and
added an argument called "range" that is managed by the yout
this problem at all.
ngnix uses the info from the url args simple and smoothly.
generally the store_url_rewrite has much more potential to be cache
effective then nginx proxy_store as the ngnix proxy_store is a permanent
store mechanism without any time limit calculation.
as for now nginx has the
ache . If you check
and analyze it more then you will notice same ID or same videop while
watching the link changes for example :
It starts [ITAG/ID/RANGE] then changes to [ID/ITAG/RANGE] and finally
to [RANGE/ITAG/ID] so with my script you can capture the whole
places!.
Ghassan
On 4/25/12, Eliezer Croitoru wrote:
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
p).
any suggest ?
On Thu, Apr 26, 2012 at 2:41 PM, Christian Loth
wrote:
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
static
content and
whatever is good for.
I'm also expecting someone from the squid team to also suggest some
way of
proper doing it
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
read_requests do |r|
idrx = /.*(id\=)([A-Za-z0-9]*).*/
itagrx = /.*(itag\=)([0-9]*).*/
rangerx = /.*(range\=)([0-9\-]*).*/
newurl = "http://video-srv.youtube.com.SQUIDINTERNAL/id_"; +
r.url.match(idrx)[2] + "_itag_" + r.url.match(itagrx)[2] + "_range_" +
r
...@moc.sa.edu.au<mailto:info@> www.moc.sa.edu.au<http://www.moc.sa.edu.au>
l Ph: (08) 8209 1600 l Fax: (08) 8209 1650
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
On 27/04/2012 09:52, Hasanen AL-Bana wrote:
On Fri, Apr 27, 2012 at 7:43 AM, Eliezer Croitoru wrote:
On 25/04/2012 20:48, Hasanen AL-Bana wrote:
wouldn't be better if we save the video chunks ? youtube is streaming
files with 1.7MB flv chunks, youtube flash player knows how to merge
the
h better with
performance and memory print.
JAVA is one step above the interpreted scripts\programs.
my opinion is that in your case you should use something else then perl
as a url_rewriter store_url_rewrite if the system has kind of static
options.
Regards,
Eliezer
On Fri, Apr 27, 2012 at 10:
perfectly.
have a look at:
http://www.visolve.com/squid/whitepapers/redirector.php#Configuring_Squid_for_squidGuard
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
On 24/04/2012 21:02, Eliezer Croitoru wrote:
as for some people asking me recently about youtube cache i have checked
again and found that youtube changed their video uris and added an
argument called "range" that is managed by the youtube player.
the original url\uri dosnt include ran
d be added.
by the way,
What video\mp3 sites are you caching using your scripts?
Eliezer
On Mon, Apr 30, 2012 at 1:29 AM, Eliezer Croitoru wrote:
On 24/04/2012 21:02, Eliezer Croitoru wrote:
as for some people asking me recently about youtube cache i have checked
again and found that yo
done using iptables also
but it dont remember how it should be done.
what did you tried to do on iptables?
i also found this nice iptables method sample:
http://www.pmoghadam.com/homepage/HTML/Round-robin-load-balancing-NAT.html
Regards,
Eliezer
--
Eliezer Croitoru
https://www1.ngtech.co.
it show different data it can be because the
site has some browser based templates.
Regards,
Eliezer
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
ks fine with drupal so i suppose you can try to compile
a more advanced and supported version of squid as a starter.
Eliezer
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
login page get refreshed.
Any ideas?
on 3119 seems to works fine.
and what about some logs?
or squid.conf?
Eliezer
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
and I added OVI STORE.
Do you have any website that its not cacheable or using CDN or
something because Im really interested to look at :) .
Ghassan
On Mon, Apr 30, 2012 at 2:53 AM, Eliezer Croitoru wrote:
On 30/04/2012 02:18, Ghassan Gharabli wrote:
Hello Eliezer,
Are you trying to save
had the time to analyze the reason yet.
by the way you can use squid2.7 instance as a cache_peer instead of nginx.
did you tried my code(ruby)?
i will need to make some changes to make sure it will fit more videos
that doesn't use range parameter(there are couple).
Eliezer
--
Eliezer C
Squid versions:
request_header_access Range deny youtube
in any case the range in hot used using plain "headers" but using a url
argument on youtube.
so it wont help anyway.
Eliezer
Amos
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
ess deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localhost
http_access allow westhants-network
http_access deny all
Thanks!
--
Jeff MacDonald
j...@terida.com
902 880 7375
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
> , but
final ack from client dont..
i upgrated kernel to 3.3.4 and iptables to 1.4.13 .. all work fine
except the problem with tplink wireless router..
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
ct --redirect-target DROP
cd /proc/sys/net/bridge/
for i in *
do
echo 0> $i
done
unset i
echo 0> /proc/sys/net/ipv4/conf/lo/rp_filter
echo 0> /proc/sys/net/ipv4/conf/all/rp_filter
echo 1> /proc/sys/net/ipv4/ip_forward
hope someone help.. dont know how to track where syn/ack
closely most of the page content is being cached by the
browser so you will see mostly misses on the log after first loading the
page.
Regards,
Eliezer
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
lable.
Regards,
--
View this message in context:
http://squid-web-proxy-cache.1019090.n4.nabble.com/external-acl-code-examples-tp4424505p4546016.html
Sent from the Squid - Users mailing list archive at Nabble.com.
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonp
error message on firefox :
The requested URL could not be retrieved
What am I missing here ?
Regards,
Hugo
--
Eliezer Croitoru
https://www1.ngtech.co.il
IT consulting for Nonprofit organizations
eliezer ngtech.co.il
is there any effect on "in cache"(already cached) objects by new refresh
patterns or,
the refresh patters affect the object while it was been cached?
and is there a way to change a cached object min max times or to extend
manually the expiration time?
Thanks,
Eliezer
--
Elieze
e statistics will be relevant.
also it seems like if you will add to the uri\url some custom parameters
such as found in urls "redirect=1" it wont change anything for yt
servers about serving the file that matches the basic parameters.
Will update
Eliezer
--
Eliezer Croitoru
htt
1 - 100 of 1311 matches
Mail list logo