Questions:
1 - Squid is installed and working, all users are forced to use it.
2 - squidguard is installed and working, tested with rewrites and functional.

it is an antivirus program that requests update files from the internet. Every 
home network computer get these updates everyday. Some of the files are as big 
as 5Mb sometimes. If I download the files to my proxyserver and manualy point 
the antivirus program to the files on the server then it works but I want the 
server to do that for me.

If it would get the files form the squid cach then why does it take so long to 
update the system?

chris

>>> "marcusv" <marcusv>  >>>
Ok Question...

The internal program. What is that, and can you make changes to it.
Do you have squid running on your local server and does the cache work.

If the cache works then if you download something now, and tomorrow you
wana download the same file, its going to look at the cache on your
server 1st before you look at a external site.

Questions
1. do you have squid installed and working 100%
2. do you have squidguard installed.

If you have those. The redirect the users will work.

But if you only have squid then I don't think it will work.




-----Original Message-----
From: Chris Botha [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, April 18, 2006 2:15 PM
To: [EMAIL PROTECTED]
Subject: RE: Rewrite options


Thanks so much for the answer.

I have no access to the old_server. Perhaps I can explain a bit more. I
have an internal program that the users use and I want to reduse the
internet traffic, so I thought that I can download the files to a local
server once which work. But now I must capture the requestes sent to the
old server and redirect them to the new server, this I cannot set on the
clients either.

I have been strugling with this for about two weeks and really would
like some sort of solution.

Hope to hear form you soon again

Yours truly

Chris

>>> "marcusv" <marcusv>  >>>
Chris.

If you still have access to the old server create a index.html file in
the root of the files dir. With the following in it.

<meta http-equiv="refresh" content="0 ;URL=yournewserver.com/files/">

If not install squid and a helper program called sqiudguard.
SG help with ACL and any redirection that you might need done.

Also look at a GUI for Linux admin called webmin.

-----Original Message-----
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Chris Botha
Sent: Tuesday, April 18, 2006 12:20 PM
To: [email protected]
Subject: Rewrite options


Dear users, 

I'm in a fix and need help. On my local network I need to capture a
website link and rewrite it pointing it to a new server.

eg: 
www.old_update_server.com/files/3289sde3.zip
www.old_update_server.com/files/5289rte3.zip
www.old_update_server.com/files/1089sde3.zip
www.old_update_server.com/files/3589oru3.zip

I want to rewrite and redirect these links to 

www.new_update_server.com/files/3289sde3.zip
www.new_update_server.com/files/5289rte3.zip
www.new_update_server.com/files/1089sde3.zip
www.new_update_server.com/files/3589oru3.zip

I sit with the problem that I never know what the name of the "zip"
files are. Each user can requiest diffrent files according to their
needs. This is a short list that I included, there can be up to 20 such
files per day.

I will appreciate any help.

Thank you

</marcusv>

</marcusv>

Reply via email to