Re: [squid-users] Reports in Squids

2018-04-11 Thread Amos Jeffries
On 12/04/18 08:01, Informatico Neurodesarrollo wrote:
> Hi friends:
> I am use openSUSE Leap 42.3 as a proxy server.
> I am need send a report  monthly :
> 
> * The first 10th sites with more access and the total amount of his
> traffics.
> * The first 10th users with more access  and the total amount of his
> traffics.
> 
> * Download and Upload average of total traffic between this time:
> 
> 7:00 to    12:00
> 12:00   to    17:00
> 17:00   to    24:00
> 
> Which  software I can  implement it?
> 

Have a look at the software listed here:
 


Amos
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Ideas for better caching these popular urls

2018-04-11 Thread Eliezer Croitoru
Hey Omid,

I found the service I wrote and packed it in a RPM at:
http://ngtech.co.il/repo/centos/7/x86_64/response-dumper-icap-1.0.0-1.el7.centos.x86_64.rpm

If you are using other OS let me know and I will try to package it for your OS.
Currently debian\ubuntu alien converts the RPM smoothly.

The dumps directory is at:
/var/response-dumper

But the cleanup and filtering ACL's are your job.
You can define which GET requests the service dump\log into the files.
Each individual file in this directory will be name in the next format:
-<8 bytes uuid>-

This format will allow multiple requests happen at the same time but have a 
different name but the URL hash is still the same so you can filter files by 
this.
To calculate the hash of a URL use:
$ echo -n "GET:http:/url-to-has.com/path?query=terms"|md5sum

In each and every file the full ICAP respmod details exits ie:
ICAP Request\r\n
HTTP Request \r\n
HTTP Response\r\n

By default cookies+authorization headers are censored from both request and 
response in the dump to avoid some privacy law issues.

Now the only missing feature is RedBot is to feed a single request and a single 
response to get a full analysis.

Let me know if it works OK for you(works here fine for a while now).

Eliezer


Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: elie...@ngtech.co.il


-Original Message-
From: squid-users  On Behalf Of Omid 
Kosari
Sent: Wednesday, April 11, 2018 12:32
To: squid-users@lists.squid-cache.org
Subject: Re: [squid-users] Ideas for better caching these popular urls

Eliezer Croitoru wrote
> You will need more then just the urls but also the response headers for
> these.
> I might be able to write an ICAP service that will log requests and
> response headers and it can assist Cache admins to improve their
> efficiency but this can take a while.

Hi Eliezer,

Nice idea. I am ready to test/help/share what you need in real production
environment. Please also do a general thing which includes other domains in
first post attachment. They worth a try .

Thanks




--
Sent from: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Squid-Users-f1019091.html
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users

___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Reports in Squids

2018-04-11 Thread Periko Support
Check sarg.

2018-04-11 13:01 GMT-07:00 Informatico Neurodesarrollo
:
> Hi friends:
> I am use openSUSE Leap 42.3 as a proxy server.
> I am need send a report  monthly :
>
> * The first 10th sites with more access and the total amount of his
> traffics.
> * The first 10th users with more access  and the total amount of his
> traffics.
>
> * Download and Upload average of total traffic between this time:
>
> 7:00 to12:00
> 12:00   to17:00
> 17:00   to24:00
>
> Which  software I can  implement it?
>
> I hope that somebody can help me.
>
> My best regards.
>
>
> --
>
> Jesús Reyes Piedra
> Admin Red Neurodesarrollo,Cárdenas
> La caja decía:"Requiere windows 95 o superior"...
> Entonces instalé LINUX.
>
>
> --
> Este mensaje le ha llegado mediante el servicio de correo electronico que
> ofrece Infomed para respaldar el cumplimiento de las misiones del Sistema
> Nacional de Salud. La persona que envia este correo asume el compromiso de
> usar el servicio a tales fines y cumplir con las regulaciones establecidas
>
> Infomed: http://www.sld.cu/
>
> ___
> squid-users mailing list
> squid-users@lists.squid-cache.org
> http://lists.squid-cache.org/listinfo/squid-users
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


[squid-users] Reports in Squids

2018-04-11 Thread Informatico Neurodesarrollo

Hi friends:
I am use openSUSE Leap 42.3 as a proxy server.
I am need send a report  monthly :

* The first 10th sites with more access and the total amount of his 
traffics.
* The first 10th users with more access  and the total amount of his 
traffics.


* Download and Upload average of total traffic between this time:

7:00 to12:00
12:00   to17:00
17:00   to24:00

Which  software I can  implement it?

I hope that somebody can help me.

My best regards.


--

Jesús Reyes Piedra
Admin Red Neurodesarrollo,Cárdenas
La caja decía:"Requiere windows 95 o superior"...
Entonces instalé LINUX.


--
Este mensaje le ha llegado mediante el servicio de correo electronico que 
ofrece Infomed para respaldar el cumplimiento de las misiones del Sistema 
Nacional de Salud. La persona que envia este correo asume el compromiso de usar 
el servicio a tales fines y cumplir con las regulaciones establecidas

Infomed: http://www.sld.cu/

___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] IP Lookup Failure

2018-04-11 Thread Eliezer Croitoru
Hey Aaron,

If you will disclose your squid.conf and\or "squid -kparse" output it might 
help to understand.
The Eui48.cc file is there for a mac address lookup as far as I remember so I'm 
not sure what is causing it.

With more relevant details we might be able to help you understand what's wrong 
if at all.

Eliezer


http://ngtech.co.il/lmgtfy/
Linux System Administrator
Mobile: +972-5-28704261
Email: elie...@ngtech.co.il


From: squid-users  On Behalf Of 
Aaron Hall
Sent: Wednesday, April 11, 2018 21:20
To: squid-users@lists.squid-cache.org
Subject: [squid-users] IP Lookup Failure

Hello Users -

I've searched the forums and can't find an appropriate answer.

I'm running Squid Cache: Version 3.5.20.

I'm receiving the following line in my cache.log file:

"2018/04/11 14:05:50.370 kid1| 28,3| Eui48.cc(520) lookup: id=0x7f9d0bd92b84 
 NOT found"

The server connecting to the proxy is seeing: "Received HTTP code 0 from proxy 
after CONNECT".

The client IP is called in an ACL, and that ACL is called in an `http_access 
allow` statement.

Can someone point me in the direction of what this might indicate? The internet 
and Google searches have failed to provide an answer.

Cheers.

--
Aaron Hall
The Paranoids
Network Security 
mailto:aaron.h...@oath.com

___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


[squid-users] IP Lookup Failure

2018-04-11 Thread Aaron Hall
Hello Users -

I've searched the forums and can't find an appropriate answer.

I'm running Squid Cache: Version 3.5.20.

I'm receiving the following line in my cache.log file:

"2018/04/11 14:05:50.370 kid1| 28,3| Eui48.cc(520) lookup:
id=0x7f9d0bd92b84  NOT found"

The server connecting to the proxy is seeing: "Received HTTP code 0 from
proxy after CONNECT".

The client IP is called in an ACL, and that ACL is called in an
`http_access allow` statement.

Can someone point me in the direction of what this might indicate? The
internet and Google searches have failed to provide an answer.

Cheers.
--
Aaron Hall
The Paranoids
Network Security
aaron.h...@oath.com
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Ideas for better caching these popular urls

2018-04-11 Thread Eliezer Croitoru
Hey Omid,

I will try to use a file format similar to this:
## FILENAME = unixtime-sha256
ESPMOD icap://127.0.0.1:1344/dumper ICAP/1.0
date: Wed, 11 Apr 2018 16:52:13 GMT
encapsulated: req-hdr=0, res-hdr=105, res-body=413
preview: 0
allow: 204
host: 127.0.0.1:1344
Socket-Remote-Addr: 127.0.0.1:55178

GET http://ngtech.co.il/index.html HTTP/1.1
Accept: */*
User-Agent: curl/7.29.0

HTTP/1.1 200 OK
Content-Length: 17230
Accept-Ranges: bytes
Access-Control-Allow-Methods: GET, POST, OPTIONS
Access-Control-Allow-Origin: *
Content-Type: text/html
Date: Wed, 11 Apr 2018 16:52:13 GMT
Last-Modified: Tue, 03 Apr 2018 20:19:05 GMT
Server: nginx/1.10.3 (Ubuntu)
Vary: Accept-Encoding
## EOF

I have a prototype that I wrote three years ago but it needs to be polished for 
general use.
I will update when I will have some progress.

Eliezer


Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: elie...@ngtech.co.il



-Original Message-
From: squid-users  On Behalf Of Omid 
Kosari
Sent: Wednesday, April 11, 2018 12:32
To: squid-users@lists.squid-cache.org
Subject: Re: [squid-users] Ideas for better caching these popular urls

Eliezer Croitoru wrote
> You will need more then just the urls but also the response headers for
> these.
> I might be able to write an ICAP service that will log requests and
> response headers and it can assist Cache admins to improve their
> efficiency but this can take a while.

Hi Eliezer,

Nice idea. I am ready to test/help/share what you need in real production
environment. Please also do a general thing which includes other domains in
first post attachment. They worth a try .

Thanks




--
Sent from: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Squid-Users-f1019091.html
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users

___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users


Re: [squid-users] Ideas for better caching these popular urls

2018-04-11 Thread Omid Kosari
Eliezer Croitoru wrote
> You will need more then just the urls but also the response headers for
> these.
> I might be able to write an ICAP service that will log requests and
> response headers and it can assist Cache admins to improve their
> efficiency but this can take a while.

Hi Eliezer,

Nice idea. I am ready to test/help/share what you need in real production
environment. Please also do a general thing which includes other domains in
first post attachment. They worth a try .

Thanks




--
Sent from: 
http://squid-web-proxy-cache.1019090.n4.nabble.com/Squid-Users-f1019091.html
___
squid-users mailing list
squid-users@lists.squid-cache.org
http://lists.squid-cache.org/listinfo/squid-users