Re: Count uniq Client ips

2020-10-15 Thread Tim Düsterhus
Aleks,

Am 15.10.20 um 22:54 schrieb Aleksandar Lazic:
> I need to know how many concurrent clients request *NOW* a specific URL
> and display
> it in prometheus and limit access to max client let's say 50 per url.
> 
> That's my requirement.

Using a stick table keyed using the `base32` fetch (or possibly using a
str key and the `path` fetch) and then using the `table_conn_cur` fetch
should provide the necessary information.

Best regards
Tim Düsterhus



Re: Count uniq Client ips

2020-10-15 Thread Aleksandar Lazic

Tim.

On 15.10.20 19:05, Tim Düsterhus wrote:

Aleks,

Am 15.10.20 um 14:08 schrieb Aleksandar Lazic:

The target is to know how much concurrent IP's request the a specific URL.


What *exactly* would you like to extract? Do you actually want
concurrent IP addresses? Log parsing then would be impossible by definition.


I need to know how many concurrent clients request *NOW* a specific URL and 
display
it in prometheus and limit access to max client let's say 50 per url.

That's my requirement.

Agree that the logfile is the wrong way to get this information's.



Best regards
Tim Düsterhus


Regards
Aleks



Re: Count uniq Client ips

2020-10-15 Thread Tim Düsterhus
Aleks,

Am 15.10.20 um 14:08 schrieb Aleksandar Lazic:
> The target is to know how much concurrent IP's request the a specific URL.

What *exactly* would you like to extract? Do you actually want
concurrent IP addresses? Log parsing then would be impossible by definition.



Best regards
Tim Düsterhus



Re: Count uniq Client ips

2020-10-15 Thread Aleksandar Lazic

Hi Adis,

On 15.10.20 15:03, Adis Nezirovic wrote:

On 10/15/20 2:08 PM, Aleksandar Lazic wrote:

Hi.

I though maybe the peers could help me when I yust add the client IP 
with the URL but I'm not sure if I can query the peers store in a efficient way.


The target is to know how much concurrent IP's request the a specific URL.

Could lua be a solution.


Hey Aleks,

I'm not sure Lua would be the right solution for your situation, counting stuff 
is tricky.


Hm so you mean that lua could be a performance bottleneck for youtube scale ?
As I haven't used lua in haproxy or nginx I have no experience how it behaves 
on high
traffic sites.

I thought to use something like this but "proc" wide

function action(txn)
  -- Get source IP
  local clientip = txn.f:src()
  local url  = txn.sf:path_beg("/MY_URL")

  save_in_global_hash(clientip+url)
end

and query this save_in_global_hash with a service.

However, I think Redis has INCR, you you can store per URL counters and maybe (just maybe) 
use Lua action in HAProxy to write to Redis.


Obviously, you'd need to look out for performance, added latency etc, but it 
would be a start.
You can then access Redis outside of the HAProxy context and observe the 
counters.


Maybe the stick tables could also be a solution because I use it already for 
limiting access.

```
  # 
https://www.haproxy.com/blog/application-layer-ddos-attack-protection-with-haproxy/
  http-request track-sc0 src table per_ip_rates
```
# table: per_ip_rates, type: ip, size:1048576, used:3918

0x7f3c58fa9620: key= use=0 exp=597470 http_req_rate(1)=1

0x7f3c4d299960: key= use=0 exp=588433 http_req_rate(1)=2
0x7f3c50cc8830: key= use=0 exp=241004 http_req_rate(1)=0
0x7f3c5c6b3eb0: key= use=0 exp=586046 http_req_rate(1)=1
...
```

Can i add there a URL part like path_beg("/MYURL")


Just my 2c, hope it helps you (like you helped many people on this list)


Thank you for your input.


Best regards,





Re: Count uniq Client ips

2020-10-15 Thread Adis Nezirovic

On 10/15/20 2:08 PM, Aleksandar Lazic wrote:

Hi.

I though maybe the peers could help me when I yust add the client IP with the 
URL but I'm not sure if I can query the peers store in a efficient way.

The target is to know how much concurrent IP's request the a specific URL.

Could lua be a solution.


Hey Aleks,

I'm not sure Lua would be the right solution for your situation, 
counting stuff is tricky.


However, I think Redis has INCR, you you can store per URL counters and 
maybe (just maybe) use Lua action in HAProxy to write to Redis.


Obviously, you'd need to look out for performance, added latency etc, 
but it would be a start.
You can then access Redis outside of the HAProxy context and observe the 
counters.


Just my 2c, hope it helps you (like you helped many people on this list)

Best regards,
--
Adis Nezirovic
Software Engineer
HAProxy Technologies - Powering your uptime!
375 Totten Pond Road, Suite 302 | Waltham, MA 02451, US
+1 (844) 222-4340 | https://www.haproxy.com



Count uniq Client ips

2020-10-15 Thread Aleksandar Lazic
Hi.

I have a quite tricky requirement and hope to get some input for a efficient 
solution.

I use a haproyx in front of a streaming server.

The access log, in json format, writes out the http request to syslog which is 
this plugin 
https://github.com/influxdata/telegraf/tree/release-1.14/plugins/inputs/syslog

Now I tried with 
https://github.com/influxdata/telegraf/tree/release-1.14/plugins/processors/dedup
 to get unique IP's but that's quite unprecise.

I though maybe the peers could help me when I yust add the client IP with the 
URL but I'm not sure if I can query the peers store in a efficient way.

The target is to know how much concurrent IP's request the a specific URL.

Could lua be a solution.

Thanks for any ideas.

Best regards
Aleks