Hi Willy,

Thank's for the quick reply and helpful answers.

I have a multiproc HAProxy, so my first guess is that each proc has to
have the mapfiles in memory.

Each mapfile has a CIDR IP block as key, and I can merge them based on
this indeed. I'll try to use the word converter to apply the right
headers based on a merged file and will get back to the mailinglist to
let you know how it went.



Le 04/05/2017 à 07:30, Willy Tarreau a écrit :
> Hi Arnaud,
>
> On Wed, May 03, 2017 at 01:54:13PM +0200, Arnaud B. wrote:
>> Hi there,
>>
>> I'm currently wondering, based on
>> https://www.haproxy.com/blog/use-geoip-database-within-haproxy/ and
>> related notes, is there a more convenient way now ? I've created
>> mapfiles for latitude, longitude, accuracy radius, country code, cities
>> names etc. 
>>
>> Right now, my process is based on Maxmind's databases scraping and
>> generating mapfiles for each added header.
>>
>> Now my HAProxy is around 10GB ram and takes about 70secs to reload.
> Why does it take that long ? That seems totally abnormal. Do you use
> your maps at many places, possibly causing them from being loaded
> multiple times maybe ? In theory even with this they should be merged.
>
> Now regarding the size, do you think your maps could be aggregated
> so that you have all fields at once on a line ? If so you could set
> a variable based on a single lookup of the source address in the
> map, then use the "word" converter to split it depending on the
> field you're interested in. The risk is to end up with an even larger
> map but I think that most entries can be merged.
>
> Willy
>


Reply via email to