Is this issue going to be fixed ?
Please let me know.
Thanks,
Madhu
Sent from my iPhone
> On Oct 17, 2016, at 10:33 PM, Madhu wrote:
>
> Any update on this ?
> Madhu
>
> Sent from my iPhone
>
>> On Oct 14, 2016, at 11:40 AM, Madhu wrote:
>>
>> I still see the crash with the changes. Below
Any update on this ?
Madhu
Sent from my iPhone
> On Oct 14, 2016, at 11:40 AM, Madhu wrote:
>
> I still see the crash with the changes. Below is the trace.
>
> [ 1037.412668] Call Trace:
> [ 1037.415402] [] dump_stack+0x64/0x82
> [ 1037.421146] [] dump_header+0x7f/0x1f1
> [ 1037.427083] []
I still see the crash with the changes. Below is the trace.
[ 1037.412668] Call Trace:
[ 1037.415402] [] dump_stack+0x64/0x82
[ 1037.421146] [] dump_header+0x7f/0x1f1
[ 1037.427083] [] ? put_online_cpus+0x56/0x80
[ 1037.433507] [] ? rcu_oom_notify+0xcc/0xf0
[ 1037.439834] [] oom_kill_process+
I would be:
bird_1.6.2-1~bpo8+1madhupatch1_amd64.deb
Apologies, but I just realised I forgot the step to actually apply the
patch to the file! :)
You may have realised that, and done it - but if not, just repeat the steps
again, but change the "apply patch" section to this:
# Apply patch #
cd
Thanks Just. I followed your steps. I got these files. Which one I have to
install?
ls
bird-1.6.2
bird-bgp_1.6.2-1~bpo8+1madhupatch1_all.deb
bird-doc_1.6.2-1~bpo8+1madhupatch1_all.deb
bird_1.6.2-1~bpo8+1.debian.tar.xz
bird_1.6.2-1~bpo8+1.dsc
bird_1.6.2-1~bpo8+1madhupatch1.debian.tar.xz
bird_1.6.2-
I'd like to, but sorry I can't really.
I can give you a lot of pointers though - and it's good to learn something
about the package management :)
First, get yourself a host that is the same debian or ubuntu version that
you want to deploy the package on, and use it to build the package. Then
cre
Hi just,
Is there any way you can give me your built debian package, So that I
can install and check :) ?
Madhu
On Wed, Oct 12, 2016 at 8:37 AM, Justin Cattle wrote:
> Personally - I get the debian source package, then apply the patch to that
> using quilt, bump the version with my own loca
Personally - I get the debian source package, then apply the patch to that
using quilt, bump the version with my own local identifier, and build the
new package.
Cheers,
Just
On 12 October 2016 at 16:19, Madhu wrote:
> How to install this patch?
>
> Madhu
>
> Sent from my iPhone
>
> > On Oct
How to install this patch?
Madhu
Sent from my iPhone
> On Oct 12, 2016, at 5:27 AM, Jan Matejka wrote:
>
> Hi!
>
> Please try the last commit in GIT, branch master. Commit ID
> 2e7fb11a6e31324151c6db98df2fe26d2d6cffab.
> Attaching the patch as well.
>
> Thank you both for reporting this iss
The patch looks very promising in the lab. All hosts are at around 153kB
for 195 routes each.
As the patch is quite small, if it remains stable over the next day or so I
may consider rolling this one out sooner rather than later.
Thanks :)
Cheers,
Just
On 12 October 2016 at 13:27, Jan Matejk
Hi!
Please try the last commit in GIT, branch master. Commit ID
2e7fb11a6e31324151c6db98df2fe26d2d6cffab.
Attaching the patch as well.
Thank you both for reporting this issue.
Jan
On 10/12/2016 02:10 PM, Justin Cattle wrote:
> Good stuff - once again, please let me know if you want to me test
Good stuff - once again, please let me know if you want to me test any
patching at this end :)
Cheers,
Just
On 12 October 2016 at 13:02, Ondrej Zajicek wrote:
> On Wed, Oct 12, 2016 at 12:09:17PM +0100, Justin Cattle wrote:
> > Are there any thoughts as to why I still see quite large memory u
On Wed, Oct 12, 2016 at 12:09:17PM +0100, Justin Cattle wrote:
> Are there any thoughts as to why I still see quite large memory usage, and
> only on some [ seemingly random ] hosts ?
Hi
We are doing some testing and trying to identify the cause of the problem.
We found some problems and strange
Are there any thoughts as to why I still see quite large memory usage, and
only on some [ seemingly random ] hosts ?
Cheers,
Just
On 11 October 2016 at 19:05, Madhu wrote:
> Hi everyone,
>
> Now i tested with 16 path and 32 path, I don't see the crash. Below are
> the memory usage for each.
Hi everyone,
Now i tested with 16 path and 32 path, I don't see the crash. Below are
the memory usage for each.
*16 path*
bird> show memory
BIRD memory usage
Routing tables: 5109 kB
Route attributes: 13 kB
ROA tables:192 B
Protocols: 163 kB
Total:5384 kB
*32
It looks like oom-kill from the call trace I think.
Cheers,
Just
On 11 October 2016 at 13:07, Ondrej Zajicek wrote:
> On Mon, Oct 10, 2016 at 06:21:44PM -0700, Madhu wrote:
> > I have 3000 routes with 64 path ecmp. It is crashing . I don't see the
> coredump . Is there anyway to fix using tra
On Mon, Oct 10, 2016 at 06:21:44PM -0700, Madhu wrote:
> I have 3000 routes with 64 path ecmp. It is crashing . I don't see the
> coredump . Is there anyway to fix using trace ?
Hi
I would like to confirm, it is BIRD that was killed by out of memory
killer and could be just restarted? Or Linux k
I must admit, after deploying 1.6.2 to production, I am still seeing more
memory usage than expected.
I was going to take some time to gather some more stats - but seeing this
made me jump early :)
Memory usage is still better than it was originally, but I see 2G in some
cases, which seems pretty
I have 3000 routes with 64 path ecmp. It is crashing . I don't see the coredump
. Is there anyway to fix using trace ?
Madhu
Sent from my iPhone
> On Oct 10, 2016, at 6:08 PM, Jonathan Stewart
> wrote:
>
> It is not a general issue, many here run hundreds of thousands of routes.
>
> It must
It is not a general issue, many here run hundreds of thousands of routes.
It must be something specific to your environment.
Jonathan
On 10 Oct 2016 4:40 p.m., "Madhu" wrote:
> Looks like the problem is still there. I used 1.6.0 and also 1.6.2.
>
>
>
> On Mon, Oct 10, 2016 at 10:56 AM, james m
Looks like the problem is still there. I used 1.6.0 and also 1.6.2.
On Mon, Oct 10, 2016 at 10:56 AM, james machado
wrote:
> Madhu check out the following thread. http://bird.network.cz/
> pipermail/bird-users/2016-September/010578.html
>
>
>
> On Mon, Oct 10, 2016 at 10:02 AM, Madhu wrote:
21 matches
Mail list logo