Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time

2014-07-17 Thread Craig Lewis
: *Sage Weil sw...@redhat.com *To: *Quenten Grasso qgra...@onq.com.au *Cc: *ceph-users@lists.ceph.com *Sent: *Thursday, 17 July, 2014 4:44:45 PM *Subject: *Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time On Thu, 17 Jul 2014, Quenten Grasso wrote: Hi Sage

Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time

2014-07-16 Thread Quenten Grasso
:38 PM To: Sage Weil Cc: ceph-users@lists.ceph.com Subject: Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time Hi Sage, since this problem is tunables-related, do we need to expect same behavior or not when we do regular data rebalancing caused by adding new

Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time

2014-07-16 Thread Andrei Mikhailovsky
: Quenten Grasso qgra...@onq.com.au To: Andrija Panic andrija.pa...@gmail.com, Sage Weil sw...@redhat.com Cc: ceph-users@lists.ceph.com Sent: Wednesday, 16 July, 2014 1:20:19 PM Subject: Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time Hi Sage, Andrija List I

Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time

2014-07-16 Thread Andrija Panic
...@gmail.com, Sage Weil sw...@redhat.com *Cc: *ceph-users@lists.ceph.com *Sent: *Wednesday, 16 July, 2014 1:20:19 PM *Subject: *Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time Hi Sage, Andrija List I have seen the tuneables issue on our cluster when I

Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time

2014-07-16 Thread Danny Luhde-Thompson
-users@lists.ceph.com *Sent: *Wednesday, 16 July, 2014 1:20:19 PM *Subject: *Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time Hi Sage, Andrija List I have seen the tuneables issue on our cluster when I upgraded to firefly. I ended up going back

Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time

2014-07-15 Thread Andrija Panic
Hi Sage, since this problem is tunables-related, do we need to expect same behavior or not when we do regular data rebalancing caused by adding new/removing OSD? I guess not, but would like your confirmation. I'm already on optimal tunables, but I'm afraid to test this by i.e. shuting down 1

Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time

2014-07-15 Thread Sage Weil
On Tue, 15 Jul 2014, Andrija Panic wrote: Hi Sage, since this problem is tunables-related, do we need to expect same behavior or not  when we do regular data rebalancing caused by adding new/removing OSD? I guess not, but would like your confirmation. I'm already on optimal tunables, but

Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time

2014-07-14 Thread Andrei Mikhailovsky
, 13 July, 2014 9:54:17 PM Subject: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time Hi, after seting ceph upgrade (0.72.2 to 0.80.3) I have issued ceph osd crush tunables optimal and after only few minutes I have added 2 more OSDs to the CEPH cluster... So

Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time

2014-07-14 Thread Andrija Panic
tunables optimal AND add new OSD at thesame time Hi, after seting ceph upgrade (0.72.2 to 0.80.3) I have issued ceph osd crush tunables optimal and after only few minutes I have added 2 more OSDs to the CEPH cluster... So these 2 changes were more or a less done at the same time

Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time

2014-07-14 Thread Sage Weil
I've added some additional notes/warnings to the upgrade and release notes: https://github.com/ceph/ceph/commit/fc597e5e3473d7db6548405ce347ca7732832451 If there is somewhere else where you think a warning flag would be useful, let me know! Generally speaking, we want to be able to cope with

Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time

2014-07-14 Thread Andrija Panic
Perhaps here: http://ceph.com/releases/v0-80-firefly-released/ Thanks On 14 July 2014 18:18, Sage Weil sw...@redhat.com wrote: I've added some additional notes/warnings to the upgrade and release notes: https://github.com/ceph/ceph/commit/fc597e5e3473d7db6548405ce347ca7732832451 If there

Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time

2014-07-14 Thread Udo Lembke
Hi, which values are all changed with ceph osd crush tunables optimal? Is it perhaps possible to change some parameter the weekends before the upgrade is running, to have more time? (depends if the parameter are available in 0.72...). The warning told, it's can take days... we have an cluster

Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time

2014-07-14 Thread Sage Weil
On Mon, 14 Jul 2014, Udo Lembke wrote: Hi, which values are all changed with ceph osd crush tunables optimal? There are some brand new crush tunables that fix.. I don't even remember off hand. In general, you probably want to stay away from 'optimal' unless this is a fresh cluster and all

Re: [ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time

2014-07-14 Thread Andrija Panic
Udo, I had all VMs completely unoperational - so don't set optimal for now... On 14 July 2014 20:48, Udo Lembke ulem...@polarzone.de wrote: Hi, which values are all changed with ceph osd crush tunables optimal? Is it perhaps possible to change some parameter the weekends before the upgrade

[ceph-users] ceph osd crush tunables optimal AND add new OSD at the same time

2014-07-13 Thread Andrija Panic
Hi, after seting ceph upgrade (0.72.2 to 0.80.3) I have issued ceph osd crush tunables optimal and after only few minutes I have added 2 more OSDs to the CEPH cluster... So these 2 changes were more or a less done at the same time - rebalancing because of tunables optimal, and rebalancing