100GB objects (or ~40 on a hard drive!) are way too large for you to
get an effective random distribution.
-Greg
On Thu, Jan 8, 2015 at 5:25 PM, Mark Nelson mark.nel...@inktank.com wrote:
On 01/08/2015 03:35 PM, Michael J Brewer wrote:
Hi all,
I'm working on filling a cluster to near
I didn't actually calculate the per-OSD object density but yes, I agree
that will hurt.
On 01/09/2015 12:09 PM, Gregory Farnum wrote:
100GB objects (or ~40 on a hard drive!) are way too large for you to
get an effective random distribution.
-Greg
On Thu, Jan 8, 2015 at 5:25 PM, Mark Nelson
Hi all,
I'm working on filling a cluster to near capacity for testing purposes.
Though I'm noticing that it isn't storing the data uniformly between OSDs
during the filling process. I currently have the following levels:
Node 1:
/dev/sdb1 3904027124 2884673100 1019354024
On 01/08/2015 03:35 PM, Michael J Brewer wrote:
Hi all,
I'm working on filling a cluster to near capacity for testing purposes.
Though I'm noticing that it isn't storing the data uniformly between
OSDs during the filling process. I currently have the following levels:
Node 1:
/dev/sdb1
On Thu, 8 Jan 2015 15:35:22 -0600 Michael J Brewer wrote:
Hi all,
I'm working on filling a cluster to near capacity for testing purposes.
Though I'm noticing that it isn't storing the data uniformly between OSDs
during the filling process. I currently have the following levels: