Hi all,
this may not be the best list for this question, but I figure that the
number of disks connected to users here should be pretty big...
I upgraded from 2.6.17-rc4 to 2.6.18.3 about a week ago, and I've since
had 3 drives kicked out of my 10-drive RAID5 array. Previously, I had no
kicks over
Neil Brown wrote:
On Wednesday December 13, [EMAIL PROTECTED] wrote:
Before I tell you that doesn't work, could you provide a complete
command line you expect to work rather than just one argument? Showing
the array designator and the location of the actual new UUID to use? We
think we have
which is right at the edge of what I need. I want to read the doc on
stripe_cache_size before going huge, if that's K 10MB is a LOT of cache
when 256 works perfectly in RAID-0.
but they are basically unrelated. in r5/6, the stripe cache is absolutely
critical in caching parity chunks. in r0, n
Neil Brown wrote:
> Patches to the man page to add useful examples are always welcome.
And if people would like to be more verbose, the wiki is available at
http://linux-raid.osdl.org/
It's now kinda useful but definitely not fully migrated from the old RAID FAQ.
David
-
To unsubscribe from thi
On Wednesday December 13, [EMAIL PROTECTED] wrote:
>
> Before I tell you that doesn't work, could you provide a complete
> command line you expect to work rather than just one argument? Showing
> the array designator and the location of the actual new UUID to use? We
> think we have tried every