I would mainly consider future upgrades. F.ex. do one vdisk per disk shelf per rg. F.ex. for a GL6S you would have 12 vdisks, and if you add a GL4S you would add 8 more vdisks, then each spindle of both systems should get approximately the same number of IOs.
Another thing to consider is re-allocating capacity. If your vdisks are very large, it might be difficult to free up a full vdisk if you need to reorganize and create new filesystems, or add more 3way replicated metadata.. But mainly I create large vdisks.. haven’t been able to show any performance difference between one vdisk per RG vs. 10 vdisks per RG. -jf man. 1. jul. 2019 kl. 07:42 skrev Ryan Novosielski <[email protected]>: > Good morning, > > Was wondering if anyone could point me to any tips or provide some > regarding choosing vdisk size? My understanding is that too small is a > waste of resources in the form of overhead, and that there is an upper > limit. but that generally within a pool, you want them to be the same size, > so that if you aren’t allocating the entire storage space on the system > straight off, you’ll need to choose a reasonable size or be left with > unusable space (eg. you will waste space if you went with, say, 200 GB > vdisk sizes on a 500 GB array). > > Anyone have any tips? Do people just generally allocate the whole thing > and have one 2 vdisks (redundancy)? Seems you have some more flexibility if > you don’t do that and have to, say, create a storage pool or filesystem > from scratch to take advantage of features in a newer FS version or what > have you. > > Thanks for the help — spent a lot of time looking for this previously, but > never asked on the list. > > -- > ____ > || \\UTGERS, |---------------------------*O*--------------------------- > ||_// the State | Ryan Novosielski - [email protected] > || \\ University | Sr. Technologist - 973/972.0922 (2x0922) ~*~ RBHS Campus > || \\ of NJ | Office of Advanced Research Computing - MSB C630, Newark > `' > > _______________________________________________ > gpfsug-discuss mailing list > gpfsug-discuss at spectrumscale.org > http://gpfsug.org/mailman/listinfo/gpfsug-discuss >
_______________________________________________ gpfsug-discuss mailing list gpfsug-discuss at spectrumscale.org http://gpfsug.org/mailman/listinfo/gpfsug-discuss
