this continues to be a fundamental weakness in our modern computing environment.

various factors collide here:

1) growing disk sizes and the need for RAID 6 (at least) mean your average
logical disk is now 5-6TB or more.
2) there is enough legacy software around that "disks" seen by the O/ S need to be <2TB. 3) your mileage may vary, but i am loath to wait more than 3 hours for fsck's,
        which in my world, limits filesystem sizes to about 600GB.
4) adding a small 20TB raid now means i have 30 filesystems to manage;
        for many, this is becoming a lot of work
5) to reduce the work of 4), means big logical disks and big filesystems but by 3)
        i have to go with XFS (or maybe zfs) so i can check them quickly
        (but xfs's quick check is too unreliable for me).
6) i hadn't worried about backup time before, and still don't. backup time is about total amount to be backed up and in my experience, not too affected by FS size.

for what its worth, here is how i solved the problem:
1) all local storage is FC connected to an external raid.
2) all partitioning is done on the raid; it exports approx 6-700GB slices (these slices are used by multiple O/S's so i can't use local partitioning) 3) i have software that auto-mounts filesystems (one per slice) on the correct system
        so managing 100 filesystems is now just editing a small text file.

On Apr 29, 2010, at 8:13 PM, Robinson, Greg wrote:

I am after some recommendations of volume sizes.  I have set my own
limit for $work, but have been asked to justify the limit with
whitepapers etc.  I cannot find any as the usual response is whatever
suits your organisation.

What volume sizes do you run, and do you have any trouble meeting your
backup window?

I have set our limit to 1TB in a single volume, and even that might be
too big.

------------------
Andrew Hume  (best -> Telework) +1 732-886-1886
[email protected]  (Work) +1 973-360-8651
AT&T Labs - Research; member of USENIX and LOPSA



_______________________________________________
Tech mailing list
[email protected]
http://lopsa.org/cgi-bin/mailman/listinfo/tech
This list provided by the League of Professional System Administrators
 http://lopsa.org/

Reply via email to