A 64-bit hashed file can - if the operating system permits - be up to about 19 
million terabytes.

If this is insufficient, you can create a Distributed File containing N of 
these, where N is an arbitrarily large number.

You'll get better I/O throughput by spreading your commonly used files over 
lots of filesystems (avoiding those containing swap and transaction logs), and 
spreading the stale ones among the remaining free space.  Defragmenting is 
always beneficial, but make sure everything's backed up and that you can read 
the backup.
-------
u2-users mailing list
[email protected]
To unsubscribe please visit http://listserver.u2ug.org/

Reply via email to