Michael B. Allen wrote:

No. Strange effects can happen at many different file sizes. If you do not test it, you do not know that it works.
Can you give me a specific example? I've written a client and I never
tested it past 5-6GB. You have me worried now :-/
There may not be a problem in your client.

But problems may show up in file systems and the support C library calls. In older systems, bits were precious, so there may be many fields that do not have enough, and now backwards compatability may be showing it's age. Sometimes it is found in a device driver that because at the time a 1GB disk was unimaginable, that the bits above there were used for flags.

Some algorithms are sound but do not scale well, hence the unexplained slowdowns.

Every 4 bit nybble barrier can be an issue, and the signed/unsigned usage may also be an issue. The granualarity of blocks in the file system.

Once you get past 4GB, I would expect the next hiccup may be at the 1TB level and then every power of 2 beyond that.

How many people are dealing with files larger than 4G on a regular basis?

You can not test every thing though. :-)

-John
[EMAIL PROTECTED]
Personal Opinion Only



Reply via email to