Hello,

In
hdfs.org.apache.hadoop.hdfs.DFSClient<eclipse-javadoc:%E2%98%82=HadoopSrcCode/src%3Chdfs.org.apache.hadoop.hdfs%7BDFSClient.java%E2%98%83DFSClient>
.DFSOutputStream<eclipse-javadoc:%E2%98%82=HadoopSrcCode/src%3Chdfs.org.apache.hadoop.hdfs%7BDFSClient.java%E2%98%83DFSClient%E2%98%83DFSOutputStream>.writeChunk(byte[]
b, int offset, int len, byte[] checksum)
The second last line:

int psize = Math.min((int)(blockSize-bytesCurBlock), writePacketSize);

When I use blockSize  bigger than 2GB, which is out of the boundary of
integer something weird would happen. For example, for a 3GB block it will
create more than 2Million packets.

Anyone noticed this before?

Elton

Reply via email to