Re: Regarding loading Image file into HDFS
Hi, Blocks are split at arbitrary block size boundaries. Readers can read the whole file by reading all blocks together (this is transparently handled by the underlying DFS reader classes itself, a developer does not have to care about it). HDFS does not care about what _type_ of file you store,
Regarding loading Image file into HDFS
Hi, I have basic doubt... How Hadoop splits an Image file into blocks and puts in HDFS? Usually Image file cannot be splitted right how it is happening in Hadoop? regards, Rams