[ http://issues.apache.org/jira/browse/HADOOP-738?page=comments#action_12451479 ] Doug Cutting commented on HADOOP-738: -------------------------------------
In Hadoop, local files generally *do* have a .crc file associated with them. This has caught lots of data corruption problems (mostly for folks w/o ECC memory). I'm okay having an option to not create crc files when exporting files, but I think it's the wrong fix for the problem in the description of this issue. Getting and putting files should work with local .crc files enabled. > dfs get or copyToLocal should not copy crc file > ----------------------------------------------- > > Key: HADOOP-738 > URL: http://issues.apache.org/jira/browse/HADOOP-738 > Project: Hadoop > Issue Type: Bug > Components: dfs > Affects Versions: 0.8.0 > Environment: all > Reporter: Milind Bhandarkar > Assigned To: Milind Bhandarkar > Fix For: 0.9.0 > > Attachments: hadoop-crc.patch > > > Currently, when we -get or -copyToLocal a directory from DFS, all the files > including crc files are also copied. When we -put or -copyFromLocal again, > since the crc files already exist on DFS, this put fails. The solution is not > to copy checksum files when copying to local. Patch is forthcoming. -- This message is automatically generated by JIRA. - If you think it was sent incorrectly contact one of the administrators: http://issues.apache.org/jira/secure/Administrators.jspa - For more information on JIRA, see: http://www.atlassian.com/software/jira
