[
https://issues.apache.org/jira/browse/HBASE-3782?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Esteban Gutierrez resolved HBASE-3782.
--------------------------------------
Resolution: Won't Fix
Should be fixed by atomic bulk loading from HBASE-4552
> Multi-Family support for bulk upload tools causes File Not Found Exception
> --------------------------------------------------------------------------
>
> Key: HBASE-3782
> URL: https://issues.apache.org/jira/browse/HBASE-3782
> Project: HBase
> Issue Type: Bug
> Components: mapreduce
> Affects Versions: 0.90.3
> Reporter: Nichole Treadway
> Attachments: HBASE-3782.patch
>
>
> I've been testing HBASE-1861 in 0.90.2, which adds multi-family support for
> bulk upload tools.
> I found that when running the importtsv program, some reduce tasks fail with
> a File Not Found exception if there are no keys in the input data which fall
> into the region assigned to that reduce task. From what I can determine, it
> seems that an output directory is created in the write() method and expected
> to exist in the writeMetaData() method...if there are no keys to be written
> for that reduce task, the write method is never called and the output
> directory is never created, but writeMetaData is expecting the output
> directory to exist...thus the FnF exception:
> 2011-03-17 11:52:48,095 WARN org.apache.hadoop.mapred.TaskTracker: Error
> running child
> java.io.FileNotFoundException: File does not exist:
> hdfs://master:9000/awardsData/_temporary/_attempt_201103151859_0066_r_000000_0
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:468)
> at
> org.apache.hadoop.hbase.regionserver.StoreFile.getUniqueFile(StoreFile.java:580)
> at
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat$1.writeMetaData(HFileOutputFormat.java:186)
> at
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat$1.close(HFileOutputFormat.java:247)
> at
> org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:567)
> at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:408)
> at org.apache.hadoop.mapred.Child.main(Child.java:170)
> Simply checking if the file exists should fix the issue.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)