Hello there, I'm starting to use Pig for processing events and I'm having
one specific issue.
Currently, the writing process, writes a line to the file and syncs the
file to readers.
(org.apache.hadoop.fs.FSDataOutputStream.sync()).

If I try to read the file from another process, it works fine, at least
using
org.apache.hadoop.fs.FSDataInputStream.

But it looks like pig doesn't read any data. I tried PigStorage, CSVLoader,
and CSVExcelStorage, but no luck.

One weird thing is the following:
Successfully read 0 records (376 bytes) from: "...."

It looks like it is reading 376 bytes, the file has more than 1 hdfs block
(64M).

I'm using hadoop 1.0.3. and pig 0.10.0

Thanks!
Lucas

Reply via email to