Hi Ehsan

Thanks for your quick responding, is it BlobDeserializer the only way to
go? Thanks.

Cheers
aij


On Fri, Apr 11, 2014 at 4:27 PM, Ehsan ul Haq <[email protected]> wrote:

> Yes it is possible to transfer binary files into Hadoop (HDFS/Hbase etc)
> using flume.
> If your binary files are already being stored in some local file system in
> some folders then you can use SpoolingDirectory source "
> https://flume.apache.org/FlumeUserGuide.html#spooling-directory-source";.
> The default deserializer is a LINE based text deserializer which is not for
> binary data. You can use BlobDeserializer "
> https://flume.apache.org/FlumeUserGuide.html#blobdeserializer"; which can
> handle binary data.
>
> Cheers
> Ehsan
>
>
> On Fri, Apr 11, 2014 at 9:58 AM, R W <[email protected]> wrote:
>
>> Hi All
>>
>> We have an App will generate huge amount of binary based file, it's not
>> like text file which we may have new line separator for it, we want to
>> collect these binary files and store them into a Hadoop cluster, I'm new to
>> Flume, so could anyone share some ideas here on how to do it with Flume,
>> thanks in advance.
>>
>> Cheers
>> aij
>>
>
>

Reply via email to