[ 
https://issues.apache.org/jira/browse/HADOOP-3719?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12613398#action_12613398
 ] 

Ari Rabkin commented on HADOOP-3719:
------------------------------------

Pete -- 
Yes, the sink file writers are pluggable.  In fact, our current writer uses the 
Hadoop FileSystem class, so I believe that if you pass a local path that points 
at NFS, it'll "just work".  We haven't tested that, though.

> Chukwa
> ------
>
>                 Key: HADOOP-3719
>                 URL: https://issues.apache.org/jira/browse/HADOOP-3719
>             Project: Hadoop Core
>          Issue Type: Improvement
>            Reporter: Ari Rabkin
>         Attachments: chukwa_08.pdf
>
>
> We'd like to contribute Chukwa, a data collection and analysis framework 
> being developed at Yahoo!.  Chukwa is a natural complement to Hadoop, since 
> it is built on top of HDFS and Map-Reduce, and since Hadoop clusters are a 
> key use case.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to