Yes, it is possible to have both spark and hdfs running on the same
cluster. We have a lot of clusters running without any issues. And yes, it
is possible to hook spark up with remote hdfs. You might feel a bit lag if
they are on different networks.

Thanks
Best Regards

On Fri, Nov 21, 2014 at 8:43 PM, EH <eas...@gmail.com> wrote:

> Unfortunately whether it is possible to have both Spark and HDFS running on
> the same machine is not under our control.  :(  Right now we have Spark and
> HDFS running in different machines.  In this case, is it still possible to
> hook up a remote HDFS with Spark so that we can use Spark Streaming
> checkpoints?  Thank you for your help.
>
> Best,
> EH
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Setup-Remote-HDFS-for-Spark-tp19481p19485.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to