I am afraid not, because yarn needs dfs.

Huizhe Wang <wang.h...@husky.neu.edu> 于2019年5月20日周一 上午9:50写道:

> Hi,
>
> I wanna to use Spark on Yarn without HDFS.I store my resource in AWS and
> using s3a to get them. However, when I use stop-dfs.sh stoped Namenode and
> DataNode. I got an error when using yarn cluster mode. Could I using yarn
> without start DFS, how could I use this mode?
>
> Yours,
> Jane
>


-- 
Best Regards

Jeff Zhang

Reply via email to