Please post the question on vendor's forum. 

> On Sep 25, 2015, at 7:13 AM, Angel Angel <areyouange...@gmail.com> wrote:
> 
> hello,
> I am running the spark application.
> 
> I have installed the cloudera manager.
> it includes the spark version 1.2.0
> 
> 
> But now i want to use spark version 1.4.0.
> 
> its also working fine.
> 
> But when i try to access the HDFS in spark 1.4.0 in eclipse i am getting the 
> following error.
> 
> "Exception in thread "main" java.nio.file.FileSystemNotFoundException: 
> Provider "hdfs" not installed "
> 
> 
> My spark 1.4.0 spark-env.sh file is  
> 
> export HADOOP_CONF_DIR=/etc/hadoop/conf
> export SPARK_HOME=/root/spark-1.4.0
> 
> 
> export 
> DEFAULT_HADOOP_HOME=/opt/cloudera/parcels/CDH-5.3.5-1.cdh5.3.5.p0.4/lib/hadoop
> 
> still i am getting the error.
> 
> please give me suggestions.
> 
> Thanking You,
> Sagar Jadhav. 

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to