ould be
>>>
>>>
>>>
>>> val file = sc.textFile("hdfs:///localhost:9000/sigmoid/input.txt")
>>>
>>>
>>>
>>> 3 “///”
>>>
>>>
>>>
>>> Thanks
>>>
>>> Tri
>>>
>>>
>
gt;
>> Tri
>>
>>
>>
>> *From:* rapelly kartheek [mailto:kartheek.m...@gmail.com]
>> *Sent:* Friday, November 14, 2014 9:42 AM
>> *To:* Akhil Das; user@spark.apache.org
>> *Subject:* Re: Read a HDFS file from Spark using HDFS API
>>
>>
&
t;
> Thanks
>
> Tri
>
>
>
> *From:* rapelly kartheek [mailto:kartheek.m...@gmail.com]
> *Sent:* Friday, November 14, 2014 9:42 AM
> *To:* Akhil Das; user@spark.apache.org
> *Subject:* Re: Read a HDFS file from Spark using HDFS API
>
>
>
> No. I am not acc
It should be
val file = sc.textFile("hdfs:///localhost:9000/sigmoid/input.txt")
3 “///”
Thanks
Tri
From: rapelly kartheek [mailto:kartheek.m...@gmail.com]
Sent: Friday, November 14, 2014 9:42 AM
To: Akhil Das; user@spark.apache.org
Subject: Re: Read a HDFS file from Spark using HDFS
Can you not create SparkContext inside the scheduler code? If you are
looking just to access hdfs then you can use the following object with it,
you can create/read/write files.
val hdfs = org.apache.hadoop.fs.FileSystem.get(new
URI("hdfs://localhost:9000"), hadoopConf)
Thanks
Best Regards
On
No. I am not accessing hdfs from either shell or a spark application. I
want to access from spark "Scheduler code".
I face an error when I use sc.textFile() as SparkContext wouldn't have been
created yet. So, error says: "sc not found".
On Fri, Nov 14, 2014 at 9:07 PM, Akhil Das
wrote:
> like t
like this?
val file = sc.textFile("hdfs://localhost:9000/sigmoid/input.txt")
Thanks
Best Regards
On Fri, Nov 14, 2014 at 9:02 PM, rapelly kartheek
wrote:
> Hi,
> I am trying to read a HDFS file from Spark "scheduler code". I could find
> how to write hdfs read/writes in java.
>
> But I need t