Sorry wrong format:
file:///home/wxhsdp/spark/example/standalone/README.md
An extra / is needed at the start.
On Thu, Apr 24, 2014 at 1:46 PM, Adnan Yaqoob wrote:
> You need to use proper url format:
>
> file://home/wxhsdp/spark/example/standalone/README.md
>
>
> On Thu, Ap
You need to use proper url format:
file://home/wxhsdp/spark/example/standalone/README.md
On Thu, Apr 24, 2014 at 1:29 PM, wxhsdp wrote:
> i think maybe it's the problem of read local file
>
> val logFile = "/home/wxhsdp/spark/example/standalone/README.md"
> val logData = sc.textFile(logFile).c
to access the last element.
>
>
> On Thu, Apr 24, 2014 at 10:33 AM, Sai Prasanna wrote:
>
>> Oh ya, Thanks Adnan.
>>
>>
>> On Thu, Apr 24, 2014 at 10:30 AM, Adnan Yaqoob wrote:
>>
>>> You can use following code:
>>>
>>> RDD.tak
You can use following code:
RDD.take(RDD.count())
On Thu, Apr 24, 2014 at 9:51 AM, Sai Prasanna wrote:
> Hi All, Some help !
> RDD.first or RDD.take(1) gives the first item, is there a straight forward
> way to access the last element in a similar way ?
>
> I coudnt fine a tail/last method for
When I was testing spark, I faced this issue, this issue is not related to
memory shortage, It is because your configurations are not correct. Try to
pass you current Jar to to the SparkContext with SparkConf's setJars
function and try again.
On Thu, Apr 24, 2014 at 8:38 AM, wxhsdp wrote:
> by t