Hi,
I am new to spark and planning on writing a machine learning application
with Spark mllib. My dataset is in json format. Is it possible to load data
into spark without using any external json libraries? I have explored the
option of SparkSql but I believe that is only for interactive use or
lo
.
Now, I can't figure out as to why it should run successfully during this
time even if it could not find SparkContext. I am sure there should be good
reason behind this behavior. Anyone has any idea on this?
Thanks,
Pankaj Channe
On Saturday, November 22, 2014, pankaj channe wrote:
>
>> Best Regards
>>
>> On Sat, Nov 22, 2014 at 8:39 AM, pankaj channe
>> wrote:
>>
>>> I have seen similar posts on this issue but could not find solution.
>>> Apologies if this has been discussed here before.
>>>
>>> I am
I have seen similar posts on this issue but could not find solution.
Apologies if this has been discussed here before.
I am running a spark streaming job with yarn on a 5 node cluster. I am
using following command to submit my streaming job.
spark-submit --class class_name --master yarn-cluster -