I prefer using CentOS/SLES/Ubuntu personally.

Thanks
Deepak

On Mon, Apr 18, 2016 at 5:57 PM, My List <mylistt...@gmail.com> wrote:

> Deepak,
>
> I love the unix flavors have been a programmed on them. Just have a
> windows laptop and pc hence haven't moved to unix flavors. Was trying to
> run big data stuff on windows. Have run in so much of issues that I could
> just throw the laptop with windows out.
>
> Your view - Redhat, Ubuntu or Centos.
> Does Redhat give a one year licence on purchase etc?
>
> Thanks
>
> On Mon, Apr 18, 2016 at 5:52 PM, Deepak Sharma <deepakmc...@gmail.com>
> wrote:
>
>> It works well with all flavors of Linux.
>> It all depends on your ex with these flavors.
>>
>> Thanks
>> Deepak
>>
>> On Mon, Apr 18, 2016 at 5:51 PM, My List <mylistt...@gmail.com> wrote:
>>
>>> Deepak,
>>>
>>> Would you advice that I use Ubuntu? or Redhat. Cause Windows support etc
>>> and issues are Galore on Spark.
>>> Since I am starting afresh, what would you advice?
>>>
>>> On Mon, Apr 18, 2016 at 5:45 PM, Deepak Sharma <deepakmc...@gmail.com>
>>> wrote:
>>>
>>>> Binary for Spark means ts spark built against hadoop 2.6
>>>> It will not have any hadoop executables.
>>>> You'll have to setup hadoop separately.
>>>> I have not used windows version yet but there are some.
>>>>
>>>> Thanks
>>>> Deepak
>>>>
>>>> On Mon, Apr 18, 2016 at 5:43 PM, My List <mylistt...@gmail.com> wrote:
>>>>
>>>>> Deepak,
>>>>>
>>>>> The following could be a very dumb questions so pardon me for the same.
>>>>> 1) When I download the binary for Spark with a version of
>>>>> Hadoop(Hadoop 2.6) does it not come in the zip or tar file?
>>>>> 2) If it does not come along,Is there a Apache Hadoop for windows, is
>>>>> it in binary format or will have to build it?
>>>>> 3) Is there a basic tutorial for Hadoop on windows for the basic needs
>>>>> of Spark.
>>>>>
>>>>> Thanks in Advance !
>>>>>
>>>>> On Mon, Apr 18, 2016 at 5:35 PM, Deepak Sharma <deepakmc...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Once you download hadoop and format the namenode , you can use
>>>>>> start-dfs.sh to start hdfs.
>>>>>> Then use 'jps' to sss if datanode/namenode services are up and
>>>>>> running.
>>>>>>
>>>>>> Thanks
>>>>>> Deepak
>>>>>>
>>>>>> On Mon, Apr 18, 2016 at 5:18 PM, My List <mylistt...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Hi ,
>>>>>>>
>>>>>>> I am a newbie on Spark.I wanted to know how to start and verify if
>>>>>>> HDFS has started on Spark stand alone.
>>>>>>>
>>>>>>> Env -
>>>>>>> Windows 7 - 64 bit
>>>>>>> Spark 1.4.1 With Hadoop 2.6
>>>>>>>
>>>>>>> Using Scala Shell - spark-shell
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> Thanks,
>>>>>>> Harry
>>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Thanks
>>>>>> Deepak
>>>>>> www.bigdatabig.com
>>>>>> www.keosha.net
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Thanks,
>>>>> Harmeet
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Thanks
>>>> Deepak
>>>> www.bigdatabig.com
>>>> www.keosha.net
>>>>
>>>
>>>
>>>
>>> --
>>> Thanks,
>>> Harmeet
>>>
>>
>>
>>
>> --
>> Thanks
>> Deepak
>> www.bigdatabig.com
>> www.keosha.net
>>
>
>
>
> --
> Thanks,
> Harmeet
>



-- 
Thanks
Deepak
www.bigdatabig.com
www.keosha.net

Reply via email to