Take a look at this gist
https://gist.github.com/bigaidream/40fe0f8267a80e7c9cf8
That worked for me.


On Wed, Aug 6, 2014 at 7:32 PM, Sathish Kumaran Vairavelu <
vsathishkuma...@gmail.com> wrote:

> Mohit, This doesn't seems to be working can you please provide more
> details? when I use "from pyspark import SparkContext" it is disabled in
> pycharm. I use pycharm community edition. Where should I set the
> environment variables in same python script or different python script?
>
> Also, should I run any Spark local cluster so Spark program runs on top of
> that?
>
>
> Appreciate your help
>
> -Sathish
>
>
> On Wed, Aug 6, 2014 at 6:22 PM, Mohit Singh <mohit1...@gmail.com> wrote:
>
>> My naive set up..
>> Adding
>> os.environ['SPARK_HOME'] = "/path/to/spark"
>> sys.path.append("/path/to/spark/python")
>> on top of my script.
>> from pyspark import SparkContext
>> from pyspark import SparkConf
>> Execution works from within pycharm...
>>
>> Though my next step is to figure out autocompletion and I bet there are
>> better ways to develop apps for spark..
>>
>>
>>
>> On Wed, Aug 6, 2014 at 4:16 PM, Sathish Kumaran Vairavelu <
>> vsathishkuma...@gmail.com> wrote:
>>
>>> Hello,
>>>
>>> I am trying to use the python IDE PyCharm for Spark application
>>> development. How can I use pyspark with Python IDE? Can anyone help me with
>>> this?
>>>
>>>
>>> Thanks
>>>
>>> Sathish
>>>
>>>
>>>
>>
>>
>> --
>> Mohit
>>
>> "When you want success as badly as you want the air, then you will get
>> it. There is no other secret of success."
>> -Socrates
>>
>
>


-- 
Mohit

"When you want success as badly as you want the air, then you will get it.
There is no other secret of success."
-Socrates

Reply via email to