If you are talking about a stand alone program, have a look at this doc.
<https://spark.apache.org/docs/0.9.1/python-programming-guide.html#standalone-programs>

from pyspark import SparkConf, SparkContext
from pyspark.sql import HiveContext

conf = (SparkConf()
         .setMaster("local")
         .setAppName("My app")
         .set("spark.executor.memory", "1g"))
sc = SparkContext(conf = conf)

sqlContext = HiveContext(sc)


Thanks
Best Regards

On Sat, Nov 8, 2014 at 4:35 AM, Pagliari, Roberto <rpagli...@appcomsci.com>
wrote:

> I’m running the latest version of spark with Hadoop 1.x and scala 2.9.3
> and hive 0.9.0.
>
>
>
> When using python 2.7
>
> from pyspark.sql import HiveContext
>
> sqlContext = HiveContext(sc)
>
>
>
> I’m getting ‘sc not defined’
>
>
>
> On the other hand, I can see ‘sc’ from pyspark CLI.
>
>
>
> Is there a way to fix it?
>

Reply via email to