t; wrote:
>
> Do you still need help on the PR?
> btw, does this apply to YARN client mode?
>
> ------
> From: andrewweiner2...@u.northwestern.edu
> Date: Sun, 17 Jan 2016 17:00:39 -0600
> Subject: Re: SparkContext SyntaxError: invalid syntax
> To: cutl.
rn.edu
> Date: Sun, 17 Jan 2016 17:00:39 -0600
> Subject: Re: SparkContext SyntaxError: invalid syntax
> To: cutl...@gmail.com
> CC: user@spark.apache.org
>
>
> Yeah, I do think it would be worth explicitly stating this in the docs. I
> was going to try to edit the docs mys
hwestern.edu
> Date: Sun, 17 Jan 2016 17:00:39 -0600
> Subject: Re: SparkContext SyntaxError: invalid syntax
> To: cutl...@gmail.com
> CC: user@spark.apache.org
>
>
> Yeah, I do think it would be worth explicitly stating this in the docs. I
> was going to try to edit th
Do you still need help on the PR?
btw, does this apply to YARN client mode?
From: andrewweiner2...@u.northwestern.edu
Date: Sun, 17 Jan 2016 17:00:39 -0600
Subject: Re: SparkContext SyntaxError: invalid syntax
To: cutl...@gmail.com
CC: user@spark.apache.org
Yeah, I do think it would be worth
entioned in the prior email:
>>>>>>>>> Error from python worker:
>>>>>>>>> python: module pyspark.daemon not found
>>>>>>>>>
>>>
;>
>>>>>> On Fri, Jan 8, 2016 at 2:31 PM, Bryan Cutler <cutl...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Hi Andrew,
>&g
l/lib/python2.7/site-packages:/home/jpr123/mobile-cdn-analysis:/home//lib/python2.7/site-packages:/scratch4/hadoop/yarn/local/usercache//appcache/application_1450370639491_0136/container_1450370639491_0136_01_02/pyspark.zip:/scratch4/hadoop/yarn/local/usercache//appcache/application_14
l job to just run on one node.
>>>>>>>>
>>>>>>>> On Fri, Jan 8, 2016 at 5:22 PM, Andrew Weiner <
>>>>>>>> andrewweiner2...@u.northwestern.edu> wrote:
>>>>>>>>
>>>>>>>>> Now for simplicity I'm testing with wordcount.py from the provided
>>>>>
at []
>>>>>>>>
>>>>>>>> A bit lower down, I see this error:
>>>>>>>>
>>>>>>>> 16/01/08 19:14:48 WARN scheduler.TaskSetManager: Lost task 0.0 in
>>>>>>>> stage 0.0 (TI
ou are using? If you are sure
>>>>>>>>> everything
>>>>>>>>> is installed correctly on each node following the guide on "Running
>>>>>>>>> Spark
>>
/Traceback (most recent call last):
>>>>>> File "loss_rate_by_probe.py", line 15, in ?
>>>>>> from pyspark import SparkContext
>>>>>> File
>>>>>>
>>>>>> "/scratch5/hadoop/yarn/local/usercache//filecache/18/spark-assembly-1.3.1-hadoop2.4.0.j
arn/local/usercache//filecache/18/spark-assembly-1.3.1-hadoop2.4.0.jar/pyspark/context.py",
>>>>> line 219
>>>>> with SparkContext._lock:
>>>>> ^
>>>>> SyntaxError: invalid syntax/
>>>>>
>>>>> This is very
SparkContext-lock-Error-td18233.html
>>>> >
>>>> , but unlike that person I am using Python 2.7.8.
>>>>
>>>> Here is what I'm using:
>>>> Spark 1.3.1
>>>> Hadoop 2.4.0.2.1.5.0-695
>>>> Python 2.7.8
>>&g
p/pyspark/__init__.py",
> line 61
> indent = ' ' * (min(len(m) for m in indents) if indents else 0)
> ^
> SyntaxError: invalid syntax/
>
> Any thoughts?
>
> Andrew
>
&g
obe.py", line 15, in ?
>> from pyspark import SparkContext
>> File
>>
>> "/scratch5/hadoop/yarn/local/usercache//appcache/application_1450370639491_0119/container_1450370639491_0119_01_01/pyspark.zip/pyspark/__init__.py",
>> line 61
>> indent = ' ' * (min(len(m) for m in i
ge in context:
http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-SyntaxError-invalid-syntax-tp25910.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsu
16 matches
Mail list logo