[ 
https://issues.apache.org/jira/browse/SPARK-5191?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-5191:
-----------------------------
    Component/s:     (was: PySpark)
                 Scheduler

> Pyspark: scheduler hangs when importing a standalone pyspark app
> ----------------------------------------------------------------
>
>                 Key: SPARK-5191
>                 URL: https://issues.apache.org/jira/browse/SPARK-5191
>             Project: Spark
>          Issue Type: Bug
>          Components: Scheduler
>    Affects Versions: 1.0.2, 1.1.1, 1.3.0, 1.2.1
>            Reporter: Daniel Liu
>
> In a.py:
> {code}
> from pyspark import SparkContext
> sc = SparkContext("local", "test spark")
> rdd = sc.parallelize(range(1, 10))
> print rdd.count()
> {code}
> In b.py:
> {code}
> from a import *
> {code}
> {{python a.py}} runs fine
> {{python b.py}} will hang at TaskSchedulerImpl: Removed TaskSet 0.0, whose 
> tasks have all completed, from pool
> {{./bin/spark-submit --py-files a.py b.py}} has the same problem



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to