[ 
https://issues.apache.org/jira/browse/SPARK-33568?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17239399#comment-17239399
 ] 

Shane Knapp commented on SPARK-33568:
-------------------------------------

this is now installed on the ubuntu 16 workers

> install coverage for pypy3
> --------------------------
>
>                 Key: SPARK-33568
>                 URL: https://issues.apache.org/jira/browse/SPARK-33568
>             Project: Spark
>          Issue Type: Bug
>          Components: Build, PySpark
>    Affects Versions: 3.0.0
>            Reporter: Shane Knapp
>            Assignee: Shane Knapp
>            Priority: Major
>
> from:
> https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-branch-3.0-test-sbt-hadoop-2.7-hive-1.2/1002/console
>  
> Coverage is not installed in Python executable 'pypy3' but 
> 'COVERAGE_PROCESS_START' environment variable is set, exiting.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to