[ 
https://issues.apache.org/jira/browse/SPARK-43972?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17781917#comment-17781917
 ] 

Jamie commented on SPARK-43972:
-------------------------------

This issue appears to be fixed in pyspark 3.5.0

 

Here's a run of the same tests: 
[https://github.com/jamiekt/jstark/actions/runs/6725570531/job/18280243390] 
that were [run on pyspark 
3.5.0|https://github.com/jamiekt/jstark/actions/runs/6725570531/job/18280243390#step:6:53].

> Tests never succeed on pyspark 3.4.0 (work OK on pyspark 3.3.2)
> ---------------------------------------------------------------
>
>                 Key: SPARK-43972
>                 URL: https://issues.apache.org/jira/browse/SPARK-43972
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 3.4.0
>         Environment: Sorry, not sure what I'm supposed to put in this section.
>            Reporter: Jamie
>            Priority: Major
>
> I have a project that uses pyspark. The tests have always run fine on pyspark 
> versions prior to pyspark 3.4.0 but now fail on that version (which was 
> released on 2023-04-13).
> My project is configured to use the latest available version of pyspark:
> {code:json}
> dependencies = [
>   "pyspark",
>   "faker"
> ]
> {code}
> [https://github.com/jamiekt/jstark/blob/c1629cee4e4b8fb0b4471f6fc2941f1b0a99a4bf/pyproject.toml#L26-L29]
> The tests are run using GitHub Actions. An example of the failing tests is at 
> [https://github.com/jamiekt/jstark/actions/runs/4977164046], you can see 
> there that the tests are run upon various combinations of OS & python 
> version, they are all cancelled after running for over 5 hours.
> If I [pin the version of pyspark to 
> 3.3.2|https://github.com/jamiekt/jstark/commit/5fd7115d3719a7d6ef2547e8e35feb3ed76ee99f]
>  then the tests all succeed in ~10 minutes, see 
> [https://github.com/jamiekt/jstark/actions/runs/5061332947] for such a 
> successful run.
> ----
> This can be reproduced by cloning the repository and running only one test. 
> The project uses hatch for managing environments and dependencies so you 
> would need that installed ({{{}pipx install hatch{}}}/{{{}brew install 
> hatch{}}}). I have reproduced the problem on python3.10.
> Reproduce the problem by running these commands:
> {code:bash}
> # force use of python3.10
> export HATCH_PYTHON=/path/to/python3.10
> git clone https://github.com/jamiekt/jstark.git
> cd jstark
> # following command will create a virtualenv & install all dependencies, 
> including pyspark 3.4.0
> hatch run pytest -k test_basketweeks_by_product_and_customer
> {code}
> On my machine this never completes. I need to CTRL+C to crash out of it. I 
> consider this to be equivalent behaviour to the tests that fail in the GitHub 
> Actions pipeline after 6 hours.
> Now let's checkout the branch which pins pyspark to 3.3.2 and run the same 
> thing (the hatch environment will get rebuilt with pyspark 3.3.2)
> {code:bash}
> git checkout try-pyspark3-3-2
> hatch run pytest -k test_basketweeks_by_product_and_customer
> {code}
> this time it succeeds in ~31seconds:
> {code:bash}
> ➜  hatch run pytest -k test_basketweeks_by_product_and_customer
> ==================================================================================================
>  test session starts 
> ======================================================================================================================
> platform darwin -- Python 3.10.10, pytest-7.3.1, pluggy-1.0.0
> rootdir: /private/tmp/jstark
> plugins: Faker-18.9.0, cov-4.0.0
> collected 79 items / 78 deselected / 1 selected
> tests/test_grocery_retailer_feature_generator.py .                            
>                                                                               
>                                                                               
>                                [100%]
> ================================================================= 1 passed, 
> 78 deselected in 31.30s =================
> {code}
> That particular test constructs a very very complex pyspark dataframe which I 
> suspect might be contributing to the problem, however the issue here is that 
> it works on pyspark 3.3.2 but not on pyspark 3.4.0.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to