[
https://issues.apache.org/jira/browse/DATAFU-179?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17903909#comment-17903909
]
Eyal Allweil commented on DATAFU-179:
-------------------------------------
I have a Jupyter notebook which runs/checks most of the Scala APIs, which look
good.
It looks like the guide I wrote for calling them from PySpark is outdated and
won't work anymore. I need to update it from a my PySpark notebook - I didn't
try to run them this time from a VM like I did when I wrote the guide.
Is that what you meant? We should probably set up something automatic (or more
automatic) than that going forward.
> Support Spark 3.3.x and 3.4.x
> -----------------------------
>
> Key: DATAFU-179
> URL: https://issues.apache.org/jira/browse/DATAFU-179
> Project: DataFu
> Issue Type: Improvement
> Affects Versions: 2.1.0
> Reporter: Eyal Allweil
> Assignee: Eyal Allweil
> Priority: Major
> Fix For: 2.1.0
>
> Time Spent: 10m
> Remaining Estimate: 0h
>
> Make sure that DataFu compiles with and tests successfully for Spark 3.3.x
> and 3.4.x
--
This message was sent by Atlassian Jira
(v8.20.10#820010)