[ 
https://issues.apache.org/jira/browse/SPARK-38661?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17512577#comment-17512577
 ] 

Apache Spark commented on SPARK-38661:
--------------------------------------

User 'martin-g' has created a pull request for this issue:
https://github.com/apache/spark/pull/35976

> [TESTS] Replace 'abc & Symbol("abc") symbols with $"abc" in tests
> -----------------------------------------------------------------
>
>                 Key: SPARK-38661
>                 URL: https://issues.apache.org/jira/browse/SPARK-38661
>             Project: Spark
>          Issue Type: Improvement
>          Components: Tests
>    Affects Versions: 3.2.1
>            Reporter: Martin Tzvetanov Grigorov
>            Assignee: Martin Tzvetanov Grigorov
>            Priority: Minor
>             Fix For: 3.3.0
>
>
> This ticket is a follow up of SPARK-38351.
>  
> When building with Scala 2.13 many test classes produce warnings like:
> {code:java}
> [warn] 
> /home/runner/work/spark/spark/sql/core/src/test/scala/org/apache/spark/sql/execution/BaseScriptTransformationSuite.scala:562:11:
>  [deprecation @  | origin= | version=2.13.0] symbol literal is deprecated; 
> use Symbol("d") instead
> [warn]           'd.cast("string"),
> [warn]           ^
> [warn] 
> /home/runner/work/spark/spark/sql/core/src/test/scala/org/apache/spark/sql/execution/BaseScriptTransformationSuite.scala:563:11:
>  [deprecation @  | origin= | version=2.13.0] symbol literal is deprecated; 
> use Symbol("e") instead
> [warn]           'e.cast("string")).collect())
>  {code}
> For easier migration to Scala 3.x later it would be good to fix this warnings!
>  
> Also as suggested by [https://github.com/HeartSaVioR] it would be good to use 
> Spark's $"abc" syntax for columns.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to