pralabhkumar opened a new pull request #34275:
URL: https://github.com/apache/spark/pull/34275


   ### What changes were proposed in this pull request?
   
   Hide JVM traceback for SparkUpgradeException
   Following PR will result into
   
   ```
   from pyspark.sql.functions import to_date, unix_timestamp, from_unixtime
   df2 = df.select('date_str',to_date(from_unixtime(unix_timestamp('date_str', 
'yyyy-dd-aa'))))
   df2.show(1, False)
   
   
    raise converted from None
   
   pyspark.sql.utils.SparkUpgradeException: You may get a different result due 
to the upgrading of Spark 3.0: Fail to recognize 'yyyy-dd-aa' pattern in the 
DateTimeFormatter. 1) You can set spark.sql.legacy.timeParserPolicy to LEGACY 
to restore the behavior before Spark 3.0. 2) You can form a valid datetime 
pattern with the guide from 
https://spark.apache.org/docs/latest/sql-ref-datetime-pattern.html
   
    ```
     ### Why are the changes needed?
   
   This change will remove JVM traceback for pyspark in  SparkUpgradeException. 
This will help to have stack trace more pythonic way
   
    ### Does this PR introduce any user-facing change?
   
   Yes user will be able to see only python stacktrace
   
    ### How was this patch tested?
   
   unit tests


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to