Github user dilipbiswal commented on the issue:
https://github.com/apache/spark/pull/22274
@felixcheung Yeah... it may be a newer change. Actually i am new to R as
well. Here is the test i did -
```
00:15:22-dbiswal~/mygit/apache/spark/bin (SPARK-25308)$ ./sparkR
R version 3.5.1 (2018-07-02) -- "Feather Spray"
Copyright (C) 2018 The R Foundation for Statistical Computing
Platform: x86_64-apple-darwin15.6.0 (64-bit)
R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.
Natural language support but running in an English locale
R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.
Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.
Launching java with spark-submit command
/Users/dbiswal/mygit/apache/spark/bin/spark-submit "sparkr-shell"
/var/folders/z5/scf6bthx6cbcsxz3h91t3rpr0000gn/T//Rtmpy7rA5y/backend_port1381e4a3fb4ce
18/09/02 00:15:30 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.4.0-SNAPSHOT
/_/
SparkSession available as 'spark'.
> Sys.setenv(TZ = "UTC")
> library(testthat)
Attaching package: âtestthatâ
The following objects are masked from âpackage:SparkRâ:
describe, not
> l2 <- list(list(a = 1L, b = as.POSIXlt("2012-12-13 12:34:00", tz =
"UTC")),
+ list(a = 2L, b = as.POSIXlt("2014-12-15 01:24:34", tz =
"UTC")))
> df2 <- createDataFrame(l2)
> value_from_df <- collect(select(df2, from_utc_timestamp(df2$b, "JST")))[,
1]
> class(value_from_df)
[1] "POSIXct" "POSIXt"
>
```
so the value from dataframe is POSIXct which is what we create in our
deserializer. But we compare with POSIXlt which has the extra timezone
attribute information. and are failing.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]