Hi sir,
Could you please help me to implement the below scenario using spark scala:
how to convert the string date type to date type to check
the agg_start_date is less than the data_date.I want to take 2019-01-09 data
alone for my aggregations
data_date:2019-01-10 (Currently it is the processin
Hi,
To calculate LAG functions difference for the two data_date(current date and
previous date) on the same column
Could you please help me to implement the below scenario using scala spark
Dataset.
uniq_im - LAG(uniq_im,1,0) OVER PARTITION BY(name,sit,plc,country,state)
order by (data_date) as ca
Could you someone please help me how to fix this below error in spark 2.1.0
scala-2.11.8 Baically I'm migrating the code from spark 1.6.0 to
spark-2.1.0.
I'm getting the below exception in spark 2.1.0
Error: java.lang.ClassCastException: java.sql.Date cannot be cast to
java.lang.String at org.apa
1st Approach:
error : value split is not a member of org.apache.spark.sql.Row?
val newRdd = spark.read.text("/xyz/a/b/filename").rdd
anotherRDD = newRdd.
map(ip =>ip.split("\\|")).map(ip => Row(if (ip(0).isEmpty()) {
null.asInstanceOf[Int] }
Please help me on the below error & give me different approach on the below
data manipulation.
Error:Unable to find encoder for type stored in a Dataset. Primitive types
(Int, String, etc) and Product types (case classes) are supported by
importing spark.implicits._ Support for serializing other t