Hello.

I am working to create a persistent table using SparkSQL HiveContext. I
have a basic Windows event case class:

case class WindowsEvent(
                         targetEntity: String,
                         targetEntityType: String,
                         dateTimeUtc: DateTime,
                         eventId: String,
                         eventData: Map[String, String],
                         description: String,
                         eventRecordId: String,
                         level: String,
                         machineName: String,
                         sequenceNumber: String,
                         source: String,
                         sourceMachineName: String,
                         taskCategory: String,
                         user: String,
                         machineIp: String,
                         additionalData: Map[String, String]
                         )

The case class is written to a Hive table as follows:

    val hc = new HiveContext(sc)
    import hc.implicits._

    windowsEvents.foreachRDD( rdd => {
       val eventsDataFrame = rdd.toDF()
      eventsDataFrame.write.mode(SaveMode.Append).saveAsTable("eventsTable")
    })

I am seeing the following error:

Exception in thread "main" java.lang.UnsupportedOperationException: Schema
for type org.joda.time.DateTime is not supported


Obviously the DateTime schema is not supported.  How is implicit DateTime
conversion from Joda DateTime to a persistent Hive table accomplished?  Has
anyone else run into the same issue?

Regards,

Bryan Jeffrey

Reply via email to