Hi Shashank,

Scala case classes are treated as a special tuple type in Flink. If you want to make a POJO out of it, just remove the "case" keyword and make sure that the class is static (in the companion object).

I hope that helps.

Timo


Am 12/19/17 um 11:04 AM schrieb shashank agarwal:
HI,

I have upgraded flink from 1.3.2 to 1.4.0. I am using cassandra sink in my scala application. Before sink, i was converting my scala datastream to java stream and sinking in Cassandra. I have created pojo class in scala liked that :

@SerialVersionUID(507L)
@Table(keyspace = "neofp", name = "order_detail")
case class OrderFinal(
                       @BeanProperty var order_name: String,
                       @BeanProperty var user: String )extends Serializable
{
  def this() {
    this("NA", "NA",)
  }
}

and this was working fine with sink. after upgrading to 1.4.0 it's giving error "Query must not be null or empty."

After dig into the CassandraSink code, I have found it's treating it as case class and running CassandraScalaProductSinkBuilder which do sanityCheck of query existence.

So how I can create POJO class in scala so CassandraSink treats it as CassandraPojoSinkBuilder?

For workaround now I have downgraded the only connector to 1.3.2


Thanks
Shashank


Reply via email to