Hi guys,

need some help in this problem. In our use case, we need to continuously
insert values into the database. So our approach is to create the jdbc
object in the main method and then do the inserting operation in the
DStream foreachRDD operation. Is this approach reasonable?

Then the problem comes: since we are using com.mysql.jdbc.java, which is
unserializable, we keep seeing the notSerializableException. I think that
is because Spark Streaming is trying to serialize and then checkpoint the
whole class which contains the StreamingContext, not only the
StreamingContext object, right? Or other reason to trigger the serialize
operation? Any workaround for this? (except not using the
com.mysql.jdbc.java)

Thank you.

Cheers,
Fang, Yan
yanfang...@gmail.com
+1 (206) 849-4108

Reply via email to