I've done this:

1. foreachPartition
2. Open connection.
3. foreach inside the partition.
4. close the connection.

Slightly crufty, but works. Would love to see a better approach.

Regards,
Ashic.

Date: Fri, 5 Dec 2014 12:32:24 -0500
Subject: Spark Streaming Reusing JDBC Connections
From: asimja...@gmail.com
To: user@spark.apache.org

Is there a way I can have a JDBC connection open through a streaming job. I 
have a foreach which is running once per batch. However, I don’t want to open 
the connection for each batch but would rather have a persistent connection 
that I can reuse. How can I do this?

Thanks.
Asim                                      

Reply via email to