[ 
https://issues.apache.org/jira/browse/SPARK-29361?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16944987#comment-16944987
 ] 

Jungtaek Lim commented on SPARK-29361:
--------------------------------------

The plan for now is overloading below methods marked as "DeveloperApi" to have 
boolean field "isStreaming", like SQLContext.internalCreateDataFrame which is 
not a public API.

> SQLContext

{code}
def createDataFrame(rowRDD: JavaRDD[Row], schema: StructType, boolean 
isStreaming): DataFrame
def createDataFrame(rowRDD: JavaRDD[Row], schema: StructType, boolean 
isStreaming): DataFrame
{code}

> SparkSession

{code}
def createDataFrame(rowRDD: RDD[Row], schema: StructType, boolean isStreaming): 
DataFrame
def createDataFrame(rowRDD: JavaRDD[Row], schema: StructType, boolean 
isStreaming): DataFrame
{code}

since they finally calls SparkSession.internalCreateDataFrame which has 
isStreaming field, it is just passing additional parameter. Given we don't 
allow default parameter for developer api to support interop with Java, 4 new 
methods should be introduced instead of fixing existing 4 methods.

> Enable streaming source support on DSv1 
> ----------------------------------------
>
>                 Key: SPARK-29361
>                 URL: https://issues.apache.org/jira/browse/SPARK-29361
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Jungtaek Lim
>            Priority: Major
>
> DSv2 is heavily diverged between Spark 2.x and 3.x, and started from some 
> times before, Spark community suggested to not deal with old DSv2 and wait 
> for new DSv2. 
> The only consistent option between Spark 2.x and 3.x is DSv1, but supporting 
> streaming source is missed in DSv1. This issue tracks the effort to add 
> support for streaming source on DSv1.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to