If you want to contribute to the project open a JIRA/PR:
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark
On Sat, Dec 12, 2015 at 3:13 AM, kali.tumm...@gmail.com <
kali.tumm...@gmail.com> wrote:
> Hi All,
>
>
> https://github.com/apache/spark/blob/branch-1.5/sql/core/src/ma
Hi All,
https://github.com/apache/spark/blob/branch-1.5/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala#L48
In Present spark version in line 48 there is a bug, to check whether table
exists in a database using limit doesnt work for all databases sql server
Hi All,
https://github.com/apache/spark/blob/branch-1.5/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala#L48
In Present spark version in line 48 there is a bug, to check whether table
exists in a database using limit doesnt work for all databases sql server
Not for sure, but I think it is bug as of 1.5.
Spark is using LIMIT keyword whether a table exists.
https://github.com/apache/spark/blob/branch-1.5/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala#L48
If your database does not support LIMIT keyword such as S
Hi Spark Contributors,
I am trying to append data to target table using df.write.mode("append")
functionality but spark throwing up table already exists exception.
Is there a fix scheduled in later spark release ?, I am using spark 1.5.
val sourcedfmode=sourcedf.write.mode("append")
sourcedfmod