Please take a look at SPARK-9078 which allows jdbc dialects to override the 
query for checking table existence. 

> On Dec 12, 2015, at 7:12 PM, sri hari kali charan Tummala 
> <kali.tumm...@gmail.com> wrote:
> 
> Hi Michael, Ted, 
> 
> https://github.com/apache/spark/blob/branch-1.5/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala#L48
> 
> In Present spark version in line 48 there is a bug, to check whether table 
> exists in a database using limit doesnt work for all databases sql server for 
> example.
> 
> best way to check whehter table exists in any database is to use, select * 
> from table where 1=2;  or select 1 from table where 1=2; this supports all 
> the databases.
> 
> In spark 1.6 can this change be implemented, this lets  write.mode("append") 
> bug to go away.
> 
> 
> 
> def tableExists(conn: Connection, table: String): Boolean = {
> 
>     // Somewhat hacky, but there isn't a good way to identify whether a table 
> exists for all
>     // SQL database systems, considering "table" could also include the 
> database name.
>     Try(conn.prepareStatement(s"SELECT 1 FROM $table LIMIT 
> 1").executeQuery().next()).isSuccess
>   }
> 
> Solution:-
> def tableExists(conn: Connection, table: String): Boolean = {
> 
>     // Somewhat hacky, but there isn't a good way to identify whether a table 
> exists for all
>     // SQL database systems, considering "table" could also include the 
> database name.
>     Try(conn.prepareStatement(s"SELECT 1 FROM $table where 
> 1=2").executeQuery().next()).isSuccess
>   }
> 
> 
> 
> Thanks
> Sri 
> 
> 
> 
>> On Wed, Dec 9, 2015 at 10:30 PM, Michael Armbrust <mich...@databricks.com> 
>> wrote:
>> The release date is "as soon as possible".  In order to make an Apache 
>> release we must present a release candidate and have 72-hours of voting by 
>> the PMC.  As soon as there are no known bugs, the vote will pass and 1.6 
>> will be released.
>> 
>> In the mean time, I'd love support from the community testing the most 
>> recent release candidate.
>> 
>>> On Wed, Dec 9, 2015 at 2:19 PM, Sri <kali.tumm...@gmail.com> wrote:
>>> Hi Ted,
>>> 
>>> Thanks for the info , but there is no particular release date from my 
>>> understanding the package is in testing there is no release date mentioned.
>>> 
>>> Thanks
>>> Sri
>>> 
>>> 
>>> 
>>> Sent from my iPhone
>>> 
>>> > On 9 Dec 2015, at 21:38, Ted Yu <yuzhih...@gmail.com> wrote:
>>> >
>>> > See this thread:
>>> >
>>> > http://search-hadoop.com/m/q3RTtBMZpK7lEFB1/Spark+1.6.0+RC&subj=Re+VOTE+Release+Apache+Spark+1+6+0+RC1+
>>> >
>>> >> On Dec 9, 2015, at 1:20 PM, "kali.tumm...@gmail.com" 
>>> >> <kali.tumm...@gmail.com> wrote:
>>> >>
>>> >> Hi All,
>>> >>
>>> >> does anyone know exact release data for spark 1.6 ?
>>> >>
>>> >> Thanks
>>> >> Sri
>>> >>
>>> >>
>>> >>
>>> >> --
>>> >> View this message in context: 
>>> >> http://apache-spark-user-list.1001560.n3.nabble.com/Release-data-for-spark-1-6-tp25654.html
>>> >> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>> >>
>>> >> ---------------------------------------------------------------------
>>> >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> >> For additional commands, e-mail: user-h...@spark.apache.org
>>> >>
>>> 
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
> 
> 
> 
> -- 
> Thanks & Regards
> Sri Tummala
> 

Reply via email to