Re: Spark SQL API taking longer time than DF API.

2019-03-31 Thread Jörn Franke
Is the select taking longer or the saving to a file. You seem to only save in 
the second case to a file 

> Am 29.03.2019 um 15:10 schrieb neeraj bhadani :
> 
> Hi Team,
>I am executing same spark code using the Spark SQL API and DataFrame API, 
> however, Spark SQL is taking longer than expected.
> 
> PFB Sudo code.
> ---
> Case 1 : Spark SQL
> ---
> %sql
> CREATE TABLE 
> AS
> 
>  WITH  AS (
>  
> )
> , AS (
>  
>  )
> 
> SELECT * FROM  
> UNION ALL
> SELECT * FROM 
> 
> ---
> Case  2 : DataFrame API
> ---
> 
> df1 = spark.sql()
> df2 = spark.sql()
> df3 = df1.union(df2)
> df3.write.saveAsTable()
> ---
> 
> As per my understanding, both Spark SQL and DtaaFrame API generate the same 
> code under the hood and execution time has to be similar.
> 
> Regards,
> Neeraj
> 


Re: Spark SQL API taking longer time than DF API.

2019-03-31 Thread neeraj bhadani
qry_1 and qry_2 are simple select query with groupBy clause.

Are there any specific queries which works in a different way for Spark SQL
and DataFrame API?

Regards,
Neeraj

On Sat, Mar 30, 2019 at 7:27 PM Jason Nerothin 
wrote:

> Can you please quantify the difference and provide the query code?
>
> On Fri, Mar 29, 2019 at 9:11 AM neeraj bhadani <
> bhadani.neeraj...@gmail.com> wrote:
>
>> Hi Team,
>>I am executing same spark code using the Spark SQL API and DataFrame
>> API, however, Spark SQL is taking longer than expected.
>>
>> PFB Sudo code.
>>
>> ---
>>
>> Case 1 : Spark SQL
>>
>>
>> ---
>>
>> %sql
>>
>> CREATE TABLE 
>>
>> AS
>>
>>
>>  WITH  AS (
>>
>>  
>>
>> )
>>
>> , AS (
>>
>>  
>>
>>  )
>>
>>
>> SELECT * FROM 
>>
>> UNION ALL
>>
>> SELECT * FROM 
>>
>>
>>
>> ---
>>
>> Case  2 : DataFrame API
>>
>>
>> ---
>>
>>
>> df1 = spark.sql()
>>
>> df2 = spark.sql()
>>
>> df3 = df1.union(df2)
>>
>> df3.write.saveAsTable()
>>
>>
>> ---
>>
>>
>> As per my understanding, both Spark SQL and DtaaFrame API generate the
>> same code under the hood and execution time has to be similar.
>>
>>
>> Regards,
>>
>> Neeraj
>>
>>
>>
>
> --
> Thanks,
> Jason
>