On 7 Feb 2017 4:17 a.m., "Mars Xu" <xujiao.myc...@gmail.com> wrote:

Hello All,

        Some spark sqls will produce one or more jobs, I have 2 questions,

        1, How the cc.sql(“sql statement”) divided into one or more jobs ?


It's an implementation detail. You can have zero or more jobs for a single
structured query (query DSL or SQL).

        2, When I execute spark sql query in spark - shell client, how to
get the execution time (Spark  2.1.0) ?  if a sql query produced 3 jobs, In
my opinion, the execution time is to sum up the 3 jobs’ duration time.


Yes. What's the question then?

Jacek

Reply via email to