Hi Mayur,

In Java, you can do futures.get with a timeout and then cancel the future once 
timeout has been reached in the catch block. There should be something similar 
in scala as well.

Eg: https://stackoverflow.com/a/16231834
[https://cdn.sstatic.net/Sites/stackoverflow/Img/apple-touch-i...@2.png?v=73d79a89bded]<https://stackoverflow.com/a/16231834>
Does a Future timeout kill the Thread 
execution<https://stackoverflow.com/a/16231834>
When using an ExecutorService and Future objects (when submitting Runnable 
tasks), if I specify a timeout value to the future's get function, does the 
underlying thread get killed when a TimeoutExc...
stackoverflow.com



Regards,
Vibhor
________________________________
From: Gourav Sengupta <gourav.sengu...@gmail.com>
Sent: Thursday, September 15, 2022 10:22 PM
To: Mayur Benodekar <askma...@gmail.com>
Cc: user <user@spark.apache.org>; i...@spark.apache.org <i...@spark.apache.org>
Subject: EXT: Re: Spark SQL

EXTERNAL: Report suspicious emails to Email Abuse.

Okay, so for the problem to the solution 👍 that is powerful

On Thu, 15 Sept 2022, 14:48 Mayur Benodekar, 
<askma...@gmail.com<mailto:askma...@gmail.com>> wrote:
Hi Gourav,

It’s the way the framework is


Sent from my iPhone

On Sep 15, 2022, at 02:02, Gourav Sengupta 
<gourav.sengu...@gmail.com<mailto:gourav.sengu...@gmail.com>> wrote:


Hi,

Why spark and why scala?

Regards,
Gourav

On Wed, 7 Sept 2022, 21:42 Mayur Benodekar, 
<askma...@gmail.com<mailto:askma...@gmail.com>> wrote:

 am new to scala and spark both .

I have a code in scala which executes quieres in while loop one after the other.

What we need to do is if a particular query takes more than a certain time , 
for example # 10 mins we should be able to stop the query execution for that 
particular query and move on to the next one

for example

do {
    var f = Future(
   spark.sql("some query"))
   )

f onSucess {
  case suc - > println("Query ran in 10mins")
}

f failure {
 case fail -> println("query took more than 10mins")
}


}while(some condition)

var result = Await.ready(f,Duration(10,TimeUnit.MINUTES))

I understand that when we call spark.sql the control is sent to spark which i 
need to kill/stop when the duration is over so that i can get back the resources

I have tried multiple things but I am not sure how to solve this. Any help 
would be welcomed as i am stuck with this.

--
Cheers,
Mayur

Reply via email to