Hi,
   Currently I am submitting multiple hive jobs using hive cli with "hive -f" 
from different scripts. All these jobs I could see in application tracker and 
these get processed in parallel.
Now I planned to switch to HiveServer2 and submitting jobs using beeline client 
from multiple scripts  example : "nohup beeline -u 
jdbc:hive2://<host>:<port>#<variable>=$epoch -n <user> -p <pass> -d 
org.apache.hive.jdbc.HiveDriver -f <hql script>.hql &".
But I could see jobs get submitted in serial. Only one job appears in 
application tracker at a time even though cluster resources are available. Once 
that job finishes then only other job get submitted. Why is that? Is there any 
setting needs to be set to submit jobs in parallel?

Thanks,
Chandra

Reply via email to