Hi,
Are you using hive on spark?  We also encountered a similar timeout
exception, you can add the following parameters in hive-site.xml, after
verification, it is useful.
  <property>
   <name>spark.master</name>
   <value>yarn-cluster</value>
   <description>spark on yarn</description>
</property>

BTW, we also encountered a spark job submission timeout that caused kylin
to report an error. The relevant resolution parameters are as follows:
  <property>
    <name>spark.network.timeout</name>
    <value>300</value>
</property>
I hope useful to you!

[email protected] <[email protected]> 于2020年4月12日周日 下午5:42写道:

> Hi,
> This exception often occurs when Create Intermediate Flat Hive Table of
> Buid cube,What's the reason?
>
> FAILED: Execution Error, return code 30041 from
> org.apache.hadoop.hive.ql.exec.spark.SparkTask. Failed to create Spark
> client for Spark session c4332937-4c0b-43e6-9be7-7d787e6777a2:
> java.util.concurrent.TimeoutException: Client
> 'c4332937-4c0b-43e6-9be7-7d787e6777a2' timed out waiting for connection
> from the Remote Spark Driver
> The command is:
> hive -e "set mapred.job.name='Create Intermediate Flat Hive Table'
>
> ------------------------------
> 手机:徐建文
> 手机:18201820247
> 邮箱:  [email protected]
> 地址:上海市徐汇区虹漕路68号锦和中心6C
> 邮编:200233
> 电话:021-33582102
> 网址:  www.seassoon.com
>

Reply via email to