[
https://issues.apache.org/jira/browse/KYLIN-5087?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17413068#comment-17413068
]
曹勇 commented on KYLIN-5087:
---------------------------
1. I replace all jars about spark in directory of $HIVE_HOME/lib/
{code:java}
[hadoop@node4 lib]$ ll *spark*
-rw-r--r--. 1 hadoop hadoop 142037 9月 3 18:22 spark-client-2.3.9.jar
-rw-r--r--. 1 hadoop hadoop 11688921 9月 3 18:22 spark-core_2.11-2.0.0.bak
-rw-r--r--. 1 hadoop hadoop 10496903 9月 9 18:50 spark-core_2.12-3.1.2.jar
-rw-r--r--. 1 hadoop hadoop 65651 9月 3 18:22 spark-launcher_2.11-2.0.0.bak
-rw-r--r--. 1 hadoop hadoop 76347 9月 9 19:03 spark-launcher_2.12-3.1.2.jar
-rw-r--r--. 1 hadoop hadoop 2350593 9月 3 18:22
spark-network-common_2.11-2.0.0.bak
-rw-r--r--. 1 hadoop hadoop 2400596 9月 9 19:02
spark-network-common_2.12-3.1.2.jar
-rw-r--r--. 1 hadoop hadoop 58626 9月 3 18:22
spark-network-shuffle_2.11-2.0.0.bak
-rw-r--r--. 1 hadoop hadoop 127362 9月 9 19:02
spark-network-shuffle_2.12-3.1.2.jar
-rw-r--r--. 1 hadoop hadoop 15303 9月 3 18:22 spark-tags_2.11-2.0.0.bak
-rw-r--r--. 1 hadoop hadoop 15155 9月 9 19:01 spark-tags_2.12-3.1.2.jar
-rw-r--r--. 1 hadoop hadoop 41081 9月 3 18:22 spark-unsafe_2.11-2.0.0.bak
-rw-r--r--. 1 hadoop hadoop 51457 9月 9 19:00 spark-unsafe_2.12-3.1.2.jar
{code}
!image-2021-09-10-16-12-45-012.png|width=861,height=233!
2. I replace all inapposite jars about jackson and put apposite jars into
$KYLIN_HOME/ext/ .
!image-2021-09-10-16-38-32-279.png!
3. I run the sql `select part_dt, sum(price) as total_selled, count(distinct
seller_id) as sellers from kylin_sales group by part_dt order by part_dt` ,and
the error log is :
{code:java}
2021-09-10 11:57:44,057 INFO [Query d8e1d20c-6b50-19a3-24b4-725f548bb275-36]
common.KylinConfig:493 : Creating new manager instance of class
org.apache.kylin.metadata.acl.TableACLManager
2021-09-10 11:57:44,058 INFO [Query d8e1d20c-6b50-19a3-24b4-725f548bb275-36]
acl.TableACLManager:58 : Initializing TableACLManager with config
kylin_metadata@jdbc,url=jdbc:mysql://10.110.147.229:3306/kylin,username=hadoop,password=hadoop1234,maxActive=10,maxIdle=10,driverClassName=com.mysql.cj.jdbc.Driver
2021-09-10 11:57:44,058 DEBUG [Query d8e1d20c-6b50-19a3-24b4-725f548bb275-36]
cachesync.CachedCrudAssist:122 : Reloading TableACL from
kylin_metadata(key='/table_acl')@kylin_metadata@jdbc,url=jdbc:mysql://10.110.147.229:3306/kylin,username=hadoop,password=hadoop1234,maxActive=10,maxIdle=10,driverClassName=com.mysql.cj.jdbc.Driver
2021-09-10 11:57:44,066 DEBUG [Query d8e1d20c-6b50-19a3-24b4-725f548bb275-36]
cachesync.CachedCrudAssist:155 : Loaded 3 TableACL(s) out of 3 resource with 0
errors
2021-09-10 11:57:44,071 INFO [Query d8e1d20c-6b50-19a3-24b4-725f548bb275-36]
routing.RealizationChooser:189 : Force choose DataModelDesc
[name=kylin_sales_model] as selected model for specific purpose.
2021-09-10 11:57:44,074 INFO [Query d8e1d20c-6b50-19a3-24b4-725f548bb275-36]
routing.QueryRouter:84 : Find candidates by table DEFAULT.KYLIN_SALES and
project=learn_kylin : CUBE[name=kylin_sales_cube]
2021-09-10 11:57:44,074 DEBUG [Query d8e1d20c-6b50-19a3-24b4-725f548bb275-36]
routing.QueryRouter:51 : Applying rule: class
org.apache.kylin.query.routing.rules.RemoveBlackoutRealizationsRule,
realizations before: [CUBE[name=kylin_sales_cube]], realizations after:
[CUBE[name=kylin_sales_cube]]
2021-09-10 11:57:44,076 DEBUG [Query d8e1d20c-6b50-19a3-24b4-725f548bb275-36]
routing.QueryRouter:51 : Applying rule: class
org.apache.kylin.query.routing.rules.RemoveUncapableRealizationsRule,
realizations before: [CUBE[name=kylin_sales_cube]], realizations after:
[CUBE[name=kylin_sales_cube]]
2021-09-10 11:57:44,077 INFO [Query d8e1d20c-6b50-19a3-24b4-725f548bb275-36]
rules.RealizationSortRule:40 : CUBE[name=kylin_sales_cube] priority 1 cost 838.
2021-09-10 11:57:44,077 DEBUG [Query d8e1d20c-6b50-19a3-24b4-725f548bb275-36]
routing.QueryRouter:51 : Applying rule: class
org.apache.kylin.query.routing.rules.RealizationSortRule, realizations before:
[CUBE[name=kylin_sales_cube]], realizations after: [CUBE[name=kylin_sales_cube]]
2021-09-10 11:57:44,077 INFO [Query d8e1d20c-6b50-19a3-24b4-725f548bb275-36]
routing.QueryRouter:101 : The realizations remaining:
[CUBE[name=kylin_sales_cube]],and the final chosen one for current olap context
0 is CUBE[name=kylin_sales_cube]
2021-09-10 11:57:44,250 DEBUG [Query d8e1d20c-6b50-19a3-24b4-725f548bb275-36]
common.QueryContext:319 : Cannot find CubeSegmentStatisticsResult for context 0
2021-09-10 11:57:44,265 INFO [Query d8e1d20c-6b50-19a3-24b4-725f548bb275-36]
common.KylinConfig:493 : Creating new manager instance of class
org.apache.kylin.cube.cuboid.CuboidManager
2021-09-10 11:57:44,623 INFO [Query d8e1d20c-6b50-19a3-24b4-725f548bb275-36]
sql.SparderContext:57 : Current thread 36 create a SparkSession.
2021-09-10 11:57:44,627 INFO [Query d8e1d20c-6b50-19a3-24b4-725f548bb275-36]
sql.SparderContext:57 : Init spark.
2021-09-10 11:57:44,630 INFO [Query d8e1d20c-6b50-19a3-24b4-725f548bb275-36]
sql.SparderContext:57 : Initializing Spark thread starting.
2021-09-10 11:57:44,630 INFO [Query d8e1d20c-6b50-19a3-24b4-725f548bb275-36]
sql.SparderContext:57 : Initializing Spark, waiting for done.
2021-09-10 11:57:44,633 INFO [Thread-7] sql.SparderContext:57 : SparderContext
deploy with spark master: yarn
2021-09-10 11:57:44,704 INFO [Thread-7] conf.HiveConf:181 : Found
configuration file file:/home/hadoop/hive-2.3.9/conf/hive-site.xml
2021-09-10 11:57:46,685 WARN [Thread-7] util.NativeCodeLoader:60 : Unable to
load native-hadoop library for your platform... using builtin-java classes
where applicable
2021-09-10 11:57:46,741 DEBUG [Thread-7] common.KylinConfig:363 : KYLIN_CONF
property was not set, will seek KYLIN_HOME env variable
2021-09-10 11:57:46,741 INFO [Thread-7] common.KylinConfig:369 : Use
KYLIN_HOME=/home/hadoop/kylin-4.0.0
2021-09-10 11:57:46,741 DEBUG [Thread-7] common.KylinConfig:363 : KYLIN_CONF
property was not set, will seek KYLIN_HOME env variable
2021-09-10 11:57:46,741 INFO [Thread-7] common.KylinConfig:369 : Use
KYLIN_HOME=/home/hadoop/kylin-4.0.0
2021-09-10 11:57:46,769 INFO [Thread-7] spark.SparkContext:57 : Running Spark
version 3.1.2
2021-09-10 11:57:46,817 INFO [Thread-7] resource.ResourceUtils:57 :
==============================================================
2021-09-10 11:57:46,818 INFO [Thread-7] resource.ResourceUtils:57 : No custom
resources configured for spark.driver.
2021-09-10 11:57:46,818 INFO [Thread-7] resource.ResourceUtils:57 :
==============================================================
2021-09-10 11:57:46,819 INFO [Thread-7] spark.SparkContext:57 : Submitted
application: sparder_on_node4-7070
2021-09-10 11:57:46,839 INFO [Thread-7] resource.ResourceProfile:57 : Default
ResourceProfile created, executor resources: Map(memoryOverhead -> name:
memoryOverhead, amount: 1024, script: , vendor: , cores -> name: cores, amount:
1, script: , vendor: , memory -> name: memory, amount: 4096, script: , vendor:
, offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources:
Map(cpus -> name:cpus, amount: 1.0)
2021-09-10 11:57:46,854 INFO [Thread-7] resource.ResourceProfile:57 : Limiting
resource is cpus at 1 tasks per executor
2021-09-10 11:57:46,856 INFO [Thread-7] resource.ResourceProfileManager:57 :
Added ResourceProfile id: 0
2021-09-10 11:57:46,909 INFO [Thread-7] spark.SecurityManager:57 : Changing
view acls to: hadoop
2021-09-10 11:57:46,910 INFO [Thread-7] spark.SecurityManager:57 : Changing
modify acls to: hadoop
2021-09-10 11:57:46,910 INFO [Thread-7] spark.SecurityManager:57 : Changing
view acls groups to:
2021-09-10 11:57:46,911 INFO [Thread-7] spark.SecurityManager:57 : Changing
modify acls groups to:
2021-09-10 11:57:46,911 INFO [Thread-7] spark.SecurityManager:57 :
SecurityManager: authentication disabled; ui acls disabled; userswith view
permissions: Set(hadoop); groups with view permissions: Set(); users with
odify permissions: Set(hadoop); groups with modify permissions: Set()
2021-09-10 11:57:47,174 INFO [Thread-7] util.Utils:57 : Successfully started
service 'sparkDriver' on port 44920.
2021-09-10 11:57:47,207 INFO [Thread-7] spark.SparkEnv:57 : Registering
MapOutputTracker
2021-09-10 11:57:47,241 INFO [Thread-7] spark.SparkEnv:57 : Registering
BlockManagerMaster
2021-09-10 11:57:47,263 INFO [Thread-7] storage.BlockManagerMasterEndpoint:57
: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology
information
2021-09-10 11:57:47,264 INFO [Thread-7] storage.BlockManagerMasterEndpoint:57
: BlockManagerMasterEndpoint up
2021-09-10 11:57:47,304 INFO [Thread-7] spark.SparkEnv:57 : Registering
BlockManagerMasterHeartbeat
2021-09-10 11:57:47,319 INFO [Thread-7] storage.DiskBlockManager:57 : Created
local directory at
/home/hadoop/kylin-4.0.0/tomcat/temp/blockmgr-8d603a4a-b39b-4a32-96c8-c7157095626f
2021-09-10 11:57:47,373 INFO [Thread-7] memory.MemoryStore:57 : MemoryStore
started with capacity 2004.6 MiB
2021-09-10 11:57:47,421 INFO [Thread-7] spark.SparkEnv:57 : Registering
OutputCommitCoordinator
2021-09-10 11:57:47,544 INFO [Thread-7] util.log:169 : Logging initialized
@18323ms to org.sparkproject.jetty.util.log.Slf4jLog
2021-09-10 11:57:47,624 INFO [Thread-7] server.Server:375 :
jetty-9.4.40.v20210413; built: 2021-04-13T20:42:42.668Z; git:
b881a572662e1943a14ae12e7e1207989f218b74; jvm 1.8.0_302-b08
2021-09-10 11:57:47,650 INFO [Thread-7] server.Server:415 : Started @18430ms
2021-09-10 11:57:47,689 INFO [Thread-7] server.AbstractConnector:331 : Started
ServerConnector@652b0b37{HTTP/1.1, (http/1.1)}{0.0.0.0:4040}
2021-09-10 11:57:47,689 INFO [Thread-7] util.Utils:57 : Successfully started
service 'SparkUI' on port 4040.
2021-09-10 11:57:47,716 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@6a6c2431{/jobs,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,719 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@3ef94e25{/jobs/json,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,720 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@68c34762{/jobs/job,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,720 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@77ec7215{/jobs/job/json,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,721 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@29486393{/stages,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,721 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@3a56183a{/stages/json,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,722 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@1d59f51d{/stages/stage,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,723 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@53d5c725{/stages/stage/json,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,724 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@3311c9d7{/stages/pool,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,725 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@7e9e8176{/stages/pool/json,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,725 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@ae6ce7a{/storage,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,726 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@7f9f16e6{/storage/json,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,727 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@53132c2c{/storage/rdd,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,727 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@24acf85b{/storage/rdd/json,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,728 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@54416c1c{/environment,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,728 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@2b449ba3{/environment/json,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,729 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@209a91dd{/executors,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,730 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@905d099{/executors/json,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,730 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@2342f463{/executors/threadDump,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,732 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@1c3cac80{/executors/threadDump/json,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,743 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@6b911948{/static,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,744 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@5c0c06e1{/,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,745 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@2c5e59fc{/api,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,746 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@7a51285c{/jobs/job/kill,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,746 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@5c59af43{/stages/stage/kill,null,AVAILABLE,@Spark}
2021-09-10 11:57:47,748 INFO [Thread-7] ui.SparkUI:57 : Bound SparkUI to
0.0.0.0, and started at http://node4:4040
2021-09-10 11:57:47,819 INFO [Thread-7] scheduler.FairSchedulableBuilder:57 :
Creating Fair Scheduler pools from
/home/hadoop/kylin-4.0.0/conf/fairscheduler.xml
2021-09-10 11:57:47,839 INFO [Thread-7] scheduler.FairSchedulableBuilder:57 :
Created pool: query_pushdown, schedulingMode: FAIR, minShare: 0, weight: 1
2021-09-10 11:57:47,840 INFO [Thread-7] scheduler.FairSchedulableBuilder:57 :
Created pool: heavy_tasks, schedulingMode: FAIR, minShare: 1, weight: 5
2021-09-10 11:57:47,840 INFO [Thread-7] scheduler.FairSchedulableBuilder:57 :
Created pool: lightweight_tasks, schedulingMode: FAIR, minShare: 1, weight: 10
2021-09-10 11:57:47,841 INFO [Thread-7] scheduler.FairSchedulableBuilder:57 :
Created pool: vip_tasks, schedulingMode: FAIR, minShare: 0, weight: 15
2021-09-10 11:57:47,842 INFO [Thread-7] scheduler.FairSchedulableBuilder:57 :
Created default pool: default, schedulingMode: FIFO, minShare: 0, weight: 1
2021-09-10 11:57:47,876 WARN [Thread-7] config.package:422 : Can not load the
default value of `spark.yarn.isHadoopProvided` from
`org/apache/spark/deploy/yarn/config.properties` with error,
java.lang.NullPointerException. Using `false` as a default value.
2021-09-10 11:57:47,981 INFO [Thread-7] client.RMProxy:134 : Connecting to
ResourceManager at /0.0.0.0:8032
2021-09-10 11:57:49,381 INFO [Thread-7] ipc.Client:966 : Retrying connect to
server: 0.0.0.0/0.0.0.0:8032. Already tried 0 time(s); retry policy is
RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2021-09-10 11:57:50,383 INFO [Thread-7] ipc.Client:966 : Retrying connect to
server: 0.0.0.0/0.0.0.0:8032. Already tried 1 time(s); retry policy is
RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2021-09-10 11:58:07,426 INFO [Thread-7] ipc.Client:966 : Retrying connect to
server: 0.0.0.0/0.0.0.0:8032. Already tried 8 time(s); retry policy is
RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2021-09-10 11:58:08,428 INFO [Thread-7] ipc.Client:966 : Retrying connect to
server: 0.0.0.0/0.0.0.0:8032. Already tried 9 time(s); retry policy is
RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2021-09-10 11:58:08,430 INFO [Thread-7] retry.RetryInvocationHandler:411 :
java.net.ConnectException: Your endpoint configuration is wrong; For more
details see: http://wiki.apache.org/hadoop/UnsetHostnameOrPort, while invoking
ApplicationClientProtocolPBClientImpl.getClusterMetrics over null after 1
failover attempts. Trying to failover after sleeping for 25847ms.
2021-09-10 11:58:08,838 INFO [FetcherRunner 2098527970-29]
threadpool.DefaultFetcherRunner:117 : Job Fetcher: 0 should running, 0 actu
al running, 0 stopped, 0 ready, 2 already succeed, 0 error, 0 discarded, 0
others
2021-09-10 11:58:35,279 INFO [Thread-7] ipc.Client:966 : Retrying connect to
server: 0.0.0.0/0.0.0.0:8032. Already tried 0 time(s); retry policy is
RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2021-09-10 11:58:35,982 DEBUG [BadQueryDetector] service.BadQueryDetector:148 :
Detect bad query.
2021-09-10 11:58:36,281 INFO [Thread-7] ipc.Client:966 : Retrying connect to
server: 0.0.0.0/0.0.0.0:8032. Already tried 1 time(s); retry policy is
RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2021-09-10 11:58:37,282 INFO [Thread-7] ipc.Client:966 : Retrying connect to
server: 0.0.0.0/0.0.0.0:8032. Already tried 2 time(s); retry policy is
RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2021-09-10 11:58:38,284 INFO [Thread-7] ipc.Client:966 : Retrying connect to
server: 0.0.0.0/0.0.0.0:8032. Already tried 3 time(s); retry policy is
RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2021-09-10 11:58:38,838 INFO [FetcherRunner 2098527970-29]
threadpool.DefaultFetcherRunner:117 : Job Fetcher: 0 should running, 0 actual
running, 0 stopped, 0 ready, 2 already succeed, 0 error, 0 discarded, 0 others
2021-09-10 11:58:39,286 INFO [Thread-7] ipc.Client:966 : Retrying connect to
server: 0.0.0.0/0.0.0.0:8032. Already tried 4 time(s); retry policy is
RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2021-09-10 11:58:40,288 INFO [Thread-7] ipc.Client:966 : Retrying connect to
server: 0.0.0.0/0.0.0.0:8032. Already tried 5 time(s); retry policy is
RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2021-09-10 11:59:27,493 INFO [Thread-7] ipc.Client:966 : Retrying connect to
server: 0.0.0.0/0.0.0.0:8032. Already tried 9 time(s); retry policy is
RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
2021-09-10 11:59:27,496 INFO [Thread-7] retry.RetryInvocationHandler:411 :
java.net.ConnectException: Your endpoint configuration is wrong; For more
details see: http://wiki.apache.org/hadoop/UnsetHostnameOrPort, while invoking
ApplicationClientProtocolPBClientImpl.getClusterMetrics over null after 3
failover attempts. Trying to failover after sleeping for 32335ms.
2021-09-10 11:59:35,983 DEBUG [BadQueryDetector] service.BadQueryDetector:148 :
Detect bad query.
2021-09-10 11:59:35,984 INFO [BadQueryDetector] service.BadQueryDetector:204 :
Slow query has been running 114.003 seconds (project:learn_kylin, thread: 0x24,
user:ADMIN, query id:d8e1d20c-6b50-19a3-24b4-725f548bb275) -- select part_dt,
sum(price) as total_selled, count(distinct seller_id) as sellers from
kylin_sales group by part_dt order by part_dt
2021-09-10 11:59:35,990 DEBUG [BadQueryDetector]
badquery.BadQueryHistoryManager:65 : Loaded 0 Bad Query(s)
2021-09-10 11:59:36,026 INFO [BadQueryDetector] service.BadQueryDetector:192 :
Problematic thread 0x24 Query d8e1d20c-6b50-19a3-24b4-7
25f548bb275-36, query id: d8e1d20c-6b50-19a3-24b4-725f548bb275
at java.lang.Object.wait(Native Method)
at java.lang.Thread.join(Thread.java:1252)
at java.lang.Thread.join(Thread.java:1326)
at
org.apache.spark.sql.SparderContext$.$anonfun$initSpark$1(SparderContext.scala:205)
at
org.apache.spark.sql.SparderContext$$$Lambda$79/945500108.apply(Unknown Source)
at
org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
at
org.apache.spark.sql.SparderContext$.initSpark(SparderContext.scala:129)
at
org.apache.spark.sql.SparderContext$.$anonfun$getOriginalSparkSession$1(SparderContext.scala:58)
at
org.apache.spark.sql.SparderContext$$$Lambda$77/945342003.apply(Unknown Source)
at
org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)2021-09-10
11:59:38,838 INFO [FetcherRunner 2098527970-29]
threadpool.DefaultFetcherRunner:117 : Job Fetcher: 0 should running, 0 actual
running, 0 stopped, 0 ready, 2 already succeed, 0 error, 0 discarded, 0 others
{code}
I have enabled yarn high availability . I don't know why the kylin call yarn
using the address 0.0.0.0:8082 .
the part config of yarn-site.xml as follow:
{code:java}
<property>
<description>The address of the applications manager interface in the
RM.</description>
<name>yarn.resourcemanager.address.rm1</name>
<value>${yarn.resourcemanager.hostname.rm1}:8032</value>
</property>
<property>
<description>The address of the applications manager interface in the
RM.</description>
<name>yarn.resourcemanager.address.rm2</name>
<value>${yarn.resourcemanager.hostname.rm2}:8032</value>
</property>
{code}
> java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
> --------------------------------------------------------------------------
>
> Key: KYLIN-5087
> URL: https://issues.apache.org/jira/browse/KYLIN-5087
> Project: Kylin
> Issue Type: Bug
> Components: Environment , Integration, Query Engine, Spark Engine,
> Web
> Affects Versions: v4.0.0
> Environment: hadoop-3.2.2
> hive-2.3.9
> kylin-4.0.0
> scala-2.12.14
> spark-3.1.2-bin-hadoop3.2
> zookeeper-3.7.0
> Reporter: 曹勇
> Priority: Major
> Attachments: image-2021-09-09-20-11-08-870.png,
> image-2021-09-10-16-12-45-012.png, image-2021-09-10-16-38-32-279.png
>
>
> When I query example sql `select part_dt, sum(price) as total_selled,
> count(distinct seller_id) as sellers from kylin_sales group by part_dt order
> by part_dt` for kylin_sales_cube in the document ,the issue happened:
> {code:java}
> 2021-09-09 20:06:10,063 WARN [Thread-7] config.package:422 : Can not load
> the default value of `spark.yarn.isHadoopProvided` from
> `org/apache/spark/deploy/yarn/config.properties` with error,
> java.lang.NullPointerException. Using `false` as a default value.
> 2021-09-09 20:06:10,482 INFO [Thread-7] client.AHSProxy:42 : Connecting to
> Application History server at node1/192.168.111.49:10200
> 2021-09-09 20:06:10,601 INFO [Thread-7] yarn.Client:57 : Requesting a new
> application from cluster with 3 NodeManagers
> 2021-09-09 20:06:11,337 INFO [Thread-7] conf.Configuration:2795 :
> resource-types.xml not found
> 2021-09-09 20:06:11,338 INFO [Thread-7] resource.ResourceUtils:442 : Unable
> to find 'resource-types.xml'.
> 2021-09-09 20:06:11,350 INFO [Thread-7] yarn.Client:57 : Verifying our
> application has not requested more than the maximum memory capability of the
> cluster (8192 MB per container)
> 2021-09-09 20:06:11,350 INFO [Thread-7] yarn.Client:57 : Will allocate AM
> container, with 896 MB memory including 384 MB overhead
> 2021-09-09 20:06:11,351 INFO [Thread-7] yarn.Client:57 : Setting up
> container launch context for our AM
> 2021-09-09 20:06:11,351 INFO [Thread-7] yarn.Client:57 : Setting up the
> launch environment for our AM container
> 2021-09-09 20:06:11,356 INFO [Thread-7] yarn.Client:57 : Preparing resources
> for our AM container
> 2021-09-09 20:06:11,434 WARN [Thread-7] yarn.Client:69 : Neither
> spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading
> libraries under SPARK_HOME.
> 2021-09-09 20:06:13,436 INFO [Thread-7] yarn.Client:57 : Uploading resource
> file:/home/hadoop/kylin-4.0.0/tomcat/temp/spark-d250f1d5-51aa-47d1-9aa5-b36d3b60d336/__spark_libs__3169881654091659892.zip
> ->
> hdfs://ns1/user/hadoop/.sparkStaging/application_1631008247268_0018/__spark_libs__3169881654091659892.zip
> 2021-09-09 20:06:14,657 INFO [Thread-7] yarn.Client:57 : Uploading resource
> file:/home/hadoop/kylin-4.0.0/lib/kylin-parquet-job-4.0.0.jar ->
> hdfs://ns1/user/hadoop/.sparkStaging/application_1631008247268_0018/kylin-parquet-job-4.0.0.jar
> 2021-09-09 20:06:15,058 INFO [Thread-7] yarn.Client:57 : Uploading resource
> file:/home/hadoop/kylin-4.0.0/conf/spark-executor-log4j.properties ->
> hdfs://ns1/user/hadoop/.sparkStaging/application_1631008247268_0018/spark-executor-log4j.properties
> 2021-09-09 20:06:15,238 INFO [Thread-7] yarn.Client:57 : Uploading resource
> file:/home/hadoop/kylin-4.0.0/tomcat/temp/spark-d250f1d5-51aa-47d1-9aa5-b36d3b60d336/__spark_conf__4477603375905676389.zip
> ->
> hdfs://ns1/user/hadoop/.sparkStaging/application_1631008247268_0018/__spark_conf__.zip
> 2021-09-09 20:06:15,354 INFO [Thread-7] spark.SecurityManager:57 : Changing
> view acls to: hadoop
> 2021-09-09 20:06:15,354 INFO [Thread-7] spark.SecurityManager:57 : Changing
> modify acls to: hadoop
> 2021-09-09 20:06:15,354 INFO [Thread-7] spark.SecurityManager:57 : Changing
> view acls groups to:
> 2021-09-09 20:06:15,354 INFO [Thread-7] spark.SecurityManager:57 : Changing
> modify acls groups to:
> 2021-09-09 20:06:15,354 INFO [Thread-7] spark.SecurityManager:57 :
> SecurityManager: authentication disabled; ui acls disabled; users with view
> permissions: Set(hadoop); groups with view permissions: Set(); users with
> modify permissions: Set(hadoop); groups with modify permissions: Set()
> 2021-09-09 20:06:15,378 INFO [Thread-7] yarn.Client:57 : Submitting
> application application_1631008247268_0018 to ResourceManager
> 2021-09-09 20:06:15,637 INFO [Thread-7] impl.YarnClientImpl:329 : Submitted
> application application_1631008247268_0018
> 2021-09-09 20:06:16,646 INFO [Thread-7] yarn.Client:57 : Application report
> for application_1631008247268_0018 (state: ACCEPTED)
> 2021-09-09 20:06:16,656 INFO [Thread-7] yarn.Client:57 :
> client token: N/A
> diagnostics: AM container is launched, waiting for AM container to
> Register with RM
> ApplicationMaster host: N/A
> ApplicationMaster RPC port: -1
> queue: default
> start time: 1631189175391
> final status: UNDEFINED
> tracking URL: http://node1:8088/proxy/application_1631008247268_0018/
> user: hadoop
> 2021-09-09 20:06:17,659 INFO [Thread-7] yarn.Client:57 : Application report
> for application_1631008247268_0018 (state: ACCEPTED)
> 2021-09-09 20:06:18,662 INFO [Thread-7] yarn.Client:57 : Application report
> for application_1631008247268_0018 (state: ACCEPTED)
> 2021-09-09 20:06:19,665 INFO [Thread-7] yarn.Client:57 : Application report
> for application_1631008247268_0018 (state: ACCEPTED)
> 2021-09-09 20:06:19,947 INFO [dispatcher-event-loop-7]
> cluster.YarnClientSchedulerBackend:57 : Add WebUI Filter.
> org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS
> -> node1,node2, PROXY_URI_BASES ->
> http://node1:8088/proxy/application_1631008247268_0018,http://node2:8088/proxy/application_1631008247268_0018,
> RM_HA_URLS -> node1:8088,node2:8088), /proxy/application_1631008247268_0018
> 2021-09-09 20:06:20,668 INFO [Thread-7] yarn.Client:57 : Application report
> for application_1631008247268_0018 (state: RUNNING)
> 2021-09-09 20:06:20,669 INFO [Thread-7] yarn.Client:57 :
> client token: N/A
> diagnostics: N/A
> ApplicationMaster host: 192.168.111.25
> ApplicationMaster RPC port: -1
> queue: default
> start time: 1631189175391
> final status: UNDEFINED
> tracking URL: http://node1:8088/proxy/application_1631008247268_0018/
> user: hadoop
> 2021-09-09 20:06:20,672 INFO [Thread-7]
> cluster.YarnClientSchedulerBackend:57 : Application
> application_1631008247268_0018 has started running.
> 2021-09-09 20:06:20,693 INFO [Thread-7] util.Utils:57 : Successfully started
> service 'org.apache.spark.network.netty.NettyBlockTransferService' on port
> 37707.
> 2021-09-09 20:06:20,693 INFO [Thread-7] netty.NettyBlockTransferService:81 :
> Server created on node3:37707
> 2021-09-09 20:06:20,696 INFO [Thread-7] storage.BlockManager:57 : Using
> org.apache.spark.storage.RandomBlockReplicationPolicy for block replication
> policy
> 2021-09-09 20:06:20,706 INFO [Thread-7] storage.BlockManagerMaster:57 :
> Registering BlockManager BlockManagerId(driver, node3, 37707, None)
> 2021-09-09 20:06:20,713 INFO [dispatcher-BlockManagerMaster]
> storage.BlockManagerMasterEndpoint:57 : Registering block manager node3:37707
> with 2004.6 MiB RAM, BlockManagerId(driver, node3, 37707, None)
> 2021-09-09 20:06:20,719 INFO [Thread-7] storage.BlockManagerMaster:57 :
> Registered BlockManager BlockManagerId(driver, node3, 37707, None)
> 2021-09-09 20:06:20,721 INFO [Thread-7] storage.BlockManager:57 :
> Initialized BlockManager: BlockManagerId(driver, node3, 37707, None)
> 2021-09-09 20:06:20,832 INFO [dispatcher-event-loop-2]
> cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:57 : ApplicationMaster
> registered as NettyRpcEndpointRef(spark-client://YarnAM)
> 2021-09-09 20:06:20,922 INFO [Thread-7] ui.ServerInfo:57 : Adding filter to
> /metrics/json: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
> 2021-09-09 20:06:20,924 INFO [Thread-7] handler.ContextHandler:916 : Started
> o.s.j.s.ServletContextHandler@71ddd2af{/metrics/json,null,AVAILABLE,@Spark}
> 2021-09-09 20:06:25,825 INFO [dispatcher-CoarseGrainedScheduler]
> cluster.YarnSchedulerBackend$YarnDriverEndpoint:57 : Registered executor
> NettyRpcEndpointRef(spark-client://Executor) (192.168.111.50:36340) with ID
> 1, ResourceProfileId 0
> 2021-09-09 20:06:25,872 INFO [Thread-7]
> cluster.YarnClientSchedulerBackend:57 : SchedulerBackend is ready for
> scheduling beginning after reached minRegisteredResourcesRatio: 0.8
> 2021-09-09 20:06:25,949 ERROR [Thread-7] sql.SparderContext:94 : Error for
> initializing spark
> java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
> at
> org.apache.spark.utils.KylinReflectUtils$.getSessionState(KylinReflectUtils.scala:42)
> at
> org.apache.spark.sql.KylinSession.sessionState$lzycompute(KylinSession.scala:47)
> at
> org.apache.spark.sql.KylinSession.sessionState(KylinSession.scala:46)
> at
> org.apache.kylin.query.UdfManager.$anonfun$registerBuiltInFunc$1(UdfManager.scala:35)
> at
> org.apache.kylin.query.UdfManager.$anonfun$registerBuiltInFunc$1$adapted(UdfManager.scala:34)
> at scala.collection.immutable.List.foreach(List.scala:392)
> at
> org.apache.kylin.query.UdfManager.org$apache$kylin$query$UdfManager$$registerBuiltInFunc(UdfManager.scala:34)
> at org.apache.kylin.query.UdfManager$.create(UdfManager.scala:85)
> at
> org.apache.spark.sql.KylinSession$KylinBuilder.getOrCreateKylinSession(KylinSession.scala:116)
> at
> org.apache.spark.sql.SparderContext$$anon$1.run(SparderContext.scala:146)
> at java.lang.Thread.run(Thread.java:748)
> 2021-09-09 20:06:25,951 INFO [Thread-7] sql.SparderContext:57 : Setting
> initializing Spark thread to null.
> 2021-09-09 20:06:25,952 INFO [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35]
> monitor.SparderContextCanary:68 : Start monitoring Sparder
> 2021-09-09 20:06:25,953 INFO [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35]
> sql.SparderContext:57 : Init spark.
> 2021-09-09 20:06:25,954 INFO [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35]
> sql.SparderContext:57 : Initializing Spark thread starting.
> 2021-09-09 20:06:25,954 INFO [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35]
> sql.SparderContext:57 : Initializing Spark, waiting for done.
> 2021-09-09 20:06:25,955 INFO [Thread-37] sql.SparderContext:57 :
> SparderContext deploy with spark master: yarn
> 2021-09-09 20:06:25,964 INFO [Thread-37] sql.SparderContext:57 : Spark
> application id is application_1631008247268_0018
> 2021-09-09 20:06:25,965 INFO [Thread-37] sql.SparderContext:57 : Spark
> context started successfully with stack trace:
> 2021-09-09 20:06:25,965 INFO [Thread-37] sql.SparderContext:57 :
> java.lang.Thread.getStackTrace(Thread.java:1559)
> org.apache.spark.sql.SparderContext$$anon$1.$anonfun$run$8(SparderContext.scala:171)
> org.apache.spark.internal.Logging.logInfo(Logging.scala:57)
> org.apache.spark.internal.Logging.logInfo$(Logging.scala:56)
> org.apache.spark.sql.SparderContext$.logInfo(SparderContext.scala:45)
> org.apache.spark.sql.SparderContext$$anon$1.run(SparderContext.scala:171)
> java.lang.Thread.run(Thread.java:748)
> 2021-09-09 20:06:25,966 INFO [Thread-37] sql.SparderContext:57 : Class
> loader: org.apache.kylin.spark.classloader.SparkClassLoader@3187093f
> 2021-09-09 20:06:25,969 INFO [Thread-37] memory.MonitorEnv:57 : create
> driver monitor env
> 2021-09-09 20:06:25,976 INFO [dispatcher-BlockManagerMaster]
> storage.BlockManagerMasterEndpoint:57 : Registering block manager node2:44050
> with 2004.6 MiB RAM, BlockManagerId(1, node2, 44050, None)
> 2021-09-09 20:06:25,988 INFO [Thread-37] sql.SparderContext:57 : setup
> master endpoint finished.hostPort:node3:40116
> 2021-09-09 20:06:26,025 INFO [Thread-37] client.AHSProxy:42 : Connecting to
> Application History server at node1/192.168.111.49:10200
> 2021-09-09 20:06:26,029 INFO [Thread-37] sql.SparderContext:57 : Setting
> initializing Spark thread to null.
> 2021-09-09 20:06:26,037 ERROR [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35]
> service.QueryService:576 : Exception while executing query
> java.sql.SQLException: Error while executing SQL "select * from (select
> part_dt, sum(price) as total_selled, count(distinct seller_id) as sellers
> from kylin_sales group by part_dt order by part_dt) limit 50000":
> java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
> at org.apache.calcite.avatica.Helper.createException(Helper.java:56)
> at org.apache.calcite.avatica.Helper.createException(Helper.java:41)
> at
> org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:163)
> at
> org.apache.calcite.avatica.AvaticaStatement.executeQuery(AvaticaStatement.java:227)
> at
> org.apache.kylin.rest.service.QueryService.executeRequest(QueryService.java:1026)
> at
> org.apache.kylin.rest.service.QueryService.queryWithSqlMassage(QueryService.java:710)
> at
> org.apache.kylin.rest.service.QueryService.query(QueryService.java:221)
> at
> org.apache.kylin.rest.service.QueryService.queryAndUpdateCache(QueryService.java:514)
> at
> org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:469)
> at
> org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:405)
> at
> org.apache.kylin.rest.controller.QueryController.query(QueryController.java:93)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205)
> at
> org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:133)
> at
> org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:97)
> at
> org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:854)
> at
> org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:765)
> at
> org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)
> at
> org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:967)
> at
> org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:901)
> at
> org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
> at
> org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:872)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:647)
> at
> org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:728)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
> at
> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
> at
> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:137)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilterInternal(BasicAuthenticationFilter.java:215)
> at
> org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:200)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:116)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
> at
> org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
> at
> org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
> at
> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
> at
> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:347)
> at
> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:263)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:209)
> at
> com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:244)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
> at
> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:110)
> at
> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:492)
> at
> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:165)
> at
> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:104)
> at
> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:1025)
> at
> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
> at
> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:452)
> at
> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1195)
> at
> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:654)
> at
> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:317)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at
> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.RuntimeException:
> java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
> at
> org.apache.kylin.query.exec.SparkExec.collectToEnumerable(SparkExec.java:50)
> at Baz.bind(Unknown Source)
> at
> org.apache.calcite.jdbc.CalcitePrepare$CalciteSignature.enumerable(CalcitePrepare.java:365)
> at
> org.apache.calcite.jdbc.CalciteConnectionImpl.enumerable(CalciteConnectionImpl.java:301)
> at
> org.apache.calcite.jdbc.CalciteMetaImpl._createIterable(CalciteMetaImpl.java:559)
> at
> org.apache.calcite.jdbc.CalciteMetaImpl.createIterable(CalciteMetaImpl.java:550)
> at
> org.apache.calcite.avatica.AvaticaResultSet.execute(AvaticaResultSet.java:182)
> at
> org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:67)
> at
> org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:44)
> at
> org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:667)
> at
> org.apache.calcite.jdbc.CalciteMetaImpl.prepareAndExecute(CalciteMetaImpl.java:619)
> at
> org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
> at
> org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
> ... 84 more
> Caused by: java.lang.UnsupportedOperationException: Spark version 2.0.0 not
> supported
> at
> org.apache.spark.utils.KylinReflectUtils$.getSessionState(KylinReflectUtils.scala:42)
> at
> org.apache.spark.sql.KylinSession.sessionState$lzycompute(KylinSession.scala:47)
> at
> org.apache.spark.sql.KylinSession.sessionState(KylinSession.scala:46)
> at org.apache.spark.sql.SQLContext.sessionState(SQLContext.scala:78)
> at org.apache.spark.sql.SQLContext.conf(SQLContext.scala:80)
> at
> org.apache.spark.sql.execution.datasources.FileStatusCache$.getOrCreate(FileStatusCache.scala:44)
> at
> org.apache.spark.sql.execution.datasource.ShardFileStatusCache$.getFileStatusCache(ShardFileStatusCache.scala:29)
> at
> org.apache.spark.sql.SparderContext$.$anonfun$initSpark$1(SparderContext.scala:214)
> at
> org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
> at
> org.apache.spark.sql.SparderContext$.initSpark(SparderContext.scala:129)
> at
> org.apache.spark.sql.SparderContext$.$anonfun$getOriginalSparkSession$1(SparderContext.scala:58)
> at
> org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
> at
> org.apache.spark.sql.SparderContext$.getOriginalSparkSession(SparderContext.scala:55)
> at
> org.apache.spark.sql.SparderContext$.$anonfun$initSpark$1(SparderContext.scala:214)
> at
> org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
> at
> org.apache.spark.sql.SparderContext$.initSpark(SparderContext.scala:129)
> at
> org.apache.spark.sql.SparderContext$.$anonfun$getOriginalSparkSession$1(SparderContext.scala:58)
> at
> org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
> at
> org.apache.spark.sql.SparderContext$.getOriginalSparkSession(SparderContext.scala:55)
> at
> org.apache.spark.sql.SparderContextFacade$.current(SparderContextFacade.scala:34)
> at
> org.apache.spark.sql.SparderContext$.getSparkSession(SparderContext.scala:65)
> at
> org.apache.kylin.query.runtime.plans.TableScanPlan$.$anonfun$createOLAPTable$1(TableScanPlan.scala:78)
> at org.apache.spark.utils.LogEx.logTime(LogEx.scala:40)
> at org.apache.spark.utils.LogEx.logTime$(LogEx.scala:38)
> at
> org.apache.kylin.query.runtime.plans.TableScanPlan$.logTime(TableScanPlan.scala:43)
> at
> org.apache.kylin.query.runtime.plans.TableScanPlan$.createOLAPTable(TableScanPlan.scala:51)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.$anonfun$visit$2(CalciteToSparkPlaner.scala:58)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.logTime(CalciteToSparkPlaner.scala:120)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:58)
> at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
> at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
> at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
> at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
> at org.apache.calcite.rel.RelVisitor.go(RelVisitor.java:61)
> at
> org.apache.kylin.query.runtime.SparkEngine.toSparkPlan(SparkEngine.java:60)
> at
> org.apache.kylin.query.runtime.SparkEngine.compute(SparkEngine.java:49)
> at
> org.apache.kylin.query.exec.QueryEngineFactory.compute(QueryEngineFactory.java:47)
> at
> org.apache.kylin.query.exec.SparkExec.collectToEnumerable(SparkExec.java:41)
> ... 96 more
> 2021-09-09 20:06:26,041 INFO [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35]
> service.QueryService:1222 : Processed rows for each storageContext: 0
> 2021-09-09 20:06:26,048 INFO [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35]
> service.QueryService:391 :
> ==========================[QUERY]===============================
> Query Id: 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384
> SQL: select part_dt, sum(price) as total_selled, count(distinct seller_id) as
> sellers from kylin_sales group by part_dt order by part_dt
> User: ADMIN
> Success: false
> Duration: 21.519
> Project: learn_kylin
> Realization Names: [CUBE[name=kylin_sales_cube]]
> Cuboid Ids: [16384]
> Is Exactly Matched: [false]
> Total scan count: 0
> Total scan files: 0
> Total metadata time: 0ms
> Total spark scan time: 0ms
> Total scan bytes: -1
> Result row count: 0
> Storage cache used: false
> Is Query Push-Down: false
> Is Prepare: false
> Used Spark pool: null
> Trace URL: null
> Message: java.lang.UnsupportedOperationException: Spark version 2.0.0 not
> supported
> while executing SQL: "select * from (select part_dt, sum(price) as
> total_selled, count(distinct seller_id) as sellers from kylin_sales group by
> part_dt order by part_dt) limit 50000"
> ==========================[QUERY]===============================2021-09-09
> 20:06:26,052 ERROR [http-bio-7070-exec-1] controller.BasicController:65 :
> org.apache.kylin.rest.exception.InternalErrorException:
> java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
> while executing SQL: "select * from (select part_dt, sum(price) as
> total_selled, count(distinct seller_id) as sellers from kylin_sales group by
> part_dt order by part_dt) limit 50000"
> at
> org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:486)
> at
> org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:405)
> at
> org.apache.kylin.rest.controller.QueryController.query(QueryController.java:93)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205)
> at
> org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:133)
> at
> org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:97)
> at
> org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:854)
> at
> org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:765)
> at
> org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)
> at
> org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:967)
> at
> org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:901)
> at
> org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
> at
> org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:872)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:647)
> at
> org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:728)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
> at
> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
> at
> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:137)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilterInternal(BasicAuthenticationFilter.java:215)
> at
> org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:200)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:116)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
> at
> org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
> at
> org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
> at
> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
> at
> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:347)
> at
> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:263)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:209)
> at
> com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:244)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
> at
> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:110)
> at
> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:492)
> at
> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:165)
> at
> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:104)
> at
> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:1025)
> at
> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
> at
> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:452)
> at
> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1195)
> at
> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:654)
> at
> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:317)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at
> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.UnsupportedOperationException: Spark version 2.0.0 not
> supported
> at
> org.apache.spark.utils.KylinReflectUtils$.getSessionState(KylinReflectUtils.scala:42)
> at
> org.apache.spark.sql.KylinSession.sessionState$lzycompute(KylinSession.scala:47)
> at
> org.apache.spark.sql.KylinSession.sessionState(KylinSession.scala:46)
> at org.apache.spark.sql.SQLContext.sessionState(SQLContext.scala:78)
> at org.apache.spark.sql.SQLContext.conf(SQLContext.scala:80)
> at
> org.apache.spark.sql.execution.datasources.FileStatusCache$.getOrCreate(FileStatusCache.scala:44)
> at
> org.apache.spark.sql.execution.datasource.ShardFileStatusCache$.getFileStatusCache(ShardFileStatusCache.scala:29)
> at
> org.apache.spark.sql.SparderContext$.$anonfun$initSpark$1(SparderContext.scala:214)
> at
> org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
> at
> org.apache.spark.sql.SparderContext$.initSpark(SparderContext.scala:129)
> at
> org.apache.spark.sql.SparderContext$.$anonfun$getOriginalSparkSession$1(SparderContext.scala:58)
> at
> org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
> at
> org.apache.spark.sql.SparderContext$.getOriginalSparkSession(SparderContext.scala:55)
> at
> org.apache.spark.sql.SparderContext$.$anonfun$initSpark$1(SparderContext.scala:214)
> at
> org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
> at
> org.apache.spark.sql.SparderContext$.initSpark(SparderContext.scala:129)
> at
> org.apache.spark.sql.SparderContext$.$anonfun$getOriginalSparkSession$1(SparderContext.scala:58)
> at
> org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
> at
> org.apache.spark.sql.SparderContext$.getOriginalSparkSession(SparderContext.scala:55)
> at
> org.apache.spark.sql.SparderContextFacade$.current(SparderContextFacade.scala:34)
> at
> org.apache.spark.sql.SparderContext$.getSparkSession(SparderContext.scala:65)
> at
> org.apache.kylin.query.runtime.plans.TableScanPlan$.$anonfun$createOLAPTable$1(TableScanPlan.scala:78)
> at org.apache.spark.utils.LogEx.logTime(LogEx.scala:40)
> at org.apache.spark.utils.LogEx.logTime$(LogEx.scala:38)
> at
> org.apache.kylin.query.runtime.plans.TableScanPlan$.logTime(TableScanPlan.scala:43)
> at
> org.apache.kylin.query.runtime.plans.TableScanPlan$.createOLAPTable(TableScanPlan.scala:51)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.$anonfun$visit$2(CalciteToSparkPlaner.scala:58)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.logTime(CalciteToSparkPlaner.scala:120)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:58)
> at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
> at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
> at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
> at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
> at org.apache.calcite.rel.RelVisitor.go(RelVisitor.java:61)
> at
> org.apache.kylin.query.runtime.SparkEngine.toSparkPlan(SparkEngine.java:60)
> at
> org.apache.kylin.query.runtime.SparkEngine.compute(SparkEngine.java:49)
> at
> org.apache.kylin.query.exec.QueryEngineFactory.compute(QueryEngineFactory.java:47)
> at
> org.apache.kylin.query.exec.SparkExec.collectToEnumerable(SparkExec.java:41)
> at Baz.bind(Unknown Source)
> at
> org.apache.calcite.jdbc.CalcitePrepare$CalciteSignature.enumerable(CalcitePrepare.java:365)
> at
> org.apache.calcite.jdbc.CalciteConnectionImpl.enumerable(CalciteConnectionImpl.java:301)
> at
> org.apache.calcite.jdbc.CalciteMetaImpl._createIterable(CalciteMetaImpl.java:559)
> at
> org.apache.calcite.jdbc.CalciteMetaImpl.createIterable(CalciteMetaImpl.java:550)
> at
> org.apache.calcite.avatica.AvaticaResultSet.execute(AvaticaResultSet.java:182)
> at
> org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:67)
> at
> org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:44)
> at
> org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:667)
> at
> org.apache.calcite.jdbc.CalciteMetaImpl.prepareAndExecute(CalciteMetaImpl.java:619)
> at
> org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
> at
> org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
> at
> org.apache.calcite.avatica.AvaticaStatement.executeQuery(AvaticaStatement.java:227)
> at
> org.apache.kylin.rest.service.QueryService.executeRequest(QueryService.java:1026)
> at
> org.apache.kylin.rest.service.QueryService.queryWithSqlMassage(QueryService.java:710)
> at
> org.apache.kylin.rest.service.QueryService.query(QueryService.java:221)
> at
> org.apache.kylin.rest.service.QueryService.queryAndUpdateCache(QueryService.java:514)
> at
> org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:469)
> ... 78 more
> 2021-09-09 20:06:35,790 INFO [FetcherRunner 1341171673-29]
> threadpool.DefaultFetcherRunner:117 : Job Fetcher: 0 should running, 0 actual
> running, 0 stopped, 0 ready, 2 already succeed, 0 error, 0 discarded, 0 others
> {code}
> web ui as follow:
> !image-2021-09-09-20-11-08-870.png!
>
> Obviously,my spark version is spark-3.1.2-bin-hadoop3.2 in the environment.
> But why kylin call spark version 2.0.0 ?
--
This message was sent by Atlassian Jira
(v8.3.4#803005)