[
https://issues.apache.org/jira/browse/KYLIN-5087?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17413077#comment-17413077
]
曹勇 commented on KYLIN-5087:
---------------------------
Then I deploy kylin to resourcemanager node of yarn .
then the running log is :
{code:java}
2021-09-10 14:06:17,101 INFO [http-bio-7070-exec-3] service.QueryService:404 :
Check query permission in 0 ms.
2021-09-10 14:06:17,106 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
service.QueryService:443 : Using project: learn_kylin
2021-09-10 14:06:17,107 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
service.QueryService:444 : The original query: select part_dt, sum(price) as
total_selled, count(distinct seller_id) as sellers from kylin_sales group by
part_dt order by part_dt
2021-09-10 14:06:17,128 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
service.QueryService:690 : The corrected query: select * from (select part_dt,
sum(price) as total_selled, count(distinct seller_id) as sellers from
kylin_sales group by part_dt order by part_dt) limit 50000
2021-09-10 14:06:18,287 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
common.KylinConfig:493 : Creating new manager instance of class
org.apache.kylin.metadata.acl.TableACLManager
2021-09-10 14:06:18,287 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
acl.TableACLManager:58 : Initializing TableACLManager with config
kylin_metadata@jdbc,url=jdbc:mysql://10.110.147.229:3306/kylin,username=hadoop,password=hadoop1234,maxActive=10,maxIdle=10,driverClassName=com.mysql.cj.jdbc.Driver
2021-09-10 14:06:18,288 DEBUG [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
cachesync.CachedCrudAssist:122 : Reloading TableACL from
kylin_metadata(key='/table_acl')@kylin_metadata@jdbc,url=jdbc:mysql://10.110.147.229:3306/kylin,username=hadoop,password=hadoop1234,maxActive=10,maxIdle=10,driverClassName=com.mysql.cj.jdbc.Driver
2021-09-10 14:06:18,295 DEBUG [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
cachesync.CachedCrudAssist:155 : Loaded 3 TableACL(s) out of 3 resource with 0
errors
2021-09-10 14:06:18,301 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
routing.RealizationChooser:189 : Force choose DataModelDesc
[name=kylin_sales_model] as selected model for specific purpose.
2021-09-10 14:06:18,304 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
routing.QueryRouter:84 : Find candidates by table DEFAULT.KYLIN_SALES and
project=learn_kylin : CUBE[name=kylin_sales_cube]
2021-09-10 14:06:18,304 DEBUG [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
routing.QueryRouter:51 : Applying rule: class
org.apache.kylin.query.routing.rules.RemoveBlackoutRealizationsRule,
realizations before: [CUBE[name=kylin_sales_cube]], realizations after:
[CUBE[name=kylin_sales_cube]]
2021-09-10 14:06:18,307 DEBUG [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
routing.QueryRouter:51 : Applying rule: class
org.apache.kylin.query.routing.rules.RemoveUncapableRealizationsRule,
realizations before: [CUBE[name=kylin_sales_cube]], realizations after:
[CUBE[name=kylin_sales_cube]]
2021-09-10 14:06:18,307 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
rules.RealizationSortRule:40 : CUBE[name=kylin_sales_cube] priority 1 cost 838.
2021-09-10 14:06:18,307 DEBUG [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
routing.QueryRouter:51 : Applying rule: class
org.apache.kylin.query.routing.rules.RealizationSortRule, realizations before:
[CUBE[name=kylin_sales_cube]], realizations after: [CUBE[name=kylin_sales_cube]]
2021-09-10 14:06:18,308 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
routing.QueryRouter:101 : The realizations remaining:
[CUBE[name=kylin_sales_cube]],and the final chosen one for current olap context
0 is CUBE[name=kylin_sales_cube]
2021-09-10 14:06:18,459 DEBUG [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
common.QueryContext:319 : Cannot find CubeSegmentStatisticsResult for context 0
2021-09-10 14:06:18,475 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
common.KylinConfig:493 : Creating new manager instance of class
org.apache.kylin.cube.cuboid.CuboidManager
2021-09-10 14:06:18,804 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
sql.SparderContext:57 : Current thread 37 create a SparkSession.
2021-09-10 14:06:18,808 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
sql.SparderContext:57 : Init spark.
2021-09-10 14:06:18,811 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
sql.SparderContext:57 : Initializing Spark thread starting.
2021-09-10 14:06:18,812 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
sql.SparderContext:57 : Initializing Spark, waiting for done.
2021-09-10 14:06:18,814 INFO [Thread-7] sql.SparderContext:57 : SparderContext
deploy with spark master: yarn
2021-09-10 14:06:18,881 INFO [Thread-7] conf.HiveConf:181 : Found
configuration file file:/home/hadoop/hive-2.3.9/conf/hive-site.xml
2021-09-10 14:06:20,889 WARN [Thread-7] util.NativeCodeLoader:60 : Unable to
load native-hadoop library for your platform... using builtin-java classes
where applicable
2021-09-10 14:06:20,946 DEBUG [Thread-7] common.KylinConfig:363 : KYLIN_CONF
property was not set, will seek KYLIN_HOME env variable
2021-09-10 14:06:20,946 INFO [Thread-7] common.KylinConfig:369 : Use
KYLIN_HOME=/home/hadoop/kylin-4.0.0
2021-09-10 14:06:20,946 DEBUG [Thread-7] common.KylinConfig:363 : KYLIN_CONF
property was not set, will seek KYLIN_HOME env variable
2021-09-10 14:06:20,946 INFO [Thread-7] common.KylinConfig:369 : Use
KYLIN_HOME=/home/hadoop/kylin-4.0.0
2021-09-10 14:06:20,974 INFO [Thread-7] spark.SparkContext:57 : Running Spark
version 3.1.2
2021-09-10 14:06:21,023 INFO [Thread-7] resource.ResourceUtils:57 :
==============================================================
2021-09-10 14:06:21,023 INFO [Thread-7] resource.ResourceUtils:57 : No custom
resources configured for spark.driver.
2021-09-10 14:06:21,024 INFO [Thread-7] resource.ResourceUtils:57 :
==============================================================
2021-09-10 14:06:21,024 INFO [Thread-7] spark.SparkContext:57 : Submitted
application: sparder_on_node1-7070
2021-09-10 14:06:21,044 INFO [Thread-7] resource.ResourceProfile:57 : Default
ResourceProfile created, executor resources: Map(memoryOverhead -> name:
memoryOverhead, amount: 1024, script: , vendor: , cores -> name: cores, amount:
1, script: , vendor: , memory -> name: memory, amount: 4096, script: , vendor:
, offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources:
Map(cpus -> name: cpus, amount: 1.0)
2021-09-10 14:06:21,059 INFO [Thread-7] resource.ResourceProfile:57 : Limiting
resource is cpus at 1 tasks per executor
2021-09-10 14:06:21,061 INFO [Thread-7] resource.ResourceProfileManager:57 :
Added ResourceProfile id: 0
2021-09-10 14:06:21,114 INFO [Thread-7] spark.SecurityManager:57 : Changing
view acls to: hadoop
2021-09-10 14:06:21,114 INFO [Thread-7] spark.SecurityManager:57 : Changing
modify acls to: hadoop
2021-09-10 14:06:21,115 INFO [Thread-7] spark.SecurityManager:57 : Changing
view acls groups to:
2021-09-10 14:06:21,115 INFO [Thread-7] spark.SecurityManager:57 : Changing
modify acls groups to:
2021-09-10 14:06:21,116 INFO [Thread-7] spark.SecurityManager:57 :
SecurityManager: authentication disabled; ui acls disabled; users with view
permissions: Set(hadoop); groups with view permissions: Set(); users with
modify permissions: Set(hadoop); groups with modify permissions: Set()
2021-09-10 14:06:21,384 INFO [Thread-7] util.Utils:57 : Successfully started
service 'sparkDriver' on port 41418.
2021-09-10 14:06:21,415 INFO [Thread-7] spark.SparkEnv:57 : Registering
MapOutputTracker
2021-09-10 14:06:21,446 INFO [Thread-7] spark.SparkEnv:57 : Registering
BlockManagerMaster
2021-09-10 14:06:21,467 INFO [Thread-7] storage.BlockManagerMasterEndpoint:57
: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology
information
2021-09-10 14:06:21,468 INFO [Thread-7] storage.BlockManagerMasterEndpoint:57
: BlockManagerMasterEndpoint up
2021-09-10 14:06:21,511 INFO [Thread-7] spark.SparkEnv:57 : Registering
BlockManagerMasterHeartbeat
2021-09-10 14:06:21,533 INFO [Thread-7] storage.DiskBlockManager:57 : Created
local directory at
/home/hadoop/kylin-4.0.0/tomcat/temp/blockmgr-56a37a93-ee6f-4ef2-ac0c-d40bd386e363
2021-09-10 14:06:21,578 INFO [Thread-7] memory.MemoryStore:57 : MemoryStore
started with capacity 2004.6 MiB
2021-09-10 14:06:21,623 INFO [Thread-7] spark.SparkEnv:57 : Registering
OutputCommitCoordinator
2021-09-10 14:06:21,728 INFO [Thread-7] util.log:169 : Logging initialized
@59637ms to org.sparkproject.jetty.util.log.Slf4jLog
2021-09-10 14:06:21,797 INFO [Thread-7] server.Server:375 :
jetty-9.4.40.v20210413; built: 2021-04-13T20:42:42.668Z; git:
b881a572662e1943a14ae12e7e1207989f218b74; jvm 1.8.0_302-b08
2021-09-10 14:06:21,824 INFO [Thread-7] server.Server:415 : Started @59734ms
2021-09-10 14:06:21,860 INFO [Thread-7] server.AbstractConnector:331 : Started
ServerConnector@685c5052{HTTP/1.1, (http/1.1)}{0.0.0.0:4040}
2021-09-10 14:06:21,860 INFO [Thread-7] util.Utils:57 : Successfully started
service 'SparkUI' on port 4040.
2021-09-10 14:06:21,884 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@3099f074{/jobs,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,887 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@51464c0{/jobs/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,887 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@299f7ab4{/jobs/job,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,888 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@3a6c319d{/jobs/job/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,889 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@7eb7ed0e{/stages,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,889 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@7a0ec6c8{/stages/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,890 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@6c2f3699{/stages/stage,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,891 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@5d698613{/stages/stage/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,891 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@74e4adaa{/stages/pool,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,892 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@126ffa9d{/stages/pool/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,892 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@7be77dff{/storage,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,893 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@28c70647{/storage/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,894 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@3a853de7{/storage/rdd,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,894 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@6202b20f{/storage/rdd/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,895 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@66dff3c2{/environment,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,896 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@582ddba8{/environment/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,896 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@867101{/executors,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,897 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@5dd7cd72{/executors/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,897 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@1db717f{/executors/threadDump,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,899 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@42a7795{/executors/threadDump/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,910 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@5cc612ac{/static,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,911 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@197de55f{/,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,912 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@61c1285e{/api,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,912 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@518b12b0{/jobs/job/kill,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,913 INFO [Thread-7] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@288df71f{/stages/stage/kill,null,AVAILABLE,@Spark}
2021-09-10 14:06:21,915 INFO [Thread-7] ui.SparkUI:57 : Bound SparkUI to
0.0.0.0, and started at http://node1:4040
2021-09-10 14:06:21,986 INFO [Thread-7] scheduler.FairSchedulableBuilder:57 :
Creating Fair Scheduler pools from
/home/hadoop/kylin-4.0.0/conf/fairscheduler.xml
2021-09-10 14:06:22,006 INFO [Thread-7] scheduler.FairSchedulableBuilder:57 :
Created pool: query_pushdown, schedulingMode: FAIR, minShare: 0, weight: 1
2021-09-10 14:06:22,007 INFO [Thread-7] scheduler.FairSchedulableBuilder:57 :
Created pool: heavy_tasks, schedulingMode: FAIR, minShare: 1, weight: 5
2021-09-10 14:06:22,007 INFO [Thread-7] scheduler.FairSchedulableBuilder:57 :
Created pool: lightweight_tasks, schedulingMode: FAIR, minShare: 1, weight: 10
2021-09-10 14:06:22,008 INFO [Thread-7] scheduler.FairSchedulableBuilder:57 :
Created pool: vip_tasks, schedulingMode: FAIR, minShare: 0, weight: 15
2021-09-10 14:06:22,009 INFO [Thread-7] scheduler.FairSchedulableBuilder:57 :
Created default pool: default, schedulingMode: FIFO, minShare: 0, weight: 1
2021-09-10 14:06:22,041 WARN [Thread-7] config.package:422 : Can not load the
default value of `spark.yarn.isHadoopProvided` from
`org/apache/spark/deploy/yarn/config.properties` with error,
java.lang.NullPointerException. Using `false` as a default value.
2021-09-10 14:06:22,141 INFO [Thread-7] client.RMProxy:134 : Connecting to
ResourceManager at /0.0.0.0:8032
2021-09-10 14:06:22,557 INFO [Thread-7] yarn.Client:57 : Requesting a new
application from cluster with 3 NodeManagers
2021-09-10 14:06:22,712 INFO [Thread-7] conf.Configuration:2795 :
resource-types.xml not found
2021-09-10 14:06:22,713 INFO [Thread-7] resource.ResourceUtils:442 : Unable to
find 'resource-types.xml'.
2021-09-10 14:06:22,726 INFO [Thread-7] yarn.Client:57 : Verifying our
application has not requested more than the maximum memory capability of the
cluster (8192 MB per container)
2021-09-10 14:06:22,726 INFO [Thread-7] yarn.Client:57 : Will allocate AM
container, with 896 MB memory including 384 MB overhead
2021-09-10 14:06:22,727 INFO [Thread-7] yarn.Client:57 : Setting up container
launch context for our AM
2021-09-10 14:06:22,727 INFO [Thread-7] yarn.Client:57 : Setting up the launch
environment for our AM container
2021-09-10 14:06:22,732 INFO [Thread-7] yarn.Client:57 : Preparing resources
for our AM container
2021-09-10 14:06:22,787 WARN [Thread-7] yarn.Client:69 : Neither
spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading
libraries under SPARK_HOME.
2021-09-10 14:06:24,627 INFO [Thread-7] yarn.Client:57 : Uploading resource
file:/home/hadoop/kylin-4.0.0/tomcat/temp/spark-73ca4639-8305-4d10-b8a6-7d81cdaebbe4/__spark_libs__8605010036484437165.zip
->
file:/home/hadoop/.sparkStaging/application_1631008247268_0022/__spark_libs__8605010036484437165.zip
2021-09-10 14:06:25,095 INFO [Thread-7] yarn.Client:57 : Uploading resource
file:/home/hadoop/kylin-4.0.0/lib/kylin-parquet-job-4.0.0.jar ->
file:/home/hadoop/.sparkStaging/application_1631008247268_0022/kylin-parquet-job-4.0.0.jar
2021-09-10 14:06:25,262 INFO [Thread-7] yarn.Client:57 : Uploading resource
file:/home/hadoop/kylin-4.0.0/conf/spark-executor-log4j.properties ->
file:/home/hadoop/.sparkStaging/application_1631008247268_0022/spark-executor-log4j.properties
2021-09-10 14:06:25,417 INFO [Thread-7] yarn.Client:57 : Uploading resource
file:/home/hadoop/kylin-4.0.0/tomcat/temp/spark-73ca4639-8305-4d10-b8a6-7d81cdaebbe4/__spark_conf__2711509430408003218.zip
->
file:/home/hadoop/.sparkStaging/application_1631008247268_0022/__spark_conf__.zip
2021-09-10 14:06:25,501 INFO [Thread-7] spark.SecurityManager:57 : Changing
view acls to: hadoop
2021-09-10 14:06:25,501 INFO [Thread-7] spark.SecurityManager:57 : Changing
modify acls to: hadoop
2021-09-10 14:06:25,501 INFO [Thread-7] spark.SecurityManager:57 : Changing
view acls groups to:
2021-09-10 14:06:25,502 INFO [Thread-7] spark.SecurityManager:57 : Changing
modify acls groups to:
2021-09-10 14:06:25,502 INFO [Thread-7] spark.SecurityManager:57 :
SecurityManager: authentication disabled; ui acls disabled; users with view
permissions: Set(hadoop); groups with view permissions: Set(); users with
modify permissions: Set(hadoop); groups with modify permissions: Set()
2021-09-10 14:06:25,527 INFO [Thread-7] yarn.Client:57 : Submitting
application application_1631008247268_0022 to ResourceManager
2021-09-10 14:06:25,794 INFO [Thread-7] impl.YarnClientImpl:329 : Submitted
application application_1631008247268_0022
2021-09-10 14:06:26,804 INFO [Thread-7] yarn.Client:57 : Application report
for application_1631008247268_0022 (state: ACCEPTED)
2021-09-10 14:06:26,812 INFO [Thread-7] yarn.Client:57 :
client token: N/A
diagnostics: [星期五 九月 10 14:06:26 +0800 2021] Application is Activated,
waiting for resources to be assigned for AM. Details : AM Partition =
<DEFAULT_PARTITION> ; Partition Resource = <memory:76800, vCores:24> ; Queue's
Absolute capacity = 100.0 % ; Queue's Absolute used capacity = 10.666667 % ;
Queue's Absolute max capacity = 100.0 % ; Queue's capacity (absolute resource)
= <memory:76800, vCores:24> ; Queue's used capacity (absolute resource) =
<memory:8192, vCores:2> ; Queue's max capacity (absolute resource) =
<memory:76800, vCores:24> ;
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: default
start time: 1631253985550
final status: UNDEFINED
tracking URL: http://node1:8088/proxy/application_1631008247268_0022/
user: hadoop
2021-09-10 14:06:27,816 INFO [Thread-7] yarn.Client:57 : Application report
for application_1631008247268_0022 (state: FAILED)
2021-09-10 14:06:27,818 INFO [Thread-7] yarn.Client:57 :
client token: N/A
diagnostics: Application application_1631008247268_0022 failed 2 times
due to AM Container for appattempt_1631008247268_0022_000002 exited with
exitCode: -1000
Failing this attempt.Diagnostics: [2021-09-10 14:06:27.176]File
file:/home/hadoop/.sparkStaging/application_1631008247268_0022/__spark_libs__8605010036484437165.zip
does not exist
java.io.FileNotFoundException: File
file:/home/hadoop/.sparkStaging/application_1631008247268_0022/__spark_libs__8605010036484437165.zip
does not exist
at
org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:668)
at
org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:989)
at
org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:658)
at
org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:458)
at
org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:270)
at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:68)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:418)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:415)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:415)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:246)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:239)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:227)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)For more detailed output, check
the application tracking page:
http://node1:8188/applicationhistory/app/application_1631008247268_0022 Then
click on links to logs of each attempt.
. Failing the application.
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: default
start time: 1631253985550
final status: FAILED
tracking URL:
http://node1:8188/applicationhistory/app/application_1631008247268_0022
user: hadoop
2021-09-10 14:06:27,867 INFO [Thread-7] yarn.Client:57 : Deleted staging
directory file:/home/hadoop/.sparkStaging/application_1631008247268_0022
2021-09-10 14:06:27,868 ERROR [Thread-7] cluster.YarnClientSchedulerBackend:73
: The YARN application has already ended! It might have been killed or the
Application Master may have failed to start. Check the YARN application logs
for more details.
2021-09-10 14:06:27,870 ERROR [Thread-7] spark.SparkContext:94 : Error
initializing SparkContext.
org.apache.spark.SparkException: Application application_1631008247268_0022
failed 2 times due to AM Container for appattempt_1631008247268_0022_000002
exited with exitCode: -1000
Failing this attempt.Diagnostics: [2021-09-10 14:06:27.176]File
file:/home/hadoop/.sparkStaging/application_1631008247268_0022/__spark_libs__8605010036484437165.zip
does not exist
java.io.FileNotFoundException: File
file:/home/hadoop/.sparkStaging/application_1631008247268_0022/__spark_libs__8605010036484437165.zip
does not exist
at
org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:668)
at
org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:989)
at
org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:658)
at
org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:458)
at
org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:270)
at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:68)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:418)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:415)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:415)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:246)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:239)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:227)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)For more detailed output, check
the application tracking page:
http://node1:8188/applicationhistory/app/application_1631008247268_0022 Then
click on links to logs of each attempt.
. Failing the application.
at
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:97)
at
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:64)
at
org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:220)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:579)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2672)
at
org.apache.spark.sql.KylinSession$KylinBuilder.$anonfun$getOrCreateKylinSession$2(KylinSession.scala:102)
at scala.Option.getOrElse(Option.scala:189)
at
org.apache.spark.sql.KylinSession$KylinBuilder.getOrCreateKylinSession(KylinSession.scala:96)
at
org.apache.spark.sql.SparderContext$$anon$1.run(SparderContext.scala:146)
at java.lang.Thread.run(Thread.java:748)
2021-09-10 14:06:27,882 INFO [Thread-7] server.AbstractConnector:381 : Stopped
Spark@685c5052{HTTP/1.1, (http/1.1)}{0.0.0.0:4040}
2021-09-10 14:06:27,883 INFO [Thread-7] ui.SparkUI:57 : Stopped Spark web UI
at http://node1:4040
2021-09-10 14:06:27,898 WARN [dispatcher-event-loop-0]
cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:69 : Attempted to request
executors before the AM has registered!
2021-09-10 14:06:27,901 INFO [Thread-7] cluster.YarnClientSchedulerBackend:57
: Shutting down all executors
2021-09-10 14:06:27,907 INFO [dispatcher-CoarseGrainedScheduler]
cluster.YarnSchedulerBackend$YarnDriverEndpoint:57 : Asking each executor to
shut down
2021-09-10 14:06:27,912 INFO [Thread-7] cluster.YarnClientSchedulerBackend:57
: YARN client scheduler backend Stopped
2021-09-10 14:06:27,924 INFO [dispatcher-event-loop-2]
spark.MapOutputTrackerMasterEndpoint:57 : MapOutputTrackerMasterEndpoint
stopped!
2021-09-10 14:06:27,939 INFO [Thread-7] memory.MemoryStore:57 : MemoryStore
cleared
2021-09-10 14:06:27,939 INFO [Thread-7] storage.BlockManager:57 : BlockManager
stopped
2021-09-10 14:06:27,949 INFO [Thread-7] storage.BlockManagerMaster:57 :
BlockManagerMaster stopped
2021-09-10 14:06:27,949 WARN [Thread-7] metrics.MetricsSystem:69 : Stopping a
MetricsSystem that is not running
2021-09-10 14:06:27,953 INFO [dispatcher-event-loop-6]
scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:57 :
OutputCommitCoordinator stopped!
2021-09-10 14:06:27,975 INFO [Thread-7] spark.SparkContext:57 : Successfully
stopped SparkContext
2021-09-10 14:06:27,976 ERROR [Thread-7] sql.SparderContext:94 : Error for
initializing spark
org.apache.spark.SparkException: Application application_1631008247268_0022
failed 2 times due to AM Container for appattempt_1631008247268_0022_000002
exited with exitCode: -1000
Failing this attempt.Diagnostics: [2021-09-10 14:06:27.176]File
file:/home/hadoop/.sparkStaging/application_1631008247268_0022/__spark_libs__8605010036484437165.zip
does not exist
java.io.FileNotFoundException: File
file:/home/hadoop/.sparkStaging/application_1631008247268_0022/__spark_libs__8605010036484437165.zip
does not exist
at
org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:668)
at
org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:989)
at
org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:658)
at
org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:458)
at
org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:270)
at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:68)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:418)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:415)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:415)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:246)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:239)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:227)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)For more detailed output, check
the application tracking page:
http://node1:8188/applicationhistory/app/application_1631008247268_0022 Then
click on links to logs of each attempt.
. Failing the application.
at
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:97)
at
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:64)
at
org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:220)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:579)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2672)
at
org.apache.spark.sql.KylinSession$KylinBuilder.$anonfun$getOrCreateKylinSession$2(KylinSession.scala:102)
at scala.Option.getOrElse(Option.scala:189)
at
org.apache.spark.sql.KylinSession$KylinBuilder.getOrCreateKylinSession(KylinSession.scala:96)
at
org.apache.spark.sql.SparderContext$$anon$1.run(SparderContext.scala:146)
at java.lang.Thread.run(Thread.java:748)
2021-09-10 14:06:27,978 INFO [Thread-7] sql.SparderContext:57 : Setting
initializing Spark thread to null.
2021-09-10 14:06:27,981 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
monitor.SparderContextCanary:68 : Start monitoring Sparder
2021-09-10 14:06:27,986 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
sql.SparderContext:57 : Init spark.
2021-09-10 14:06:27,987 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
sql.SparderContext:57 : Initializing Spark thread starting.
2021-09-10 14:06:27,987 INFO [Query 486c8f55-fb58-c50e-59b0-f67e65a3de7e-37]
sql.SparderContext:57 : Initializing Spark, waiting for done.
2021-09-10 14:06:27,988 INFO [Thread-47] sql.SparderContext:57 :
SparderContext deploy with spark master: yarn
2021-09-10 14:06:27,997 DEBUG [Thread-47] common.KylinConfig:363 : KYLIN_CONF
property was not set, will seek KYLIN_HOME env variable
2021-09-10 14:06:27,998 INFO [Thread-47] common.KylinConfig:369 : Use
KYLIN_HOME=/home/hadoop/kylin-4.0.0
2021-09-10 14:06:27,998 DEBUG [Thread-47] common.KylinConfig:363 : KYLIN_CONF
property was not set, will seek KYLIN_HOME env variable
2021-09-10 14:06:27,998 INFO [Thread-47] common.KylinConfig:369 : Use
KYLIN_HOME=/home/hadoop/kylin-4.0.0
2021-09-10 14:06:28,001 WARN [Thread-47] spark.SparkContext:69 : Another
SparkContext is being constructed (or threw an exception in its constructor).
This may indicate an error, since only one SparkContext should be running in
this JVM (see SPARK-2243). The other SparkContext was created at:
org.apache.spark.sql.SparderContext$$anon$1.run(SparderContext.scala:146)
java.lang.Thread.run(Thread.java:748)
2021-09-10 14:06:28,002 INFO [Thread-47] spark.SparkContext:57 : Running Spark
version 3.1.2
2021-09-10 14:06:28,004 INFO [Thread-47] resource.ResourceUtils:57 :
==============================================================
2021-09-10 14:06:28,004 INFO [Thread-47] resource.ResourceUtils:57 : No custom
resources configured for spark.driver.
2021-09-10 14:06:28,004 INFO [Thread-47] resource.ResourceUtils:57 :
==============================================================
2021-09-10 14:06:28,004 INFO [Thread-47] spark.SparkContext:57 : Submitted
application: sparder_on_node1-7070
2021-09-10 14:06:28,006 INFO [Thread-47] resource.ResourceProfile:57 : Default
ResourceProfile created, executor resources: Map(memoryOverhead -> name:
memoryOverhead, amount: 1024, script: , vendor: , cores -> name: cores, amount:
1, script: , vendor: , memory -> name: memory, amount: 4096, script: , vendor:
, offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources:
Map(cpus -> name: cpus, amount: 1.0)
2021-09-10 14:06:28,007 INFO [Thread-47] resource.ResourceProfile:57 :
Limiting resource is cpus at 1 tasks per executor
2021-09-10 14:06:28,007 INFO [Thread-47] resource.ResourceProfileManager:57 :
Added ResourceProfile id: 0
2021-09-10 14:06:28,008 INFO [Thread-47] spark.SecurityManager:57 : Changing
view acls to: hadoop
2021-09-10 14:06:28,009 INFO [Thread-47] spark.SecurityManager:57 : Changing
modify acls to: hadoop
2021-09-10 14:06:28,009 INFO [Thread-47] spark.SecurityManager:57 : Changing
view acls groups to:
2021-09-10 14:06:28,009 INFO [Thread-47] spark.SecurityManager:57 : Changing
modify acls groups to:
2021-09-10 14:06:28,009 INFO [Thread-47] spark.SecurityManager:57 :
SecurityManager: authentication disabled; ui acls disabled; users with view
permissions: Set(hadoop); groups with view permissions: Set(); users with
modify permissions: Set(hadoop); groups with modify permissions: Set()
2021-09-10 14:06:28,032 INFO [Thread-47] util.Utils:57 : Successfully started
service 'sparkDriver' on port 33082.
2021-09-10 14:06:28,036 INFO [Thread-47] spark.SparkEnv:57 : Registering
MapOutputTracker
2021-09-10 14:06:28,039 INFO [Thread-47] spark.SparkEnv:57 : Registering
BlockManagerMaster
2021-09-10 14:06:28,040 INFO [Thread-47] storage.BlockManagerMasterEndpoint:57
: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology
information
2021-09-10 14:06:28,040 INFO [Thread-47] storage.BlockManagerMasterEndpoint:57
: BlockManagerMasterEndpoint up
2021-09-10 14:06:28,073 INFO [Thread-47] spark.SparkEnv:57 : Registering
BlockManagerMasterHeartbeat
2021-09-10 14:06:28,075 INFO [Thread-47] storage.DiskBlockManager:57 : Created
local directory at
/home/hadoop/kylin-4.0.0/tomcat/temp/blockmgr-2aaeb6b8-0d0a-48f4-a824-3bd45810d854
2021-09-10 14:06:28,076 INFO [Thread-47] memory.MemoryStore:57 : MemoryStore
started with capacity 2004.6 MiB
2021-09-10 14:06:28,111 INFO [Thread-47] spark.SparkEnv:57 : Registering
OutputCommitCoordinator
2021-09-10 14:06:28,123 INFO [Thread-47] server.Server:375 :
jetty-9.4.40.v20210413; built: 2021-04-13T20:42:42.668Z; git:
b881a572662e1943a14ae12e7e1207989f218b74; jvm 1.8.0_302-b08
2021-09-10 14:06:28,125 INFO [Thread-47] server.Server:415 : Started @66035ms
2021-09-10 14:06:28,130 INFO [Thread-47] server.AbstractConnector:331 :
Started ServerConnector@121f6ee2{HTTP/1.1, (http/1.1)}{0.0.0.0:4040}
2021-09-10 14:06:28,131 INFO [Thread-47] util.Utils:57 : Successfully started
service 'SparkUI' on port 4040.
2021-09-10 14:06:28,132 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@1a76151b{/jobs,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,133 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@3869d2c1{/jobs/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,134 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@54ed0d44{/jobs/job,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,134 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@35e8ff90{/jobs/job/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,135 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@3f764975{/stages,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,135 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@ab37a0{/stages/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,136 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@61a1fb8d{/stages/stage,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,136 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@202be676{/stages/stage/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,137 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@2b18a335{/stages/pool,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,137 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@b82e487{/stages/pool/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,138 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@7a55dfc1{/storage,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,138 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@3ff2967e{/storage/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,139 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@29ce64cf{/storage/rdd,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,139 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@664acd9e{/storage/rdd/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,141 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@4907c262{/environment,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,142 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@33e29ace{/environment/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,142 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@7e285460{/executors,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,143 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@7e607ba5{/executors/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,143 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@17d6a113{/executors/threadDump,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,144 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@2e540714{/executors/threadDump/json,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,144 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@2cb37c45{/static,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,145 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@562d420a{/,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,145 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@13a3f9da{/api,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,146 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@d07c27{/jobs/job/kill,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,146 INFO [Thread-47] handler.ContextHandler:916 : Started
o.s.j.s.ServletContextHandler@522aaf07{/stages/stage/kill,null,AVAILABLE,@Spark}
2021-09-10 14:06:28,147 INFO [Thread-47] ui.SparkUI:57 : Bound SparkUI to
0.0.0.0, and started at http://node1:4040
2021-09-10 14:06:28,173 INFO [Thread-47] scheduler.FairSchedulableBuilder:57 :
Creating Fair Scheduler pools from
/home/hadoop/kylin-4.0.0/conf/fairscheduler.xml
2021-09-10 14:06:28,179 INFO [Thread-47] scheduler.FairSchedulableBuilder:57 :
Created pool: query_pushdown, schedulingMode: FAIR, minShare: 0, weight: 1
2021-09-10 14:06:28,180 INFO [Thread-47] scheduler.FairSchedulableBuilder:57 :
Created pool: heavy_tasks, schedulingMode: FAIR, minShare: 1, weight: 5
2021-09-10 14:06:28,180 INFO [Thread-47] scheduler.FairSchedulableBuilder:57 :
Created pool: lightweight_tasks, schedulingMode: FAIR, minShare: 1, weight: 10
2021-09-10 14:06:28,181 INFO [Thread-47] scheduler.FairSchedulableBuilder:57 :
Created pool: vip_tasks, schedulingMode: FAIR, minShare: 0, weight: 15
2021-09-10 14:06:28,181 INFO [Thread-47] scheduler.FairSchedulableBuilder:57 :
Created default pool: default, schedulingMode: FIFO, minShare: 0, weight: 1
2021-09-10 14:06:28,215 INFO [Thread-47] client.RMProxy:134 : Connecting to
ResourceManager at /0.0.0.0:8032
2021-09-10 14:06:28,224 INFO [Thread-47] yarn.Client:57 : Requesting a new
application from cluster with 3 NodeManagers
2021-09-10 14:06:28,230 INFO [Thread-47] yarn.Client:57 : Verifying our
application has not requested more than the maximum memory capability of the
cluster (8192 MB per container)
2021-09-10 14:06:28,231 INFO [Thread-47] yarn.Client:57 : Will allocate AM
container, with 896 MB memory including 384 MB overhead
2021-09-10 14:06:28,231 INFO [Thread-47] yarn.Client:57 : Setting up container
launch context for our AM
2021-09-10 14:06:28,231 INFO [Thread-47] yarn.Client:57 : Setting up the
launch environment for our AM container
2021-09-10 14:06:28,232 INFO [Thread-47] yarn.Client:57 : Preparing resources
for our AM container
2021-09-10 14:06:28,259 WARN [Thread-47] yarn.Client:69 : Neither
spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading
libraries under SPARK_HOME.
2021-09-10 14:06:28,575 DEBUG [BadQueryDetector] service.BadQueryDetector:148 :
Detect bad query.
2021-09-10 14:06:30,096 INFO [Thread-47] yarn.Client:57 : Uploading resource
file:/home/hadoop/kylin-4.0.0/tomcat/temp/spark-73ca4639-8305-4d10-b8a6-7d81cdaebbe4/__spark_libs__783599303237637597.zip
->
file:/home/hadoop/.sparkStaging/application_1631008247268_0023/__spark_libs__783599303237637597.zip
2021-09-10 14:06:30,473 INFO [Thread-47] yarn.Client:57 : Uploading resource
file:/home/hadoop/kylin-4.0.0/lib/kylin-parquet-job-4.0.0.jar ->
file:/home/hadoop/.sparkStaging/application_1631008247268_0023/kylin-parquet-job-4.0.0.jar
2021-09-10 14:06:30,636 INFO [Thread-47] yarn.Client:57 : Uploading resource
file:/home/hadoop/kylin-4.0.0/conf/spark-executor-log4j.properties ->
file:/home/hadoop/.sparkStaging/application_1631008247268_0023/spark-executor-log4j.properties
2021-09-10 14:06:30,729 INFO [Thread-47] yarn.Client:57 : Uploading resource
file:/home/hadoop/kylin-4.0.0/tomcat/temp/spark-73ca4639-8305-4d10-b8a6-7d81cdaebbe4/__spark_conf__9195544102719910017.zip
->
file:/home/hadoop/.sparkStaging/application_1631008247268_0023/__spark_conf__.zip
2021-09-10 14:06:30,792 INFO [Thread-47] spark.SecurityManager:57 : Changing
view acls to: hadoop
2021-09-10 14:06:30,792 INFO [Thread-47] spark.SecurityManager:57 : Changing
modify acls to: hadoop
2021-09-10 14:06:30,792 INFO [Thread-47] spark.SecurityManager:57 : Changing
view acls groups to:
2021-09-10 14:06:30,793 INFO [Thread-47] spark.SecurityManager:57 : Changing
modify acls groups to:
2021-09-10 14:06:30,793 INFO [Thread-47] spark.SecurityManager:57 :
SecurityManager: authentication disabled; ui acls disabled; users with view
permissions: Set(hadoop); groups with view permissions: Set(); users with
modify permissions: Set(hadoop); groups with modify permissions: Set()
2021-09-10 14:06:30,817 INFO [Thread-47] yarn.Client:57 : Submitting
application application_1631008247268_0023 to ResourceManager
2021-09-10 14:06:31,031 INFO [Thread-47] impl.YarnClientImpl:329 : Submitted
application application_1631008247268_0023
2021-09-10 14:06:31,436 INFO [FetcherRunner 1297812949-29]
threadpool.DefaultFetcherRunner:117 : Job Fetcher: 0 should running, 0 actual
running, 0 stopped, 0 ready, 2 already succeed, 0 error, 0 discarded, 0 others
2021-09-10 14:06:32,038 INFO [Thread-47] yarn.Client:57 : Application report
for application_1631008247268_0023 (state: ACCEPTED)
2021-09-10 14:06:32,039 INFO [Thread-47] yarn.Client:57 :
client token: N/A
diagnostics: [星期五 九月 10 14:06:31 +0800 2021] Application is Activated,
waiting for resources to be assigned for AM. Details : AM Partition =
<DEFAULT_PARTITION> ; Partition Resource = <memory:76800, vCores:24> ; Queue's
Absolute capacity = 100.0 % ; Queue's Absolute used capacity = 10.666667 % ;
Queue's Absolute max capacity = 100.0 % ; Queue's capacity (absolute resource)
= <memory:76800, vCores:24> ; Queue's used capacity (absolute resource) =
<memory:8192, vCores:2> ; Queue's max capacity (absolute resource) =
<memory:76800, vCores:24> ;
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: default
start time: 1631253990821
final status: UNDEFINED
tracking URL: http://node1:8088/proxy/application_1631008247268_0023/
user: hadoop
2021-09-10 14:06:33,043 INFO [Thread-47] yarn.Client:57 : Application report
for application_1631008247268_0023 (state: FAILED)
2021-09-10 14:06:33,044 INFO [Thread-47] yarn.Client:57 :
client token: N/A
diagnostics: Application application_1631008247268_0023 failed 2 times
due to AM Container for appattempt_1631008247268_0023_000002 exited with
exitCode: -1000
Failing this attempt.Diagnostics: [2021-09-10 14:06:32.286]File
file:/home/hadoop/.sparkStaging/application_1631008247268_0023/__spark_libs__783599303237637597.zip
does not exist
java.io.FileNotFoundException: File
file:/home/hadoop/.sparkStaging/application_1631008247268_0023/__spark_libs__783599303237637597.zip
does not exist
at
org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:668)
at
org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:989)
at
org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:658)
at
org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:458)
at
org.apache.hadoop.yarn.util.FSDownload.verifyAndCopy(FSDownload.java:270)
at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:68)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:418)
at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:415)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762)
at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:415)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.doDownloadCall(ContainerLocalizer.java:246)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:239)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer$FSDownloadWrapper.call(ContainerLocalizer.java:227)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)For more detailed output, check
the application tracking page:
http://node1:8188/applicationhistory/app/application_1631008247268_0023 Then
click on links to logs of each attempt.
. Failing the application.
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: default
start time: 1631253990821
final status: FAILED
tracking URL:
http://node1:8188/applicationhistory/app/application_1631008247268_0023
user: hadoop
2021-09-10 14:06:33,096 INFO [Thread-47] yarn.Client:57 : Deleted staging
directory file:/home/hadoop/.sparkStaging/application_1631008247268_0023
2021-09-10 14:06:33,096 ERROR [Thread-47] cluster.YarnClientSchedulerBackend:73
: The YARN application has already ended! It might have been killed or the
Application Master may have failed to start. Check the YARN application logs
for more details.
2021-09-10 14:06:33,096 ERROR [Thread-47] spark.SparkContext:94 : Error
initializing SparkContext.
org.apache.spark.SparkException: Application application_1631008247268_0023
failed 2 times due to AM Container for appattempt_1631008247268_0023_000002
exited with exitCode: -1000
Failing this attempt.Diagnostics: [2021-09-10 14:06:32.286]File
file:/home/hadoop/.sparkStaging/application_1631008247268_0023/__spark_libs__783599303237637597.zip
does not exist
java.io.FileNotFoundException: File
file:/home/hadoop/.sparkStaging/application_1631008247268_0023/__spark_libs__783599303237637597.zip
does not exist
at
org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:668)
at
org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:989)
at
org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:658)
{code}
Now, I'm sure I don't know what's causing the problem.
> java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
> --------------------------------------------------------------------------
>
> Key: KYLIN-5087
> URL: https://issues.apache.org/jira/browse/KYLIN-5087
> Project: Kylin
> Issue Type: Bug
> Components: Environment , Integration, Query Engine, Spark Engine,
> Web
> Affects Versions: v4.0.0
> Environment: hadoop-3.2.2
> hive-2.3.9
> kylin-4.0.0
> scala-2.12.14
> spark-3.1.2-bin-hadoop3.2
> zookeeper-3.7.0
> Reporter: 曹勇
> Priority: Major
> Attachments: image-2021-09-09-20-11-08-870.png,
> image-2021-09-10-16-12-45-012.png, image-2021-09-10-16-38-32-279.png
>
>
> When I query example sql `select part_dt, sum(price) as total_selled,
> count(distinct seller_id) as sellers from kylin_sales group by part_dt order
> by part_dt` for kylin_sales_cube in the document ,the issue happened:
> {code:java}
> 2021-09-09 20:06:10,063 WARN [Thread-7] config.package:422 : Can not load
> the default value of `spark.yarn.isHadoopProvided` from
> `org/apache/spark/deploy/yarn/config.properties` with error,
> java.lang.NullPointerException. Using `false` as a default value.
> 2021-09-09 20:06:10,482 INFO [Thread-7] client.AHSProxy:42 : Connecting to
> Application History server at node1/192.168.111.49:10200
> 2021-09-09 20:06:10,601 INFO [Thread-7] yarn.Client:57 : Requesting a new
> application from cluster with 3 NodeManagers
> 2021-09-09 20:06:11,337 INFO [Thread-7] conf.Configuration:2795 :
> resource-types.xml not found
> 2021-09-09 20:06:11,338 INFO [Thread-7] resource.ResourceUtils:442 : Unable
> to find 'resource-types.xml'.
> 2021-09-09 20:06:11,350 INFO [Thread-7] yarn.Client:57 : Verifying our
> application has not requested more than the maximum memory capability of the
> cluster (8192 MB per container)
> 2021-09-09 20:06:11,350 INFO [Thread-7] yarn.Client:57 : Will allocate AM
> container, with 896 MB memory including 384 MB overhead
> 2021-09-09 20:06:11,351 INFO [Thread-7] yarn.Client:57 : Setting up
> container launch context for our AM
> 2021-09-09 20:06:11,351 INFO [Thread-7] yarn.Client:57 : Setting up the
> launch environment for our AM container
> 2021-09-09 20:06:11,356 INFO [Thread-7] yarn.Client:57 : Preparing resources
> for our AM container
> 2021-09-09 20:06:11,434 WARN [Thread-7] yarn.Client:69 : Neither
> spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading
> libraries under SPARK_HOME.
> 2021-09-09 20:06:13,436 INFO [Thread-7] yarn.Client:57 : Uploading resource
> file:/home/hadoop/kylin-4.0.0/tomcat/temp/spark-d250f1d5-51aa-47d1-9aa5-b36d3b60d336/__spark_libs__3169881654091659892.zip
> ->
> hdfs://ns1/user/hadoop/.sparkStaging/application_1631008247268_0018/__spark_libs__3169881654091659892.zip
> 2021-09-09 20:06:14,657 INFO [Thread-7] yarn.Client:57 : Uploading resource
> file:/home/hadoop/kylin-4.0.0/lib/kylin-parquet-job-4.0.0.jar ->
> hdfs://ns1/user/hadoop/.sparkStaging/application_1631008247268_0018/kylin-parquet-job-4.0.0.jar
> 2021-09-09 20:06:15,058 INFO [Thread-7] yarn.Client:57 : Uploading resource
> file:/home/hadoop/kylin-4.0.0/conf/spark-executor-log4j.properties ->
> hdfs://ns1/user/hadoop/.sparkStaging/application_1631008247268_0018/spark-executor-log4j.properties
> 2021-09-09 20:06:15,238 INFO [Thread-7] yarn.Client:57 : Uploading resource
> file:/home/hadoop/kylin-4.0.0/tomcat/temp/spark-d250f1d5-51aa-47d1-9aa5-b36d3b60d336/__spark_conf__4477603375905676389.zip
> ->
> hdfs://ns1/user/hadoop/.sparkStaging/application_1631008247268_0018/__spark_conf__.zip
> 2021-09-09 20:06:15,354 INFO [Thread-7] spark.SecurityManager:57 : Changing
> view acls to: hadoop
> 2021-09-09 20:06:15,354 INFO [Thread-7] spark.SecurityManager:57 : Changing
> modify acls to: hadoop
> 2021-09-09 20:06:15,354 INFO [Thread-7] spark.SecurityManager:57 : Changing
> view acls groups to:
> 2021-09-09 20:06:15,354 INFO [Thread-7] spark.SecurityManager:57 : Changing
> modify acls groups to:
> 2021-09-09 20:06:15,354 INFO [Thread-7] spark.SecurityManager:57 :
> SecurityManager: authentication disabled; ui acls disabled; users with view
> permissions: Set(hadoop); groups with view permissions: Set(); users with
> modify permissions: Set(hadoop); groups with modify permissions: Set()
> 2021-09-09 20:06:15,378 INFO [Thread-7] yarn.Client:57 : Submitting
> application application_1631008247268_0018 to ResourceManager
> 2021-09-09 20:06:15,637 INFO [Thread-7] impl.YarnClientImpl:329 : Submitted
> application application_1631008247268_0018
> 2021-09-09 20:06:16,646 INFO [Thread-7] yarn.Client:57 : Application report
> for application_1631008247268_0018 (state: ACCEPTED)
> 2021-09-09 20:06:16,656 INFO [Thread-7] yarn.Client:57 :
> client token: N/A
> diagnostics: AM container is launched, waiting for AM container to
> Register with RM
> ApplicationMaster host: N/A
> ApplicationMaster RPC port: -1
> queue: default
> start time: 1631189175391
> final status: UNDEFINED
> tracking URL: http://node1:8088/proxy/application_1631008247268_0018/
> user: hadoop
> 2021-09-09 20:06:17,659 INFO [Thread-7] yarn.Client:57 : Application report
> for application_1631008247268_0018 (state: ACCEPTED)
> 2021-09-09 20:06:18,662 INFO [Thread-7] yarn.Client:57 : Application report
> for application_1631008247268_0018 (state: ACCEPTED)
> 2021-09-09 20:06:19,665 INFO [Thread-7] yarn.Client:57 : Application report
> for application_1631008247268_0018 (state: ACCEPTED)
> 2021-09-09 20:06:19,947 INFO [dispatcher-event-loop-7]
> cluster.YarnClientSchedulerBackend:57 : Add WebUI Filter.
> org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS
> -> node1,node2, PROXY_URI_BASES ->
> http://node1:8088/proxy/application_1631008247268_0018,http://node2:8088/proxy/application_1631008247268_0018,
> RM_HA_URLS -> node1:8088,node2:8088), /proxy/application_1631008247268_0018
> 2021-09-09 20:06:20,668 INFO [Thread-7] yarn.Client:57 : Application report
> for application_1631008247268_0018 (state: RUNNING)
> 2021-09-09 20:06:20,669 INFO [Thread-7] yarn.Client:57 :
> client token: N/A
> diagnostics: N/A
> ApplicationMaster host: 192.168.111.25
> ApplicationMaster RPC port: -1
> queue: default
> start time: 1631189175391
> final status: UNDEFINED
> tracking URL: http://node1:8088/proxy/application_1631008247268_0018/
> user: hadoop
> 2021-09-09 20:06:20,672 INFO [Thread-7]
> cluster.YarnClientSchedulerBackend:57 : Application
> application_1631008247268_0018 has started running.
> 2021-09-09 20:06:20,693 INFO [Thread-7] util.Utils:57 : Successfully started
> service 'org.apache.spark.network.netty.NettyBlockTransferService' on port
> 37707.
> 2021-09-09 20:06:20,693 INFO [Thread-7] netty.NettyBlockTransferService:81 :
> Server created on node3:37707
> 2021-09-09 20:06:20,696 INFO [Thread-7] storage.BlockManager:57 : Using
> org.apache.spark.storage.RandomBlockReplicationPolicy for block replication
> policy
> 2021-09-09 20:06:20,706 INFO [Thread-7] storage.BlockManagerMaster:57 :
> Registering BlockManager BlockManagerId(driver, node3, 37707, None)
> 2021-09-09 20:06:20,713 INFO [dispatcher-BlockManagerMaster]
> storage.BlockManagerMasterEndpoint:57 : Registering block manager node3:37707
> with 2004.6 MiB RAM, BlockManagerId(driver, node3, 37707, None)
> 2021-09-09 20:06:20,719 INFO [Thread-7] storage.BlockManagerMaster:57 :
> Registered BlockManager BlockManagerId(driver, node3, 37707, None)
> 2021-09-09 20:06:20,721 INFO [Thread-7] storage.BlockManager:57 :
> Initialized BlockManager: BlockManagerId(driver, node3, 37707, None)
> 2021-09-09 20:06:20,832 INFO [dispatcher-event-loop-2]
> cluster.YarnSchedulerBackend$YarnSchedulerEndpoint:57 : ApplicationMaster
> registered as NettyRpcEndpointRef(spark-client://YarnAM)
> 2021-09-09 20:06:20,922 INFO [Thread-7] ui.ServerInfo:57 : Adding filter to
> /metrics/json: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
> 2021-09-09 20:06:20,924 INFO [Thread-7] handler.ContextHandler:916 : Started
> o.s.j.s.ServletContextHandler@71ddd2af{/metrics/json,null,AVAILABLE,@Spark}
> 2021-09-09 20:06:25,825 INFO [dispatcher-CoarseGrainedScheduler]
> cluster.YarnSchedulerBackend$YarnDriverEndpoint:57 : Registered executor
> NettyRpcEndpointRef(spark-client://Executor) (192.168.111.50:36340) with ID
> 1, ResourceProfileId 0
> 2021-09-09 20:06:25,872 INFO [Thread-7]
> cluster.YarnClientSchedulerBackend:57 : SchedulerBackend is ready for
> scheduling beginning after reached minRegisteredResourcesRatio: 0.8
> 2021-09-09 20:06:25,949 ERROR [Thread-7] sql.SparderContext:94 : Error for
> initializing spark
> java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
> at
> org.apache.spark.utils.KylinReflectUtils$.getSessionState(KylinReflectUtils.scala:42)
> at
> org.apache.spark.sql.KylinSession.sessionState$lzycompute(KylinSession.scala:47)
> at
> org.apache.spark.sql.KylinSession.sessionState(KylinSession.scala:46)
> at
> org.apache.kylin.query.UdfManager.$anonfun$registerBuiltInFunc$1(UdfManager.scala:35)
> at
> org.apache.kylin.query.UdfManager.$anonfun$registerBuiltInFunc$1$adapted(UdfManager.scala:34)
> at scala.collection.immutable.List.foreach(List.scala:392)
> at
> org.apache.kylin.query.UdfManager.org$apache$kylin$query$UdfManager$$registerBuiltInFunc(UdfManager.scala:34)
> at org.apache.kylin.query.UdfManager$.create(UdfManager.scala:85)
> at
> org.apache.spark.sql.KylinSession$KylinBuilder.getOrCreateKylinSession(KylinSession.scala:116)
> at
> org.apache.spark.sql.SparderContext$$anon$1.run(SparderContext.scala:146)
> at java.lang.Thread.run(Thread.java:748)
> 2021-09-09 20:06:25,951 INFO [Thread-7] sql.SparderContext:57 : Setting
> initializing Spark thread to null.
> 2021-09-09 20:06:25,952 INFO [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35]
> monitor.SparderContextCanary:68 : Start monitoring Sparder
> 2021-09-09 20:06:25,953 INFO [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35]
> sql.SparderContext:57 : Init spark.
> 2021-09-09 20:06:25,954 INFO [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35]
> sql.SparderContext:57 : Initializing Spark thread starting.
> 2021-09-09 20:06:25,954 INFO [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35]
> sql.SparderContext:57 : Initializing Spark, waiting for done.
> 2021-09-09 20:06:25,955 INFO [Thread-37] sql.SparderContext:57 :
> SparderContext deploy with spark master: yarn
> 2021-09-09 20:06:25,964 INFO [Thread-37] sql.SparderContext:57 : Spark
> application id is application_1631008247268_0018
> 2021-09-09 20:06:25,965 INFO [Thread-37] sql.SparderContext:57 : Spark
> context started successfully with stack trace:
> 2021-09-09 20:06:25,965 INFO [Thread-37] sql.SparderContext:57 :
> java.lang.Thread.getStackTrace(Thread.java:1559)
> org.apache.spark.sql.SparderContext$$anon$1.$anonfun$run$8(SparderContext.scala:171)
> org.apache.spark.internal.Logging.logInfo(Logging.scala:57)
> org.apache.spark.internal.Logging.logInfo$(Logging.scala:56)
> org.apache.spark.sql.SparderContext$.logInfo(SparderContext.scala:45)
> org.apache.spark.sql.SparderContext$$anon$1.run(SparderContext.scala:171)
> java.lang.Thread.run(Thread.java:748)
> 2021-09-09 20:06:25,966 INFO [Thread-37] sql.SparderContext:57 : Class
> loader: org.apache.kylin.spark.classloader.SparkClassLoader@3187093f
> 2021-09-09 20:06:25,969 INFO [Thread-37] memory.MonitorEnv:57 : create
> driver monitor env
> 2021-09-09 20:06:25,976 INFO [dispatcher-BlockManagerMaster]
> storage.BlockManagerMasterEndpoint:57 : Registering block manager node2:44050
> with 2004.6 MiB RAM, BlockManagerId(1, node2, 44050, None)
> 2021-09-09 20:06:25,988 INFO [Thread-37] sql.SparderContext:57 : setup
> master endpoint finished.hostPort:node3:40116
> 2021-09-09 20:06:26,025 INFO [Thread-37] client.AHSProxy:42 : Connecting to
> Application History server at node1/192.168.111.49:10200
> 2021-09-09 20:06:26,029 INFO [Thread-37] sql.SparderContext:57 : Setting
> initializing Spark thread to null.
> 2021-09-09 20:06:26,037 ERROR [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35]
> service.QueryService:576 : Exception while executing query
> java.sql.SQLException: Error while executing SQL "select * from (select
> part_dt, sum(price) as total_selled, count(distinct seller_id) as sellers
> from kylin_sales group by part_dt order by part_dt) limit 50000":
> java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
> at org.apache.calcite.avatica.Helper.createException(Helper.java:56)
> at org.apache.calcite.avatica.Helper.createException(Helper.java:41)
> at
> org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:163)
> at
> org.apache.calcite.avatica.AvaticaStatement.executeQuery(AvaticaStatement.java:227)
> at
> org.apache.kylin.rest.service.QueryService.executeRequest(QueryService.java:1026)
> at
> org.apache.kylin.rest.service.QueryService.queryWithSqlMassage(QueryService.java:710)
> at
> org.apache.kylin.rest.service.QueryService.query(QueryService.java:221)
> at
> org.apache.kylin.rest.service.QueryService.queryAndUpdateCache(QueryService.java:514)
> at
> org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:469)
> at
> org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:405)
> at
> org.apache.kylin.rest.controller.QueryController.query(QueryController.java:93)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205)
> at
> org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:133)
> at
> org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:97)
> at
> org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:854)
> at
> org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:765)
> at
> org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)
> at
> org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:967)
> at
> org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:901)
> at
> org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
> at
> org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:872)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:647)
> at
> org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:728)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
> at
> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
> at
> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:137)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilterInternal(BasicAuthenticationFilter.java:215)
> at
> org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:200)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:116)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
> at
> org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
> at
> org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
> at
> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
> at
> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:347)
> at
> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:263)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:209)
> at
> com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:244)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
> at
> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:110)
> at
> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:492)
> at
> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:165)
> at
> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:104)
> at
> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:1025)
> at
> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
> at
> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:452)
> at
> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1195)
> at
> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:654)
> at
> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:317)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at
> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.RuntimeException:
> java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
> at
> org.apache.kylin.query.exec.SparkExec.collectToEnumerable(SparkExec.java:50)
> at Baz.bind(Unknown Source)
> at
> org.apache.calcite.jdbc.CalcitePrepare$CalciteSignature.enumerable(CalcitePrepare.java:365)
> at
> org.apache.calcite.jdbc.CalciteConnectionImpl.enumerable(CalciteConnectionImpl.java:301)
> at
> org.apache.calcite.jdbc.CalciteMetaImpl._createIterable(CalciteMetaImpl.java:559)
> at
> org.apache.calcite.jdbc.CalciteMetaImpl.createIterable(CalciteMetaImpl.java:550)
> at
> org.apache.calcite.avatica.AvaticaResultSet.execute(AvaticaResultSet.java:182)
> at
> org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:67)
> at
> org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:44)
> at
> org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:667)
> at
> org.apache.calcite.jdbc.CalciteMetaImpl.prepareAndExecute(CalciteMetaImpl.java:619)
> at
> org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
> at
> org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
> ... 84 more
> Caused by: java.lang.UnsupportedOperationException: Spark version 2.0.0 not
> supported
> at
> org.apache.spark.utils.KylinReflectUtils$.getSessionState(KylinReflectUtils.scala:42)
> at
> org.apache.spark.sql.KylinSession.sessionState$lzycompute(KylinSession.scala:47)
> at
> org.apache.spark.sql.KylinSession.sessionState(KylinSession.scala:46)
> at org.apache.spark.sql.SQLContext.sessionState(SQLContext.scala:78)
> at org.apache.spark.sql.SQLContext.conf(SQLContext.scala:80)
> at
> org.apache.spark.sql.execution.datasources.FileStatusCache$.getOrCreate(FileStatusCache.scala:44)
> at
> org.apache.spark.sql.execution.datasource.ShardFileStatusCache$.getFileStatusCache(ShardFileStatusCache.scala:29)
> at
> org.apache.spark.sql.SparderContext$.$anonfun$initSpark$1(SparderContext.scala:214)
> at
> org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
> at
> org.apache.spark.sql.SparderContext$.initSpark(SparderContext.scala:129)
> at
> org.apache.spark.sql.SparderContext$.$anonfun$getOriginalSparkSession$1(SparderContext.scala:58)
> at
> org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
> at
> org.apache.spark.sql.SparderContext$.getOriginalSparkSession(SparderContext.scala:55)
> at
> org.apache.spark.sql.SparderContext$.$anonfun$initSpark$1(SparderContext.scala:214)
> at
> org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
> at
> org.apache.spark.sql.SparderContext$.initSpark(SparderContext.scala:129)
> at
> org.apache.spark.sql.SparderContext$.$anonfun$getOriginalSparkSession$1(SparderContext.scala:58)
> at
> org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
> at
> org.apache.spark.sql.SparderContext$.getOriginalSparkSession(SparderContext.scala:55)
> at
> org.apache.spark.sql.SparderContextFacade$.current(SparderContextFacade.scala:34)
> at
> org.apache.spark.sql.SparderContext$.getSparkSession(SparderContext.scala:65)
> at
> org.apache.kylin.query.runtime.plans.TableScanPlan$.$anonfun$createOLAPTable$1(TableScanPlan.scala:78)
> at org.apache.spark.utils.LogEx.logTime(LogEx.scala:40)
> at org.apache.spark.utils.LogEx.logTime$(LogEx.scala:38)
> at
> org.apache.kylin.query.runtime.plans.TableScanPlan$.logTime(TableScanPlan.scala:43)
> at
> org.apache.kylin.query.runtime.plans.TableScanPlan$.createOLAPTable(TableScanPlan.scala:51)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.$anonfun$visit$2(CalciteToSparkPlaner.scala:58)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.logTime(CalciteToSparkPlaner.scala:120)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:58)
> at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
> at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
> at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
> at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
> at org.apache.calcite.rel.RelVisitor.go(RelVisitor.java:61)
> at
> org.apache.kylin.query.runtime.SparkEngine.toSparkPlan(SparkEngine.java:60)
> at
> org.apache.kylin.query.runtime.SparkEngine.compute(SparkEngine.java:49)
> at
> org.apache.kylin.query.exec.QueryEngineFactory.compute(QueryEngineFactory.java:47)
> at
> org.apache.kylin.query.exec.SparkExec.collectToEnumerable(SparkExec.java:41)
> ... 96 more
> 2021-09-09 20:06:26,041 INFO [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35]
> service.QueryService:1222 : Processed rows for each storageContext: 0
> 2021-09-09 20:06:26,048 INFO [Query 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384-35]
> service.QueryService:391 :
> ==========================[QUERY]===============================
> Query Id: 71e4c4e7-df0a-2b6b-ee48-1b0a8c1d6384
> SQL: select part_dt, sum(price) as total_selled, count(distinct seller_id) as
> sellers from kylin_sales group by part_dt order by part_dt
> User: ADMIN
> Success: false
> Duration: 21.519
> Project: learn_kylin
> Realization Names: [CUBE[name=kylin_sales_cube]]
> Cuboid Ids: [16384]
> Is Exactly Matched: [false]
> Total scan count: 0
> Total scan files: 0
> Total metadata time: 0ms
> Total spark scan time: 0ms
> Total scan bytes: -1
> Result row count: 0
> Storage cache used: false
> Is Query Push-Down: false
> Is Prepare: false
> Used Spark pool: null
> Trace URL: null
> Message: java.lang.UnsupportedOperationException: Spark version 2.0.0 not
> supported
> while executing SQL: "select * from (select part_dt, sum(price) as
> total_selled, count(distinct seller_id) as sellers from kylin_sales group by
> part_dt order by part_dt) limit 50000"
> ==========================[QUERY]===============================2021-09-09
> 20:06:26,052 ERROR [http-bio-7070-exec-1] controller.BasicController:65 :
> org.apache.kylin.rest.exception.InternalErrorException:
> java.lang.UnsupportedOperationException: Spark version 2.0.0 not supported
> while executing SQL: "select * from (select part_dt, sum(price) as
> total_selled, count(distinct seller_id) as sellers from kylin_sales group by
> part_dt order by part_dt) limit 50000"
> at
> org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:486)
> at
> org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:405)
> at
> org.apache.kylin.rest.controller.QueryController.query(QueryController.java:93)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205)
> at
> org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:133)
> at
> org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:97)
> at
> org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:854)
> at
> org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:765)
> at
> org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)
> at
> org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:967)
> at
> org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:901)
> at
> org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
> at
> org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:872)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:647)
> at
> org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:728)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:317)
> at
> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
> at
> org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:137)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:63)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.authentication.www.BasicAuthenticationFilter.doFilterInternal(BasicAuthenticationFilter.java:215)
> at
> org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:200)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:116)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.header.HeaderWriterFilter.doFilterInternal(HeaderWriterFilter.java:64)
> at
> org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56)
> at
> org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
> at
> org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:331)
> at
> org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:214)
> at
> org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:177)
> at
> org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:347)
> at
> org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:263)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:209)
> at
> com.thetransactioncompany.cors.CORSFilter.doFilter(CORSFilter.java:244)
> at
> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
> at
> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
> at
> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
> at
> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:110)
> at
> org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:492)
> at
> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:165)
> at
> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:104)
> at
> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:1025)
> at
> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
> at
> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:452)
> at
> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1195)
> at
> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:654)
> at
> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:317)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at
> org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.UnsupportedOperationException: Spark version 2.0.0 not
> supported
> at
> org.apache.spark.utils.KylinReflectUtils$.getSessionState(KylinReflectUtils.scala:42)
> at
> org.apache.spark.sql.KylinSession.sessionState$lzycompute(KylinSession.scala:47)
> at
> org.apache.spark.sql.KylinSession.sessionState(KylinSession.scala:46)
> at org.apache.spark.sql.SQLContext.sessionState(SQLContext.scala:78)
> at org.apache.spark.sql.SQLContext.conf(SQLContext.scala:80)
> at
> org.apache.spark.sql.execution.datasources.FileStatusCache$.getOrCreate(FileStatusCache.scala:44)
> at
> org.apache.spark.sql.execution.datasource.ShardFileStatusCache$.getFileStatusCache(ShardFileStatusCache.scala:29)
> at
> org.apache.spark.sql.SparderContext$.$anonfun$initSpark$1(SparderContext.scala:214)
> at
> org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
> at
> org.apache.spark.sql.SparderContext$.initSpark(SparderContext.scala:129)
> at
> org.apache.spark.sql.SparderContext$.$anonfun$getOriginalSparkSession$1(SparderContext.scala:58)
> at
> org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
> at
> org.apache.spark.sql.SparderContext$.getOriginalSparkSession(SparderContext.scala:55)
> at
> org.apache.spark.sql.SparderContext$.$anonfun$initSpark$1(SparderContext.scala:214)
> at
> org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
> at
> org.apache.spark.sql.SparderContext$.initSpark(SparderContext.scala:129)
> at
> org.apache.spark.sql.SparderContext$.$anonfun$getOriginalSparkSession$1(SparderContext.scala:58)
> at
> org.apache.spark.sql.SparderContext$.withClassLoad(SparderContext.scala:257)
> at
> org.apache.spark.sql.SparderContext$.getOriginalSparkSession(SparderContext.scala:55)
> at
> org.apache.spark.sql.SparderContextFacade$.current(SparderContextFacade.scala:34)
> at
> org.apache.spark.sql.SparderContext$.getSparkSession(SparderContext.scala:65)
> at
> org.apache.kylin.query.runtime.plans.TableScanPlan$.$anonfun$createOLAPTable$1(TableScanPlan.scala:78)
> at org.apache.spark.utils.LogEx.logTime(LogEx.scala:40)
> at org.apache.spark.utils.LogEx.logTime$(LogEx.scala:38)
> at
> org.apache.kylin.query.runtime.plans.TableScanPlan$.logTime(TableScanPlan.scala:43)
> at
> org.apache.kylin.query.runtime.plans.TableScanPlan$.createOLAPTable(TableScanPlan.scala:51)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.$anonfun$visit$2(CalciteToSparkPlaner.scala:58)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.logTime(CalciteToSparkPlaner.scala:120)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:58)
> at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
> at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
> at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
> at org.apache.calcite.rel.SingleRel.childrenAccept(SingleRel.java:72)
> at
> org.apache.kylin.query.runtime.CalciteToSparkPlaner.visit(CalciteToSparkPlaner.scala:45)
> at org.apache.calcite.rel.RelVisitor.go(RelVisitor.java:61)
> at
> org.apache.kylin.query.runtime.SparkEngine.toSparkPlan(SparkEngine.java:60)
> at
> org.apache.kylin.query.runtime.SparkEngine.compute(SparkEngine.java:49)
> at
> org.apache.kylin.query.exec.QueryEngineFactory.compute(QueryEngineFactory.java:47)
> at
> org.apache.kylin.query.exec.SparkExec.collectToEnumerable(SparkExec.java:41)
> at Baz.bind(Unknown Source)
> at
> org.apache.calcite.jdbc.CalcitePrepare$CalciteSignature.enumerable(CalcitePrepare.java:365)
> at
> org.apache.calcite.jdbc.CalciteConnectionImpl.enumerable(CalciteConnectionImpl.java:301)
> at
> org.apache.calcite.jdbc.CalciteMetaImpl._createIterable(CalciteMetaImpl.java:559)
> at
> org.apache.calcite.jdbc.CalciteMetaImpl.createIterable(CalciteMetaImpl.java:550)
> at
> org.apache.calcite.avatica.AvaticaResultSet.execute(AvaticaResultSet.java:182)
> at
> org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:67)
> at
> org.apache.calcite.jdbc.CalciteResultSet.execute(CalciteResultSet.java:44)
> at
> org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:667)
> at
> org.apache.calcite.jdbc.CalciteMetaImpl.prepareAndExecute(CalciteMetaImpl.java:619)
> at
> org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
> at
> org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
> at
> org.apache.calcite.avatica.AvaticaStatement.executeQuery(AvaticaStatement.java:227)
> at
> org.apache.kylin.rest.service.QueryService.executeRequest(QueryService.java:1026)
> at
> org.apache.kylin.rest.service.QueryService.queryWithSqlMassage(QueryService.java:710)
> at
> org.apache.kylin.rest.service.QueryService.query(QueryService.java:221)
> at
> org.apache.kylin.rest.service.QueryService.queryAndUpdateCache(QueryService.java:514)
> at
> org.apache.kylin.rest.service.QueryService.doQueryWithCache(QueryService.java:469)
> ... 78 more
> 2021-09-09 20:06:35,790 INFO [FetcherRunner 1341171673-29]
> threadpool.DefaultFetcherRunner:117 : Job Fetcher: 0 should running, 0 actual
> running, 0 stopped, 0 ready, 2 already succeed, 0 error, 0 discarded, 0 others
> {code}
> web ui as follow:
> !image-2021-09-09-20-11-08-870.png!
>
> Obviously,my spark version is spark-3.1.2-bin-hadoop3.2 in the environment.
> But why kylin call spark version 2.0.0 ?
--
This message was sent by Atlassian Jira
(v8.3.4#803005)