[ https://issues.apache.org/jira/browse/KYLIN-5145?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17469038#comment-17469038 ]
qingquanzhang commented on KYLIN-5145: -------------------------------------- I found the cause of the problem, I added a line in find-spark-dependency. sh :export SPARK_HOME=/opt/kylin/spark-3.1.1-bin-hadoop3.2 But when I debug the code,I found that the value of the "SPARK_VERSION" variable is not the version I set. When I run kylin.sh start, I see a message :Skip spark which not owned by kylin. SPARK_HOME is /opt/kylin/spark-3.1.1-bin-hadoop3.2 and KYLIN_HOME is /opt/kylin . I am very confused > Query Failed:wrong number of arguments while executing SQL > ---------------------------------------------------------- > > Key: KYLIN-5145 > URL: https://issues.apache.org/jira/browse/KYLIN-5145 > Project: Kylin > Issue Type: Bug > Components: Query Engine > Affects Versions: v4.0.0 > Environment: spark3.1.1 > hadoop3.1.3 > hive:3.1.2 > kylin4.0.0-spark3 > Reporter: qingquanzhang > Priority: Major > Attachments: kylin.log > > > An error occurred when I initiated the query: > java.lang.IllegalArgumentException: wrong number of arguments while executing > SQL: "select * from (select count(*) from kylin_sales) limit 50000" -- This message was sent by Atlassian Jira (v8.20.1#820001)