[ https://issues.apache.org/jira/browse/SPARK-17796?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15550548#comment-15550548 ]
Dongjoon Hyun commented on SPARK-17796: --------------------------------------- Thank you. I see. > spark HiveThriftServer2 sql AnalysisException: LOAD DATA input path does not > exist. if sql query is existed wild card characters > -------------------------------------------------------------------------------------------------------------------------------- > > Key: SPARK-17796 > URL: https://issues.apache.org/jira/browse/SPARK-17796 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 2.0.0 > Environment: CentOS release 6.6 (Final) > Reporter: Tran Quyet Thang > Labels: Spark2-HiveThriftServer2, spark2.0.0, sparkSQL > > 1. I'm using a ETL tool and connecting to Spark2-HiveThriftServer2 over > connection string URL: "jdbc:hive2://10.30.164.132:10000/nat" > 2. I'm facing with below error, if the sql have existed wild card characters > "*" : LOAD DATA LOCAL INPATH > '${SOURCE_DB_FILE}/V_SOURCE_BTS/${YYYYMMDD}/s0_V_SOURCE_BTS_\*_part_\*' INTO > TABLE ${SCHEMA_BI}V_SOURCE_BTS; > Couldn't execute SQL: LOAD DATA LOCAL INPATH > '/u02/CDR/HAITI/MakeFile/V_SOURCE_BTS/20160927/s0_V_SOURCE_BTS_\*_part_\*' > INTO TABLE nat.V_SOURCE_BTS > 2016/10/05 14:38:43 - V_SOURCE_BTS - > 2016/10/05 14:38:43 - V_SOURCE_BTS - org.apache.spark.sql.AnalysisException: > LOAD DATA input path does not exist: > /u02/CDR/HAITI/MakeFile/V_SOURCE_BTS/20160927/s0_V_SOURCE_BTS_\*_part_\*; > 3. The sql query can execute without error if I changed the sql and remove > wild cards to: > LOAD DATA LOCAL INPATH > '/u02/CDR/HAITI/MakeFile/V_SOURCE_BTS/20160927/s0_V_SOURCE_BTS_20160927_part_0000000' > INTO TABLE ${SCHEMA_BI}V_SOURCE_BTS; > 4. The problem's happened from Spark 2.0.0. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org