[ 
https://issues.apache.org/jira/browse/PHOENIX-7516?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

chaijunjie updated PHOENIX-7516:
--------------------------------
    Attachment: PHOENIX-7516.patch

> Support skip to specify schema logic was broken in CsvBulkload Tool
> -------------------------------------------------------------------
>
>                 Key: PHOENIX-7516
>                 URL: https://issues.apache.org/jira/browse/PHOENIX-7516
>             Project: Phoenix
>          Issue Type: Bug
>    Affects Versions: 5.2.1
>            Reporter: chaijunjie
>            Assignee: chaijunjie
>            Priority: Major
>         Attachments: PHOENIX-7516.patch
>
>
> when i test CsvBulkloadTool, i find the new version run failed....but old 
> version worked fine
>  hbase org.apache.phoenix.mapreduce.CsvBulkLoadTool -t MY_SCHEMA.TEST2 -i 
> /tmp/data1.csv
> {code:java}
> 2025-01-24T11:57:41,703 INFO  [main] 
> connectionqueryservice.ConnectionQueryServicesMetricsManager: Created object 
> for NoOp Connection query service metrics manager
> org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table 
> undefined. tableName=MY_SCHEMA.TEST2
>     at 
> org.apache.phoenix.compile.FromCompiler$BaseColumnResolver.createTableRef(FromCompiler.java:799)
>     at 
> org.apache.phoenix.compile.FromCompiler$SingleTableColumnResolver.<init>(FromCompiler.java:448)
>     at 
> org.apache.phoenix.compile.FromCompiler.getResolverForQuery(FromCompiler.java:233)
>     at 
> org.apache.phoenix.compile.FromCompiler.getResolverForQuery(FromCompiler.java:211)
>     at org.apache.phoenix.util.ParseNodeUtil.rewrite(ParseNodeUtil.java:177)
>     at 
> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStatement.java:712)
>     at 
> org.apache.phoenix.jdbc.PhoenixStatement$ExecutableSelectStatement.compilePlan(PhoenixStatement.java:685)
>     at 
> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:360)
>     at 
> org.apache.phoenix.jdbc.PhoenixStatement$1.call(PhoenixStatement.java:341)
>     at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>     at 
> org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:341)
>     at 
> org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:327)
>     at 
> org.apache.phoenix.jdbc.PhoenixStatement.executeQuery(PhoenixStatement.java:2227)
>     at 
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:251)
>     at 
> org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:187)
>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:81)
>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:95)
>     at 
> org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:158)
>  {code}
> PHOENIX-6405 add a check wether table is empty for global index, but the 
> table name in SQL is wrong when give a tablename include schema, seems we 
> need change the logic in SchemaUtil.getEscapedTableName



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to