Hi there,

This happened to me once, I think because it’s sensitive to uppercase.


Cordialement
Khalil, 
Manager at ISTIC-Devspace

> On May 28, 2019, at 3:27 PM, -张衡 <[email protected]> wrote:
> 
> I have established a database(first) in postgresql, and a table(t_bike) in 
> the database. The table name and field name are all lowercase. By default, 
> they are under the schema of public. I also modified the configuration of the 
> data source in kylin.properties and created 
> $KYLIN_HOEM/conf/datasource/postgresql.xml. The configuration is as follows 
> pictures;
> -------------------------------------------------------------------------------------------------------------------------------------------------------------------------
> 
> This is a error I have encountered:
> 
> 19/05/28 21:03:21 INFO manager.SqlManager: Executing SQL statement: SELECT 
> "T_BIKE"."ID" AS "T_BIKE_ID", "T_BIKE"."STATUS" AS "T_BIKE_STATUS"
> 
> FROM "PUBLIC"."T_BIKE" AS "T_BIKE"
> 
> WHERE 1 = 1 AND  (1 = 0)
> 
> 19/05/28 21:03:21 ERROR manager.SqlManager: Error executing statement: 
> org.postgresql.util.PSQLException: ERROR: relation "PUBLIC.T_BIKE" does not 
> exist
> 
>   Position: 80
> 
> org.postgresql.util.PSQLException: ERROR: relation "PUBLIC.T_BIKE" does not 
> exist
> 
> -------------------------------------------------------------------------------------------------------------------------------------------------------
> 
> The fellow is the logs of errors;
> 
> java.io.IOException: OS command error exit with return code: 1, error 
> message: Warning: /root/newapps/sqoop-1.4.7.bin__hadoop-2.6.0/../hcatalog 
> does not exist! HCatalog jobs will fail.
> 
> Please set $HCAT_HOME to the root of your HCatalog installation.
> 
> Warning: /root/newapps/sqoop-1.4.7.bin__hadoop-2.6.0/../accumulo does not 
> exist! Accumulo imports will fail.
> 
> Please set $ACCUMULO_HOME to the root of your Accumulo installation.
> 
> SLF4J: Class path contains multiple SLF4J bindings.
> 
> SLF4J: Found binding in 
> [jar:file:/root/apps/hadoop-2.8.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> 
> SLF4J: Found binding in 
> [jar:file:/root/apps/hbase-1.2.1/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> 
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
> explanation.
> 
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> 
> 19/05/28 21:03:20 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
> 
> 19/05/28 21:03:20 WARN tool.BaseSqoopTool: Setting your password on the 
> command-line is insecure. Consider using -P instead.
> 
> 19/05/28 21:03:20 WARN sqoop.ConnFactory: Parameter --driver is set to an 
> explicit driver however appropriate connection manager is not being set (via 
> --connection-manager). Sqoop is going to fall back to 
> org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which 
> connection manager should be used next time.
> 
> 19/05/28 21:03:20 INFO manager.SqlManager: Using default fetchSize of 1000
> 
> 19/05/28 21:03:20 INFO tool.CodeGenTool: Beginning code generation
> 
> 19/05/28 21:03:21 INFO manager.SqlManager: Executing SQL statement: SELECT 
> "T_BIKE"."ID" AS "T_BIKE_ID", "T_BIKE"."STATUS" AS "T_BIKE_STATUS"
> 
> FROM "PUBLIC"."T_BIKE" AS "T_BIKE"
> 
> WHERE 1 = 1 AND  (1 = 0)
> 
> 19/05/28 21:03:21 ERROR manager.SqlManager: Error executing statement: 
> org.postgresql.util.PSQLException: ERROR: relation "PUBLIC.T_BIKE" does not 
> exist
> 
>   Position: 80
> 
> org.postgresql.util.PSQLException: ERROR: relation "PUBLIC.T_BIKE" does not 
> exist
> 
>   Position: 80
> 
>         at 
> org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2440)
> 
>         at 
> org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2183)
> 
>         at 
> org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:308)
> 
>         at 
> org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:441)
> 
>         at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:365)
> 
>         at 
> org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:143)
> 
>         at 
> org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:106)
> 
>         at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:777)
> 
>         at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:786)
> 
>         at 
> org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:289)
> 
>         at 
> org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:260)
> 
>         at 
> org.apache.sqoop.manager.SqlManager.getColumnTypesForQuery(SqlManager.java:253)
> 
>         at 
> org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:336)
> 
>         at 
> org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1872)
> 
>         at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1671)
> 
>         at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)
> 
>         at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:501)
> 
>         at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
> 
>         at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
> 
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
> 
>         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
> 
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
> 
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
> 
>         at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
> 
> 19/05/28 21:03:21 ERROR tool.ImportTool: Import failed: java.io.IOException: 
> No columns to generate for ClassWriter
> 
>         at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1677)
> 
>         at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)
> 
>         at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:501)
> 
>         at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
> 
>         at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
> 
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
> 
>         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
> 
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
> 
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
> 
>         at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
> 
>  
> 
> The command is:
> 
> /root/newapps/sqoop-1.4.7.bin__hadoop-2.6.0/bin/sqoop import 
> -Dorg.apache.sqoop.splitter.allow_text_splitter=true  
> -Dmapreduce.job.queuename=default --connect 
> "jdbc:postgresql://kylin-master:5432/first" --driver org.postgresql.Driver 
> --username postgres --password postgres --query "SELECT \"T_BIKE\".\"ID\" AS 
> \"T_BIKE_ID\", \"T_BIKE\".\"STATUS\" AS \"T_BIKE_STATUS\"
> 
> FROM \"PUBLIC\".\"T_BIKE\" AS \"T_BIKE\"
> 
> WHERE 1 = 1 AND \$CONDITIONS" --target-dir 
> hdfs://kylin-master:9000/kylin/kylin_metadata/kylin-3b69c678-a4e5-f553-ffd6-d6e2e4dd4f48/kylin_intermediate_project6_test1_bb974a70_4e15_8bdc_e9ad_d6d970027e7a
>  --split-by \"T_BIKE\".\"ID\" --boundary-query "SELECT 
> MIN(\"T_BIKE\".\"ID\"), MAX(\"T_BIKE\".\"ID\")
> 
> FROM \"public\".\"T_BIKE\" AS \"T_BIKE\"" --null-string '' 
> --fields-terminated-by '|' --num-mappers 4
> 
>         at 
> org.apache.kylin.common.util.CliCommandExecutor.execute(CliCommandExecutor.java:96)
> 
>         at 
> org.apache.kylin.source.jdbc.sqoop.SqoopCmdStep.sqoopFlatHiveTable(SqoopCmdStep.java:59)
> 
>         at 
> org.apache.kylin.source.jdbc.sqoop.SqoopCmdStep.doWork(SqoopCmdStep.java:70)
> 
>         at 
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:166)
> 
>         at 
> org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:71)
> 
>         at 
> org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:166)
> 
>         at 
> org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:114)
> 
>         at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 
>         at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 
>     at java.lang.Thread.run(Thread.java:748)
> 
> -------------------------------------------------------------------------------------------------------------------------------------------------------
> 
> finally,In the process of modification, I also found that when deleting 
> default.xml file in libkylin-job-2.6.1.jar, the program will report an error.
> 
> I don’t know whether kylin read $KYLIN_HOEM/conf/datasource/postgresql.xml?
> In the use of PostgreSQL as a data source, some dialect adaptation problems 
> encountered, I hope experts can give an answer, thank you very much!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> <error.PNG>
> <kylin.properties>
> <kylinweb_serverconfig.PNG>
> <kylinweb_serverconfig.txt>
> <postgresql.xml>
> <t_bike.PNG>

Reply via email to