[ 
https://issues.apache.org/jira/browse/HIVE-26759?focusedWorklogId=827184&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-827184
 ]

ASF GitHub Bot logged work on HIVE-26759:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 18/Nov/22 14:16
            Start Date: 18/Nov/22 14:16
    Worklog Time Spent: 10m 
      Work Description: sonarcloud[bot] commented on PR #3782:
URL: https://github.com/apache/hive/pull/3782#issuecomment-1320050478

   Kudos, SonarCloud Quality Gate passed!    [![Quality Gate 
passed](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/QualityGateBadge/passed-16px.png
 'Quality Gate 
passed')](https://sonarcloud.io/dashboard?id=apache_hive&pullRequest=3782)
   
   
[![Bug](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/bug-16px.png
 
'Bug')](https://sonarcloud.io/project/issues?id=apache_hive&pullRequest=3782&resolved=false&types=BUG)
 
[![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png
 
'A')](https://sonarcloud.io/project/issues?id=apache_hive&pullRequest=3782&resolved=false&types=BUG)
 [0 
Bugs](https://sonarcloud.io/project/issues?id=apache_hive&pullRequest=3782&resolved=false&types=BUG)
  
   
[![Vulnerability](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/vulnerability-16px.png
 
'Vulnerability')](https://sonarcloud.io/project/issues?id=apache_hive&pullRequest=3782&resolved=false&types=VULNERABILITY)
 
[![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png
 
'A')](https://sonarcloud.io/project/issues?id=apache_hive&pullRequest=3782&resolved=false&types=VULNERABILITY)
 [0 
Vulnerabilities](https://sonarcloud.io/project/issues?id=apache_hive&pullRequest=3782&resolved=false&types=VULNERABILITY)
  
   [![Security 
Hotspot](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/security_hotspot-16px.png
 'Security 
Hotspot')](https://sonarcloud.io/project/security_hotspots?id=apache_hive&pullRequest=3782&resolved=false&types=SECURITY_HOTSPOT)
 
[![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png
 
'A')](https://sonarcloud.io/project/security_hotspots?id=apache_hive&pullRequest=3782&resolved=false&types=SECURITY_HOTSPOT)
 [0 Security 
Hotspots](https://sonarcloud.io/project/security_hotspots?id=apache_hive&pullRequest=3782&resolved=false&types=SECURITY_HOTSPOT)
  
   [![Code 
Smell](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/common/code_smell-16px.png
 'Code 
Smell')](https://sonarcloud.io/project/issues?id=apache_hive&pullRequest=3782&resolved=false&types=CODE_SMELL)
 
[![A](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/RatingBadge/A-16px.png
 
'A')](https://sonarcloud.io/project/issues?id=apache_hive&pullRequest=3782&resolved=false&types=CODE_SMELL)
 [0 Code 
Smells](https://sonarcloud.io/project/issues?id=apache_hive&pullRequest=3782&resolved=false&types=CODE_SMELL)
   
   [![No Coverage 
information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/CoverageChart/NoCoverageInfo-16px.png
 'No Coverage 
information')](https://sonarcloud.io/component_measures?id=apache_hive&pullRequest=3782&metric=coverage&view=list)
 No Coverage information  
   [![No Duplication 
information](https://sonarsource.github.io/sonarcloud-github-static-resources/v2/checks/Duplications/NoDuplicationInfo-16px.png
 'No Duplication 
information')](https://sonarcloud.io/component_measures?id=apache_hive&pullRequest=3782&metric=duplicated_lines_density&view=list)
 No Duplication information
   
   




Issue Time Tracking
-------------------

    Worklog Id:     (was: 827184)
    Time Spent: 1.5h  (was: 1h 20m)

> ERROR: column "CC_START" does not exist, when Postgres is used as Hive 
> metastore
> --------------------------------------------------------------------------------
>
>                 Key: HIVE-26759
>                 URL: https://issues.apache.org/jira/browse/HIVE-26759
>             Project: Hive
>          Issue Type: Bug
>          Components: Metastore
>    Affects Versions: 4.0.0-alpha-2
>            Reporter: Akshat Mathur
>            Assignee: Akshat Mathur
>            Priority: Major
>              Labels: pull-request-available
>          Time Spent: 1.5h
>  Remaining Estimate: 0h
>
> This error is coming when Postgres is used as Hive Metastore. 
> hive-site.xml
>  
> {code:java}
> <?xml version="1.0"?>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> <configuration>
>     <!-- UPSTREAM -->
>     <property>
>         <name>hive.server2.logging.operation.level</name>
>         <value>NONE</value>
>     </property>
>     <property>
>         <name>hive.log4j.file</name>
>         <value>hive-log4j.properties</value>
>     </property>
>     <property>
>         <name>metastore.log4j.file</name>
>         <value>metastore-log4j.properties</value>
>     </property>
>     <!-- Intellij -->
>     <property>
>         <name>hive.jar.path</name>
>         
> <value>/Users/am/Desktop/work/upstream/hive/ql/target/hive-exec-4.0.0-SNAPSHOT.jar</value>
>         <description>The location of hive_cli.jar that is used when 
> submitting jobs in a separate jvm.</description>
>     </property>
>     <property>
>         <name>hive.hadoop.classpath</name>
>         
> <value>/Users/am/Desktop/work/upstream/hive/ql/target/hive-exec-4.0.0-SNAPSHOT.jar</value>
>     </property>
>     <property>
>         <name>hive.metastore.local</name>
>         <value>false</value>
>     </property>
>     <property>
>         <name>hive.metastore.uris</name>
>         <value>thrift://localhost:9083</value>
>     </property>
>     <property>
>         <name>hive.metastore.warehouse.dir</name>
>         <value>/Users/am/Desktop/work/hivestuff/warehouse</value>
>     </property>
>     <property>
>         <name>hive.server2.metrics.enabled</name>
>         <value>true</value>
>     </property>
>  
>     <property>
>         <name>spark.eventLog.enabled</name>
>         <value>true</value>
>     </property>
>     <property>
>         <name>spark.eventLog.dir</name>
>         <value>/tmp/hive</value>
>     </property>
>     <!-- Intellij -->
>     
>     <property>
>         <name>metastore.metastore.event.db.notification.api.auth</name>
>         <value>false</value>
>     </property>
>     <property>
>         <name>hive.metastore.schema.verification</name>
>         <value>false</value>
>     </property>
>     <property>
>         <name>datanucleus.autoCreateTables</name>
>         <value>true</value>
>     </property>
>   
>     <property>
>         <name>hive.exec.scratchdir</name>
>         <values>/tmp/hive-${user.name}</values>
>     </property>
>         <property>
>             <name>javax.jdo.option.ConnectionURL</name>
>             <value>jdbc:postgresql://localhost:5432/hive_metastore</value>
>             <description>JDBC connect string for a JDBC 
> metastore</description>
>         </property>
>         <property>
>             <name>javax.jdo.option.ConnectionDriverName</name>
>             <value>org.postgresql.Driver</value>
>         </property>
>         <property>
>             <name>javax.jdo.option.ConnectionUserName</name>
>             <value>hive</value>
>         </property>
>         <property>
>             <name>javax.jdo.option.ConnectionPassword</name>
>             <value>hive</value>
>         </property>
>     <property>
>         <name>datanucleus.schema.autoCreateAll</name>
>         <value>true</value>
>     </property>
>     <property>
>         <name>hive.server2.enable.doAs</name>
>         <value>false</value>
>         <description></description>
>     </property>
>     <property>
>         <name>hive.server2.enable.impersonation</name>
>         <value>false</value>
>         <description></description>
>     </property>
>     <property>
>         <name>dfs.namenode.acls.enabled</name>
>         <value>false</value>
>     </property>
>     <!-- FAIR SCHEDULER -->
>     <!-- These following lines are needed to use ACID features -->
>     <!-- BEGIN -->
>     <!--
>     <property>
>       <name>hive.enforce.bucketing</name>
>       <value>true</value>
>     </property>
>     <property>
>       <name>hive.support.concurrency</name>
>       <value>true</value>
>     </property>
>     <property>
>       <name>hive.exec.dynamic.partition.mode</name>
>       <value>nonstrict</value>
>     </property>
>     <property>
>       <name>hive.txn.manager</name>
>       <value>org.apache.hadoop.hive.ql.lockmgr.DbTxnManager</value>
>     </property>
>     <property>
>       <name>hive.lock.manager</name>
>       <value>org.apache.hadoop.hive.ql.lockmgr.DbLockManager</value>
>     </property>
>     <property>
>       <name>hive.compactor.initiator.on</name>
>       <value>true</value>
>     </property>
>     <property>
>       <name>hive.compactor.worker.threads</name>
>       <value>2</value>
>     </property>
>     -->
>     <!-- END -->
>     <property>
>         <name>hive.server2.webui.explain.output</name>
>         <value>true</value>
>     </property>
>     <property>
>         <name>hive.server2.webui.show.graph</name>
>         <value>true</value>
>     </property>
>     <property>
>         <name>hive.server2.webui.show.stats</name>
>         <value>true</value>
>     </property>
>     <property>
>         <name>hive.server2.webui.max.graph.size</name>
>         <value>40</value>
>     </property>
>     <!-- ACID -->
>     <property>
>         <name>hive.txn.manager</name>
>         <value>org.apache.hadoop.hive.ql.lockmgr.DbTxnManager</value>
>     </property>
>     <property>
>         <name>hive.compactor.initiator.on</name>
>         <value>true</value>
>     </property>
>     <property>
>         <name>hive.compactor.worker.threads</name>
>         <value>3</value>
>     </property>
>     <property>
>         <name>metastore.compactor.worker.threads</name>
>         <value>4</value>
>     </property>
>     <property>
>         <name>hive.support.concurrency</name>
>         <value>true</value>
>     </property>
>     <property>
>         <name>hive.exec.dynamic.partition.mode</name>
>         <value>nonstrict</value>
>     </property>
>     <property>
>         <name>hive.lock.manager</name>
>         <value>org.apache.hadoop.hive.ql.lockmgr.DbLockManager</value>
>     </property>
>     <property>
>         <name>hive.compactor.crud.query.based</name>
>         <value>true</value>
>     </property>
>     <property>
>         <name>hive.metastore.runworker.in</name>
>         <value>hs2</value>
>     </property>
>     <!-- Random -->
>     <!--     <property>
>             <name>hive.users.in.admin.role</name>
>             <value>karencoppage</value>
>         </property> -->
>     <!--Timestamp-->
>     <!--     <property>
>             <name>hive.parquet.write.int64.timestamp</name>
>             <value>true</value>
>         </property>
>      -->
>     <!--for WebUI explain plan-->
>     <!--     <property>
>             <name>hive.server2.webui.max.historic.queries</name>
>             <value>40</value>
>         </property> -->
>     <!--     <property>
>             <name></name>
>             <value></value>
>         </property>
>      -->
> </configuration>
>  {code}
>  
>  
> Following is the stack trace when HMS service is started:
> {code:java}
> [Thread-5] ERROR org.apache.hadoop.hive.ql.txn.compactor.Initiator - 
> Initiator loop caught unexpected exception this time through the loop
> org.apache.hadoop.hive.metastore.api.MetaException: Unable to select from 
> transaction database org.postgresql.util.PSQLException: ERROR: column 
> "CC_START" does not exist
>   Position: 1215
>     at 
> org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2676)
>     at 
> org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2366)
>     at 
> org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:356)
>     at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:490)
>     at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:408)
>     at 
> org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:181)
>     at 
> org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:133)
>     at 
> com.zaxxer.hikari.pool.ProxyPreparedStatement.executeQuery(ProxyPreparedStatement.java:52)
>     at 
> com.zaxxer.hikari.pool.HikariProxyPreparedStatement.executeQuery(HikariProxyPreparedStatement.java)
>     at 
> org.apache.hadoop.hive.metastore.txn.TxnHandler.showCompact(TxnHandler.java:3894)
>     at 
> org.apache.hadoop.hive.ql.txn.compactor.Initiator.run(Initiator.java:154)    
> at 
> org.apache.hadoop.hive.metastore.txn.TxnHandler.showCompact(TxnHandler.java:3946)
>  ~[classes/:?]
>     at 
> org.apache.hadoop.hive.ql.txn.compactor.Initiator.run(Initiator.java:154) 
> ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] {code}
> This error disappears when derby is configured as HMS.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to