[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13775935#comment-13775935
 ] 

Hive QA commented on HIVE-5296:
-------------------------------



{color:red}Overall{color}: -1 no tests executed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12604650/HIVE-5296.patch

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/860/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/860/console

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Tests failed with: NonZeroExitCodeException: Command 'bash 
/data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and 
output '+ [[ -n '' ]]
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost 
-Dhttp.proxyPort=3128'
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost 
-Dhttp.proxyPort=3128'
+ cd /data/hive-ptest/working/
+ tee /data/hive-ptest/logs/PreCommit-HIVE-Build-860/source-prep.txt
+ mkdir -p maven ivy
+ [[ svn = \s\v\n ]]
+ [[ -n '' ]]
+ [[ -d apache-svn-trunk-source ]]
+ [[ ! -d apache-svn-trunk-source/.svn ]]
+ [[ ! -d apache-svn-trunk-source ]]
+ cd apache-svn-trunk-source
+ svn revert -R .
Reverted 'build-common.xml'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/plan/CreateTableDesc.java'
++ awk '{print $2}'
++ egrep -v '^X|^Performing status on external'
++ svn status --no-ignore
+ rm -rf build data/files/exported_table hcatalog/build hcatalog/core/build 
hcatalog/storage-handlers/hbase/build hcatalog/server-extensions/build 
hcatalog/webhcat/svr/build hcatalog/webhcat/java-client/build 
hcatalog/hcatalog-pig-adapter/build common/src/gen 
ql/src/test/results/clientpositive/import_exported_table.q.out 
ql/src/test/queries/clientpositive/import_exported_table.q
+ svn update

Fetching external item into 'hcatalog/src/test/e2e/harness'
External at revision 1525764.

At revision 1525764.
+ patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hive-ptest/working/scratch/build.patch
+ [[ -f /data/hive-ptest/working/scratch/build.patch ]]
+ chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh
+ /data/hive-ptest/working/scratch/smart-apply-patch.sh 
/data/hive-ptest/working/scratch/build.patch
The patch does not appear to apply with p0 to p2
+ exit 1
'
{noformat}

This message is automatically generated.
                
> Memory leak: OOM Error after multiple open/closed JDBC connections. 
> --------------------------------------------------------------------
>
>                 Key: HIVE-5296
>                 URL: https://issues.apache.org/jira/browse/HIVE-5296
>             Project: Hive
>          Issue Type: Bug
>          Components: HiveServer2
>    Affects Versions: 0.12.0
>         Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
>            Reporter: Douglas
>              Labels: hiveserver
>             Fix For: 0.12.0
>
>         Attachments: HIVE-5296.patch
>
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
> However, on inspection of the related patch and my built version of Hive 
> (patch carried forward to 0.12.0), I am still seeing the described behaviour.
> Multiple connections to Hiveserver2, all of which are closed and disposed of 
> properly show the Java heap size to grow extremely quickly. 
> This issue can be recreated using the following code
> {code}
> import java.sql.DriverManager;
> import java.sql.Connection;
> import java.sql.ResultSet;
> import java.sql.SQLException;
> import java.sql.Statement;
> import java.util.Properties;
> import org.apache.hive.service.cli.HiveSQLException;
> import org.apache.log4j.Logger;
> /*
>  * Class which encapsulates the lifecycle of a query or statement.
>  * Provides functionality which allows you to create a connection
>  */
> public class HiveClient {
>       
>       Connection con;
>       Logger logger;
>       private static String driverName = "org.apache.hive.jdbc.HiveDriver";   
>       private String db;
>       
>       
>       public HiveClient(String db)
>       {               
>               logger = Logger.getLogger(HiveClient.class);
>               this.db=db;
>               
>               try{
>                        Class.forName(driverName);
>               }catch(ClassNotFoundException e){
>                       logger.info("Can't find Hive driver");
>               }
>               
>               String hiveHost = GlimmerServer.config.getString("hive/host");
>               String hivePort = GlimmerServer.config.getString("hive/port");
>               String connectionString = "jdbc:hive2://"+hiveHost+":"+hivePort 
> +"/default";
>               logger.info(String.format("Attempting to connect to 
> %s",connectionString));
>               try{                    
>                       con = 
> DriverManager.getConnection(connectionString,"","");                          
>                                             
>               }catch(Exception e){
>                       logger.error("Problem instantiating the 
> connection"+e.getMessage());
>               }               
>       }
>                       
>       public int update(String query) 
>       {
>               Integer res = 0;
>               Statement stmt = null;
>               try{                    
>                       stmt = con.createStatement();
>                       String switchdb = "USE "+db;
>                       logger.info(switchdb);          
>                       stmt.executeUpdate(switchdb);
>                       logger.info(query);
>                       res = stmt.executeUpdate(query);
>                       logger.info("Query passed to server");  
>                       stmt.close();
>               }catch(HiveSQLException e){
>                       logger.info(String.format("HiveSQLException thrown, 
> this can be valid, " +
>                                       "but check the error: %s from the query 
> %s",query,e.toString()));
>               }catch(SQLException e){
>                       logger.error(String.format("Unable to execute query 
> SQLException %s. Error: %s",query,e));
>               }catch(Exception e){
>                       logger.error(String.format("Unable to execute query %s. 
> Error: %s",query,e));
>               }
>               
>               if(stmt!=null)
>                       try{
>                               stmt.close();
>                       }catch(SQLException e){
>                               logger.error("Cannot close the statment, 
> potentially memory leak "+e);
>                       }
>               
>               return res;
>       }
>       
>       public void close()
>       {
>               if(con!=null){
>                       try {
>                               con.close();
>                       } catch (SQLException e) {                              
>                               logger.info("Problem closing connection "+e);
>                       }
>               }
>       }
>       
>       
>       
> }
> {code}
> And by creating and closing many HiveClient objects. The heap space used by 
> the hiveserver2 runjar process is seen to increase extremely quickly, without 
> such space being released.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to