[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-10-03 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13785651#comment-13785651
 ] 

Hudson commented on HIVE-5296:
--

FAILURE: Integrated in Hive-trunk-hadoop2-ptest #123 (See 
[https://builds.apache.org/job/Hive-trunk-hadoop2-ptest/123/])
HIVE-5296: Memory leak: OOM Error after multiple open/closed JDBC connections.  
(Kousuke Saruta via Thejas Nair) (thejas: 
http://svn.apache.org/viewcvs.cgi/?root=Apache-SVNview=revrev=1528678)
* 
/hive/trunk/service/src/java/org/apache/hive/service/cli/session/HiveSessionImpl.java


 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0, 0.13.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
Assignee: Kousuke Saruta
  Labels: hiveserver
 Fix For: 0.12.0, 0.13.0

 Attachments: HIVE-5296.1.patch, HIVE-5296.2.patch, HIVE-5296.patch, 
 HIVE-5296.patch, HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
   

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-10-03 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13785835#comment-13785835
 ] 

Hudson commented on HIVE-5296:
--

SUCCESS: Integrated in Hive-trunk-hadoop1-ptest #189 (See 
[https://builds.apache.org/job/Hive-trunk-hadoop1-ptest/189/])
HIVE-5296: Memory leak: OOM Error after multiple open/closed JDBC connections.  
(Kousuke Saruta via Thejas Nair) (thejas: 
http://svn.apache.org/viewcvs.cgi/?root=Apache-SVNview=revrev=1528678)
* 
/hive/trunk/service/src/java/org/apache/hive/service/cli/session/HiveSessionImpl.java


 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0, 0.13.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
Assignee: Kousuke Saruta
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.1.patch, HIVE-5296.2.patch, HIVE-5296.patch, 
 HIVE-5296.patch, HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
   

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-10-02 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13784775#comment-13784775
 ] 

Hudson commented on HIVE-5296:
--

FAILURE: Integrated in Hive-branch-0.12-hadoop1 #5 (See 
[https://builds.apache.org/job/Hive-branch-0.12-hadoop1/5/])
HIVE-5296: Memory leak: OOM Error after multiple open/closed JDBC connections.  
(Kousuke Saruta via Thejas Nair) (thejas: 
http://svn.apache.org/viewcvs.cgi/?root=Apache-SVNview=revrev=1528679)
* 
/hive/branches/branch-0.12/service/src/java/org/apache/hive/service/cli/session/HiveSessionImpl.java


 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0, 0.13.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
Assignee: Kousuke Saruta
  Labels: hiveserver
 Fix For: 0.12.0, 0.13.0

 Attachments: HIVE-5296.1.patch, HIVE-5296.2.patch, HIVE-5296.patch, 
 HIVE-5296.patch, HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-10-02 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13784787#comment-13784787
 ] 

Hudson commented on HIVE-5296:
--

FAILURE: Integrated in Hive-trunk-hadoop2 #471 (See 
[https://builds.apache.org/job/Hive-trunk-hadoop2/471/])
HIVE-5296: Memory leak: OOM Error after multiple open/closed JDBC connections.  
(Kousuke Saruta via Thejas Nair) (thejas: 
http://svn.apache.org/viewcvs.cgi/?root=Apache-SVNview=revrev=1528678)
* 
/hive/trunk/service/src/java/org/apache/hive/service/cli/session/HiveSessionImpl.java


 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0, 0.13.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
Assignee: Kousuke Saruta
  Labels: hiveserver
 Fix For: 0.12.0, 0.13.0

 Attachments: HIVE-5296.1.patch, HIVE-5296.2.patch, HIVE-5296.patch, 
 HIVE-5296.patch, HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
   

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-10-02 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13784792#comment-13784792
 ] 

Hudson commented on HIVE-5296:
--

FAILURE: Integrated in Hive-trunk-h0.21 #2375 (See 
[https://builds.apache.org/job/Hive-trunk-h0.21/2375/])
HIVE-5296: Memory leak: OOM Error after multiple open/closed JDBC connections.  
(Kousuke Saruta via Thejas Nair) (thejas: 
http://svn.apache.org/viewcvs.cgi/?root=Apache-SVNview=revrev=1528678)
* 
/hive/trunk/service/src/java/org/apache/hive/service/cli/session/HiveSessionImpl.java


 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0, 0.13.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
Assignee: Kousuke Saruta
  Labels: hiveserver
 Fix For: 0.12.0, 0.13.0

 Attachments: HIVE-5296.1.patch, HIVE-5296.2.patch, HIVE-5296.patch, 
 HIVE-5296.patch, HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
   

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-10-01 Thread Douglas (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13782799#comment-13782799
 ] 

Douglas commented on HIVE-5296:
---

Ok thanks -- In this case, my system was opening a lot of connections with 
fewer file handles ([HIVE-4501|https://issues.apache.org/jira/browse/HIVE-4501] 
is file handle memory leak). Some of these connections/queries also threw 
Exceptions -- which probably exacerbated the problem. I'll watch the other 
issue, but if I see see steady heap usage over time as the total number of 
connections increases, we can mark resolved.

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0, 0.13.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0, 0.13.0

 Attachments: HIVE-5296.1.patch, HIVE-5296.2.patch, HIVE-5296.patch, 
 HIVE-5296.patch, HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void 

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-30 Thread Douglas (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13781985#comment-13781985
 ] 

Douglas commented on HIVE-5296:
---

Once again, thanks for your interest in this. I've pushed the latest patch to 
my production system and will keep an eye on heap space over the next couple of 
days.




 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0, 0.13.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.1.patch, HIVE-5296.2.patch, HIVE-5296.patch, 
 HIVE-5296.patch, HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
   logger.info(Problem closing connection +e);

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-30 Thread Kousuke Saruta (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13782075#comment-13782075
 ] 

Kousuke Saruta commented on HIVE-5296:
--

Hi [~doug.mcilwraith] 
My patch is for one of two causes of memory leak already known.
So, you should also watch HIVE-4501.

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0, 0.13.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.1.patch, HIVE-5296.2.patch, HIVE-5296.patch, 
 HIVE-5296.patch, HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
   logger.info(Problem closing connection +e);
   }
  

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-27 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13779724#comment-13779724
 ] 

Hive QA commented on HIVE-5296:
---



{color:green}Overall{color}: +1 all checks pass

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12605371/HIVE-5296.2.patch

{color:green}SUCCESS:{color} +1 3175 tests passed

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/929/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/929/console

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
{noformat}

This message is automatically generated.

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0, 0.13.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.1.patch, HIVE-5296.2.patch, HIVE-5296.patch, 
 HIVE-5296.patch, HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-27 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13779793#comment-13779793
 ] 

Hive QA commented on HIVE-5296:
---



{color:green}Overall{color}: +1 all checks pass

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12605371/HIVE-5296.2.patch

{color:green}SUCCESS:{color} +1 3176 tests passed

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/931/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/931/console

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
{noformat}

This message is automatically generated.

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0, 0.13.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.1.patch, HIVE-5296.2.patch, HIVE-5296.patch, 
 HIVE-5296.patch, HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-26 Thread Vaibhav Gumashta (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13779075#comment-13779075
 ] 

Vaibhav Gumashta commented on HIVE-5296:


[~k.saruta] I've added a couple of comments in the review board. Thanks!

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0, 0.13.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.1.patch, HIVE-5296.patch, HIVE-5296.patch, 
 HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
   logger.info(Problem closing connection +e);
   }
   }
   }
   
   
   
 }
 

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-26 Thread Vaibhav Gumashta (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13779078#comment-13779078
 ] 

Vaibhav Gumashta commented on HIVE-5296:


cc [~thejas]

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0, 0.13.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.1.patch, HIVE-5296.patch, HIVE-5296.patch, 
 HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
   logger.info(Problem closing connection +e);
   }
   }
   }
   
   
   
 }
 {code}
 And by creating and closing many HiveClient objects. 

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-26 Thread Kousuke Saruta (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13779459#comment-13779459
 ] 

Kousuke Saruta commented on HIVE-5296:
--

[~vgumashta] Thank you for your comment. I will submit next patch soon.

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0, 0.13.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.1.patch, HIVE-5296.patch, HIVE-5296.patch, 
 HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
   logger.info(Problem closing connection +e);
   }
   }
   }
   
   
   
 }
 {code}

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-25 Thread Kousuke Saruta (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13777901#comment-13777901
 ] 

Kousuke Saruta commented on HIVE-5296:
--

The error above may be caused by HIVE-5360. So, I've re-submitted a patch.

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0, 0.13.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.1.patch, HIVE-5296.patch, HIVE-5296.patch, 
 HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
   logger.info(Problem closing connection +e);
   }
   }
   }
   
   
   
 }
 

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-25 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13778126#comment-13778126
 ] 

Hive QA commented on HIVE-5296:
---



{color:green}Overall{color}: +1 all checks pass

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12605075/HIVE-5296.1.patch

{color:green}SUCCESS:{color} +1 3162 tests passed

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/901/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/901/console

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
{noformat}

This message is automatically generated.

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0, 0.13.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.1.patch, HIVE-5296.patch, HIVE-5296.patch, 
 HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-25 Thread Vaibhav Gumashta (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13778253#comment-13778253
 ] 

Vaibhav Gumashta commented on HIVE-5296:


+1 (non-binding).

[~k.saruta] Thanks for catching this!

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0, 0.13.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.1.patch, HIVE-5296.patch, HIVE-5296.patch, 
 HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
   logger.info(Problem closing connection +e);
   }
   }
   }
   
   
   
 }
 {code}
 And by 

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-25 Thread Vaibhav Gumashta (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13778323#comment-13778323
 ] 

Vaibhav Gumashta commented on HIVE-5296:


[~k.saruta] on another look, I have a few comments. I'll post them to RB.

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0, 0.13.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.1.patch, HIVE-5296.patch, HIVE-5296.patch, 
 HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
   logger.info(Problem closing connection +e);
   }
   }
   }
   
   
   
 }
 

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-24 Thread Douglas (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13776290#comment-13776290
 ] 

Douglas commented on HIVE-5296:
---

Thanks everybody for your interest in this. I'll apply the patch in my 
production environment and report back shortly as to whether this resolves the 
original issue.



 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
   logger.info(Problem closing connection +e);
   }
   }
   }
   
  

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-24 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13776722#comment-13776722
 ] 

Hive QA commented on HIVE-5296:
---



{color:red}Overall{color}: -1 at least one tests failed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12604739/HIVE-5296.patch

{color:red}ERROR:{color} -1 due to 4 failed/errored test(s), 3149 tests executed
*Failed tests:*
{noformat}
org.apache.hive.hcatalog.templeton.tool.TestJobIDParser.testParseHive
org.apache.hive.hcatalog.templeton.tool.TestJobIDParser.testParseJar
org.apache.hive.hcatalog.templeton.tool.TestJobIDParser.testParsePig
org.apache.hive.hcatalog.templeton.tool.TestJobIDParser.testParseStreaming
{noformat}

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/871/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/871/console

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests failed with: TestsFailedException: 4 tests failed
{noformat}

This message is automatically generated.

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: 

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-24 Thread Kousuke Saruta (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13777098#comment-13777098
 ] 

Kousuke Saruta commented on HIVE-5296:
--

I found using FileSystem.closeAll is a bad idea and FIleSystem$Cache problem 
will be addressed HIVE-4501 so I try to address another problem that opHandle 
will not be released when Exception occurred during executing query or command.
I'll submit modified patch soon.

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.patch, HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
 

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-23 Thread Kousuke Saruta (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13775564#comment-13775564
 ] 

Kousuke Saruta commented on HIVE-5296:
--

I found the instance of Hashtable$Entry continuing to increase. This is caused 
by two reasons as follows.

1. opHandle wouldn't release if Exception is thrown during executing query or 
command
When Exception is thrown during executing query or command, operation handle 
object will not be released from Map (OperationManager#handleToOperation) 
because opHandleSet.add(opHandle) will not be executed in HiveSessionImpl 
(Hiveserver2 side) and execResp.getOperationHandle() will not be executed in 
HiveStatement (JDBC Client side).

{code}
  public OperationHandle executeStatementInternal()
  throws HiveSQLException {
acquire();
try {
   ExecuteStatementOperation operation = getOperationManager()
  .newExecuteStatementOperation(getSession(), statement, confOverlay, 
runAsync);
  opHandle
  operation.run();  
 --- Throws exception and cannot get 
handle.
  OperationHandle opHandle = operation.getHandle();
  opHandleSet.add(opHandle);
  return opHandle;
} finally {
  release();
}
  }
{code}

{code}
 public boolean execute(String sql) throws SQLException {

...

try {
  closeClientOperation();
  TExecuteStatementReq execReq = new TExecuteStatementReq(sessHandle, sql);
  execReq.setConfOverlay(sessConf);
  TExecuteStatementResp execResp = client.ExecuteStatement(execReq);
  Utils.verifySuccessWithInfo(execResp.getStatus());
 ---  Throws exception and cannot get handle.
  stmtHandle = execResp.getOperationHandle();
} catch (SQLException eS) {
  throw eS;
} catch (Exception ex) {
  throw new SQLException(ex.toString(), 08S01, ex);
}
...
{code}




2. FileSystem$Cache will be increase.
When we call FileSystem#get, FileSystem object is cached in FileSystem$Cache.
Cache is implemented using HashMap and equality of Key is implemented as 
follows.

{code}
  /** {@inheritDoc} */
  public int hashCode() {
return (scheme + authority).hashCode() + ugi.hashCode() + (int)unique;
  }

  static boolean isEqual(Object a, Object b) {
return a == b || (a != null  a.equals(b));
  }

  /** {@inheritDoc} */
  public boolean equals(Object obj) {
if (obj == this) {
  return true;
}
if (obj != null  obj instanceof Key) {
  Key that = (Key)obj;
  return isEqual(this.scheme, that.scheme)
  isEqual(this.authority, that.authority)
  isEqual(this.ugi, that.ugi)
  (this.unique == that.unique);
}
return false;
  }
{code}

Key contains UserGroupInformation and two UserGroupInformation objects are 
equivalent when the Subject objects included in each UserGroupInformation 
object are same (not equivalent).
{code}
  public boolean equals(Object o) {
if (o == this) {
  return true;
} else if (o == null || getClass() != o.getClass()) {
  return false;
} else {
  return subject == ((UserGroupInformation) o).subject;
}
  }
{code}

In Hiveserver2, it will get UserGroupInformation 
UserGroupInformation#createRemoteUser or UserGroupInformation#createProxyUser.
Those methods create new Subject objects so Cache doesn't work.

If FileSystem.closeAll or FileSystem#close method are called, FileSystem object 
will be removed from Cache.

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import 

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-23 Thread Vaibhav Gumashta (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13775734#comment-13775734
 ] 

Vaibhav Gumashta commented on HIVE-5296:


[~k.saruta] Thanks for looking at this. Can you post the patch to review board 
or phabricator as well?

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
   logger.info(Problem closing connection +e);
   }
   }
   }
   
   
   
 }
 {code}
 And by creating and 

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-23 Thread Kousuke Saruta (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13775822#comment-13775822
 ] 

Kousuke Saruta commented on HIVE-5296:
--

Hi, [~vgumashta] I've posted my patch to revoew board.

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
   logger.info(Problem closing connection +e);
   }
   }
   }
   
   
   
 }
 {code}
 And by creating and closing many HiveClient objects. The heap space used by 
 

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-23 Thread Hive QA (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13775935#comment-13775935
 ] 

Hive QA commented on HIVE-5296:
---



{color:red}Overall{color}: -1 no tests executed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12604650/HIVE-5296.patch

Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/860/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/860/console

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Tests failed with: NonZeroExitCodeException: Command 'bash 
/data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and 
output '+ [[ -n '' ]]
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost 
-Dhttp.proxyPort=3128'
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost 
-Dhttp.proxyPort=3128'
+ cd /data/hive-ptest/working/
+ tee /data/hive-ptest/logs/PreCommit-HIVE-Build-860/source-prep.txt
+ mkdir -p maven ivy
+ [[ svn = \s\v\n ]]
+ [[ -n '' ]]
+ [[ -d apache-svn-trunk-source ]]
+ [[ ! -d apache-svn-trunk-source/.svn ]]
+ [[ ! -d apache-svn-trunk-source ]]
+ cd apache-svn-trunk-source
+ svn revert -R .
Reverted 'build-common.xml'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/plan/CreateTableDesc.java'
++ awk '{print $2}'
++ egrep -v '^X|^Performing status on external'
++ svn status --no-ignore
+ rm -rf build data/files/exported_table hcatalog/build hcatalog/core/build 
hcatalog/storage-handlers/hbase/build hcatalog/server-extensions/build 
hcatalog/webhcat/svr/build hcatalog/webhcat/java-client/build 
hcatalog/hcatalog-pig-adapter/build common/src/gen 
ql/src/test/results/clientpositive/import_exported_table.q.out 
ql/src/test/queries/clientpositive/import_exported_table.q
+ svn update

Fetching external item into 'hcatalog/src/test/e2e/harness'
External at revision 1525764.

At revision 1525764.
+ patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hive-ptest/working/scratch/build.patch
+ [[ -f /data/hive-ptest/working/scratch/build.patch ]]
+ chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh
+ /data/hive-ptest/working/scratch/smart-apply-patch.sh 
/data/hive-ptest/working/scratch/build.patch
The patch does not appear to apply with p0 to p2
+ exit 1
'
{noformat}

This message is automatically generated.

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

 Attachments: HIVE-5296.patch

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-19 Thread Kousuke Saruta (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13772541#comment-13772541
 ] 

Kousuke Saruta commented on HIVE-5296:
--

I found a cause of memory leak.
I will illustrate details and show an idea for modification later.

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
   logger.info(Problem closing connection +e);
   }
   }
   }
   
   
   
 }
 {code}
 And by creating and closing many HiveClient objects. The heap space used 

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-17 Thread Douglas (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13769320#comment-13769320
 ] 

Douglas commented on HIVE-5296:
---

Hi Kousuke, 

Thanks for your interest. Here are the answers to your questions:

1) This is most definitely the Hiverserver2 process. I validated this by 
tracking the heap space utilised for the hiveserver2 process over time, as 
connections were opened and closed. 

2) The queries that were being executed were for the most part LOAD DATA 
INPATH:

{code}

int returnCode = hc.update(LOAD DATA INPATH \'+fileName +\'  +
OVERWRITE INTO TABLE +targetTable+ 
partition (dt=\'+cal.getTimeInMillis()+\'));
logger.info(this.getClass()+  returned with value 
+returnCode);   
{code}

These were a mix of successes, and exceptions. I've yet to validate if the leak 
occurs in all instances, or in those cases where the hiveserver throws 
Exceptions.

3) I've not had the time to dig into the hiveserver code as yet to find the 
offending object, but if I do get the chance, I will certainly post my findings 
and a patch.



 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   

[jira] [Commented] (HIVE-5296) Memory leak: OOM Error after multiple open/closed JDBC connections.

2013-09-16 Thread Kousuke Saruta (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-5296?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13768919#comment-13768919
 ] 

Kousuke Saruta commented on HIVE-5296:
--

I have some questions.

1. Which server-side (Hiveserver2 process) or client-side, does the memory leak 
occur?
2. What query did you execute?
3. If you have already grasped, could you tell me which object increase?

 Memory leak: OOM Error after multiple open/closed JDBC connections. 
 

 Key: HIVE-5296
 URL: https://issues.apache.org/jira/browse/HIVE-5296
 Project: Hive
  Issue Type: Bug
  Components: HiveServer2
Affects Versions: 0.12.0
 Environment: Hive 0.12.0, Hadoop 1.1.2, Debian.
Reporter: Douglas
  Labels: hiveserver
 Fix For: 0.12.0

   Original Estimate: 168h
  Remaining Estimate: 168h

 This error seems to relate to https://issues.apache.org/jira/browse/HIVE-3481
 However, on inspection of the related patch and my built version of Hive 
 (patch carried forward to 0.12.0), I am still seeing the described behaviour.
 Multiple connections to Hiveserver2, all of which are closed and disposed of 
 properly show the Java heap size to grow extremely quickly. 
 This issue can be recreated using the following code
 {code}
 import java.sql.DriverManager;
 import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.Properties;
 import org.apache.hive.service.cli.HiveSQLException;
 import org.apache.log4j.Logger;
 /*
  * Class which encapsulates the lifecycle of a query or statement.
  * Provides functionality which allows you to create a connection
  */
 public class HiveClient {
   
   Connection con;
   Logger logger;
   private static String driverName = org.apache.hive.jdbc.HiveDriver;   
   private String db;
   
   
   public HiveClient(String db)
   {   
   logger = Logger.getLogger(HiveClient.class);
   this.db=db;
   
   try{
Class.forName(driverName);
   }catch(ClassNotFoundException e){
   logger.info(Can't find Hive driver);
   }
   
   String hiveHost = GlimmerServer.config.getString(hive/host);
   String hivePort = GlimmerServer.config.getString(hive/port);
   String connectionString = jdbc:hive2://+hiveHost+:+hivePort 
 +/default;
   logger.info(String.format(Attempting to connect to 
 %s,connectionString));
   try{
   con = 
 DriverManager.getConnection(connectionString,,);  
 
   }catch(Exception e){
   logger.error(Problem instantiating the 
 connection+e.getMessage());
   }   
   }
   
   public int update(String query) 
   {
   Integer res = 0;
   Statement stmt = null;
   try{
   stmt = con.createStatement();
   String switchdb = USE +db;
   logger.info(switchdb);  
   stmt.executeUpdate(switchdb);
   logger.info(query);
   res = stmt.executeUpdate(query);
   logger.info(Query passed to server);  
   stmt.close();
   }catch(HiveSQLException e){
   logger.info(String.format(HiveSQLException thrown, 
 this can be valid,  +
   but check the error: %s from the query 
 %s,query,e.toString()));
   }catch(SQLException e){
   logger.error(String.format(Unable to execute query 
 SQLException %s. Error: %s,query,e));
   }catch(Exception e){
   logger.error(String.format(Unable to execute query %s. 
 Error: %s,query,e));
   }
   
   if(stmt!=null)
   try{
   stmt.close();
   }catch(SQLException e){
   logger.error(Cannot close the statment, 
 potentially memory leak +e);
   }
   
   return res;
   }
   
   public void close()
   {
   if(con!=null){
   try {
   con.close();
   } catch (SQLException e) {  
   logger.info(Problem closing connection +e);
   }
   }