[jira] [Commented] (HIVE-2015) Eliminate bogus Datanucleus.Plugin Bundle ERROR log messages

2012-05-23 Thread Carl Steinbach (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-2015?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13281867#comment-13281867
 ] 

Carl Steinbach commented on HIVE-2015:
--

Looks like we could also set datanucleus.plugin.pluginRegistryBundleCheck=NONE 
in HiveConf and hive-default.xml.template:

From 
http://www.datanucleus.org/products/datanucleus/persistence_properties.html:
{noformat}
datanucleus.plugin.pluginRegistryBundleCheck
Description  Defines what happens when plugin bundles are found and are 
duplicated
Range of Values  EXCEPTION | LOG | NONE
{noformat}


 Eliminate bogus Datanucleus.Plugin Bundle ERROR log messages
 

 Key: HIVE-2015
 URL: https://issues.apache.org/jira/browse/HIVE-2015
 Project: Hive
  Issue Type: Bug
  Components: Diagnosability, Metastore
Reporter: Carl Steinbach

 Every time I start up the Hive CLI with logging enabled I'm treated to the 
 following ERROR log messages courtesy of DataNucleus:
 {code}
 DEBUG metastore.ObjectStore: datanucleus.plugin.pluginRegistryBundleCheck = 
 LOG 
 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core requires 
 org.eclipse.core.resources but it cannot be resolved. 
 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core requires 
 org.eclipse.core.runtime but it cannot be resolved. 
 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core requires 
 org.eclipse.text but it cannot be resolved.
 {code}
 Here's where this comes from:
 * The bin/hive scripts cause Hive to inherit Hadoop's classpath.
 * Hadoop's classpath includes $HADOOP_HOME/lib/core-3.1.1.jar, an Eclipse 
 library.
 * core-3.1.1.jar includes a plugin.xml file defining an OSGI plugin
 * At startup, Datanucleus scans the classpath looking for OSGI plugins, and 
 will attempt to initialize any that it finds, including the Eclipse OSGI 
 plugins located in core-3.1.1.jar
 * Initialization of the OSGI plugin in core-3.1.1.jar fails because of 
 unresolved dependencies.
 * We see an ERROR message telling us that Datanucleus failed to initialize a 
 plugin that we don't care about in the first place.
 I can think of two options for solving this problem:
 # Rewrite the scripts in $HIVE_HOME/bin so that they don't inherit ALL of 
 Hadoop's CLASSPATH.
 # Replace DataNucleus's NOnManagedPluginRegistry with our own implementation 
 that does nothing.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Commented] (HIVE-2015) Eliminate bogus Datanucleus.Plugin Bundle ERROR log messages

2012-05-23 Thread Zhenxiao Luo (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-2015?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13281922#comment-13281922
 ] 

Zhenxiao Luo commented on HIVE-2015:


@Carl: I set datanucleus.plugin.pluginRegistryBundleCheck=NONE in both 
HiveConf.java and hive-default.xml. The error message is still there for the 
first cmd in Cli. I am checking other possible solutions to fix it.

 Eliminate bogus Datanucleus.Plugin Bundle ERROR log messages
 

 Key: HIVE-2015
 URL: https://issues.apache.org/jira/browse/HIVE-2015
 Project: Hive
  Issue Type: Bug
  Components: Diagnosability, Metastore
Reporter: Carl Steinbach
Assignee: Zhenxiao Luo

 Every time I start up the Hive CLI with logging enabled I'm treated to the 
 following ERROR log messages courtesy of DataNucleus:
 {code}
 DEBUG metastore.ObjectStore: datanucleus.plugin.pluginRegistryBundleCheck = 
 LOG 
 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core requires 
 org.eclipse.core.resources but it cannot be resolved. 
 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core requires 
 org.eclipse.core.runtime but it cannot be resolved. 
 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core requires 
 org.eclipse.text but it cannot be resolved.
 {code}
 Here's where this comes from:
 * The bin/hive scripts cause Hive to inherit Hadoop's classpath.
 * Hadoop's classpath includes $HADOOP_HOME/lib/core-3.1.1.jar, an Eclipse 
 library.
 * core-3.1.1.jar includes a plugin.xml file defining an OSGI plugin
 * At startup, Datanucleus scans the classpath looking for OSGI plugins, and 
 will attempt to initialize any that it finds, including the Eclipse OSGI 
 plugins located in core-3.1.1.jar
 * Initialization of the OSGI plugin in core-3.1.1.jar fails because of 
 unresolved dependencies.
 * We see an ERROR message telling us that Datanucleus failed to initialize a 
 plugin that we don't care about in the first place.
 I can think of two options for solving this problem:
 # Rewrite the scripts in $HIVE_HOME/bin so that they don't inherit ALL of 
 Hadoop's CLASSPATH.
 # Replace DataNucleus's NOnManagedPluginRegistry with our own implementation 
 that does nothing.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Commented] (HIVE-2015) Eliminate bogus Datanucleus.Plugin Bundle ERROR log messages

2012-05-23 Thread Zhenxiao Luo (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-2015?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13281936#comment-13281936
 ] 

Zhenxiao Luo commented on HIVE-2015:


Setting datanucleus.plugin.pluginRegistryBundleCheck has no effect of this bug.

According to: 
http://www.datanucleus.org/products/datanucleus/persistence_properties.html

datanucleus.plugin.pluginRegistryBundleCheck
Description  Defines what happens when plugin bundles are found and are 
duplicated
Range of Values  EXCEPTION | LOG | NONE

Seems this is set to whether LOG, EXCEPTION, or NONE if the plugin bundles are 
found and are duplicated. If it is not found/unresolved, ERROR still shows up 
in the first CLI command.

Also tested setting it to be EXCEPTION. No exception at all, the same error 
message.

 Eliminate bogus Datanucleus.Plugin Bundle ERROR log messages
 

 Key: HIVE-2015
 URL: https://issues.apache.org/jira/browse/HIVE-2015
 Project: Hive
  Issue Type: Bug
  Components: Diagnosability, Metastore
Reporter: Carl Steinbach
Assignee: Zhenxiao Luo

 Every time I start up the Hive CLI with logging enabled I'm treated to the 
 following ERROR log messages courtesy of DataNucleus:
 {code}
 DEBUG metastore.ObjectStore: datanucleus.plugin.pluginRegistryBundleCheck = 
 LOG 
 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core requires 
 org.eclipse.core.resources but it cannot be resolved. 
 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core requires 
 org.eclipse.core.runtime but it cannot be resolved. 
 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core requires 
 org.eclipse.text but it cannot be resolved.
 {code}
 Here's where this comes from:
 * The bin/hive scripts cause Hive to inherit Hadoop's classpath.
 * Hadoop's classpath includes $HADOOP_HOME/lib/core-3.1.1.jar, an Eclipse 
 library.
 * core-3.1.1.jar includes a plugin.xml file defining an OSGI plugin
 * At startup, Datanucleus scans the classpath looking for OSGI plugins, and 
 will attempt to initialize any that it finds, including the Eclipse OSGI 
 plugins located in core-3.1.1.jar
 * Initialization of the OSGI plugin in core-3.1.1.jar fails because of 
 unresolved dependencies.
 * We see an ERROR message telling us that Datanucleus failed to initialize a 
 plugin that we don't care about in the first place.
 I can think of two options for solving this problem:
 # Rewrite the scripts in $HIVE_HOME/bin so that they don't inherit ALL of 
 Hadoop's CLASSPATH.
 # Replace DataNucleus's NOnManagedPluginRegistry with our own implementation 
 that does nothing.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Commented] (HIVE-2015) Eliminate bogus Datanucleus.Plugin Bundle ERROR log messages

2011-06-24 Thread Carl Steinbach (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-2015?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13054291#comment-13054291
 ] 

Carl Steinbach commented on HIVE-2015:
--

@Andy: Has the DataNucleus 3.0 release gone final yet? If not, do you have any 
idea when this will happen? Is there any chance that this fix will be 
backported to the 2.x series?

 Eliminate bogus Datanucleus.Plugin Bundle ERROR log messages
 

 Key: HIVE-2015
 URL: https://issues.apache.org/jira/browse/HIVE-2015
 Project: Hive
  Issue Type: Bug
  Components: Diagnosability, Metastore
Reporter: Carl Steinbach

 Every time I start up the Hive CLI with logging enabled I'm treated to the 
 following ERROR log messages courtesy of DataNucleus:
 {code}
 DEBUG metastore.ObjectStore: datanucleus.plugin.pluginRegistryBundleCheck = 
 LOG 
 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core requires 
 org.eclipse.core.resources but it cannot be resolved. 
 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core requires 
 org.eclipse.core.runtime but it cannot be resolved. 
 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core requires 
 org.eclipse.text but it cannot be resolved.
 {code}
 Here's where this comes from:
 * The bin/hive scripts cause Hive to inherit Hadoop's classpath.
 * Hadoop's classpath includes $HADOOP_HOME/lib/core-3.1.1.jar, an Eclipse 
 library.
 * core-3.1.1.jar includes a plugin.xml file defining an OSGI plugin
 * At startup, Datanucleus scans the classpath looking for OSGI plugins, and 
 will attempt to initialize any that it finds, including the Eclipse OSGI 
 plugins located in core-3.1.1.jar
 * Initialization of the OSGI plugin in core-3.1.1.jar fails because of 
 unresolved dependencies.
 * We see an ERROR message telling us that Datanucleus failed to initialize a 
 plugin that we don't care about in the first place.
 I can think of two options for solving this problem:
 # Rewrite the scripts in $HIVE_HOME/bin so that they don't inherit ALL of 
 Hadoop's CLASSPATH.
 # Replace DataNucleus's NOnManagedPluginRegistry with our own implementation 
 that does nothing.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Commented] (HIVE-2015) Eliminate bogus Datanucleus.Plugin Bundle ERROR log messages

2011-03-25 Thread Andy Jefferson (JIRA)

[ 
https://issues.apache.org/jira/browse/HIVE-2015?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13011155#comment-13011155
 ] 

Andy Jefferson commented on HIVE-2015:
--

or just use recent DataNucleus (3.0Mx) which, by default, omits checks on OSGi 
dependencies.

PS. if having such issues with third party software i'd expect people to go to 
that third-party software and register an issue there to be able to turn 
something off etc, rather than rely on that projects developers to just happen 
across issues like this in a web trawl.

 Eliminate bogus Datanucleus.Plugin Bundle ERROR log messages
 

 Key: HIVE-2015
 URL: https://issues.apache.org/jira/browse/HIVE-2015
 Project: Hive
  Issue Type: Bug
  Components: Diagnosability, Metastore
Reporter: Carl Steinbach

 Every time I start up the Hive CLI with logging enabled I'm treated to the 
 following ERROR log messages courtesy of DataNucleus:
 {code}
 DEBUG metastore.ObjectStore: datanucleus.plugin.pluginRegistryBundleCheck = 
 LOG 
 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core requires 
 org.eclipse.core.resources but it cannot be resolved. 
 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core requires 
 org.eclipse.core.runtime but it cannot be resolved. 
 ERROR DataNucleus.Plugin: Bundle org.eclipse.jdt.core requires 
 org.eclipse.text but it cannot be resolved.
 {code}
 Here's where this comes from:
 * The bin/hive scripts cause Hive to inherit Hadoop's classpath.
 * Hadoop's classpath includes $HADOOP_HOME/lib/core-3.1.1.jar, an Eclipse 
 library.
 * core-3.1.1.jar includes a plugin.xml file defining an OSGI plugin
 * At startup, Datanucleus scans the classpath looking for OSGI plugins, and 
 will attempt to initialize any that it finds, including the Eclipse OSGI 
 plugins located in core-3.1.1.jar
 * Initialization of the OSGI plugin in core-3.1.1.jar fails because of 
 unresolved dependencies.
 * We see an ERROR message telling us that Datanucleus failed to initialize a 
 plugin that we don't care about in the first place.
 I can think of two options for solving this problem:
 # Rewrite the scripts in $HIVE_HOME/bin so that they don't inherit ALL of 
 Hadoop's CLASSPATH.
 # Replace DataNucleus's NOnManagedPluginRegistry with our own implementation 
 that does nothing.

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira