[ 
https://issues.apache.org/jira/browse/HIVE-1536?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12898655#action_12898655
 ] 

Sean Flatley commented on HIVE-1536:
------------------------------------

The log entries can be found at the end of this comment.  I did find in the 
Eclipse console an error entry reporting that JAVA_HOME is not set, so it does 
appear to be configuration:

    [junit] PREHOOK: query: select a,b,c,d,f as e,f*2 from 
testHiveJdbcDriverTable limit 1
    [junit] PREHOOK: type: QUERY
    [junit] PREHOOK: Input: defa...@testhivejdbcdrivertable
    [junit] PREHOOK: Output: 
file:/tmp/sean/hive_2010-08-14_17-32-24_783_1792704952460259887/-mr-10000
    [junit] Total MapReduce jobs = 1
    [junit] Launching Job 1 out of 1
    [junit] Number of reduce tasks is set to 0 since there's no reduce operator
    [junit] Error: JAVA_HOME is not set.
    [junit] FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.MapRedTask
 
I do have JAVA_HOME set and exported in my .bashrc file.

Here is the hive.log file entries.

2010-08-14 17:32:24,929 INFO  ql.Driver (Driver.java:execute(425)) - Starting 
command: select a,b,c,d,f as e,f*2 from testHiveJdbcDriverTable limit 1
2010-08-14 17:32:24,935 ERROR SessionState (SessionState.java:printError(277)) 
- PREHOOK: query: select a,b,c,d,f as e,f*2 from testHiveJdbcDriverTable limit 1
2010-08-14 17:32:24,937 ERROR SessionState (SessionState.java:printError(277)) 
- PREHOOK: type: QUERY
2010-08-14 17:32:24,938 ERROR SessionState (SessionState.java:printError(277)) 
- PREHOOK: Input: defa...@testhivejdbcdrivertable
2010-08-14 17:32:24,939 ERROR SessionState (SessionState.java:printError(277)) 
- PREHOOK: Output: 
file:/tmp/sean/hive_2010-08-14_17-32-24_783_1792704952460259887/-mr-10000
2010-08-14 17:32:24,940 INFO  ql.Driver (SessionState.java:printInfo(268)) - 
Total MapReduce jobs = 1
2010-08-14 17:32:24,942 INFO  ql.Driver (SessionState.java:printInfo(268)) - 
Launching Job 1 out of 1
2010-08-14 17:32:24,947 INFO  exec.MapRedTask 
(SessionState.java:printInfo(268)) - Number of reduce tasks is set to 0 since 
there's no reduce operator
2010-08-14 17:32:24,967 INFO  exec.MapRedTask (MapRedTask.java:execute(152)) - 
Generating plan file 
file:/tmp/sean/hive_2010-08-14_17-32-24_783_1792704952460259887/-local-10002/plan.xml

2010-08-14 17:32:25,588 INFO  exec.MapRedTask (MapRedTask.java:execute(173)) - 
Executing: /home/sean/projects/hive/build/hadoopcore/hadoop-0.20.0/bin/hadoop 
jar /home/sean/projects/hive/build/ql/hive-exec-0.7.0.jar 
org.apache.hadoop.hive.ql.exec.ExecDriver -libjars 
file:///home/sean/projects/hive/build/jdbc/test/test-udfs.jar  -plan 
file:/tmp/sean/hive_2010-08-14_17-32-24_783_1792704952460259887/-local-10002/plan.xml
 -nolog -jobconf datanucleus.connectionPoolingType=DBCP -jobconf 
hive.exec.script.allow.partial.consumption=false -jobconf 
hive.query.id=sean_20100814173232_f49d776c-7273-4995-8cc3-f7aceabdbde3 -jobconf 
hive.hwi.listen.port=9999 -jobconf hive.map.aggr=true -jobconf 
hive.map.aggr.hash.min.reduction=0.5 -jobconf 
datanucleus.plugin.pluginRegistryBundleCheck=LOG -jobconf 
hive.exec.reducers.bytes.per.reducer=1000000000 -jobconf hive.optimize.cp=true 
-jobconf hive.exec.dynamic.partition.mode=strict -jobconf 
hive.merge.size.smallfiles.avgsize=16000000 -jobconf 
datanucleus.cache.level2.type=SOFT -jobconf hive.exec.max.created.files=100000 
-jobconf hive.script.serde=org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe 
-jobconf hive.fileformat.check=true -jobconf 
hive.exec.max.dynamic.partitions.pernode=100 -jobconf 
hive.enforce.sorting=false -jobconf hive.optimize.ppd=true -jobconf 
hive.optimize.groupby=true -jobconf hive.enforce.bucketing=false -jobconf 
javax.jdo.option.ConnectionUserName=APP -jobconf 
hive.mapred.reduce.tasks.speculative.execution=true -jobconf 
mapred.job.name=select+a%2Cb%2Cc%2Cd%2Cf+as+e%2Cf*2+from+testHiveJ...1%28Stage-1%29
 -jobconf javax.jdo.option.DetachAllOnCommit=true -jobconf 
hive.mapred.local.mem=0 -jobconf datanucleus.cache.level2=false -jobconf 
hive.session.id=sean_201008141732 -jobconf 
fs.pfile.impl=org.apache.hadoop.fs.ProxyLocalFileSystem -jobconf 
hive.script.operator.id.env.var=HIVE_SCRIPT_OPERATOR_ID -jobconf 
hive.archive.har.parentdir.settable=false -jobconf 
hadoop.job.ugi=sean%2Csean%2Cadm%2Cdialout%2Ccdrom%2Cplugdev%2Clpadmin%2Cadmin%2Csambashare
 -jobconf test.src.dir=file%3A%2F%2F%24%7Bbuild.dir%7D%2Fsrc%2Ftest -jobconf 
hive.metastore.server.max.threads=100000 -jobconf hive.udtf.auto.progress=false 
-jobconf hive.hwi.war.file=lib%2Fhive-hwi-%40VERSION%40.war -jobconf 
datanucleus.validateTables=false -jobconf hive.exec.compress.output=false 
-jobconf hive.test.mode.prefix=test_ -jobconf 
hive.mapjoin.bucket.cache.size=100 -jobconf 
test.log.dir=%24%7Bbuild.dir%7D%2Ftest%2Flogs -jobconf 
test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf 
datanucleus.validateConstraints=false -jobconf 
hive.metastore.server.tcp.keepalive=true -jobconf mapred.reduce.tasks=-1 
-jobconf 
hive.query.string=select+a%2Cb%2Cc%2Cd%2Cf+as+e%2Cf*2+from+testHiveJdbcDriverTable+limit+1
 -jobconf hive.input.format=org.apache.hadoop.hive.ql.io.HiveInputFormat 
-jobconf hive.task.progress=false -jobconf 
hive.jar.path=%24%7Bbuild.dir.hive%7D%2Fql%2Fhive-exec-%24%7Bversion%7D.jar 
-jobconf hive.metastore.ds.retry.interval=1000 -jobconf 
javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver 
-jobconf hive.skewjoin.mapjoin.map.tasks=10000 -jobconf 
hive.mapjoin.maxsize=100000 -jobconf hive.archive.enabled=false -jobconf 
hive.aux.jars.path=file%3A%2F%2F%2Fhome%2Fsean%2Fprojects%2Fhive%2Fbuild%2Fjdbc%2Ftest%2Ftest-udfs.jar
 -jobconf hive.exec.dynamic.partition=false -jobconf 
hive.exec.pre.hooks=org.apache.hadoop.hive.ql.hooks.PreExecutePrinter%2C+org.apache.hadoop.hive.ql.hooks.EnforceReadOnlyTables
 -jobconf hive.optimize.skewjoin=false -jobconf 
hive.groupby.mapaggr.checkinterval=100000 -jobconf hive.test.mode=false 
-jobconf hive.exec.parallel=false -jobconf 
hive.exec.counters.pull.interval=1000 -jobconf hive.default.fileformat=TextFile 
-jobconf hive.exec.max.dynamic.partitions=1000 -jobconf 
fs.har.impl=org.apache.hadoop.hive.shims.HiveHarFileSystem -jobconf 
hive.test.mode.samplefreq=32 -jobconf hive.metastore.ds.retry.attempts=1 
-jobconf javax.jdo.option.NonTransactionalRead=true -jobconf 
hive.script.auto.progress=false -jobconf hive.merge.mapredfiles=false -jobconf 
javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue
 -jobconf hive.exec.compress.intermediate=false -jobconf 
hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore 
-jobconf hive.map.aggr.hash.percentmemory=0.5 -jobconf 
hive.hwi.listen.host=0.0.0.0 -jobconf 
datanucleus.transactionIsolation=read-committed -jobconf 
hive.merge.size.per.task=256000000 -jobconf datanucleus.autoCreateSchema=true 
-jobconf 
hive.exec.post.hooks=org.apache.hadoop.hive.ql.hooks.PostExecutePrinter 
-jobconf hive.groupby.skewindata=false -jobconf hive.metastore.local=true 
-jobconf hive.skewjoin.mapjoin.min.split=33554432 -jobconf 
hadoop.tmp.dir=%24%7Bbuild.dir.hive%7D%2Ftest%2Fhadoop-%24%7Buser.name%7D 
-jobconf hive.mapred.mode=nonstrict -jobconf hive.optimize.pruner=true -jobconf 
hive.skewjoin.key=100000 -jobconf 
hive.default.partition.name=__HIVE_DEFAULT_PARTITION__ -jobconf 
hive.hbase.wal.enabled=true -jobconf datanucleus.validateColumns=false -jobconf 
datanucleus.identifierFactory=datanucleus -jobconf 
hive.querylog.location=%24%7Bbuild.dir%7D%2Ftmp -jobconf 
hive.optimize.reducededuplication=true -jobconf hive.exec.reducers.max=999 
-jobconf 
javax.jdo.PersistenceManagerFactoryClass=org.datanucleus.jdo.JDOPersistenceManagerFactory
 -jobconf hive.heartbeat.interval=1000 -jobconf hive.join.cache.size=25000 
-jobconf hive.metastore.warehouse.dir=%24%7Btest.warehouse.dir%7D -jobconf 
datanucleus.autoStartMechanismMode=checked -jobconf 
javax.jdo.option.ConnectionPassword=mine -jobconf 
hive.metastore.connect.retries=5 -jobconf hive.exec.mode.local.auto=false 
-jobconf hive.mapjoin.cache.numrows=25000 -jobconf 
hive.exec.parallel.thread.number=8 -jobconf datanucleus.storeManagerType=rdbms 
-jobconf 
hive.script.recordreader=org.apache.hadoop.hive.ql.exec.TextRecordReader 
-jobconf hive.exec.scratchdir=%24%7Bbuild.dir%7D%2Fscratchdir -jobconf 
hive.metastore.metadb.dir=file%3A%2F%2F%24%7Bbuild.dir%7D%2Ftest%2Fdata%2Fmetadb%2F
 -jobconf hive.metastore.server.min.threads=200 -jobconf 
hive.script.recordwriter=org.apache.hadoop.hive.ql.exec.TextRecordWriter 
-jobconf hive.merge.mapfiles=true -jobconf hive.exec.script.maxerrsize=100000 
-jobconf 
test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q
 -jobconf hive.join.emit.interval=1000 -jobconf hive.added.jars.path= -jobconf 
mapred.system.dir=%2Fhome%2Fsean%2Fprojects%2Fhive%2Fbuild%2Ftest%2Fhadoop-sean%2Fmapred%2Fsystem%2F1753935500
 -jobconf 
mapred.local.dir=%2Fhome%2Fsean%2Fprojects%2Fhive%2Fbuild%2Ftest%2Fhadoop-sean%2Fmapred%2Flocal%2F-2115807459

2010-08-14 17:32:25,633 ERROR exec.MapRedTask (MapRedTask.java:execute(224)) - 
Execution failed with exit status: 1
2010-08-14 17:32:25,635 ERROR ql.Driver (SessionState.java:printError(277)) - 
FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.MapRedTask



> Add support for JDBC PreparedStatements
> ---------------------------------------
>
>                 Key: HIVE-1536
>                 URL: https://issues.apache.org/jira/browse/HIVE-1536
>             Project: Hadoop Hive
>          Issue Type: Improvement
>          Components: Drivers
>            Reporter: Sean Flatley
>
> As a result of a Sprint which had us using Pentaho Data Integration with the 
> Hive database we have updated the driver.  Many PreparedStatement methods 
> have been implemented.  A patch will be attached tomorrow with a summary of 
> changes.
> Note:  A checkout of Hive/trunk was performed and the TestJdbcDriver test 
> cased was run.  This was done before any modifications were made to the 
> checked out project.  The testResultSetMetaData failed:
> java.sql.SQLException: Query returned non-zero code: 9, cause: FAILED: 
> Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MapRedTask
>       at 
> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:189)
>       at 
> org.apache.hadoop.hive.jdbc.TestJdbcDriver.testResultSetMetaData(TestJdbcDriver.java:530)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>       at java.lang.reflect.Method.invoke(Method.java:597)
>       at junit.framework.TestCase.runTest(TestCase.java:154)
>       at junit.framework.TestCase.runBare(TestCase.java:127)
>       at junit.framework.TestResult$1.protect(TestResult.java:106)
>       at junit.framework.TestResult.runProtected(TestResult.java:124)
>       at junit.framework.TestResult.run(TestResult.java:109)
>       at junit.framework.TestCase.run(TestCase.java:118)
>       at junit.framework.TestSuite.runTest(TestSuite.java:208)
>       at junit.framework.TestSuite.run(TestSuite.java:203)
>       at 
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
>       at 
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
>       at 
> org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
> A co-worker did the same and the tests passed.  Both environments were Ubuntu 
> and Hadoop version 0.20.2.
> Tests added to the TestJdbcDriver by us were successful.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to