TID: [0] [BAM] [2012-11-04 22:22:08,613] ERROR
{org.apache.hadoop.mapred.MapTask} - IO error in map input file
file:/var/BAM/wso2bam-2.0.1/repository/data/hive/warehouse-1234/appserversta
tsperdaydatafetcher {org.apache.hadoop.mapred.MapTask}
TID: [0] [BAM] [2012-11-04 22:22:08,613] ERROR
{org.apache.hadoop.mapred.MapTask} - IO error in map input file
file:/var/BAM/wso2bam-2.0.1/repository/data/hive/warehouse-1234/appserversta
tsperdaydatafetcher {org.apache.hadoop.mapred.MapTask}
TID: [0] [BAM] [2012-11-04 22:22:08,650] ERROR
{org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBRecordReader} - Failed to
close {org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBRecordReader}
org.h2.jdbc.JdbcSQLException: Database is already closed (to disable
automatic closing at VM shutdown, add ";DB_CLOSE_ON_EXIT=FALSE" to the db
URL) [90121-140]
at
org.h2.message.DbException.getJdbcSQLException(DbException.java:327)
at org.h2.message.DbException.get(DbException.java:167)
at org.h2.message.DbException.get(DbException.java:144)
at org.h2.message.DbException.get(DbException.java:133)
at org.h2.jdbc.JdbcConnection.checkClosed(JdbcConnection.java:1348)
at
org.h2.jdbc.JdbcConnection.checkClosedForWrite(JdbcConnection.java:1333)
at org.h2.jdbc.JdbcConnection.commit(JdbcConnection.java:413)
at
org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBRecordReader.close(DBRecordRea
der.java:175)
at
org.apache.hadoop.hive.ql.io.HiveRecordReader.doClose(HiveRecordReader.java:
50)
at
org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.close(HiveContextA
wareRecordReader.java:96)
at
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.close(MapTask.java:254)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:439)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:371)
at
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:210)
TID: [0] [BAM] [2012-11-04 22:22:08,880] ERROR
{org.apache.hadoop.hive.ql.exec.ExecDriver} - Ended Job = job_local_0001
with errors {org.apache.hadoop.hive.ql.exec.ExecDriver}
TID: [0] [BAM] [2012-11-04 22:22:08,880] ERROR
{org.apache.hadoop.hive.ql.exec.ExecDriver} - Ended Job = job_local_0001
with errors {org.apache.hadoop.hive.ql.exec.ExecDriver}
TID: [0] [BAM] [2012-11-04 22:22:08,884] ERROR
{org.apache.hadoop.hive.ql.exec.ExecDriver} - Error during job, obtaining
debugging information... {org.apache.hadoop.hive.ql.exec.ExecDriver}
TID: [0] [BAM] [2012-11-04 22:22:08,884] ERROR
{org.apache.hadoop.hive.ql.exec.ExecDriver} - Error during job, obtaining
debugging information... {org.apache.hadoop.hive.ql.exec.ExecDriver}
adoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:189)
at
org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.c
all(HiveExecutorServiceImpl.java:325)
at
org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.c
all(HiveExecutorServiceImpl.java:225)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
Source)
at java.lang.Thread.run(Unknown Source)
TID: [0] [BAM] [2012-11-04 22:22:09,557] ERROR
{org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask} - Error while
executing script : service_stats_848
{org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask}
org.wso2.carbon.analytics.hive.exception.HiveExecutionException: Error while
executing Hive script.Query returned non-zero code: 9, cause: FAILED:
Execution Error, return code 2 from
org.apache.hadoop.hive.ql.exec.MapRedTask
at
org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl.execute(HiveExec
utorServiceImpl.java:110)
at
org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask.execute(HiveScrip
tExecutorTask.java:60)
at
org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute(TaskQuartzJobAd
apter.java:56)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown
Source)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
Source)
at java.lang.Thread.run(Unknown Source)
TID: [0] [BAM] [2012-11-04 22:24:00,003] INFO
{org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask} - Running
script executor task for script esb_stats_0. [Sun Nov 04 22:24:00 CST 2012]
{org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask}
TID: [0] [BAM] [2012-11-04 22:24:00,009] INFO
{org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask} - Running
script executor task for script service_stats_848. [Sun Nov 04 22:24:00 CST
2012] {org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask}
TID: [0] [BAM] [2012-11-04 22:24:07,697] INFO
{org.wso2.carbon.cassandra.server.CarbonCassandraAuthenticator} - The key
is not present in the cache...
{org.wso2.carbon.cassandra.server.CarbonCassandraAuthenticator}
TID: [0] [BAM] [2012-11-04 22:25:15,219] ERROR
{org.apache.hadoop.mapred.MapTask} - IO error in map input file
file:/var/BAM/wso2bam-2.0.1/repository/data/hive/warehouse-1234/esbmediation
statsperdaydatafetcher {org.apache.hadoop.mapred.MapTask}
TID: [0] [BAM] [2012-11-04 22:25:15,219] ERROR
{org.apache.hadoop.mapred.MapTask} - IO error in map input file
file:/var/BAM/wso2bam-2.0.1/repository/data/hive/warehouse-1234/esbmediation
statsperdaydatafetcher {org.apache.hadoop.mapred.MapTask}
TID: [0] [BAM] [2012-11-04 22:25:15,224] ERROR
{org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBRecordReader} - Failed to
close {org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBRecordReader}
org.h2.jdbc.JdbcSQLException: Database is already closed (to disable
automatic closing at VM shutdown, add ";DB_CLOSE_ON_EXIT=FALSE" to the db
URL) [90121-140]
at
org.h2.message.DbException.getJdbcSQLException(DbException.java:327)
at org.h2.message.DbException.get(DbException.java:167)
at org.h2.message.DbException.get(DbException.java:144)
at org.h2.message.DbException.get(DbException.java:133)
at org.h2.jdbc.JdbcConnection.checkClosed(JdbcConnection.java:1348)
at
org.h2.jdbc.JdbcConnection.checkClosedForWrite(JdbcConnection.java:1333)
at org.h2.jdbc.JdbcConnection.commit(JdbcConnection.java:413)
at
org.wso2.carbon.hadoop.hive.jdbc.storage.db.DBRecordReader.close(DBRecordRea
der.java:175)
at
org.apache.hadoop.hive.ql.io.HiveRecordReader.doClose(HiveRecordReader.java:
50)
at
org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.close(HiveContextA
wareRecordReader.java:96)
at
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.close(MapTask.java:254)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:439)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:371)
at
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:210)
TID: [0] [BAM] [2012-11-04 22:25:15,606] ERROR
{org.apache.hadoop.hive.ql.exec.ExecDriver} - Ended Job = job_local_0001
with errors {org.apache.hadoop.hive.ql.exec.ExecDriver}
TID: [0] [BAM] [2012-11-04 22:25:15,606] ERROR
{org.apache.hadoop.hive.ql.exec.ExecDriver} - Ended Job = job_local_0001
with errors {org.apache.hadoop.hive.ql.exec.ExecDriver}
TID: [0] [BAM] [2012-11-04 22:25:15,610] ERROR
{org.apache.hadoop.hive.ql.exec.ExecDriver} - Error during job, obtaining
debugging information... {org.apache.hadoop.hive.ql.exec.ExecDriver}
TID: [0] [BAM] [2012-11-04 22:25:15,610] ERROR
{org.apache.hadoop.hive.ql.exec.ExecDriver} - Error during job, obtaining
debugging information... {org.apache.hadoop.hive.ql.exec.ExecDriver}
hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:189)
at
org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.c
all(HiveExecutorServiceImpl.java:325)
at
org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.c
all(HiveExecutorServiceImpl.java:225)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
Source)
at java.lang.Thread.run(Unknown Source)
TID: [0] [BAM] [2012-11-04 22:25:16,271] ERROR
{org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask} - Error while
executing script : esb_stats_0
{org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask}
org.wso2.carbon.analytics.hive.exception.HiveExecutionException: Error while
executing Hive script.Query returned non-zero code: 9, cause: FAILED:
Execution Error, return code 2 from
org.apache.hadoop.hive.ql.exec.MapRedTask
at
org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl.execute(HiveExec
utorServiceImpl.java:110)
at
org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask.execute(HiveScrip
tExecutorTask.java:60)
at
org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute(TaskQuartzJobAd
apter.java:56)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown
Source)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown
Source)
at java.lang.Thread.run(Unknown Source)
TID: [0] [BAM] [2012-11-04 22:27:00,007] INFO
{org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask} - Running
script executor task for script esb_stats_0. [Sun Nov 04 22:27:00 CST 2012]
{org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask}
TID: [0] [BAM] [2012-11-04 22:27:00,009] INFO
{org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask} - Running
script executor task for script service_stats_848. [Sun Nov 04 22:27:00 CST
2012] {org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask}
TID: [0] [BAM] [2012-11-04 22:27:07,936] INFO
{org.wso2.carbon.cassandra.server.CarbonCassandraAuthenticator} - The key
is not present in the cache...
{org.wso2.carbon.cassandra.server.CarbonCassandraAuthenticator}
Saludos,
Ing. Jorge Infante Osorio.
CDAE.
Fac. 5.
UCI.
En un mundo perfecto las pizzas serían una comida saludable, las laptops se
cargarían desde una fuente de corriente inalámbrica y todos los JAR serían
bundles de OSGI
10mo. ANIVERSARIO DE LA CREACION DE LA UNIVERSIDAD DE LAS CIENCIAS
INFORMATICAS...
CONECTADOS AL FUTURO, CONECTADOS A LA REVOLUCION
http://www.uci.cu
http://www.facebook.com/universidad.uci
http://www.flickr.com/photos/universidad_uci
_______________________________________________
Dev mailing list
[email protected]
http://wso2.org/cgi-bin/mailman/listinfo/dev