Hi Pubudu, thank you for bringing this up. this is resolved with the following JIRA [1].
[1] https://wso2.org/jira/browse/DAS-161 On Fri, Sep 18, 2015 at 3:16 PM, Pubudu Gunatilaka <[email protected]> wrote: > Hi Niranda, > > I am running DAS with HBase and getting following error message when > running a spark query. I was able to fix this issue by adding the complete > file path of commons-codec_1.4.0.wso2v1.jar in add-to-spark-classpath.conf > file in the spark folder. Please look into the issue. > > TID: [-1234] [] [2015-09-18 09:36:28,060] INFO > {org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor} - > Executed query: CREATE TEMPORARY TABLE memberstatus USING > CarbonAnalytics OPTIONS (tableName "STREAMING_DATA") > Time Elapsed: 0.548 seconds. > {org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor} > TID: [-1] [] [2015-09-18 09:36:31,979] WARN > {org.apache.spark.scheduler.TaskSetManager} - Lost task 0.0 in stage 0.0 > (TID 0, 10.244.83.30): java.lang.NoClassDefFoundError: > org/apache/commons/codec/binary/Hex > at org.apache.hadoop.hbase.util.MD5Hash.getMD5AsHex(MD5Hash.java:64) > at > org.apache.hadoop.hbase.HRegionInfo.createRegionName(HRegionInfo.java:485) > at > org.apache.hadoop.hbase.HRegionInfo.createRegionName(HRegionInfo.java:415) > at org.apache.hadoop.hbase.HRegionInfo.<init>(HRegionInfo.java:340) > at org.apache.hadoop.hbase.HRegionInfo.convert(HRegionInfo.java:1036) > at org.apache.hadoop.hbase.HRegionInfo.parseFrom(HRegionInfo.java:1106) > at > org.apache.hadoop.hbase.HRegionInfo.parseFromOrNull(HRegionInfo.java:1073) > at > org.apache.hadoop.hbase.MetaTableAccessor.getHRegionInfo(MetaTableAccessor.java:795) > at > org.apache.hadoop.hbase.MetaTableAccessor.getRegionLocations(MetaTableAccessor.java:728) > at > org.apache.hadoop.hbase.MetaTableAccessor$1.visit(MetaTableAccessor.java:342) > at > org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:606) > at > org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:365) > at > org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:281) > at > org.wso2.carbon.analytics.datasource.hbase.HBaseAnalyticsRecordStore.tableExists(HBaseAnalyticsRecordStore.java:136) > at > org.wso2.carbon.analytics.datasource.hbase.HBaseAnalyticsRecordStore.get(HBaseAnalyticsRecordStore.java:352) > at > org.wso2.carbon.analytics.dataservice.core.AnalyticsDataServiceImpl.initIndexedTableStore(AnalyticsDataServiceImpl.java:218) > at > org.wso2.carbon.analytics.dataservice.core.AnalyticsDataServiceImpl.<init>(AnalyticsDataServiceImpl.java:153) > at > org.wso2.carbon.analytics.dataservice.core.AnalyticsServiceHolder.checkAndPopulateCustomAnalyticsDS(AnalyticsServiceHolder.java:75) > at > org.wso2.carbon.analytics.dataservice.core.AnalyticsServiceHolder.getAnalyticsDataService(AnalyticsServiceHolder.java:64) > at > org.wso2.carbon.analytics.spark.core.internal.ServiceHolder.getAnalyticsDataService(ServiceHolder.java:81) > at > org.wso2.carbon.analytics.spark.core.rdd.AnalyticsRDD.compute(AnalyticsRDD.java:82) > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277) > at org.apache.spark.rdd.RDD.iterator(RDD.scala:244) > at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35) > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277) > at org.apache.spark.rdd.RDD.iterator(RDD.scala:244) > at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35) > at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277) > at org.apache.spark.rdd.RDD.iterator(RDD.scala:244) > at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63) > at org.apache.spark.scheduler.Task.run(Task.scala:70) > at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > at java.lang.Thread.run(Thread.java:745) > Caused by: java.lang.ClassNotFoundException: > org.apache.commons.codec.binary.Hex > at java.net.URLClassLoader$1.run(URLClassLoader.java:366) > at java.net.URLClassLoader$1.run(URLClassLoader.java:355) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:354) > at java.lang.ClassLoader.loadClass(ClassLoader.java:425) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) > at java.lang.ClassLoader.loadClass(ClassLoader.java:358) > ... 35 more > {org.apache.spark.scheduler.TaskSetManager} > TID: [-1] [] [2015-09-18 09:36:32,364] ERROR > {org.apache.spark.scheduler.TaskSchedulerImpl} - Lost executor 0 on > 10.244.83.30: remote Rpc client disassociated > {org.apache.spark.scheduler.TaskSchedulerImpl} > TID: [-1] [] [2015-09-18 09:36:32,356] WARN > {akka.remote.ReliableDeliverySupervisor} - Association with remote system > [akka.tcp://[email protected]:13500] has failed, address is now > gated for [5000] ms. Reason is: [Disassociated]. > {akka.remote.ReliableDeliverySupervisor} > TID: [-1] [] [2015-09-18 09:36:32,377] WARN > {org.apache.spark.scheduler.TaskSetManager} - Lost task 0.1 in stage 0.0 > (TID 1, 10.244.83.30): ExecutorLostFailure (executor 0 lost) > {org.apache.spark.scheduler.TaskSetManager} > TID: [-1] [] [2015-09-18 09:36:32,361] WARN > {akka.remote.ReliableDeliverySupervisor} - Association with remote system > [akka.tcp://[email protected]:13500] has failed, address is now > gated for [5000] ms. Reason is: [Disassociated]. > {akka.remote.ReliableDeliverySupervisor} > TID: [-1] [] [2015-09-18 09:36:32,385] ERROR > {org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend} - Asked > to remove non-existent executor 0 > {org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend} > TID: [-1] [] [2015-09-18 09:36:39,569] ERROR > {org.apache.spark.scheduler.TaskSchedulerImpl} - Lost executor 1 on > 10.244.83.30: remote Rpc client disassociated > {org.apache.spark.scheduler.TaskSchedulerImpl} > TID: [-1] [] [2015-09-18 09:36:39,569] WARN > {akka.remote.ReliableDeliverySupervisor} - Association with remote system > [akka.tcp://[email protected]:13500] has failed, address is now > gated for [5000] ms. Reason is: [Disassociated]. > {akka.remote.ReliableDeliverySupervisor} > TID: [-1] [] [2015-09-18 09:36:39,569] WARN > {akka.remote.ReliableDeliverySupervisor} - Association with remote system > [akka.tcp://[email protected]:13500] has failed, address is now > gated for [5000] ms. Reason is: [Disassociated]. > {akka.remote.ReliableDeliverySupervisor} > TID: [-1] [] [2015-09-18 09:36:39,596] WARN > {org.apache.spark.scheduler.TaskSetManager} - Lost task 0.3 in stage 0.0 > (TID 3, 10.244.83.30): ExecutorLostFailure (executor 1 lost) > {org.apache.spark.scheduler.TaskSetManager} > TID: [-1] [] [2015-09-18 09:36:39,598] ERROR > {org.apache.spark.scheduler.TaskSetManager} - Task 0 in stage 0.0 failed 4 > times; aborting job {org.apache.spark.scheduler.TaskSetManager} > TID: [-1234] [] [2015-09-18 09:36:39,614] INFO > {org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor} - > Executed query: SELECT * from memberstatus > Time Elapsed: 11.55 seconds. > {org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor} > TID: [-1234] [] [2015-09-18 09:36:39,614] ERROR > {org.apache.axis2.rpc.receivers.RPCMessageReceiver} - Job aborted due to > stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: > Lost task 0.3 in stage 0.0 (TID 3, 10.244.83.30): ExecutorLostFailure > (executor 1 lost) > Driver stacktrace: {org.apache.axis2.rpc.receivers.RPCMessageReceiver} > java.lang.reflect.InvocationTargetException > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at > org.apache.axis2.rpc.receivers.RPCUtil.invokeServiceClass(RPCUtil.java:212) > at > org.apache.axis2.rpc.receivers.RPCMessageReceiver.invokeBusinessLogic(RPCMessageReceiver.java:117) > at > org.apache.axis2.receivers.AbstractInOutMessageReceiver.invokeBusinessLogic(AbstractInOutMessageReceiver.java:40) > at > org.apache.axis2.receivers.AbstractMessageReceiver.receive(AbstractMessageReceiver.java:110) > at org.apache.axis2.engine.AxisEngine.receive(AxisEngine.java:180) > at > org.apache.axis2.transport.local.LocalTransportReceiver.processMessage(LocalTransportReceiver.java:169) > at > org.apache.axis2.transport.local.LocalTransportReceiver.processMessage(LocalTransportReceiver.java:82) > at > org.wso2.carbon.core.transports.local.CarbonLocalTransportSender.finalizeSendWithToAddress(CarbonLocalTransportSender.java:45) > at > org.apache.axis2.transport.local.LocalTransportSender.invoke(LocalTransportSender.java:77) > at org.apache.axis2.engine.AxisEngine.send(AxisEngine.java:442) > at > org.apache.axis2.description.OutInAxisOperationClient.send(OutInAxisOperation.java:430) > at > org.apache.axis2.description.OutInAxisOperationClient.executeImpl(OutInAxisOperation.java:225) > at > org.apache.axis2.client.OperationClient.execute(OperationClient.java:149) > at > org.wso2.carbon.analytics.spark.admin.stub.AnalyticsProcessorAdminServiceStub.execute(AnalyticsProcessorAdminServiceStub.java:912) > at > org.wso2.carbon.analytics.spark.ui.client.AnalyticsExecutionClient.executeScriptContent(AnalyticsExecutionClient.java:67) > at > org.apache.jsp.spark_002dmanagement.executeScript_005fajaxprocessor_jsp._jspService(executeScript_005fajaxprocessor_jsp.java:110) > at org.apache.jasper.runtime.HttpJspBase.service(HttpJspBase.java:70) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:727) > at > org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:432) > at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:395) > at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:339) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:727) > at org.wso2.carbon.ui.JspServlet.service(JspServlet.java:155) > at org.wso2.carbon.ui.TilesJspServlet.service(TilesJspServlet.java:80) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:727) > at > org.eclipse.equinox.http.helper.ContextPathServletAdaptor.service(ContextPathServletAdaptor.java:37) > at > org.eclipse.equinox.http.servlet.internal.ServletRegistration.service(ServletRegistration.java:61) > at > org.eclipse.equinox.http.servlet.internal.ProxyServlet.processAlias(ProxyServlet.java:128) > at > org.eclipse.equinox.http.servlet.internal.ProxyServlet.service(ProxyServlet.java:68) > at javax.servlet.http.HttpServlet.service(HttpServlet.java:727) > at > org.wso2.carbon.tomcat.ext.servlet.DelegationServlet.service(DelegationServlet.java:68) > at > org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303) > at > org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) > at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) > at > org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) > at > org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) > at > org.wso2.carbon.tomcat.ext.filter.CharacterSetFilter.doFilter(CharacterSetFilter.java:61) > at > org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) > at > org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) > at > org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220) > at > org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122) > at > org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:504) > at > org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:170) > at > org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103) > at > org.wso2.carbon.tomcat.ext.valves.CompositeValve.continueInvocation(CompositeValve.java:99) > at > org.wso2.carbon.tomcat.ext.valves.CarbonTomcatValve$1.invoke(CarbonTomcatValve.java:47) > at > org.wso2.carbon.webapp.mgt.TenantLazyLoaderValve.invoke(TenantLazyLoaderValve.java:57) > at > org.wso2.carbon.event.receiver.core.internal.tenantmgt.TenantLazyLoaderValve.invoke(TenantLazyLoaderValve.java:48) > at > org.wso2.carbon.tomcat.ext.valves.TomcatValveContainer.invokeValves(TomcatValveContainer.java:47) > at > org.wso2.carbon.tomcat.ext.valves.CompositeValve.invoke(CompositeValve.java:62) > at > org.wso2.carbon.tomcat.ext.valves.CarbonStuckThreadDetectionValve.invoke(CarbonStuckThreadDetectionValve.java:159) > at > org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950) > at > org.wso2.carbon.tomcat.ext.valves.CarbonContextCreatorValve.invoke(CarbonContextCreatorValve.java:57) > at > org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116) > at > org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:421) > at > org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1074) > at > org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:611) > at > org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1739) > at > org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1698) > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > at > org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) > at java.lang.Thread.run(Thread.java:745) > Caused by: org.apache.spark.SparkException: Job aborted due to stage > failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task > 0.3 in stage 0.0 (TID 3, 10.244.83.30): ExecutorLostFailure (executor 1 > lost) > Driver stacktrace: > at org.apache.spark.scheduler.DAGScheduler.org > $apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1273) > at > org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1264) > at > org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1263) > at > scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) > at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) > at > org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1263) > at > org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730) > at > org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730) > at scala.Option.foreach(Option.scala:236) > at > org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:730) > at > org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1457) > at > org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1418) > at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) > TID: [-1] [] [2015-09-18 09:36:39,600] ERROR > {org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend} - Asked > to remove non-existent executor 1 > {org.apache.spark.scheduler.cluster.SparkDeploySchedulerBackend} > > > Thank you! > -- > > *Pubudu Gunatilaka* > Software Engineer > WSO2, Inc.: http://wso2.com > lean.enterprise.middleware > mobile: +94 77 4078049 > -- *Niranda Perera* Software Engineer, WSO2 Inc. Mobile: +94-71-554-8430 Twitter: @n1r44 <https://twitter.com/N1R44> https://pythagoreanscript.wordpress.com/
_______________________________________________ Dev mailing list [email protected] http://wso2.org/cgi-bin/mailman/listinfo/dev
