Vinod Rohilla created CARBONDATA-922:
----------------------------------------
Summary: Spark Context stopped while executing the load query in
performance testing.
Key: CARBONDATA-922
URL: https://issues.apache.org/jira/browse/CARBONDATA-922
Project: CarbonData
Issue Type: Bug
Components: data-load
Environment: Spark 2.1
Reporter: Vinod Rohilla
Priority: Minor
Attachments: Error Logs
Spark Context stopped while executing the load query in performance testing.
Error Logs:
17/04/12 05:39:52 WARN TransportChannelHandler: Exception in connection from
/88.99.61.24:43216
java.io.IOException: Connection reset by peer
at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:192)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
at
io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:221)
at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:899)
at
io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:275)
at
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:119)
at
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451)
at
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
at
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:745)
17/04/12 05:39:52 ERROR TransportResponseHandler: Still have 1 requests
outstanding when connection from /88.99.61.24:43216 is closed
17/04/12 05:39:52 ERROR YarnSchedulerBackend$YarnSchedulerEndpoint: Sending
RequestExecutors(0,0,Map()) to AM was unsuccessful
java.io.IOException: Connection reset by peer
at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:192)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
at
io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:221)
at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:899)
at
io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:275)
at
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:119)
at
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451)
at
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
at
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:745)
17/04/12 05:39:52 INFO SchedulerExtensionServices: Stopping
SchedulerExtensionServices
(serviceOption=None,
services=List(),
started=false)
17/04/12 05:39:52 ERROR Utils: Uncaught exception in thread Yarn application
state monitor
org.apache.spark.SparkException: Exception thrown in awaitResult
at
org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)
at
org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)
at
scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
at
org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
at
org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167)
at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
at
org.apache.spark.scheduler.cluster.CoarseGrainedSchedulerBackend.requestTotalExecutors(CoarseGrainedSchedulerBackend.scala:512)
at
org.apache.spark.scheduler.cluster.YarnSchedulerBackend.stop(YarnSchedulerBackend.scala:93)
at
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.stop(YarnClientSchedulerBackend.scala:151)
at
org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:467)
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1588)
at
org.apache.spark.SparkContext$$anonfun$stop$8.apply$mcV$sp(SparkContext.scala:1826)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1283)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1825)
at
org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend$MonitorThread.run(YarnClientSchedulerBackend.scala:108)
Caused by: java.io.IOException: Connection reset by peer
at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:192)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
at
io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:221)
at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:899)
at
io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:275)
at
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:119)
at
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:652)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:575)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:489)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:451)
at
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
at
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
at java.lang.Thread.run(Thread.java:745)
17/04/12 05:39:52 INFO MapOutputTrackerMasterEndpoint:
MapOutputTrackerMasterEndpoint stopped!
17/04/12 05:39:52 INFO HdfsFileLock: main Deleted the lock file
hdfs://88.99.61.21:65110/tmp/perfsuite2/carbon.metastore/default/oscon_new_1/meta.lock
17/04/12 05:39:52 INFO LoadTable: main Table MetaData Unlocked Successfully
after data load
%d [%thread] %-5level %logger - %msg%n java.lang.IllegalStateException: Cannot
call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.sql.CarbonSession$CarbonBuilder.getOrCreateCarbonSession(CarbonSession.scala:74)
com.huawei.spark.SessionManager.setCarbonSparkSession(SessionManager.java:34)
com.huawei.spark.SessionManager.<init>(SessionManager.java:23)
com.huawei.utils.Utilities.<init>(Utilities.java:10)
com.huawei.utils.Utilities.getInstance(Utilities.java:18)
com.huawei.performancesuite.StartDataLoadTest.setUp(StartDataLoadTest.java:43)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.junit.internal.runners.MethodRoadie.runBefores(MethodRoadie.java:122)
org.junit.internal.runners.MethodRoadie.runBeforesThenTestThenAfters(MethodRoadie.java:86)
org.junit.internal.runners.MethodRoadie.runTest(MethodRoadie.java:77)
org.junit.internal.runners.MethodRoadie.run(MethodRoadie.java:42)
org.junit.internal.runners.JUnit4ClassRunner.invokeTestMethod(JUnit4ClassRunner.java:88)
org.junit.internal.runners.JUnit4ClassRunner.runMethods(JUnit4ClassRunner.java:51)
org.junit.runners.Parameterized$TestClassRunnerForParameters.run(Parameterized.java:98)
org.junit.internal.runners.CompositeRunner.runChildren(CompositeRunner.java:33)
org.junit.runners.Parameterized.access$000(Parameterized.java:55)
org.junit.runners.Parameterized$1.run(Parameterized.java:131)
The currently active SparkContext was created at:
org.apache.spark.sql.CarbonSession$CarbonBuilder.getOrCreateCarbonSession(CarbonSession.scala:74)
com.huawei.spark.SessionManager.setCarbonSparkSession(SessionManager.java:34)
com.huawei.spark.SessionManager.<init>(SessionManager.java:23)
com.huawei.utils.Utilities.<init>(Utilities.java:10)
com.huawei.utils.Utilities.getInstance(Utilities.java:18)
com.huawei.performancesuite.StartDataLoadTest.setUp(StartDataLoadTest.java:43)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.junit.internal.runners.MethodRoadie.runBefores(MethodRoadie.java:122)
org.junit.internal.runners.MethodRoadie.runBeforesThenTestThenAfters(MethodRoadie.java:86)
org.junit.internal.runners.MethodRoadie.runTest(MethodRoadie.java:77)
org.junit.internal.runners.MethodRoadie.run(MethodRoadie.java:42)
org.junit.internal.runners.JUnit4ClassRunner.invokeTestMethod(JUnit4ClassRunner.java:88)
org.junit.internal.runners.JUnit4ClassRunner.runMethods(JUnit4ClassRunner.java:51)
org.junit.runners.Parameterized$TestClassRunnerForParameters.run(Parameterized.java:98)
org.junit.internal.runners.CompositeRunner.runChildren(CompositeRunner.java:33)
org.junit.runners.Parameterized.access$000(Parameterized.java:55)
org.junit.runners.Parameterized$1.run(Parameterized.java:131)
at
org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:100)
~[spark-core_2.11-2.1.0.jar:2.1.0]
at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1408)
~[spark-core_2.11-2.1.0.jar:2.1.0]
at org.apache.spark.rdd.NewHadoopRDD.<init>(NewHadoopRDD.scala:78)
~[spark-core_2.11-2.1.0.jar:2.1.0]
at
org.apache.carbondata.spark.util.GlobalDictionaryUtil$.loadDataFrame(GlobalDictionaryUtil.scala:381)
~[carbondata_2.11-1.1.0-incubating-SNAPSHOT-shade-hadoop2.2.0.jar:1.1.0-incubating-SNAPSHOT]
at
org.apache.carbondata.spark.util.GlobalDictionaryUtil$$anonfun$8.apply(GlobalDictionaryUtil.scala:718)
~[carbondata_2.11-1.1.0-incubating-SNAPSHOT-shade-hadoop2.2.0.jar:1.1.0-incubating-SNAPSHOT]
at
org.apache.carbondata.spark.util.GlobalDictionaryUtil$$anonfun$8.apply(GlobalDictionaryUtil.scala:718)
~[carbondata_2.11-1.1.0-incubating-SNAPSHOT-shade-hadoop2.2.0.jar:1.1.0-incubating-SNAPSHOT]
at scala.Option.getOrElse(Option.scala:121)
~[scala-library-2.11.8.jar:?]
at
org.apache.carbondata.spark.util.GlobalDictionaryUtil$.generateGlobalDictionary(GlobalDictionaryUtil.scala:718)
~[carbondata_2.11-1.1.0-incubating-SNAPSHOT-shade-hadoop2.2.0.jar:1.1.0-incubating-SNAPSHOT]
at
org.apache.spark.sql.execution.command.LoadTable.run(carbonTableSchema.scala:558)
~[carbondata_2.11-1.1.0-incubating-SNAPSHOT-shade-hadoop2.2.0.jar:2.1.0]
at
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
~[spark-sql_2.11-2.1.0.jar:2.1.0]
at
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
~[spark-sql_2.11-2.1.0.jar:2.1.0]
at
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
~[spark-sql_2.11-2.1.0.jar:2.1.0]
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
~[spark-sql_2.11-2.1.0.jar:2.1.0]
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
~[spark-sql_2.11-2.1.0.jar:2.1.0]
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
~[spark-sql_2.11-2.1.0.jar:2.1.0]
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
~[spark-core_2.11-2.1.0.jar:2.1.0]
at
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
~[spark-sql_2.11-2.1.0.jar:2.1.0]
at
org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
~[spark-sql_2.11-2.1.0.jar:2.1.0]
at
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:87)
~[spark-sql_2.11-2.1.0.jar:2.1.0]
at
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:87)
~[spark-sql_2.11-2.1.0.jar:2.1.0]
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:185)
~[spark-sql_2.11-2.1.0.jar:2.1.0]
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
~[spark-sql_2.11-2.1.0.jar:2.1.0]
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:592)
~[spark-sql_2.11-2.1.0.jar:2.1.0]
at com.huawei.spark.SessionManager.sql(SessionManager.java:42)
~[automation.jar:?]
at
com.huawei.performancesuite.StartDataLoadTest.testDataLoad(StartDataLoadTest.java:64)
[automation.jar:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
~[?:1.8.0_121]
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
~[?:1.8.0_121]
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
~[?:1.8.0_121]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_121]
at org.junit.internal.runners.TestMethod.invoke(TestMethod.java:59)
[automation.jar:?]
at
org.junit.internal.runners.MethodRoadie.runTestMethod(MethodRoadie.java:98)
[automation.jar:?]
at org.junit.internal.runners.MethodRoadie$2.run(MethodRoadie.java:79)
[automation.jar:?]
at
org.junit.internal.runners.MethodRoadie.runBeforesThenTestThenAfters(MethodRoadie.java:87)
[automation.jar:?]
at
org.junit.internal.runners.MethodRoadie.runTest(MethodRoadie.java:77)
[automation.jar:?]
at org.junit.internal.runners.MethodRoadie.run(MethodRoadie.java:42)
[automation.jar:?]
at
org.junit.internal.runners.JUnit4ClassRunner.invokeTestMethod(JUnit4ClassRunner.java:88)
[automation.jar:?]
at
org.junit.internal.runners.JUnit4ClassRunner.runMethods(JUnit4ClassRunner.java:51)
[automation.jar:?]
at
org.junit.runners.Parameterized$TestClassRunnerForParameters.run(Parameterized.java:98)
[automation.jar:?]
at
org.junit.internal.runners.CompositeRunner.runChildren(CompositeRunner.java:33)
[automation.jar:?]
at org.junit.runners.Parameterized.access$000(Parameterized.java:55)
[automation.jar:?]
at org.junit.runners.Parameterized$1.run(Parameterized.java:131)
[automation.jar:?]
at
org.junit.internal.runners.ClassRoadie.runUnprotected(ClassRoadie.java:27)
[automation.jar:?]
at
org.junit.internal.runners.ClassRoadie.runProtected(ClassRoadie.java:37)
[automation.jar:?]
at org.junit.runners.Parameterized.run(Parameterized.java:129)
[automation.jar:?]
at
org.junit.internal.runners.CompositeRunner.runChildren(CompositeRunner.java:33)
[automation.jar:?]
at
org.junit.internal.runners.CompositeRunner.run(CompositeRunner.java:28)
[automation.jar:?]
at org.junit.runner.JUnitCore.run(JUnitCore.java:130) [automation.jar:?]
at org.junit.runner.JUnitCore.run(JUnitCore.java:109) [automation.jar:?]
at org.junit.runner.JUnitCore.run(JUnitCore.java:100) [automation.jar:?]
at org.junit.runner.JUnitCore.runClasses(JUnitCore.java:60)
[automation.jar:?]
at
com.huawei.performancesuite.StartDataLoadSuite.main(StartDataLoadSuite.java:17)
[automation.jar:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
~[?:1.8.0_121]
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
~[?:1.8.0_121]
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
~[?:1.8.0_121]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_121]
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
[spark-core_2.11-2.1.0.jar:2.1.0]
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
[spark-core_2.11-2.1.0.jar:2.1.0]
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
[spark-core_2.11-2.1.0.jar:2.1.0]
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
[spark-core_2.11-2.1.0.jar:2.1.0]
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
[spark-core_2.11-2.1.0.jar:2.1.0]
17/04/12 05:39:52 INFO CarbonSparkSqlParser: Parsing command: load data inpath
'hdfs://hacluster/benchmarks/CarbonData/data/datafile_15.csv' into table
oscon_new_1 options('DELIMITER'=',',
'QUOTECHAR'='"','FILEHEADER'='ACTIVE_AREA_ID, ACTIVE_CHECK_DY,
ACTIVE_CHECK_HOUR, ACTIVE_CHECK_MM, ACTIVE_CHECK_TIME, ACTIVE_CHECK_YR,
ACTIVE_CITY, ACTIVE_COUNTRY, ACTIVE_DISTRICT, ACTIVE_EMUI_VERSION,
ACTIVE_FIRMWARE_VER, ACTIVE_NETWORK, ACTIVE_OS_VERSION, ACTIVE_PROVINCE, BOM,
CHECK_DATE, CHECK_DY, CHECK_HOUR, CHECK_MM, CHECK_YR, CUST_ADDRESS_ID,
CUST_AGE, CUST_BIRTH_COUNTRY, CUST_BIRTH_DY, CUST_BIRTH_MM, CUST_BIRTH_YR,
CUST_BUY_POTENTIAL, CUST_CITY, CUST_STATE, CUST_COUNTRY, CUST_COUNTY,
CUST_EMAIL_ADDR, CUST_LAST_RVW_DATE, CUST_FIRST_NAME, CUST_ID, CUST_JOB_TITLE,
CUST_LAST_NAME, CUST_LOGIN, CUST_NICK_NAME, CUST_PRFRD_FLG, CUST_SEX,
CUST_STREET_NAME, CUST_STREET_NO, CUST_SUITE_NO, CUST_ZIP, DELIVERY_CITY,
DELIVERY_STATE, DELIVERY_COUNTRY, DELIVERY_DISTRICT, DELIVERY_PROVINCE,
DEVICE_NAME, INSIDE_NAME, ITM_BRAND, ITM_BRAND_ID, ITM_CATEGORY,
ITM_CATEGORY_ID, ITM_CLASS, ITM_CLASS_ID, ITM_COLOR, ITM_CONTAINER,
ITM_FORMULATION, ITM_MANAGER_ID, ITM_MANUFACT, ITM_MANUFACT_ID, ITM_ID,
ITM_NAME, ITM_REC_END_DATE, ITM_REC_START_DATE, LATEST_AREAID, LATEST_CHECK_DY,
LATEST_CHECK_HOUR, LATEST_CHECK_MM, LATEST_CHECK_TIME, LATEST_CHECK_YR,
LATEST_CITY, LATEST_COUNTRY, LATEST_DISTRICT, LATEST_EMUI_VERSION,
LATEST_FIRMWARE_VER, LATEST_NETWORK, LATEST_OS_VERSION, LATEST_PROVINCE,
OL_ORDER_DATE, OL_ORDER_NO, OL_RET_ORDER_NO, OL_RET_DATE, OL_SITE,
OL_SITE_DESC, PACKING_DATE, PACKING_DY, PACKING_HOUR, PACKING_LIST_NO,
PACKING_MM, PACKING_YR, PRMTION_ID, PRMTION_NAME, PRM_CHANNEL_CAT,
PRM_CHANNEL_DEMO, PRM_CHANNEL_DETAILS, PRM_CHANNEL_DMAIL, PRM_CHANNEL_EMAIL,
PRM_CHANNEL_EVENT, PRM_CHANNEL_PRESS, PRM_CHANNEL_RADIO, PRM_CHANNEL_TV,
PRM_DSCNT_ACTIVE, PRM_END_DATE, PRM_PURPOSE, PRM_START_DATE, PRODUCT_ID,
PROD_BAR_CODE, PROD_BRAND_NAME, PRODUCT_NAME, PRODUCT_MODEL, PROD_MODEL_ID,
PROD_COLOR, PROD_SHELL_COLOR, PROD_CPU_CLOCK, PROD_IMAGE, PROD_LIVE, PROD_LOC,
PROD_LONG_DESC, PROD_RAM, PROD_ROM, PROD_SERIES, PROD_SHORT_DESC, PROD_THUMB,
PROD_UNQ_DEVICE_ADDR, PROD_UNQ_MDL_ID, PROD_UPDATE_DATE, PROD_UQ_UUID,
SHP_CARRIER, SHP_CODE, SHP_CONTRACT, SHP_MODE_ID, SHP_MODE, STR_ORDER_DATE,
STR_ORDER_NO, TRACKING_NO, WH_CITY, WH_COUNTRY, WH_COUNTY, WH_ID, WH_NAME,
WH_STATE, WH_STREET_NAME, WH_STREET_NO, WH_STREET_TYPE, WH_SUITE_NO, WH_ZIP,
CUST_DEP_COUNT, CUST_VEHICLE_COUNT, CUST_ADDRESS_CNT, CUST_CRNT_CDEMO_CNT,
CUST_CRNT_HDEMO_CNT, CUST_CRNT_ADDR_DM, CUST_FIRST_SHIPTO_CNT,
CUST_FIRST_SALES_CNT, CUST_GMT_OFFSET, CUST_DEMO_CNT, CUST_INCOME,
PROD_UNLIMITED, PROD_OFF_PRICE, PROD_UNITS, TOTAL_PRD_COST, TOTAL_PRD_DISC,
PROD_WEIGHT, REG_UNIT_PRICE, EXTENDED_AMT, UNIT_PRICE_DSCNT_PCT, DSCNT_AMT,
PROD_STD_CST, TOTAL_TX_AMT, FREIGHT_CHRG, WAITING_PERIOD, DELIVERY_PERIOD,
ITM_CRNT_PRICE, ITM_UNITS, ITM_WSLE_CST, ITM_SIZE, PRM_CST,
PRM_RESPONSE_TARGET, PRM_ITM_DM, SHP_MODE_CNT, WH_GMT_OFFSET, WH_SQ_FT,
STR_ORD_QTY, STR_WSLE_CST, STR_LIST_PRICE, STR_SALES_PRICE, STR_EXT_DSCNT_AMT,
STR_EXT_SALES_PRICE, STR_EXT_WSLE_CST, STR_EXT_LIST_PRICE, STR_EXT_TX,
STR_COUPON_AMT, STR_NET_PAID, STR_NET_PAID_INC_TX, STR_NET_PRFT,
STR_SOLD_YR_CNT, STR_SOLD_MM_CNT, STR_SOLD_ITM_CNT, STR_TOTAL_CUST_CNT,
STR_AREA_CNT, STR_DEMO_CNT, STR_OFFER_CNT, STR_PRM_CNT, STR_TICKET_CNT,
STR_NET_PRFT_DM_A, STR_NET_PRFT_DM_B, STR_NET_PRFT_DM_C, STR_NET_PRFT_DM_D,
STR_NET_PRFT_DM_E, STR_RET_STR_ID, STR_RET_REASON_CNT, STR_RET_TICKET_NO,
STR_RTRN_QTY, STR_RTRN_AMT, STR_RTRN_TX, STR_RTRN_AMT_INC_TX, STR_RET_FEE,
STR_RTRN_SHIP_CST, STR_RFNDD_CSH, STR_REVERSED_CHRG, STR_STR_CREDIT,
STR_RET_NET_LOSS, STR_RTRNED_YR_CNT, STR_RTRN_MM_CNT, STR_RET_ITM_CNT,
STR_RET_CUST_CNT, STR_RET_AREA_CNT, STR_RET_OFFER_CNT, STR_RET_PRM_CNT,
STR_RET_NET_LOSS_DM_A, STR_RET_NET_LOSS_DM_B, STR_RET_NET_LOSS_DM_C,
STR_RET_NET_LOSS_DM_D, OL_ORD_QTY, OL_WSLE_CST, OL_LIST_PRICE, OL_SALES_PRICE,
OL_EXT_DSCNT_AMT, OL_EXT_SALES_PRICE, OL_EXT_WSLE_CST, OL_EXT_LIST_PRICE,
OL_EXT_TX, OL_COUPON_AMT, OL_EXT_SHIP_CST, OL_NET_PAID, OL_NET_PAID_INC_TX,
OL_NET_PAID_INC_SHIP, OL_NET_PAID_INC_SHIP_TX, OL_NET_PRFT, OL_SOLD_YR_CNT,
OL_SOLD_MM_CNT, OL_SHIP_DATE_CNT, OL_ITM_CNT, OL_BILL_CUST_CNT,
OL_BILL_AREA_CNT, OL_BILL_DEMO_CNT, OL_BILL_OFFER_CNT, OL_SHIP_CUST_CNT,
OL_SHIP_AREA_CNT, OL_SHIP_DEMO_CNT, OL_SHIP_OFFER_CNT, OL_WEB_PAGE_CNT,
OL_WEB_SITE_CNT, OL_SHIP_MODE_CNT, OL_WH_CNT, OL_PRM_CNT, OL_NET_PRFT_DM_A,
OL_NET_PRFT_DM_B, OL_NET_PRFT_DM_C, OL_NET_PRFT_DM_D, OL_RET_RTRN_QTY,
OL_RTRN_AMT, OL_RTRN_TX, OL_RTRN_AMT_INC_TX, OL_RET_FEE, OL_RTRN_SHIP_CST,
OL_RFNDD_CSH, OL_REVERSED_CHRG, OL_ACCOUNT_CREDIT, OL_RTRNED_YR_CNT,
OL_RTRNED_MM_CNT, OL_RTRITM_CNT, OL_RFNDD_CUST_CNT, OL_RFNDD_AREA_CNT,
OL_RFNDD_DEMO_CNT, OL_RFNDD_OFFER_CNT, OL_RTRNING_CUST_CNT,
OL_RTRNING_AREA_CNT, OL_RTRNING_DEMO_CNT, OL_RTRNING_OFFER_CNT,
OL_RTRWEB_PAGE_CNT, OL_REASON_CNT, OL_NET_LOSS, OL_NET_LOSS_DM_A,
OL_NET_LOSS_DM_B,
OL_NET_LOSS_DM_C','BAD_RECORDS_ACTION'='FORCE','BAD_RECORDS_LOGGER_ENABLE'='FALSE');
17/04/12 05:39:52 INFO MemoryStore: MemoryStore cleared
17/04/12 05:39:52 INFO BlockManager: BlockManager stopped
17/04/12 05:39:52 INFO BlockManagerMaster: BlockManagerMaster stopped
17/04/12 05:39:52 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!
17/04/12 05:39:52 INFO CarbonLateDecodeRule: main Skip CarbonOptimizer
17/04/12 05:39:52 INFO SparkContext: Successfully stopped SparkContext
17/04/12 05:39:52 INFO HdfsFileLock: main HDFS lock
path:hdfs://88.99.61.21:65110/tmp/perfsuite2/carbon.metastore/default/oscon_new_1/meta.lock
17/04/12 05:39:52 INFO LoadTable: main Successfully able to get the table
metadata file lock
17/04/12 05:39:52 INFO LoadTable: main Initiating Direct Load for the Table :
(default.oscon_new_1)
17/04/12 05:39:52 INFO GlobalDictionaryUtil$: main Generate global dictionary
from source data files!
17/04/12 05:39:52 ERROR GlobalDictionaryUtil$: main generate global dictionary
failed
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.sql.CarbonSession$CarbonBuilder.getOrCreateCarbonSession(CarbonSession.scala:74)
com.huawei.spark.SessionManager.setCarbonSparkSession(SessionManager.java:34)
com.huawei.spark.SessionManager.<init>(SessionManager.java:23)
com.huawei.utils.Utilities.<init>(Utilities.java:10)
com.huawei.utils.Utilities.getInstance(Utilities.java:18)
com.huawei.performancesuite.StartDataLoadTest.setUp(StartDataLoadTest.java:43)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.junit.internal.runners.MethodRoadie.runBefores(MethodRoadie.java:122)
org.junit.internal.runners.MethodRoadie.runBeforesThenTestThenAfters(MethodRoadie.java:86)
org.junit.internal.runners.MethodRoadie.runTest(MethodRoadie.java:77)
org.junit.internal.runners.MethodRoadie.run(MethodRoadie.java:42)
org.junit.internal.runners.JUnit4ClassRunner.invokeTestMethod(JUnit4ClassRunner.java:88)
org.junit.internal.runners.JUnit4ClassRunner.runMethods(JUnit4ClassRunner.java:51)
org.junit.runners.Parameterized$TestClassRunnerForParameters.run(Parameterized.java:98)
org.junit.internal.runners.CompositeRunner.runChildren(CompositeRunner.java:33)
org.junit.runners.Parameterized.access$000(Parameterized.java:55)
org.junit.runners.Parameterized$1.run(Parameterized.java:131)
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)