Build failed in Jenkins: carbondata-master-spark-2.1 #3298

2019-01-06 Thread Apache Jenkins Server
See 


Changes:

[manishgupta88] [CARBONDATA-3223] Fixed Wrong Datasize and Indexsize 
calculation for old

--
[...truncated 9.61 MB...]
at org.scalatest.tools.Runner$.main(Runner.scala:860)
at org.scalatest.tools.Runner.main(Runner.scala)
null
- CarbonReaderExample
2019-01-07 07:51:34 AUDIT audit:72 - {"time":"January 6, 2019 11:51:34 PM 
PST","username":"jenkins","opName":"CREATE 
TABLE","opId":"3483729492157696","opStatus":"START"}
2019-01-07 07:51:34 AUDIT audit:93 - {"time":"January 6, 2019 11:51:34 PM 
PST","username":"jenkins","opName":"CREATE 
TABLE","opId":"3483729492157696","opStatus":"SUCCESS","opTime":"74 
ms","table":"default.hive_carbon_example","extraInfo":{"bad_record_path":"","local_dictionary_enable":"true","external":"false","sort_columns":"","comment":""}}
2019-01-07 07:51:34 AUDIT audit:72 - {"time":"January 6, 2019 11:51:34 PM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3483729583634793","opStatus":"START"}
2019-01-07 07:51:34 AUDIT audit:93 - {"time":"January 6, 2019 11:51:34 PM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3483729583634793","opStatus":"SUCCESS","opTime":"135 
ms","table":"default.hive_carbon_example","extraInfo":{"SegmentId":"0","DataSize":"921.0B","IndexSize":"498.0B"}}
+---+-++
| id| name|  salary|
+---+-++
|  1|  'liang'|20.0|
|  2|'anubhav'| 2.0|
+---+-++

OK
OK
OK
OK
+---++---++--+
| ID|| NAME || SALARY|
+---++---++--+
| 1 || 'liang' || 20.0  |
+---++---++--+
| 2 || 'anubhav' || 2.0   |
+---++---++--+
**Total Number Of Rows Fetched ** 2
OK
+--+
| NAME |
+---++-+
| 'liang'|
+---++-+
| 'anubhav'  |
+---++-+
 ** Total Rows Fetched When Quering The Individual Columns **2
OK
+---++---++--+
| Salary|| ID || NAME|
+---++---++--+
| 20.0 || 1 || 'liang'  |
+---++---++--+
| 2.0 || 2 || 'anubhav'   |
+---++---++--+
 ** Total Rows Fetched When Quering The Out Of Order Columns **2
- HiveExample
Exception encountered when invoking run on a nested suite - Cannot call 
methods on a stopped SparkContext.
This stopped SparkContext was created at:

org.apache.spark.sql.test.util.QueryTest.(QueryTest.scala:115)
org.apache.carbondata.examplesCI.RunExamples.(RunExamples.scala:35)
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
java.lang.reflect.Constructor.newInstance(Constructor.java:423)
java.lang.Class.newInstance(Class.java:442)
org.scalatest.tools.DiscoverySuite$.getSuiteInstance(DiscoverySuite.scala:69)
org.scalatest.tools.DiscoverySuite$$anonfun$1.apply(DiscoverySuite.scala:38)
org.scalatest.tools.DiscoverySuite$$anonfun$1.apply(DiscoverySuite.scala:37)
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
scala.collection.Iterator$class.foreach(Iterator.scala:893)
scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
scala.collection.AbstractIterable.foreach(Iterable.scala:54)
scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
scala.collection.AbstractTraversable.map(Traversable.scala:104)
org.scalatest.tools.DiscoverySuite.(DiscoverySuite.scala:37)
org.scalatest.tools.Runner$.genDiscoSuites$1(Runner.scala:2390)

The currently active SparkContext was created at:

(No active SparkContext.)
  *** ABORTED ***
  java.lang.IllegalStateException: Cannot call methods on a stopped 
SparkContext.
This stopped SparkContext was created at:

org.apache.spark.sql.test.util.QueryTest.(QueryTest.scala:115)
org.apache.carbondata.examplesCI.RunExamples.(RunExamples.scala:35)
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
java.lang.reflect.Constructor.newInstance(Constructor.java:423)
java.lang.Class.newInstance(Class.java:442)

Build failed in Jenkins: carbondata-master-spark-2.1 » Apache CarbonData :: Examples #3298

2019-01-06 Thread Apache Jenkins Server
See 


--
[...truncated 267.67 KB...]
++

time for query on table with lucene datamap table:0.275
time for query on table without lucene datamap table:0.159
+-+-+
|   id| name|
+-+-+
|which test1 good4|who and name7|
|which test1 good4|who and name6|
|which test1 good5|who and name5|
|which test1 good4|who and name7|
|which test1 good4|who and name6|
|which test1 good5|who and name5|
+-+-+

+-+-+
|   id| name|
+-+-+
|which test1 good4|who and name7|
|which test1 good4|who and name6|
|which test1 good5|who and name5|
|which test1 good4|who and name7|
|which test1 good4|who and name6|
|which test1 good5|who and name5|
+-+-+

2019-01-07 07:51:33 AUDIT audit:72 - {"time":"January 6, 2019 11:51:33 PM 
PST","username":"jenkins","opName":"DROP 
TABLE","opId":"3483727789029193","opStatus":"START"}
2019-01-07 07:51:33 AUDIT audit:93 - {"time":"January 6, 2019 11:51:33 PM 
PST","username":"jenkins","opName":"DROP 
TABLE","opId":"3483727789029193","opStatus":"SUCCESS","opTime":"110 
ms","table":"default.persontable","extraInfo":{}}
- LuceneDataMapExample
2019-01-07 07:51:33 AUDIT audit:72 - {"time":"January 6, 2019 11:51:33 PM 
PST","username":"jenkins","opName":"CREATE 
TABLE","opId":"3483727912963201","opStatus":"START"}
2019-01-07 07:51:33 AUDIT audit:93 - {"time":"January 6, 2019 11:51:33 PM 
PST","username":"jenkins","opName":"CREATE 
TABLE","opId":"3483727912963201","opStatus":"SUCCESS","opTime":"71 
ms","table":"default.origin_table","extraInfo":{"bad_record_path":"","local_dictionary_enable":"true","external":"false","sort_columns":"","comment":""}}
2019-01-07 07:51:33 AUDIT audit:72 - {"time":"January 6, 2019 11:51:33 PM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3483727989539608","opStatus":"START"}
2019-01-07 07:51:33 ERROR DataLoadExecutor:55 - Data Load is partially success 
for table origin_table
2019-01-07 07:51:33 AUDIT audit:93 - {"time":"January 6, 2019 11:51:33 PM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3483727989539608","opStatus":"SUCCESS","opTime":"144 
ms","table":"default.origin_table","extraInfo":{"SegmentId":"0","DataSize":"2.85KB","IndexSize":"1.38KB"}}
2019-01-07 07:51:33 AUDIT audit:72 - {"time":"January 6, 2019 11:51:33 PM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3483728139302096","opStatus":"START"}
2019-01-07 07:51:33 ERROR DataLoadExecutor:55 - Data Load is partially success 
for table origin_table
2019-01-07 07:51:33 AUDIT audit:93 - {"time":"January 6, 2019 11:51:33 PM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3483728139302096","opStatus":"SUCCESS","opTime":"164 
ms","table":"default.origin_table","extraInfo":{"SegmentId":"1","DataSize":"2.85KB","IndexSize":"1.38KB"}}
2019-01-07 07:51:33 AUDIT audit:72 - {"time":"January 6, 2019 11:51:33 PM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3483728310575987","opStatus":"START"}
2019-01-07 07:51:33 ERROR DataLoadExecutor:55 - Data Load is partially success 
for table origin_table
2019-01-07 07:51:33 AUDIT audit:93 - {"time":"January 6, 2019 11:51:33 PM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3483728310575987","opStatus":"SUCCESS","opTime":"173 
ms","table":"default.origin_table","extraInfo":{"SegmentId":"2","DataSize":"2.85KB","IndexSize":"1.38KB"}}
2019-01-07 07:51:33 AUDIT audit:72 - {"time":"January 6, 2019 11:51:33 PM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3483728489888238","opStatus":"START"}
2019-01-07 07:51:33 ERROR DataLoadExecutor:55 - Data Load is partially success 
for table origin_table
2019-01-07 07:51:33 AUDIT audit:93 - {"time":"January 6, 2019 11:51:33 PM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3483728489888238","opStatus":"SUCCESS","opTime":"171 
ms","table":"default.origin_table","extraInfo":{"SegmentId":"3","DataSize":"2.85KB","IndexSize":"1.38KB"}}
++
|count(1)|
++
|  40|
++

2019-01-07 07:51:33 AUDIT audit:72 - {"time":"January 6, 2019 11:51:33 PM 
PST","username":"jenkins","opName":"CREATE 
TABLE","opId":"3483728731515952","opStatus":"START"}
2019-01-07 07:51:34 AUDIT audit:93 - {"time":"January 6, 2019 11:51:34 PM 
PST","username":"jenkins","opName":"CREATE 
TABLE","opId":"3483728731515952","opStatus":"SUCCESS","opTime":"62 
ms","table":"default.external_table","extraInfo":{"bad_record_path":"","_filelevelformat":"false","local_dictionary_enable":"true","external":"true","_external":"true","sort_columns":"","comment":""}}
++
|count(1)|
++
|  40|
++

2019-01-07 07:51:34 AUDIT audit:72 - {"time":"January 6, 2019 11:51:34 PM 

Jenkins build is still unstable: carbondata-master-spark-2.1 » Apache CarbonData :: Spark2 #3298

2019-01-06 Thread Apache Jenkins Server
See 




carbondata git commit: [CARBONDATA-3223] Fixed Wrong Datasize and Indexsize calculation for old store using Show Segments

2019-01-06 Thread manishgupta88
Repository: carbondata
Updated Branches:
  refs/heads/master 923dab1b5 -> 72da33495


[CARBONDATA-3223] Fixed Wrong Datasize and Indexsize calculation for old store 
using Show Segments

Problem: Table Created and Loading on older version(1.1) was showing data-size 
and index-size 0B when refreshed on new version. This was
because when the data-size was coming as "null" we were not computing it, 
directly assigning 0 value to it.

Solution: Showing the old datasize and indexsize as NA.

Also refactored SetQuerySegment code for better understandability.

This closes #3047


Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo
Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/72da3349
Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/72da3349
Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/72da3349

Branch: refs/heads/master
Commit: 72da33495362fdbf4cd0e24331ca77a1fab470f6
Parents: 923dab1
Author: manishnalla1994 
Authored: Wed Jan 2 18:00:36 2019 +0530
Committer: manishgupta88 
Committed: Mon Jan 7 11:33:06 2019 +0530

--
 .../hadoop/api/CarbonInputFormat.java   | 25 +++-
 .../org/apache/carbondata/api/CarbonStore.scala |  4 ++--
 .../org/apache/spark/sql/CarbonCountStar.scala  |  2 +-
 3 files changed, 22 insertions(+), 9 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/carbondata/blob/72da3349/hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonInputFormat.java
--
diff --git 
a/hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonInputFormat.java 
b/hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonInputFormat.java
index 24691f2..26144e2 100644
--- 
a/hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonInputFormat.java
+++ 
b/hadoop/src/main/java/org/apache/carbondata/hadoop/api/CarbonInputFormat.java
@@ -277,12 +277,7 @@ m filterExpression
   public static void setQuerySegment(Configuration conf, 
AbsoluteTableIdentifier identifier) {
 String dbName = 
identifier.getCarbonTableIdentifier().getDatabaseName().toLowerCase();
 String tbName = 
identifier.getCarbonTableIdentifier().getTableName().toLowerCase();
-String segmentNumbersFromProperty = CarbonProperties.getInstance()
-.getProperty(CarbonCommonConstants.CARBON_INPUT_SEGMENTS + dbName + 
"." + tbName, "*");
-if (!segmentNumbersFromProperty.trim().equals("*")) {
-  CarbonInputFormat.setSegmentsToAccess(conf,
-  Segment.toSegmentList(segmentNumbersFromProperty.split(","), null));
-}
+getQuerySegmentToAccess(conf, dbName, tbName);
   }
 
   /**
@@ -827,4 +822,22 @@ m filterExpression
 }
 return projectColumns.toArray(new String[projectColumns.size()]);
   }
+
+  private static void getQuerySegmentToAccess(Configuration conf, String 
dbName, String tableName) {
+String segmentNumbersFromProperty = CarbonProperties.getInstance()
+.getProperty(CarbonCommonConstants.CARBON_INPUT_SEGMENTS + dbName + 
"." + tableName, "*");
+if (!segmentNumbersFromProperty.trim().equals("*")) {
+  CarbonInputFormat.setSegmentsToAccess(conf,
+  Segment.toSegmentList(segmentNumbersFromProperty.split(","), null));
+}
+  }
+
+  /**
+   * Set `CARBON_INPUT_SEGMENTS` from property to configuration
+   */
+  public static void setQuerySegment(Configuration conf, CarbonTable 
carbonTable) {
+String tableName = carbonTable.getTableName();
+getQuerySegmentToAccess(conf, carbonTable.getDatabaseName(), tableName);
+  }
+
 }

http://git-wip-us.apache.org/repos/asf/carbondata/blob/72da3349/integration/spark-common/src/main/scala/org/apache/carbondata/api/CarbonStore.scala
--
diff --git 
a/integration/spark-common/src/main/scala/org/apache/carbondata/api/CarbonStore.scala
 
b/integration/spark-common/src/main/scala/org/apache/carbondata/api/CarbonStore.scala
index da9d4c2..11db430 100644
--- 
a/integration/spark-common/src/main/scala/org/apache/carbondata/api/CarbonStore.scala
+++ 
b/integration/spark-common/src/main/scala/org/apache/carbondata/api/CarbonStore.scala
@@ -107,8 +107,8 @@ object CarbonStore {
 (indices.asScala.map(_.getFile_size).sum, 
FileFactory.getCarbonFile(indexPath).getSize)
   } else {
 // for batch segment, we can get the data size from table status 
file directly
-(if (load.getDataSize == null) 0L else load.getDataSize.toLong,
-  if (load.getIndexSize == null) 0L else load.getIndexSize.toLong)
+(if (load.getDataSize == null) -1L else load.getDataSize.toLong,
+  if (load.getIndexSize == null) -1L else load.getIndexSize.toLong)
   }
 
   if (showHistory) {


Build failed in Jenkins: carbondata-master-spark-2.1 » Apache CarbonData :: Examples #3297

2019-01-06 Thread Apache Jenkins Server
See 


--
[...truncated 267.08 KB...]
|which test1 good6|who and name7|
|which test1 good2|who and name4|
|which test1 good0|who and name0|
|which test1 good2|who and name0|
|which test1 good4|who and name7|
|which test1 good2|who and name7|
|which test1 good8|who and name6|
|which test1 good0|who and name6|
|which test1 good3|who and name6|
+-+-+

+-+-+
|   id| name|
+-+-+
|which test1 good6|who and name7|
|which test1 good2|who and name4|
|which test1 good0|who and name0|
|which test1 good2|who and name0|
|which test1 good4|who and name7|
|which test1 good2|who and name7|
|which test1 good8|who and name6|
|which test1 good0|who and name6|
|which test1 good3|who and name6|
|which test1 good6|who and name7|
+-+-+

2019-01-06 09:22:59 AUDIT audit:72 - {"time":"January 6, 2019 1:22:59 AM 
PST","username":"jenkins","opName":"DROP 
TABLE","opId":"3402814375595190","opStatus":"START"}
2019-01-06 09:22:59 AUDIT audit:93 - {"time":"January 6, 2019 1:22:59 AM 
PST","username":"jenkins","opName":"DROP 
TABLE","opId":"3402814375595190","opStatus":"SUCCESS","opTime":"119 
ms","table":"default.persontable","extraInfo":{}}
- LuceneDataMapExample
2019-01-06 09:22:59 AUDIT audit:72 - {"time":"January 6, 2019 1:22:59 AM 
PST","username":"jenkins","opName":"CREATE 
TABLE","opId":"3402814510818183","opStatus":"START"}
2019-01-06 09:22:59 AUDIT audit:93 - {"time":"January 6, 2019 1:22:59 AM 
PST","username":"jenkins","opName":"CREATE 
TABLE","opId":"3402814510818183","opStatus":"SUCCESS","opTime":"66 
ms","table":"default.origin_table","extraInfo":{"bad_record_path":"","local_dictionary_enable":"true","external":"false","sort_columns":"","comment":""}}
2019-01-06 09:22:59 AUDIT audit:72 - {"time":"January 6, 2019 1:22:59 AM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3402814582584365","opStatus":"START"}
2019-01-06 09:22:59 ERROR DataLoadExecutor:55 - Data Load is partially success 
for table origin_table
2019-01-06 09:22:59 AUDIT audit:93 - {"time":"January 6, 2019 1:22:59 AM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3402814582584365","opStatus":"SUCCESS","opTime":"174 
ms","table":"default.origin_table","extraInfo":{"SegmentId":"0","DataSize":"2.85KB","IndexSize":"1.38KB"}}
2019-01-06 09:22:59 AUDIT audit:72 - {"time":"January 6, 2019 1:22:59 AM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3402814763327265","opStatus":"START"}
2019-01-06 09:23:00 ERROR DataLoadExecutor:55 - Data Load is partially success 
for table origin_table
2019-01-06 09:23:00 AUDIT audit:93 - {"time":"January 6, 2019 1:23:00 AM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3402814763327265","opStatus":"SUCCESS","opTime":"179 
ms","table":"default.origin_table","extraInfo":{"SegmentId":"1","DataSize":"2.85KB","IndexSize":"1.38KB"}}
2019-01-06 09:23:00 AUDIT audit:72 - {"time":"January 6, 2019 1:23:00 AM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3402814948772466","opStatus":"START"}
2019-01-06 09:23:00 ERROR DataLoadExecutor:55 - Data Load is partially success 
for table origin_table
2019-01-06 09:23:00 AUDIT audit:93 - {"time":"January 6, 2019 1:23:00 AM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3402814948772466","opStatus":"SUCCESS","opTime":"167 
ms","table":"default.origin_table","extraInfo":{"SegmentId":"2","DataSize":"2.85KB","IndexSize":"1.38KB"}}
2019-01-06 09:23:00 AUDIT audit:72 - {"time":"January 6, 2019 1:23:00 AM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3402815122928463","opStatus":"START"}
2019-01-06 09:23:00 ERROR DataLoadExecutor:55 - Data Load is partially success 
for table origin_table
2019-01-06 09:23:00 AUDIT audit:93 - {"time":"January 6, 2019 1:23:00 AM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3402815122928463","opStatus":"SUCCESS","opTime":"184 
ms","table":"default.origin_table","extraInfo":{"SegmentId":"3","DataSize":"2.85KB","IndexSize":"1.38KB"}}
++
|count(1)|
++
|  40|
++

2019-01-06 09:23:00 AUDIT audit:72 - {"time":"January 6, 2019 1:23:00 AM 
PST","username":"jenkins","opName":"CREATE 
TABLE","opId":"3402815377909665","opStatus":"START"}
2019-01-06 09:23:00 AUDIT audit:93 - {"time":"January 6, 2019 1:23:00 AM 
PST","username":"jenkins","opName":"CREATE 
TABLE","opId":"3402815377909665","opStatus":"SUCCESS","opTime":"65 
ms","table":"default.external_table","extraInfo":{"bad_record_path":"","_filelevelformat":"false","local_dictionary_enable":"true","external":"true","_external":"true","sort_columns":"","comment":""}}
++
|count(1)|
++
|  40|
++

2019-01-06 09:23:00 AUDIT audit:72 - {"time":"January 6, 2019 1:23:00 AM 
PST","username":"jenkins","opName":"LOAD 

Jenkins build is still unstable: carbondata-master-spark-2.1 » Apache CarbonData :: Spark2 #3297

2019-01-06 Thread Apache Jenkins Server
See 




Build failed in Jenkins: carbondata-master-spark-2.1 #3297

2019-01-06 Thread Apache Jenkins Server
See 


--
[...truncated 8.42 MB...]
at 
org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:29)
at org.scalatest.Suite$class.run(Suite.scala:1421)
at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:29)
at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55)
at 
org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2563)
at 
org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$3.apply(Runner.scala:2557)
at scala.collection.immutable.List.foreach(List.scala:381)
at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:2557)
at 
org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1044)
at 
org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1043)
at 
org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:2722)
at 
org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1043)
at org.scalatest.tools.Runner$.main(Runner.scala:860)
at org.scalatest.tools.Runner.main(Runner.scala)
null
- CarbonReaderExample
2019-01-06 09:23:01 AUDIT audit:72 - {"time":"January 6, 2019 1:23:01 AM 
PST","username":"jenkins","opName":"CREATE 
TABLE","opId":"3402816196525200","opStatus":"START"}
2019-01-06 09:23:01 AUDIT audit:93 - {"time":"January 6, 2019 1:23:01 AM 
PST","username":"jenkins","opName":"CREATE 
TABLE","opId":"3402816196525200","opStatus":"SUCCESS","opTime":"81 
ms","table":"default.hive_carbon_example","extraInfo":{"bad_record_path":"","local_dictionary_enable":"true","external":"false","sort_columns":"","comment":""}}
2019-01-06 09:23:01 AUDIT audit:72 - {"time":"January 6, 2019 1:23:01 AM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3402816295182332","opStatus":"START"}
2019-01-06 09:23:01 AUDIT audit:93 - {"time":"January 6, 2019 1:23:01 AM 
PST","username":"jenkins","opName":"LOAD 
DATA","opId":"3402816295182332","opStatus":"SUCCESS","opTime":"153 
ms","table":"default.hive_carbon_example","extraInfo":{"SegmentId":"0","DataSize":"921.0B","IndexSize":"498.0B"}}
+---+-++
| id| name|  salary|
+---+-++
|  1|  'liang'|20.0|
|  2|'anubhav'| 2.0|
+---+-++

OK
OK
OK
OK
+---++---++--+
| ID|| NAME || SALARY|
+---++---++--+
| 1 || 'liang' || 20.0  |
+---++---++--+
| 2 || 'anubhav' || 2.0   |
+---++---++--+
**Total Number Of Rows Fetched ** 2
OK
+--+
| NAME |
+---++-+
| 'liang'|
+---++-+
| 'anubhav'  |
+---++-+
 ** Total Rows Fetched When Quering The Individual Columns **2
OK
+---++---++--+
| Salary|| ID || NAME|
+---++---++--+
| 20.0 || 1 || 'liang'  |
+---++---++--+
| 2.0 || 2 || 'anubhav'   |
+---++---++--+
 ** Total Rows Fetched When Quering The Out Of Order Columns **2
- HiveExample
Exception encountered when invoking run on a nested suite - Cannot call 
methods on a stopped SparkContext.
This stopped SparkContext was created at:

org.apache.spark.sql.test.util.QueryTest.(QueryTest.scala:115)
org.apache.carbondata.examplesCI.RunExamples.(RunExamples.scala:35)
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
java.lang.reflect.Constructor.newInstance(Constructor.java:423)
java.lang.Class.newInstance(Class.java:442)
org.scalatest.tools.DiscoverySuite$.getSuiteInstance(DiscoverySuite.scala:69)
org.scalatest.tools.DiscoverySuite$$anonfun$1.apply(DiscoverySuite.scala:38)
org.scalatest.tools.DiscoverySuite$$anonfun$1.apply(DiscoverySuite.scala:37)
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
scala.collection.Iterator$class.foreach(Iterator.scala:893)
scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
scala.collection.AbstractIterable.foreach(Iterable.scala:54)
scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
scala.collection.AbstractTraversable.map(Traversable.scala:104)
org.scalatest.tools.DiscoverySuite.(DiscoverySuite.scala:37)