[GitHub] carbondata issue #1934: [CARBONDATA-2133] Fixed Exception displays after per...

2018-02-13 Thread anubhav100
Github user anubhav100 commented on the issue:

https://github.com/apache/carbondata/pull/1934
  
retest sdv please


---


[GitHub] carbondata issue #1934: [CARBONDATA-2133] Fixed Exception displays after per...

2018-02-13 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/1934
  
SDV Build Fail , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/3541/



---


[GitHub] carbondata issue #1978: [CARBONDATA-2182]added one more params called extraP...

2018-02-13 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1978
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/2503/



---


[GitHub] carbondata issue #1978: [CARBONDATA-2182]added one more params called extraP...

2018-02-13 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1978
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/3743/



---


[GitHub] carbondata issue #1934: [CARBONDATA-2133] Fixed Exception displays after per...

2018-02-13 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1934
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/2502/



---


[GitHub] carbondata issue #1978: [CARBONDATA-2182]added one more params called extraP...

2018-02-13 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/1978
  
SDV Build Success , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/3540/



---


[GitHub] carbondata issue #1934: [CARBONDATA-2133] Fixed Exception displays after per...

2018-02-13 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1934
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/3742/



---


[GitHub] carbondata issue #1934: [CARBONDATA-2133] Fixed Exception displays after per...

2018-02-13 Thread anubhav100
Github user anubhav100 commented on the issue:

https://github.com/apache/carbondata/pull/1934
  
@manishgupta88 i have done changes please check


---


[GitHub] carbondata pull request #1934: [CARBONDATA-2133] Fixed Exception displays af...

2018-02-13 Thread anubhav100
Github user anubhav100 commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1934#discussion_r168088520
  
--- Diff: 
integration/spark2/src/test/scala/org/apache/spark/carbondata/restructure/AlterTableValidationTestCase.scala
 ---
@@ -544,6 +546,13 @@ class AlterTableValidationTestCase extends 
Spark2QueryTest with BeforeAndAfterAl
 sql("drop table if exists restructure1")
 sql("drop table if exists restructure")
   }
+test("test alter command for boolean data type with correct default 
measure value") {
+  sql("create table testalterwithboolean(id int,name string) stored by 
'carbondata' ")
+  sql("insert into testalterwithboolean values(1,'anubhav')  ")
+  sql(
+"alter table testalterwithboolean add columns(booleanfield boolean) 
tblproperties('default.value.booleanfield'='true')")
+  checkAnswer(sql("select * from 
testalterwithboolean"),Seq(Row(1,"anubhav",true)))
+}
--- End diff --

done


---


[GitHub] carbondata pull request #1939: [CARBONDATA-2139] Optimize CTAS documentation...

2018-02-13 Thread manishgupta88
Github user manishgupta88 commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1939#discussion_r168085698
  
--- Diff: 
integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/createTable/TestCreateTableAsSelect.scala
 ---
@@ -170,10 +175,277 @@ class TestCreateTableAsSelect extends QueryTest with 
BeforeAndAfterAll {
 }
   }
 
+  test("test create table as select with where clause in select from 
parquet table that does not return data") {
+sql("DROP TABLE IF EXISTS ctas_select_where_parquet")
+sql(
+  """
+| CREATE TABLE ctas_select_where_parquet
+| STORED BY 'carbondata'
+| as select * FROM parquet_ctas_test
+| where key=300""".stripMargin)
+checkAnswer(sql("SELECT * FROM ctas_select_where_parquet"),
+  sql("SELECT * FROM parquet_ctas_test where key=300"))
+  }
+
+  test("test create table as select with where clause in select from 
hive/orc table that does not return data") {
+sql("DROP TABLE IF EXISTS ctas_select_where_orc")
+sql(
+  """
+| CREATE TABLE ctas_select_where_orc
+| STORED BY 'carbondata'
+| AS SELECT * FROM orc_ctas_test
+| where key=300""".stripMargin)
+checkAnswer(sql("SELECT * FROM ctas_select_where_orc"),
+  sql("SELECT * FROM orc_ctas_test where key=300"))
+  }
+
+  test("test create table as select with select from same carbon table 
name with if not exists clause") {
+sql("drop table if exists ctas_same_table_name")
+sql("CREATE TABLE ctas_same_table_name(key INT, value STRING) STORED 
BY 'carbondata'")
+checkExistence(sql("SHOW TABLES"), true, "ctas_same_table_name")
+sql(
+  """
+| CREATE TABLE IF NOT EXISTS ctas_same_table_name
+| STORED BY 'carbondata'
+| AS SELECT * FROM ctas_same_table_name
+  """.stripMargin)
+intercept[Exception] {
+  sql(
+"""
+  | CREATE TABLE ctas_same_table_name
+  | STORED BY 'carbondata'
+  | AS SELECT * FROM ctas_same_table_name
+""".stripMargin)
+}
+  }
+
+  test("test create table as select with select from same carbon table 
name with if not exists clause and source table not exists") {
+sql("DROP TABLE IF EXISTS ctas_same_table_name")
+checkExistence(sql("SHOW TABLES"), false, "ctas_same_table_name")
+intercept[Exception] {
+  sql(
+"""
+  | CREATE TABLE IF NOT EXISTS ctas_same_table_name
+  | STORED BY 'carbondata'
+  | AS SELECT * FROM ctas_same_table_name
+""".stripMargin)
+}
+  }
+
+  test("test create table as select with select from same carbon table 
name with if not exists clause and source table exists") {
+sql("DROP TABLE IF EXISTS ctas_same_table_name")
+sql("DROP TABLE IF EXISTS ctas_if_table_name")
+sql("CREATE TABLE ctas_same_table_name(key INT, value STRING) STORED 
BY 'carbondata'")
+sql(
+  """
+| CREATE TABLE IF NOT EXISTS ctas_if_table_name
+| STORED BY 'carbondata'
+| AS SELECT * FROM ctas_same_table_name
+  """.stripMargin)
+checkExistence(sql("show tables"), true, "ctas_if_table_name")
+  }
+
+  test("add example for documentation") {
+sql("DROP TABLE IF EXISTS target_table")
+sql("DROP TABLE IF EXISTS source_table")
+// create carbon table and insert data
+sql(
+  """
+| CREATE TABLE source_table(
+| id INT,
+| name STRING,
+| city STRING,
+| age INT)
+| STORED AS parquet
+| """.stripMargin)
+sql("INSERT INTO source_table SELECT 1,'bob','shenzhen',27")
+sql("INSERT INTO source_table SELECT 2,'david','shenzhen',31")
+sql(
+  """
+| CREATE TABLE target_table
+| STORED BY 'carbondata'
+| AS
+|   SELECT city,avg(age) FROM source_table group by city
+  """.stripMargin)
+// results:
+//sql("SELECT * FROM target_table").show
+//+++
+//|city|avg(age)|
+//+++
+//|shenzhen|29.0|
+//+++
+checkAnswer(sql("SELECT * FROM target_table"), Seq(Row("shenzhen", 
29)))
+  }
+
+  test("test create table as select with sum,count,min,max") {
+sql("DROP TABLE IF EXISTS target_table")
+sql("DROP TABLE IF EXISTS source_table")
+// create carbon table and insert data
+sql(
+  """
+| CREATE TABLE source_table(
+| id INT,
+ 

[GitHub] carbondata pull request #1939: [CARBONDATA-2139] Optimize CTAS documentation...

2018-02-13 Thread manishgupta88
Github user manishgupta88 commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1939#discussion_r168085120
  
--- Diff: .gitignore ---
@@ -15,4 +15,5 @@ target/
 .project
 .classpath
 metastore_db/
-derby.log
\ No newline at end of file
+derby.log
+integration/spark-common-test/src/test/resources/Data
--- End diff --

Remove this file..it is not required to be committed


---


[GitHub] carbondata issue #1978: [CARBONDATA-2182]added one more params called extraP...

2018-02-13 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1978
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/2501/



---


[GitHub] carbondata pull request #1939: [CARBONDATA-2139] Optimize CTAS documentation...

2018-02-13 Thread manishgupta88
Github user manishgupta88 commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1939#discussion_r168085211
  
--- Diff: 
integration/spark-common-test/src/test/scala/org/apache/carbondata/spark/testsuite/createTable/TestCreateTableAsSelect.scala
 ---
@@ -53,6 +55,9 @@ class TestCreateTableAsSelect extends QueryTest with 
BeforeAndAfterAll {
 sql("DROP TABLE IF EXISTS parquet_ctas_test")
 sql("DROP TABLE IF EXISTS orc_ctas_test")
 createTablesAndInsertData
+CarbonProperties.getInstance().
+  addProperty(CarbonCommonConstants.COMPACTION_SEGMENT_LEVEL_THRESHOLD,
+CarbonCommonConstants.DEFAULT_SEGMENT_LEVEL_THRESHOLD)
--- End diff --

I could not find any test case for compaction. Please remove this property 
as it is required only for compaction case


---


[GitHub] carbondata issue #1978: [CARBONDATA-2182]added one more params called extraP...

2018-02-13 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1978
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/3741/



---


[GitHub] carbondata issue #1934: [CARBONDATA-2133] Fixed Exception displays after per...

2018-02-13 Thread manishgupta88
Github user manishgupta88 commented on the issue:

https://github.com/apache/carbondata/pull/1934
  
@anubhav100 ... In your PR description above no need to mention about the 
code details...that will be reviewed as part of your PRkindly remove the 
code details from PR description and follow the PR description template


---


[GitHub] carbondata pull request #1934: [CARBONDATA-2133] Fixed Exception displays af...

2018-02-13 Thread manishgupta88
Github user manishgupta88 commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1934#discussion_r168083518
  
--- Diff: 
integration/spark2/src/test/scala/org/apache/spark/carbondata/restructure/AlterTableValidationTestCase.scala
 ---
@@ -544,6 +546,13 @@ class AlterTableValidationTestCase extends 
Spark2QueryTest with BeforeAndAfterAl
 sql("drop table if exists restructure1")
 sql("drop table if exists restructure")
   }
+test("test alter command for boolean data type with correct default 
measure value") {
+  sql("create table testalterwithboolean(id int,name string) stored by 
'carbondata' ")
+  sql("insert into testalterwithboolean values(1,'anubhav')  ")
+  sql(
+"alter table testalterwithboolean add columns(booleanfield boolean) 
tblproperties('default.value.booleanfield'='true')")
+  checkAnswer(sql("select * from 
testalterwithboolean"),Seq(Row(1,"anubhav",true)))
+}
--- End diff --

Please add one more test case for boolean type where the default value is 
not provided


---


[GitHub] carbondata pull request #1934: [CARBONDATA-2133] Fixed Exception displays af...

2018-02-13 Thread manishgupta88
Github user manishgupta88 commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/1934#discussion_r168083426
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java ---
@@ -720,21 +721,22 @@ private static String parseStringToBigDecimal(String 
value, ColumnSchema columnS
 return null;
   }
 
+  public static DataTypeConverter getDataTypeConverter() {
+if (converter == null) {
+  converter = new DataTypeConverterImpl();
+}
+return converter;
+  }
+
--- End diff --

This change does not belong to this PR...the code is already present in the 
same class...kindly remove all the changes that does not belong to this PR


---


[GitHub] carbondata issue #1973: [CARBONDATA-2163][CARBONDATA-2164] Remove spark depe...

2018-02-13 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1973
  
Build Failed with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/2500/



---


[GitHub] carbondata issue #1973: [CARBONDATA-2163][CARBONDATA-2164] Remove spark depe...

2018-02-13 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/1973
  
SDV Build Fail , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/3539/



---


[GitHub] carbondata issue #1973: [CARBONDATA-2163][CARBONDATA-2164] Remove spark depe...

2018-02-13 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1973
  
Build Failed  with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/3740/



---


[GitHub] carbondata issue #1978: [CARBONDATA-2182]added one more params called extraP...

2018-02-13 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/1978
  
SDV Build Success , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/3538/



---


[GitHub] carbondata issue #1978: [CARBONDATA-2182]added one more params called extraP...

2018-02-13 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1978
  
Build Success with Spark 2.2.1, Please check CI 
http://88.99.58.216:8080/job/ApacheCarbonPRBuilder/2499/



---


[GitHub] carbondata issue #1978: [CARBONDATA-2182]added one more params called extraP...

2018-02-13 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/1978
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder1/3739/



---


[GitHub] carbondata pull request #1978: [CARBONDATA-2182]added one more params called...

2018-02-13 Thread akashrn5
GitHub user akashrn5 opened a pull request:

https://github.com/apache/carbondata/pull/1978

[CARBONDATA-2182]added one more params called extraParams in SessionParams 
and add carbonSessionInfo to CarbonEnvInitPreEvent

add one more param called ExtraParmas in SessionParams for session Level 
operations and pass the carbonSessionInfo to event, 
so that user can save information in that at session level, in 
carbonSessionInfo


Be sure to do all of the following checklist to help us incorporate 
your contribution quickly and easily:

 - [x] Any interfaces changed?
 NA
 - [x] Any backward compatibility impacted?
 NA
 - [x] Document update required?
NA
 - [x] Testing done
manual testing
Please provide details on 
- Whether new unit test cases have been added or why no new tests 
are required?
- How it is tested? Please attach test report.
- Is it a performance related change? Please attach the performance 
test report.
- Any additional information to help reviewers in testing this 
change.
   
 - [x] For large changes, please consider breaking it into sub-tasks under 
an umbrella JIRA. 
NA


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/akashrn5/incubator-carbondata extra

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/carbondata/pull/1978.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1978


commit adfdad79debe8990a5de6b2d5c63b970a360cd0a
Author: akashrn5 
Date:   2018-02-13T12:29:47Z

added one more params called extraParams in SessionParams




---


[jira] [Created] (CARBONDATA-2182) add one more param called ExtraParmas in SessionParams for session Level operations

2018-02-13 Thread Akash R Nilugal (JIRA)
Akash R Nilugal created CARBONDATA-2182:
---

 Summary: add one more param called ExtraParmas in SessionParams 
for session Level operations
 Key: CARBONDATA-2182
 URL: https://issues.apache.org/jira/browse/CARBONDATA-2182
 Project: CarbonData
  Issue Type: Bug
Reporter: Akash R Nilugal
Assignee: Akash R Nilugal


add one more param called ExtraParmas in SessionParams for session Level 
operations



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] carbondata issue #1971: [CARBONDATA-2161]Compacted Segment of Streaming Tabl...

2018-02-13 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/1971
  
SDV Build Success , Please check CI 
http://144.76.159.231:8080/job/ApacheSDVTests/3537/



---


[jira] [Closed] (CARBONDATA-2002) Streaming segment status is not getting updated to finished or success

2018-02-13 Thread Geetika Gupta (JIRA)

 [ 
https://issues.apache.org/jira/browse/CARBONDATA-2002?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Geetika Gupta closed CARBONDATA-2002.
-
Resolution: Fixed

> Streaming segment status is not getting updated to finished or success
> --
>
> Key: CARBONDATA-2002
> URL: https://issues.apache.org/jira/browse/CARBONDATA-2002
> Project: CarbonData
>  Issue Type: Bug
>  Components: data-load
>Affects Versions: 1.3.0
> Environment: spark2.1
>Reporter: Geetika Gupta
>Priority: Major
> Fix For: 1.4.0
>
> Attachments: 2000_UniqData.csv
>
>
> I created a streaming table and loaded data into it using the following 
> commands on spark shell:
> import org.apache.spark.sql.SparkSession
> import org.apache.spark.sql.CarbonSession._
> import org.apache.carbondata.core.util.CarbonProperties
> import org.apache.spark.sql.streaming.{ProcessingTime, StreamingQuery}
> val carbon = SparkSession.builder().config(sc.getConf) 
> .getOrCreateCarbonSession("hdfs://localhost:54311/newCarbonStore","/tmp")
> import org.apache.carbondata.core.constants.CarbonCommonConstants
> import org.apache.carbondata.core.util.CarbonProperties
> CarbonProperties.getInstance().addProperty(CarbonCommonConstants.CARBON_BAD_RECORDS_ACTION,
>  "FORCE")
> carbon.sql("drop table if exists uniqdata_stream")
> carbon.sql("create table uniqdata_stream(CUST_ID int,CUST_NAME String,DOB 
> timestamp,DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 
> bigint,DECIMAL_COLUMN1 decimal(30,10),DECIMAL_COLUMN2 
> decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 
> int) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES 
> ('TABLE_BLOCKSIZE'= '256 MB', 'streaming'='true')");
> import carbon.sqlContext.implicits._
> import org.apache.spark.sql.types._
> val uniqdataSch = StructType(
> Array(StructField("CUST_ID", IntegerType),StructField("CUST_NAME", 
> StringType),StructField("DOB", TimestampType), StructField("DOJ", 
> TimestampType), StructField("BIGINT_COLUMN1", LongType), 
> StructField("BIGINT_COLUMN2", LongType), StructField("DECIMAL_COLUMN1", 
> org.apache.spark.sql.types.DecimalType(30, 10)), 
> StructField("DECIMAL_COLUMN2", 
> org.apache.spark.sql.types.DecimalType(36,10)), StructField("Double_COLUMN1", 
> DoubleType), StructField("Double_COLUMN2", DoubleType), 
> StructField("INTEGER_COLUMN1", IntegerType)))
> val streamDf = carbon.readStream
> .schema(uniqdataSch)
> .option("sep", ",")
> .csv("file:///home/geetika/Downloads/uniqdata")
> val qry = streamDf.writeStream.format("carbondata").trigger(ProcessingTime("5 
> seconds"))
>  .option("checkpointLocation","/stream/uniq")
> .option("dbName", "default")
> .option("tableName", "uniqdata_stream")
> .start()
>   qry.awaitTermination()
> //Press ctrl+c to terminate
> start the spark shell again
>  import org.apache.spark.sql.SparkSession
> import org.apache.spark.sql.CarbonSession._
> val carbon = SparkSession.builder().config(sc.getConf) 
> .getOrCreateCarbonSession("hdfs://localhost:54311/newCarbonStore","/tmp")
> carbon.sql("show segments for table uniqdata_stream").show
> It shows the following output:
> +-+-++-+-+---+
> |SegmentSequenceId|   Status| Load Start Time|Load End Time|Merged 
> To|File Format|
> +-+-++-+-+---+
> |0|Streaming|2018-01-05 18:23:...| null|   NA|
>  ROW_V1|
> +-+-++-+-+---+
> Status for the segment is not updated



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (CARBONDATA-2002) Streaming segment status is not getting updated to finished or success

2018-02-13 Thread Geetika Gupta (JIRA)

[ 
https://issues.apache.org/jira/browse/CARBONDATA-2002?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16361973#comment-16361973
 ] 

Geetika Gupta commented on CARBONDATA-2002:
---

As per the documentation: [http://carbondata.apache.org/streaming-guide.html] , 
the streaming segment is updated once it reaches the max segment size. So this 
bug can be closed.

> Streaming segment status is not getting updated to finished or success
> --
>
> Key: CARBONDATA-2002
> URL: https://issues.apache.org/jira/browse/CARBONDATA-2002
> Project: CarbonData
>  Issue Type: Bug
>  Components: data-load
>Affects Versions: 1.3.0
> Environment: spark2.1
>Reporter: Geetika Gupta
>Priority: Major
> Fix For: 1.4.0
>
> Attachments: 2000_UniqData.csv
>
>
> I created a streaming table and loaded data into it using the following 
> commands on spark shell:
> import org.apache.spark.sql.SparkSession
> import org.apache.spark.sql.CarbonSession._
> import org.apache.carbondata.core.util.CarbonProperties
> import org.apache.spark.sql.streaming.{ProcessingTime, StreamingQuery}
> val carbon = SparkSession.builder().config(sc.getConf) 
> .getOrCreateCarbonSession("hdfs://localhost:54311/newCarbonStore","/tmp")
> import org.apache.carbondata.core.constants.CarbonCommonConstants
> import org.apache.carbondata.core.util.CarbonProperties
> CarbonProperties.getInstance().addProperty(CarbonCommonConstants.CARBON_BAD_RECORDS_ACTION,
>  "FORCE")
> carbon.sql("drop table if exists uniqdata_stream")
> carbon.sql("create table uniqdata_stream(CUST_ID int,CUST_NAME String,DOB 
> timestamp,DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 
> bigint,DECIMAL_COLUMN1 decimal(30,10),DECIMAL_COLUMN2 
> decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 
> int) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES 
> ('TABLE_BLOCKSIZE'= '256 MB', 'streaming'='true')");
> import carbon.sqlContext.implicits._
> import org.apache.spark.sql.types._
> val uniqdataSch = StructType(
> Array(StructField("CUST_ID", IntegerType),StructField("CUST_NAME", 
> StringType),StructField("DOB", TimestampType), StructField("DOJ", 
> TimestampType), StructField("BIGINT_COLUMN1", LongType), 
> StructField("BIGINT_COLUMN2", LongType), StructField("DECIMAL_COLUMN1", 
> org.apache.spark.sql.types.DecimalType(30, 10)), 
> StructField("DECIMAL_COLUMN2", 
> org.apache.spark.sql.types.DecimalType(36,10)), StructField("Double_COLUMN1", 
> DoubleType), StructField("Double_COLUMN2", DoubleType), 
> StructField("INTEGER_COLUMN1", IntegerType)))
> val streamDf = carbon.readStream
> .schema(uniqdataSch)
> .option("sep", ",")
> .csv("file:///home/geetika/Downloads/uniqdata")
> val qry = streamDf.writeStream.format("carbondata").trigger(ProcessingTime("5 
> seconds"))
>  .option("checkpointLocation","/stream/uniq")
> .option("dbName", "default")
> .option("tableName", "uniqdata_stream")
> .start()
>   qry.awaitTermination()
> //Press ctrl+c to terminate
> start the spark shell again
>  import org.apache.spark.sql.SparkSession
> import org.apache.spark.sql.CarbonSession._
> val carbon = SparkSession.builder().config(sc.getConf) 
> .getOrCreateCarbonSession("hdfs://localhost:54311/newCarbonStore","/tmp")
> carbon.sql("show segments for table uniqdata_stream").show
> It shows the following output:
> +-+-++-+-+---+
> |SegmentSequenceId|   Status| Load Start Time|Load End Time|Merged 
> To|File Format|
> +-+-++-+-+---+
> |0|Streaming|2018-01-05 18:23:...| null|   NA|
>  ROW_V1|
> +-+-++-+-+---+
> Status for the segment is not updated



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] carbondata issue #1971: [CARBONDATA-2161]Compacted Segment of Streaming Tabl...

2018-02-13 Thread BJangir
Github user BJangir commented on the issue:

https://github.com/apache/carbondata/pull/1971
  
retest sdv please


---