[GitHub] carbondata issue #2966: [WIP] test and check no sort by default

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2966
  
Build Failed  with Spark 2.3.1, Please check CI 
http://136.243.101.176:8080/job/carbondataprbuilder2.3/9867/



---


[GitHub] carbondata issue #2966: [WIP] test and check no sort by default

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2966
  
Build Failed with Spark 2.2.1, Please check CI 
http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1818/



---


[GitHub] carbondata issue #2966: [WIP] test and check no sort by default

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2966
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1607/



---


[GitHub] carbondata pull request #2961: [CARBONDATA-3119] Fixing the getOrCreateCarbo...

2018-11-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/carbondata/pull/2961


---


[GitHub] carbondata issue #2961: [CARBONDATA-3119] Fixing the getOrCreateCarbonSessio...

2018-11-30 Thread zzcclp
Github user zzcclp commented on the issue:

https://github.com/apache/carbondata/pull/2961
  
LGTM


---


[GitHub] carbondata pull request #2967: [CARBONDATA-3140]Block create like table comm...

2018-11-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/carbondata/pull/2967


---


[jira] [Resolved] (CARBONDATA-3140) Block cretae table like command for carbon table

2018-11-30 Thread Ravindra Pesala (JIRA)


 [ 
https://issues.apache.org/jira/browse/CARBONDATA-3140?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ravindra Pesala resolved CARBONDATA-3140.
-
   Resolution: Fixed
Fix Version/s: 1.5.1

> Block cretae table like command for carbon table
> 
>
> Key: CARBONDATA-3140
> URL: https://issues.apache.org/jira/browse/CARBONDATA-3140
> Project: CarbonData
>  Issue Type: Bug
>Reporter: Akash R Nilugal
>Assignee: Akash R Nilugal
>Priority: Minor
> Fix For: 1.5.1
>
>  Time Spent: 1h 40m
>  Remaining Estimate: 0h
>
> after create table like command on carbon table as source table, and dropping 
> the new table, source table is geting dropped in spark 2.1,



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] carbondata issue #2967: [CARBONDATA-3140]Block create like table command in ...

2018-11-30 Thread ravipesala
Github user ravipesala commented on the issue:

https://github.com/apache/carbondata/pull/2967
  
LGTM


---


[GitHub] carbondata issue #2968: [CARBONDATA-3141] Removed Carbon Table Detail Comman...

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2968
  
Build Success with Spark 2.2.1, Please check CI 
http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1817/



---


[GitHub] carbondata issue #2968: [CARBONDATA-3141] Removed Carbon Table Detail Comman...

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2968
  
Build Success with Spark 2.3.1, Please check CI 
http://136.243.101.176:8080/job/carbondataprbuilder2.3/9866/



---


[GitHub] carbondata issue #2961: [CARBONDATA-3119] Fixing the getOrCreateCarbonSessio...

2018-11-30 Thread xuchuanyin
Github user xuchuanyin commented on the issue:

https://github.com/apache/carbondata/pull/2961
  
LGTM


---


[GitHub] carbondata issue #2961: [CARBONDATA-3119] Fixing the getOrCreateCarbonSessio...

2018-11-30 Thread xuchuanyin
Github user xuchuanyin commented on the issue:

https://github.com/apache/carbondata/pull/2961
  
LGTM


---


[GitHub] carbondata issue #2961: [CARBONDATA-3119] Fixing the getOrCreateCarbonSessio...

2018-11-30 Thread xuchuanyin
Github user xuchuanyin commented on the issue:

https://github.com/apache/carbondata/pull/2961
  
LGTM


---


[GitHub] carbondata pull request #2964: [HOTFIX] Fix ArrayOutOfBound exception when d...

2018-11-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/carbondata/pull/2964


---


[GitHub] carbondata issue #2964: [HOTFIX] Fix ArrayOutOfBound exception when duplicat...

2018-11-30 Thread kunal642
Github user kunal642 commented on the issue:

https://github.com/apache/carbondata/pull/2964
  
LGTM


---


[GitHub] carbondata issue #2968: [CARBONDATA-3141] Removed Carbon Table Detail Comman...

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2968
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1606/



---


[GitHub] carbondata issue #2914: [CARBONDATA-3093] Provide property builder for carbo...

2018-11-30 Thread xubo245
Github user xubo245 commented on the issue:

https://github.com/apache/carbondata/pull/2914
  
@xuchuanyin I rebased about 5 hours ago. already rebase the latest master


---


[GitHub] carbondata issue #2967: [CARBONDATA-3140]Block create like table command in ...

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2967
  
Build Success with Spark 2.3.1, Please check CI 
http://136.243.101.176:8080/job/carbondataprbuilder2.3/9865/



---


[GitHub] carbondata pull request #2968: [CARBONDATA-3141] Removed Carbon Table Detail...

2018-11-30 Thread praveenmeenakshi56
GitHub user praveenmeenakshi56 opened a pull request:

https://github.com/apache/carbondata/pull/2968

[CARBONDATA-3141] Removed Carbon Table Detail Command Test case

### What has been changed?
Removed Carbon Table Detail Command Test case, as this is not used. But 
every time there is a change in metadata or anything else, the test case has to 
be modified repeatedly.

 - [ ] Any interfaces changed?
 NA
 - [ ] Any backward compatibility impacted?
 NA
 - [ ] Document update required?
NA
 - [ ] Testing done
Please provide details on 
- Whether new unit test cases have been added or why no new tests 
are required?
- How it is tested? Please attach test report.
- Is it a performance related change? Please attach the performance 
test report.
- Any additional information to help reviewers in testing this 
change.
Test case removed
 - [ ] For large changes, please consider breaking it into sub-tasks under 
an umbrella JIRA. 
NA


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/praveenmeenakshi56/carbondata tabledetail

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/carbondata/pull/2968.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2968


commit 2f003fdebe83149440c665b13537c01b567424a3
Author: praveenmeenakshi56 
Date:   2018-11-30T12:35:08Z

Removed Carbon Table Detail Test case




---


[jira] [Created] (CARBONDATA-3141) Remove Carbon Table Detail Test Case

2018-11-30 Thread Praveen M P (JIRA)
Praveen M P created CARBONDATA-3141:
---

 Summary: Remove Carbon Table Detail Test Case
 Key: CARBONDATA-3141
 URL: https://issues.apache.org/jira/browse/CARBONDATA-3141
 Project: CarbonData
  Issue Type: Test
Reporter: Praveen M P
Assignee: Praveen M P






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] carbondata issue #2967: [CARBONDATA-3140]Block create like table command in ...

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2967
  
Build Success with Spark 2.2.1, Please check CI 
http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1816/



---


[GitHub] carbondata pull request #2965: [Documentation] Editorial review

2018-11-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/carbondata/pull/2965


---


[GitHub] carbondata issue #2965: [Documentation] Editorial review

2018-11-30 Thread kunal642
Github user kunal642 commented on the issue:

https://github.com/apache/carbondata/pull/2965
  
LGTM


---


[GitHub] carbondata issue #2967: [CARBONDATA-3140]Block create like table command in ...

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2967
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1605/



---


[GitHub] carbondata issue #2961: [CARBONDATA-3119] Fixing the getOrCreateCarbonSessio...

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2961
  
Build Success with Spark 2.2.1, Please check CI 
http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1814/



---


[GitHub] carbondata issue #2961: [CARBONDATA-3119] Fixing the getOrCreateCarbonSessio...

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2961
  
Build Success with Spark 2.3.1, Please check CI 
http://136.243.101.176:8080/job/carbondataprbuilder2.3/9863/



---


[GitHub] carbondata issue #2967: [CARBONDATA-3140]Block create like table command in ...

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2967
  
Build Failed with Spark 2.2.1, Please check CI 
http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1815/



---


[GitHub] carbondata issue #2967: [CARBONDATA-3140]Block create like table command in ...

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2967
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1604/



---


[GitHub] carbondata issue #2967: [CARBONDATA-3140]Block create like table command in ...

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2967
  
Build Failed  with Spark 2.3.1, Please check CI 
http://136.243.101.176:8080/job/carbondataprbuilder2.3/9864/



---


[GitHub] carbondata issue #2966: [WIP] test and check no sort by default

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2966
  
Build Failed with Spark 2.2.1, Please check CI 
http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1813/



---


[jira] [Resolved] (CARBONDATA-3064) Support separate audit log

2018-11-30 Thread Ravindra Pesala (JIRA)


 [ 
https://issues.apache.org/jira/browse/CARBONDATA-3064?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ravindra Pesala resolved CARBONDATA-3064.
-
Resolution: Fixed

> Support separate audit log
> --
>
> Key: CARBONDATA-3064
> URL: https://issues.apache.org/jira/browse/CARBONDATA-3064
> Project: CarbonData
>  Issue Type: Improvement
>Reporter: Jacky Li
>Assignee: Jacky Li
>Priority: Major
> Fix For: 1.5.1
>
>  Time Spent: 8h 20m
>  Remaining Estimate: 0h
>
> Currently, CarbonData outputs audit log with other level log together in one 
> log file, it is not easy for user to check the audit. And sometimes the audit 
> information is not complete since it depends on each Command to invoke Logger 
> in its run function. 
> To improve it, I propose a new audit log implementation by following:
> 1. Separate the audit log from normal log, user can configure log4j to output 
> the audit log in a separate file
> 2. The audit log should have a common format that includes at least: time, 
> username, operation name, operation id that identify this operation, status 
> (success or failure), other extra information like data loading size, time 
> spent
> 3. The audit log should be in JSON format to enable analytic tool support in 
> the future.
> For example, the audit log will be look like following
> {"time":"2018-10-31 15:02:12","username":"anonymous","opName":"CREATE 
> TABLE","opId":"115794874155743","opStatus":"START"}
> {"time":"2018-10-31 15:02:12","username":"anonymous","opName":"CREATE 
> TABLE","opId":"115794874155743","opStatus":"SUCCESS","opTime":"542 
> ms","tableId":"default.t1","extraInfo":{"external":"false"}}
> {"time":"2018-10-31 15:02:15","username":"anonymous","opName":"INSERT 
> INTO","opId":"115797876187366","opStatus":"START"}
> {"time":"2018-10-31 15:02:19","username":"anonymous","opName":"INSERT 
> INTO","opId":"115797876187366","opStatus":"SUCCESS","opTime":”4043 
> ms","tableId":"default.t1","extraInfo":{"SegmentId":"0","DataSize":"403.0B","IndexSize":"246.0B"}}
> {"time":"2018-10-31 15:02:33","username":"anonymous","opName":"DROP 
> TABLE","opId":"115816322828613","opStatus":"START"}
> {"time":"2018-10-31 15:02:34","username":"anonymous","opName":"DROP 
> TABLE","opId":"115816322828613","opStatus":"SUCCESS","opTime":"131 
> ms","tableId":"default.t1","extraInfo":{}}
> {"time":"2018-10-31 15:02:49","username":"anonymous","opName":"SHOW 
> SEGMENTS","opId":"115831939703565","opStatus":"START"}
> {"time":"2018-10-31 15:02:49","username":"anonymous","opName":"SHOW 
> SEGMENTS","opId":"115831939703565","opStatus":"SUCCESS","opTime":"30 
> ms","tableId":"default.t2","extraInfo":{}}
> {"time":"2018-10-31 15:03:54","username":"anonymous","opName":"INSERT 
> OVERWRITE","opId":"115896869484042","opStatus":"START"}
> {"time":"2018-10-31 15:03:56","username":"anonymous","opName":"INSERT 
> OVERWRITE","opId":"115896869484042","opStatus":"SUCCESS","opTime":"2039 
> ms","tableId":"default.t2","extraInfo":{"SegmentId":"0","DataSize":"403.0B","IndexSize":"246.0B”}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] carbondata pull request #2967: [CARBONDATA-3140]Block create like table comm...

2018-11-30 Thread akashrn5
GitHub user akashrn5 opened a pull request:

https://github.com/apache/carbondata/pull/2967

[CARBONDATA-3140]Block create like table command in carbon

### Why this PR?
when like table is created using carbon table as source table, and the new 
table is dropped, it deletes the source table in spark-2.1 and works fine in 
other. Blocking the create like command for carbon table


Be sure to do all of the following checklist to help us incorporate 
your contribution quickly and easily:

 - [ ] Any interfaces changed?
 
 - [ ] Any backward compatibility impacted?
 
 - [ ] Document update required?

 - [ ] Testing done
Please provide details on 
- Whether new unit test cases have been added or why no new tests 
are required?
- How it is tested? Please attach test report.
- Is it a performance related change? Please attach the performance 
test report.
- Any additional information to help reviewers in testing this 
change.
   
 - [ ] For large changes, please consider breaking it into sub-tasks under 
an umbrella JIRA. 



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/akashrn5/incubator-carbondata like

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/carbondata/pull/2967.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2967


commit cc8dbbf337976bae9e3a4a43b317c853b783fcff
Author: akashrn5 
Date:   2018-11-30T10:31:34Z

Block create like table command in carbon




---


[GitHub] carbondata issue #2967: [CARBONDATA-3140]Block create like table command in ...

2018-11-30 Thread akashrn5
Github user akashrn5 commented on the issue:

https://github.com/apache/carbondata/pull/2967
  
@KanakaKumar @ravipesala please review


---


[GitHub] carbondata pull request #2967: [CARBONDATA-3140]Block create like table comm...

2018-11-30 Thread akashrn5
Github user akashrn5 commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2967#discussion_r237817147
  
--- Diff: 
core/src/main/java/org/apache/carbondata/core/memory/UnsafeMemoryManager.java 
---
@@ -129,7 +129,7 @@ private synchronized MemoryBlock 
allocateMemory(MemoryType memoryType, String ta
   memoryBlock = MemoryAllocator.HEAP.allocate(memoryRequested);
   if (LOGGER.isDebugEnabled()) {
 LOGGER.debug(String
-.format("Creating onheap working Memory block (%d) with size: 
", memoryBlock.size()));
+.format("Creating onheap working Memory block with size: 
(%d)", memoryBlock.size()));
--- End diff --

corrected the error message here


---


[jira] [Created] (CARBONDATA-3140) Block cretae table like command for carbon table

2018-11-30 Thread Akash R Nilugal (JIRA)
Akash R Nilugal created CARBONDATA-3140:
---

 Summary: Block cretae table like command for carbon table
 Key: CARBONDATA-3140
 URL: https://issues.apache.org/jira/browse/CARBONDATA-3140
 Project: CarbonData
  Issue Type: Bug
Reporter: Akash R Nilugal
Assignee: Akash R Nilugal


after create table like command on carbon table as source table, and dropping 
the new table, source table is geting dropped in spark 2.1,



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (CARBONDATA-3105) some table property does not support modification

2018-11-30 Thread Ravindra Pesala (JIRA)


 [ 
https://issues.apache.org/jira/browse/CARBONDATA-3105?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ravindra Pesala updated CARBONDATA-3105:

Fix Version/s: (was: 1.5.1)
   NONE

> some table property does not support modification
> -
>
> Key: CARBONDATA-3105
> URL: https://issues.apache.org/jira/browse/CARBONDATA-3105
> Project: CarbonData
>  Issue Type: Improvement
>  Components: data-load
>Affects Versions: 1.5.0
>Reporter: wangsen
>Assignee: wangsen
>Priority: Minor
> Fix For: NONE
>
>
> Some table property does not support modification by alter table ,for example:
> ALTER TABLE sdr_carbon_flow_ter_1day_test SET 
> TBLPROPERTIES('COMPACTION_LEVEL_THRESHOLD'='6,0','AUTO_LOAD_MERGE'='true');
> Error: java.lang.RuntimeException: Alter table newProperties operation 
> failed: Error: Invalid option(s): auto_load_merge (state=,code=0)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (CARBONDATA-3079) Improve the C++ SDK read performance by merging column in JNI

2018-11-30 Thread Ravindra Pesala (JIRA)


 [ 
https://issues.apache.org/jira/browse/CARBONDATA-3079?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ravindra Pesala updated CARBONDATA-3079:

Fix Version/s: (was: 1.5.1)
   NONE

> Improve the C++ SDK read performance by merging column in JNI
> -
>
> Key: CARBONDATA-3079
> URL: https://issues.apache.org/jira/browse/CARBONDATA-3079
> Project: CarbonData
>  Issue Type: Sub-task
>Reporter: xubo245
>Assignee: xubo245
>Priority: Major
> Fix For: NONE
>
>  Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> Improve the C++ SDK read performance by merging column in JNI



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (CARBONDATA-3073) Support other interface in carbon writer of C++ SDK

2018-11-30 Thread Ravindra Pesala (JIRA)


 [ 
https://issues.apache.org/jira/browse/CARBONDATA-3073?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ravindra Pesala updated CARBONDATA-3073:

Fix Version/s: (was: 1.5.1)
   NONE

> Support  other interface in carbon writer of C++ SDK
> 
>
> Key: CARBONDATA-3073
> URL: https://issues.apache.org/jira/browse/CARBONDATA-3073
> Project: CarbonData
>  Issue Type: Sub-task
>Affects Versions: 1.5.1
>Reporter: xubo245
>Assignee: xubo245
>Priority: Major
> Fix For: NONE
>
>  Time Spent: 4h 50m
>  Remaining Estimate: 0h
>
> when user create table and write data in C++ SDK, user sometimes need 
> configure withTableProperties, so we should Support configure TableProperties 
> in carbon writer of C++ SDK
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Resolved] (CARBONDATA-3083) Null values are getting replaced by 0 after update operation.

2018-11-30 Thread Ravindra Pesala (JIRA)


 [ 
https://issues.apache.org/jira/browse/CARBONDATA-3083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ravindra Pesala resolved CARBONDATA-3083.
-
Resolution: Fixed

> Null values are getting replaced by 0 after update operation.
> -
>
> Key: CARBONDATA-3083
> URL: https://issues.apache.org/jira/browse/CARBONDATA-3083
> Project: CarbonData
>  Issue Type: Bug
>Affects Versions: 1.5.1
>Reporter: Kunal Kapoor
>Assignee: Kunal Kapoor
>Priority: Major
> Fix For: 1.5.1
>
>  Time Spent: 1h 50m
>  Remaining Estimate: 0h
>
> create table negativeTable(intCol int, stringCol string, shortCol short) 
> stored by 'carbondata'
> load data inpath 'hdfs://hacluster/user/dataWithNegativeValues.csv' into 
> table negativeTable 
> options('delimiter'=',','fileheader'='intCol,stringCol,shortCol','bad_records_action'='force')
> select * from negativeTable
> insert into negativeTable select 0,null,-10
> insert into negativeTable select null,'inserted',20
> select * from negativeTable
> update negativeTable set (intCol) = (5) where intCol=0
> select * from negativeTable



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] carbondata issue #2966: [WIP] test and check no sort by default

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2966
  
Build Failed  with Spark 2.3.1, Please check CI 
http://136.243.101.176:8080/job/carbondataprbuilder2.3/9862/



---


[GitHub] carbondata issue #2961: [CARBONDATA-3119] Fixing the getOrCreateCarbonSessio...

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2961
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1603/



---


[GitHub] carbondata pull request #2961: [CARBONDATA-3119] Fixing the getOrCreateCarbo...

2018-11-30 Thread zygitup
Github user zygitup commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2961#discussion_r237807538
  
--- Diff: 
integration/spark2/src/main/scala/org/apache/spark/sql/CarbonSession.scala ---
@@ -37,6 +38,8 @@ import 
org.apache.carbondata.core.constants.CarbonCommonConstants
 import org.apache.carbondata.core.util.{CarbonProperties, 
CarbonSessionInfo, ThreadLocalSessionInfo}
 import org.apache.carbondata.streaming.CarbonStreamingQueryListener
 
+
+
--- End diff --

ok.Has been removed


---


[jira] [Resolved] (CARBONDATA-3095) Optimize the documentation of SDK/CSDK

2018-11-30 Thread Kunal Kapoor (JIRA)


 [ 
https://issues.apache.org/jira/browse/CARBONDATA-3095?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Kunal Kapoor resolved CARBONDATA-3095.
--
   Resolution: Fixed
Fix Version/s: 1.5.1

> Optimize the documentation of SDK/CSDK
> --
>
> Key: CARBONDATA-3095
> URL: https://issues.apache.org/jira/browse/CARBONDATA-3095
> Project: CarbonData
>  Issue Type: Sub-task
>Affects Versions: 1.5.1
>Reporter: xubo245
>Assignee: xubo245
>Priority: Major
> Fix For: 1.5.1
>
>  Time Spent: 5h
>  Remaining Estimate: 0h
>




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] carbondata issue #2961: [CARBONDATA-3119] Fixing the getOrCreateCarbonSessio...

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2961
  
Build Success with Spark 2.2.1, Please check CI 
http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1812/



---


[GitHub] carbondata issue #2961: [CARBONDATA-3119] Fixing the getOrCreateCarbonSessio...

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2961
  
Build Success with Spark 2.3.1, Please check CI 
http://136.243.101.176:8080/job/carbondataprbuilder2.3/9861/



---


[GitHub] carbondata pull request #2915: [CARBONDATA-3095] Optimize the documentation ...

2018-11-30 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/carbondata/pull/2915


---


[GitHub] carbondata issue #2966: [WIP] test and check no sort by default

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2966
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1602/



---


[GitHub] carbondata issue #2915: [CARBONDATA-3095] Optimize the documentation of SDK/...

2018-11-30 Thread sraghunandan
Github user sraghunandan commented on the issue:

https://github.com/apache/carbondata/pull/2915
  
Lgtm


---


[GitHub] carbondata issue #2914: [CARBONDATA-3093] Provide property builder for carbo...

2018-11-30 Thread xuchuanyin
Github user xuchuanyin commented on the issue:

https://github.com/apache/carbondata/pull/2914
  
Have you rebased with the latest master code and recheck again? Since 18 
days had passed after your last commit.


---


[GitHub] carbondata issue #2966: [WIP] test and check no sort by default

2018-11-30 Thread ajantha-bhat
Github user ajantha-bhat commented on the issue:

https://github.com/apache/carbondata/pull/2966
  
retest this please


---


[GitHub] carbondata issue #2914: [CARBONDATA-3093] Provide property builder for carbo...

2018-11-30 Thread xubo245
Github user xubo245 commented on the issue:

https://github.com/apache/carbondata/pull/2914
  
@jackylk @ravipesala @kunal642 @KanakaKumar @chenliang613 @sraghunandan CI 
pass, please review.


---


[GitHub] carbondata pull request #2961: [CARBONDATA-3119] Fixing the getOrCreateCarbo...

2018-11-30 Thread zzcclp
Github user zzcclp commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2961#discussion_r237783794
  
--- Diff: 
integration/spark2/src/main/scala/org/apache/spark/sql/CarbonSession.scala ---
@@ -248,7 +251,7 @@ object CarbonSession {
 
 session = new CarbonSession(sparkContext, None, !enableInMemCatlog)
 val carbonProperties = CarbonProperties.getInstance()
-if (storePath != null) {
+if (storePath != null && StringUtils.isNotBlank(storePath)) {
--- End diff --

remove 'storePath != null', StringUtils.isNotBlank already includes this 
judgement.


---


[GitHub] carbondata pull request #2961: [CARBONDATA-3119] Fixing the getOrCreateCarbo...

2018-11-30 Thread zzcclp
Github user zzcclp commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2961#discussion_r237782924
  
--- Diff: 
integration/spark2/src/main/scala/org/apache/spark/sql/CarbonSession.scala ---
@@ -37,6 +38,8 @@ import 
org.apache.carbondata.core.constants.CarbonCommonConstants
 import org.apache.carbondata.core.util.{CarbonProperties, 
CarbonSessionInfo, ThreadLocalSessionInfo}
 import org.apache.carbondata.streaming.CarbonStreamingQueryListener
 
+
+
--- End diff --

remove these two lines.


---


[GitHub] carbondata issue #2951: [SDV] Add datasource testcases for Spark File Format

2018-11-30 Thread kunal642
Github user kunal642 commented on the issue:

https://github.com/apache/carbondata/pull/2951
  
@shivamasn Please add description for the PR. Also attach test report in 
the description.


---


[GitHub] carbondata pull request #2951: [SDV] Add datasource testcases for Spark File...

2018-11-30 Thread kunal642
Github user kunal642 commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2951#discussion_r237781038
  
--- Diff: 
integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/datasource/CreateTableUsingSparkCarbonFileFormatTestCase.scala
 ---
@@ -0,0 +1,342 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.cluster.sdv.generated.datasource
+
+import java.io.File
+import java.text.SimpleDateFormat
+import java.util.{Date, Random}
+import scala.collection.JavaConverters._
+import org.apache.commons.io.FileUtils
+import org.apache.commons.lang.RandomStringUtils
+import org.scalatest.BeforeAndAfterAll
+import org.apache.spark.util.SparkUtil
+import org.apache.carbondata.core.datastore.filesystem.CarbonFile
+import org.apache.carbondata.core.datastore.impl.FileFactory
+import org.apache.carbondata.core.metadata.datatype.DataTypes
+import org.apache.carbondata.core.util.{CarbonProperties, CarbonUtil, 
DataFileFooterConverter}
+import org.apache.carbondata.sdk.file.{CarbonWriter, Field, Schema}
+import org.apache.spark.sql.Row
+import org.apache.spark.sql.common.util.QueryTest
+import org.apache.carbondata.core.constants.CarbonCommonConstants
+import org.apache.carbondata.core.datamap.DataMapStoreManager
+import org.apache.carbondata.core.metadata.AbsoluteTableIdentifier
+import org.apache.carbondata.core.metadata.blocklet.DataFileFooter
+
+class CreateTableUsingSparkCarbonFileFormatTestCase extends QueryTest with 
BeforeAndAfterAll {
+
+  override def beforeAll(): Unit = {
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+  }
+
+  override def afterAll(): Unit = {
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+  }
+
+  var writerPath = new File(this.getClass.getResource("/").getPath
++
+"../." +
+
"./src/test/resources/SparkCarbonFileFormat/WriterOutput/")
+.getCanonicalPath
+  //getCanonicalPath gives path with \, but the code expects /.
+  writerPath = writerPath.replace("\\", "/");
+
+  def buildTestData(): Any = {
+
+FileUtils.deleteDirectory(new File(writerPath))
+
+val schema = new StringBuilder()
+  .append("[ \n")
+  .append("   {\"name\":\"string\"},\n")
+  .append("   {\"age\":\"int\"},\n")
+  .append("   {\"height\":\"double\"}\n")
+  .append("]")
+  .toString()
+
+try {
+  val builder = CarbonWriter.builder()
+  val writer =
+
builder.outputPath(writerPath).withCsvInput(Schema.parseJson(schema)).writtenBy("CreateTableUsingSparkCarbonFileFormatTestCase").build()
+  var i = 0
+  while (i < 100) {
+writer.write(Array[String]("robot" + i, String.valueOf(i), 
String.valueOf(i.toDouble / 2)))
+i += 1
+  }
+  writer.close()
+} catch {
+  case _: Throwable => None
+}
+  }
+
+  def cleanTestData() = {
+FileUtils.deleteDirectory(new File(writerPath))
+  }
+
+  def deleteIndexFile(path: String, extension: String) : Unit = {
+val file: CarbonFile = FileFactory
+  .getCarbonFile(path, FileFactory.getFileType(path))
+
+for (eachDir <- file.listFiles) {
+  if (!eachDir.isDirectory) {
+if (eachDir.getName.endsWith(extension)) {
+  CarbonUtil.deleteFoldersAndFilesSilent(eachDir)
+}
+  } else {
+deleteIndexFile(eachDir.getPath, extension)
+  }
+}
+  }
+
+  test("Running SQL directly and read carbondata files (sdk Writer Output) 
using the SparkCarbonFileFormat ") {
+buildTestData()
+assert(new File(writerPath).exists())
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+
+//data source file format
+if (SparkUtil.isSparkVersionEqualTo("2.1")) {
+  //data 

[GitHub] carbondata pull request #2951: [SDV] Add datasource testcases for Spark File...

2018-11-30 Thread kunal642
Github user kunal642 commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2951#discussion_r237779541
  
--- Diff: 
integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/datasource/CreateTableUsingSparkCarbonFileFormatTestCase.scala
 ---
@@ -0,0 +1,342 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.cluster.sdv.generated.datasource
+
+import java.io.File
+import java.text.SimpleDateFormat
+import java.util.{Date, Random}
+import scala.collection.JavaConverters._
+import org.apache.commons.io.FileUtils
+import org.apache.commons.lang.RandomStringUtils
+import org.scalatest.BeforeAndAfterAll
+import org.apache.spark.util.SparkUtil
+import org.apache.carbondata.core.datastore.filesystem.CarbonFile
+import org.apache.carbondata.core.datastore.impl.FileFactory
+import org.apache.carbondata.core.metadata.datatype.DataTypes
+import org.apache.carbondata.core.util.{CarbonProperties, CarbonUtil, 
DataFileFooterConverter}
+import org.apache.carbondata.sdk.file.{CarbonWriter, Field, Schema}
+import org.apache.spark.sql.Row
+import org.apache.spark.sql.common.util.QueryTest
+import org.apache.carbondata.core.constants.CarbonCommonConstants
+import org.apache.carbondata.core.datamap.DataMapStoreManager
+import org.apache.carbondata.core.metadata.AbsoluteTableIdentifier
+import org.apache.carbondata.core.metadata.blocklet.DataFileFooter
+
+class CreateTableUsingSparkCarbonFileFormatTestCase extends QueryTest with 
BeforeAndAfterAll {
+
+  override def beforeAll(): Unit = {
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+  }
+
+  override def afterAll(): Unit = {
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+  }
+
+  var writerPath = new File(this.getClass.getResource("/").getPath
++
+"../." +
+
"./src/test/resources/SparkCarbonFileFormat/WriterOutput/")
+.getCanonicalPath
+  //getCanonicalPath gives path with \, but the code expects /.
+  writerPath = writerPath.replace("\\", "/");
+
+  def buildTestData(): Any = {
+
+FileUtils.deleteDirectory(new File(writerPath))
+
+val schema = new StringBuilder()
+  .append("[ \n")
+  .append("   {\"name\":\"string\"},\n")
+  .append("   {\"age\":\"int\"},\n")
+  .append("   {\"height\":\"double\"}\n")
+  .append("]")
+  .toString()
+
+try {
+  val builder = CarbonWriter.builder()
+  val writer =
+
builder.outputPath(writerPath).withCsvInput(Schema.parseJson(schema)).writtenBy("CreateTableUsingSparkCarbonFileFormatTestCase").build()
+  var i = 0
+  while (i < 100) {
+writer.write(Array[String]("robot" + i, String.valueOf(i), 
String.valueOf(i.toDouble / 2)))
+i += 1
+  }
+  writer.close()
+} catch {
+  case _: Throwable => None
+}
+  }
+
+  def cleanTestData() = {
+FileUtils.deleteDirectory(new File(writerPath))
+  }
+
+  def deleteIndexFile(path: String, extension: String) : Unit = {
+val file: CarbonFile = FileFactory
+  .getCarbonFile(path, FileFactory.getFileType(path))
+
+for (eachDir <- file.listFiles) {
+  if (!eachDir.isDirectory) {
+if (eachDir.getName.endsWith(extension)) {
+  CarbonUtil.deleteFoldersAndFilesSilent(eachDir)
+}
+  } else {
+deleteIndexFile(eachDir.getPath, extension)
+  }
+}
+  }
+
+  test("Running SQL directly and read carbondata files (sdk Writer Output) 
using the SparkCarbonFileFormat ") {
+buildTestData()
+assert(new File(writerPath).exists())
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+
+//data source file format
+if (SparkUtil.isSparkVersionEqualTo("2.1")) {
+  //data 

[GitHub] carbondata pull request #2951: [SDV] Add datasource testcases for Spark File...

2018-11-30 Thread kunal642
Github user kunal642 commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2951#discussion_r237779383
  
--- Diff: 
integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/datasource/CreateTableUsingSparkCarbonFileFormatTestCase.scala
 ---
@@ -0,0 +1,342 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.cluster.sdv.generated.datasource
+
+import java.io.File
+import java.text.SimpleDateFormat
+import java.util.{Date, Random}
+import scala.collection.JavaConverters._
+import org.apache.commons.io.FileUtils
+import org.apache.commons.lang.RandomStringUtils
+import org.scalatest.BeforeAndAfterAll
+import org.apache.spark.util.SparkUtil
+import org.apache.carbondata.core.datastore.filesystem.CarbonFile
+import org.apache.carbondata.core.datastore.impl.FileFactory
+import org.apache.carbondata.core.metadata.datatype.DataTypes
+import org.apache.carbondata.core.util.{CarbonProperties, CarbonUtil, 
DataFileFooterConverter}
+import org.apache.carbondata.sdk.file.{CarbonWriter, Field, Schema}
+import org.apache.spark.sql.Row
+import org.apache.spark.sql.common.util.QueryTest
+import org.apache.carbondata.core.constants.CarbonCommonConstants
+import org.apache.carbondata.core.datamap.DataMapStoreManager
+import org.apache.carbondata.core.metadata.AbsoluteTableIdentifier
+import org.apache.carbondata.core.metadata.blocklet.DataFileFooter
+
+class CreateTableUsingSparkCarbonFileFormatTestCase extends QueryTest with 
BeforeAndAfterAll {
+
+  override def beforeAll(): Unit = {
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+  }
+
+  override def afterAll(): Unit = {
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+  }
+
+  var writerPath = new File(this.getClass.getResource("/").getPath
++
+"../." +
+
"./src/test/resources/SparkCarbonFileFormat/WriterOutput/")
+.getCanonicalPath
+  //getCanonicalPath gives path with \, but the code expects /.
+  writerPath = writerPath.replace("\\", "/");
+
+  def buildTestData(): Any = {
+
+FileUtils.deleteDirectory(new File(writerPath))
+
+val schema = new StringBuilder()
+  .append("[ \n")
+  .append("   {\"name\":\"string\"},\n")
+  .append("   {\"age\":\"int\"},\n")
+  .append("   {\"height\":\"double\"}\n")
+  .append("]")
+  .toString()
+
+try {
+  val builder = CarbonWriter.builder()
+  val writer =
+
builder.outputPath(writerPath).withCsvInput(Schema.parseJson(schema)).writtenBy("CreateTableUsingSparkCarbonFileFormatTestCase").build()
+  var i = 0
+  while (i < 100) {
+writer.write(Array[String]("robot" + i, String.valueOf(i), 
String.valueOf(i.toDouble / 2)))
+i += 1
+  }
+  writer.close()
+} catch {
+  case _: Throwable => None
+}
+  }
+
+  def cleanTestData() = {
+FileUtils.deleteDirectory(new File(writerPath))
+  }
+
+  def deleteIndexFile(path: String, extension: String) : Unit = {
+val file: CarbonFile = FileFactory
+  .getCarbonFile(path, FileFactory.getFileType(path))
+
+for (eachDir <- file.listFiles) {
+  if (!eachDir.isDirectory) {
+if (eachDir.getName.endsWith(extension)) {
+  CarbonUtil.deleteFoldersAndFilesSilent(eachDir)
+}
+  } else {
+deleteIndexFile(eachDir.getPath, extension)
+  }
+}
+  }
+
+  test("Running SQL directly and read carbondata files (sdk Writer Output) 
using the SparkCarbonFileFormat ") {
+buildTestData()
+assert(new File(writerPath).exists())
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+
+//data source file format
+if (SparkUtil.isSparkVersionEqualTo("2.1")) {
+  //data 

[GitHub] carbondata pull request #2951: [SDV] Add datasource testcases for Spark File...

2018-11-30 Thread kunal642
Github user kunal642 commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2951#discussion_r237780471
  
--- Diff: 
integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/datasource/CreateTableUsingSparkCarbonFileFormatTestCase.scala
 ---
@@ -0,0 +1,342 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.cluster.sdv.generated.datasource
--- End diff --

Please format all the newly added code


---


[GitHub] carbondata pull request #2951: [SDV] Add datasource testcases for Spark File...

2018-11-30 Thread kunal642
Github user kunal642 commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2951#discussion_r237779458
  
--- Diff: 
integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/datasource/CreateTableUsingSparkCarbonFileFormatTestCase.scala
 ---
@@ -0,0 +1,342 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.carbondata.cluster.sdv.generated.datasource
+
+import java.io.File
+import java.text.SimpleDateFormat
+import java.util.{Date, Random}
+import scala.collection.JavaConverters._
+import org.apache.commons.io.FileUtils
+import org.apache.commons.lang.RandomStringUtils
+import org.scalatest.BeforeAndAfterAll
+import org.apache.spark.util.SparkUtil
+import org.apache.carbondata.core.datastore.filesystem.CarbonFile
+import org.apache.carbondata.core.datastore.impl.FileFactory
+import org.apache.carbondata.core.metadata.datatype.DataTypes
+import org.apache.carbondata.core.util.{CarbonProperties, CarbonUtil, 
DataFileFooterConverter}
+import org.apache.carbondata.sdk.file.{CarbonWriter, Field, Schema}
+import org.apache.spark.sql.Row
+import org.apache.spark.sql.common.util.QueryTest
+import org.apache.carbondata.core.constants.CarbonCommonConstants
+import org.apache.carbondata.core.datamap.DataMapStoreManager
+import org.apache.carbondata.core.metadata.AbsoluteTableIdentifier
+import org.apache.carbondata.core.metadata.blocklet.DataFileFooter
+
+class CreateTableUsingSparkCarbonFileFormatTestCase extends QueryTest with 
BeforeAndAfterAll {
+
+  override def beforeAll(): Unit = {
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+  }
+
+  override def afterAll(): Unit = {
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+  }
+
+  var writerPath = new File(this.getClass.getResource("/").getPath
++
+"../." +
+
"./src/test/resources/SparkCarbonFileFormat/WriterOutput/")
+.getCanonicalPath
+  //getCanonicalPath gives path with \, but the code expects /.
+  writerPath = writerPath.replace("\\", "/");
+
+  def buildTestData(): Any = {
+
+FileUtils.deleteDirectory(new File(writerPath))
+
+val schema = new StringBuilder()
+  .append("[ \n")
+  .append("   {\"name\":\"string\"},\n")
+  .append("   {\"age\":\"int\"},\n")
+  .append("   {\"height\":\"double\"}\n")
+  .append("]")
+  .toString()
+
+try {
+  val builder = CarbonWriter.builder()
+  val writer =
+
builder.outputPath(writerPath).withCsvInput(Schema.parseJson(schema)).writtenBy("CreateTableUsingSparkCarbonFileFormatTestCase").build()
+  var i = 0
+  while (i < 100) {
+writer.write(Array[String]("robot" + i, String.valueOf(i), 
String.valueOf(i.toDouble / 2)))
+i += 1
+  }
+  writer.close()
+} catch {
+  case _: Throwable => None
+}
+  }
+
+  def cleanTestData() = {
+FileUtils.deleteDirectory(new File(writerPath))
+  }
+
+  def deleteIndexFile(path: String, extension: String) : Unit = {
+val file: CarbonFile = FileFactory
+  .getCarbonFile(path, FileFactory.getFileType(path))
+
+for (eachDir <- file.listFiles) {
+  if (!eachDir.isDirectory) {
+if (eachDir.getName.endsWith(extension)) {
+  CarbonUtil.deleteFoldersAndFilesSilent(eachDir)
+}
+  } else {
+deleteIndexFile(eachDir.getPath, extension)
+  }
+}
+  }
+
+  test("Running SQL directly and read carbondata files (sdk Writer Output) 
using the SparkCarbonFileFormat ") {
+buildTestData()
+assert(new File(writerPath).exists())
+sql("DROP TABLE IF EXISTS sdkOutputTable")
+
+//data source file format
+if (SparkUtil.isSparkVersionEqualTo("2.1")) {
+  //data 

[GitHub] carbondata pull request #2961: [CARBONDATA-3119] Fixing the getOrCreateCarbo...

2018-11-30 Thread xubo245
Github user xubo245 commented on a diff in the pull request:

https://github.com/apache/carbondata/pull/2961#discussion_r237782459
  
--- Diff: 
integration/spark2/src/main/scala/org/apache/spark/sql/CarbonSession.scala ---
@@ -180,7 +183,7 @@ object CarbonSession {
   val userSuppliedContext: Option[SparkContext] =
 getValue("userSuppliedContext", 
builder).asInstanceOf[Option[SparkContext]]
 
-  if (metaStorePath != null) {
+  if (metaStorePath != null && StringUtils.isNotBlank(metaStorePath)) {
--- End diff --

StringUtils.isNotBlank will judge null


---


[GitHub] carbondata issue #2961: [CARBONDATA-3119] Fixing the getOrCreateCarbonSessio...

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2961
  
Build Success with Spark 2.1.0, Please check CI 
http://136.243.101.176:8080/job/ApacheCarbonPRBuilder2.1/1601/



---


[GitHub] carbondata issue #2914: [CARBONDATA-3093] Provide property builder for carbo...

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2914
  
Build Success with Spark 2.3.1, Please check CI 
http://136.243.101.176:8080/job/carbondataprbuilder2.3/9859/



---


[GitHub] carbondata issue #2966: [WIP] test and check no sort by default

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2966
  
Build Failed  with Spark 2.3.1, Please check CI 
http://136.243.101.176:8080/job/carbondataprbuilder2.3/9860/



---


[GitHub] carbondata issue #2914: [CARBONDATA-3093] Provide property builder for carbo...

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2914
  
Build Success with Spark 2.2.1, Please check CI 
http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1810/



---


[GitHub] carbondata issue #2966: [WIP] test and check no sort by default

2018-11-30 Thread CarbonDataQA
Github user CarbonDataQA commented on the issue:

https://github.com/apache/carbondata/pull/2966
  
Build Failed with Spark 2.2.1, Please check CI 
http://95.216.28.178:8080/job/ApacheCarbonPRBuilder1/1811/



---