Re: why there are no join in the official benchmark test

2017-02-06 Thread Liang Chen
Hi We are test based on TPC-H/TPC-DS benchmark, the report will be shared soon. Regards Liang 2017-02-07 1:28 GMT-05:00 Yinwei Li <251469...@qq.com>: > Hi all, > > > In Apache CarbonData Performance Benchmark(0.1.0) there are no join in > all SQLs, what's the main reason? > > > I want to

Re: [ANNOUNCE] Apache CarbonData 1.0.0-incubating released

2017-02-06 Thread Liang Chen
Hi relatall Looks that AWS Athena doesn't support CarbonData format currently. Maybe you can try your ad hoc queries on CarbonData(HDFS)HDSF+Spark directly. Regards Liang 2017-02-06 19:06 GMT-05:00 : > Hi, > I have data stored in S3 and use AWS Athena to do ad-hoc

Re: Discussion about getting excution duration about a query when using sparkshell+carbondata

2017-02-06 Thread Liang Chen
Hi I used the below method in spark shell for DEMO, for your reference: import org.apache.spark.sql.catalyst.util._ benchmark { carbondf.filter($"name" === "Allen" and $"gender" === "Male" and $"province" === "NB" and $"singler" === "false").count } Regards Liang 2017-02-06 22:07 GMT-05:00

Re: [ANNOUNCE] Apache CarbonData 1.0.0-incubating released

2017-02-06 Thread relatall
Hi, I have data stored in S3 and use AWS Athena to do ad-hoc queries. How can I leverage CarbonData for my business, please? On Sun, Feb 5, 2017 at 5:27 PM, Liang Chen wrote: > Hi xiaoqiao > > Very happy to see that you will keep contributing on CarbonData, "Double >

Re: Error while loading - Table is locked for updation. Please try after some time ( Spark 1.6.2 )

2017-02-06 Thread manish gupta
Hi Sanoj, Can you please try the below things. 1. Remove carbon.properties file and let the system take all the default values. In the logs shared by you I can see that while creating the CarbonContext it is printing the carbon.properties file path and printing all the properties in it. So give

Re: Error while loading - Table is locked for updation. Please tryafter some time ( Spark 1.6.2 )

2017-02-06 Thread Sanoj M George
Hi Yinwei, Tried this, it is using the new store path, but still getting the same error. Thanks On Mon, Feb 6, 2017 at 1:38 PM, Yinwei Li <251469...@qq.com> wrote: > Hi Sanoj, > > >maybe you can try init carbonContext by setting the parameter storePath > as follows: > > > scala> val

Re: Error while loading - Table is locked for updation. Please try after some time ( Spark 1.6.2 )

2017-02-06 Thread Sanoj M George
Hi Manish, Could not find any .lock files incarbon store. I am getting the error while running spark-shell, did not try thrift server. However, as you can see from attached logs, it is taking the default store location ( not the one from carbon.properties ) scala> cc.storePath res0: String =

[GitHub] incubator-carbondata-site pull request #13: Fixed Minor UI issues in the web...

2017-02-06 Thread PallaviSingh1992
GitHub user PallaviSingh1992 opened a pull request: https://github.com/apache/incubator-carbondata-site/pull/13 Fixed Minor UI issues in the website Fixed the following issue in the website - icons not displaying properly - updated the copyright @2016 to @2017 - fixed

?????? Error while loading - Table is locked for updation. Please tryafter some time ( Spark 1.6.2 )

2017-02-06 Thread Yinwei Li
Hi Sanoj, maybe you can try init carbonContext by setting the parameter storePath as follows: scala> val storePath = "hdfs://localohst:9000/home/hadoop/carbondata/bin/carbonshellstore" scala> val cc = new CarbonContext(sc, storePath) --

Re: Error while loading - Table is locked for updation. Please try after some time ( Spark 1.6.2 )

2017-02-06 Thread manish gupta
Hi Sanoj, Please check if there is any file with .lock extension in the carbon store. Also when you start thrift server carbon store location will be printed in the thrift server logs. Please validate if there is nay mismatch in the store location provided by you and the store location getting

Re: Error while loading - Table is locked for updation. Please try after some time ( Spark 1.6.2 )

2017-02-06 Thread Sanoj M George
Thanks Raghunandan. Checked the thread but it seems this error is due to something else. Below are the parameters that I changed : carbon.properties : carbon.storelocation=hdfs://localhost:9000/opt/CarbonStore carbon.ddl.base.hdfs.url=hdfs://localhost:9000/opt/data