Re: error occur when I load data to s3

2018-09-04 Thread aaron
Hi kunalkapoor, I'd like give you more debug log as below.


application/x-www-form-urlencoded; charset=utf-8
Tue, 04 Sep 2018 06:45:10 GMT
/aa-sdk-test2/carbon-data/example/LockFiles/concurrentload.lock"
18/09/04 14:45:10 DEBUG request: Sending Request: GET
https://aa-sdk-test2.s3.us-east-1.amazonaws.com
/carbon-data/example/LockFiles/concurrentload.lock Headers: (Authorization:
AWS AKIAIAQX5F5B2MLQPRGQ:Ap8rHsiPQPYUdcBb2Ojb/MA9q+I=, User-Agent:
aws-sdk-java/1.7.4 Mac_OS_X/10.13.6
Java_HotSpot(TM)_64-Bit_Server_VM/25.144-b01/1.8.0_144, Range: bytes=0--1,
Date: Tue, 04 Sep 2018 06:45:10 GMT, Content-Type:
application/x-www-form-urlencoded; charset=utf-8, ) 
18/09/04 14:45:10 DEBUG PoolingClientConnectionManager: Connection request:
[route: {s}->https://aa-sdk-test2.s3.us-east-1.amazonaws.com:443][total kept
alive: 1; route allocated: 1 of 15; total allocated: 1 of 15]
18/09/04 14:45:10 DEBUG PoolingClientConnectionManager: Connection leased:
[id: 1][route:
{s}->https://aa-sdk-test2.s3.us-east-1.amazonaws.com:443][total kept alive:
0; route allocated: 1 of 15; total allocated: 1 of 15]
18/09/04 14:45:10 DEBUG SdkHttpClient: Stale connection check
18/09/04 14:45:10 DEBUG RequestAddCookies: CookieSpec selected: default
18/09/04 14:45:10 DEBUG RequestAuthCache: Auth cache not set in the context
18/09/04 14:45:10 DEBUG RequestProxyAuthentication: Proxy auth state:
UNCHALLENGED
18/09/04 14:45:10 DEBUG SdkHttpClient: Attempt 1 to execute request
18/09/04 14:45:10 DEBUG DefaultClientConnection: Sending request: GET
/carbon-data/example/LockFiles/concurrentload.lock HTTP/1.1
18/09/04 14:45:10 DEBUG wire:  >> "GET
/carbon-data/example/LockFiles/concurrentload.lock HTTP/1.1[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  >> "Host:
aa-sdk-test2.s3.us-east-1.amazonaws.com[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  >> "Authorization: AWS
AKIAIAQX5F5B2MLQPRGQ:Ap8rHsiPQPYUdcBb2Ojb/MA9q+I=[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  >> "User-Agent: aws-sdk-java/1.7.4
Mac_OS_X/10.13.6
Java_HotSpot(TM)_64-Bit_Server_VM/25.144-b01/1.8.0_144[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  >> "Range: bytes=0--1[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  >> "Date: Tue, 04 Sep 2018 06:45:10
GMT[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  >> "Content-Type:
application/x-www-form-urlencoded; charset=utf-8[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  >> "Connection: Keep-Alive[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  >> "[\r][\n]"
18/09/04 14:45:10 DEBUG headers: >> GET
/carbon-data/example/LockFiles/concurrentload.lock HTTP/1.1
18/09/04 14:45:10 DEBUG headers: >> Host:
aa-sdk-test2.s3.us-east-1.amazonaws.com
18/09/04 14:45:10 DEBUG headers: >> Authorization: AWS
AKIAIAQX5F5B2MLQPRGQ:Ap8rHsiPQPYUdcBb2Ojb/MA9q+I=
18/09/04 14:45:10 DEBUG headers: >> User-Agent: aws-sdk-java/1.7.4
Mac_OS_X/10.13.6 Java_HotSpot(TM)_64-Bit_Server_VM/25.144-b01/1.8.0_144
18/09/04 14:45:10 DEBUG headers: >> Range: bytes=0--1
18/09/04 14:45:10 DEBUG headers: >> Date: Tue, 04 Sep 2018 06:45:10 GMT
18/09/04 14:45:10 DEBUG headers: >> Content-Type:
application/x-www-form-urlencoded; charset=utf-8
18/09/04 14:45:10 DEBUG headers: >> Connection: Keep-Alive
18/09/04 14:45:10 DEBUG wire:  << "HTTP/1.1 200 OK[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  << "x-amz-id-2:
ooaOvIUsvupOOYOCVRY7y4TUanV9xJbcAqfd+w31xAkGRptm1blE5E5yMobmKsmRyGj9crhGCao=[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  << "x-amz-request-id:
A1AD0240EBDD2234[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  << "Date: Tue, 04 Sep 2018 06:45:11
GMT[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  << "Last-Modified: Tue, 04 Sep 2018 06:45:05
GMT[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  << "ETag:
"d41d8cd98f00b204e9800998ecf8427e"[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  << "Accept-Ranges: bytes[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  << "Content-Type:
application/octet-stream[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  << "Content-Length: 0[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  << "Server: AmazonS3[\r][\n]"
18/09/04 14:45:10 DEBUG wire:  << "[\r][\n]"
18/09/04 14:45:10 DEBUG DefaultClientConnection: Receiving response:
HTTP/1.1 200 OK
18/09/04 14:45:10 DEBUG headers: << HTTP/1.1 200 OK
18/09/04 14:45:10 DEBUG headers: << x-amz-id-2:
ooaOvIUsvupOOYOCVRY7y4TUanV9xJbcAqfd+w31xAkGRptm1blE5E5yMobmKsmRyGj9crhGCao=
18/09/04 14:45:10 DEBUG headers: << x-amz-request-id: A1AD0240EBDD2234
18/09/04 14:45:10 DEBUG headers: << Date: Tue, 04 Sep 2018 06:45:11 GMT
18/09/04 14:45:10 DEBUG headers: << Last-Modified: Tue, 04 Sep 2018 06:45:05
GMT
18/09/04 14:45:10 DEBUG headers: << ETag: "d41d8cd98f00b204e9800998ecf8427e"
18/09/04 14:45:10 DEBUG headers: << Accept-Ranges: bytes
18/09/04 14:45:10 DEBUG headers: << Content-Type: application/octet-stream
18/09/04 14:45:10 DEBUG headers: << Content-Length: 0
18/09/04 14:45:10 DEBUG headers: << Server: AmazonS3
18/09/04 14:45:10 DEBUG SdkHttpClient: Connection can be kept alive
indefinitely
18/09/04 14:45:10 DEBUG request: Received successful response: 200, AWS
Request ID: A1AD0240EBDD2234
18/09/04 14:45:10 DEBUG 

Re: error occur when I load data to s3

2018-09-03 Thread Kunal Kapoor
Ok. Let me have a look

On Tue, Sep 4, 2018, 8:22 AM aaron <949835...@qq.com> wrote:

> Hi kunalkapoor,
>It seems that error not fixed yet. Do you have any idea?
>
> thanks
> aaron
>
> aaron:2.2.1 aaron$ spark-shell --executor-memory 4g --driver-memory 2g
> Ivy Default Cache set to: /Users/aaron/.ivy2/cache
> The jars for the packages stored in: /Users/aaron/.ivy2/jars
> :: loading settings :: url =
>
> jar:file:/usr/local/Cellar/apache-spark/2.2.1/lib/apache-carbondata-1.5.0-SNAPSHOT-bin-spark2.2.1-hadoop2.7.2.jar!/org/apache/ivy/core/settings/ivysettings.xml
> com.amazonaws#aws-java-sdk added as a dependency
> org.apache.hadoop#hadoop-aws added as a dependency
> com.databricks#spark-avro_2.11 added as a dependency
> :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
> confs: [default]
> found com.amazonaws#aws-java-sdk;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-support;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-core;1.10.75.1 in central
> found commons-logging#commons-logging;1.1.3 in central
> found org.apache.httpcomponents#httpclient;4.3.6 in local-m2-cache
> found org.apache.httpcomponents#httpcore;4.3.3 in local-m2-cache
> found commons-codec#commons-codec;1.6 in local-m2-cache
> found com.fasterxml.jackson.core#jackson-databind;2.5.3 in central
> found com.fasterxml.jackson.core#jackson-annotations;2.5.0 in
> central
> found com.fasterxml.jackson.core#jackson-core;2.5.3 in central
> found
> com.fasterxml.jackson.dataformat#jackson-dataformat-cbor;2.5.3 in
> central
> found joda-time#joda-time;2.8.1 in central
> found com.amazonaws#aws-java-sdk-simpledb;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-simpleworkflow;1.10.75.1 in
> central
> found com.amazonaws#aws-java-sdk-storagegateway;1.10.75.1 in
> central
> found com.amazonaws#aws-java-sdk-route53;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-s3;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-kms;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-importexport;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-sts;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-sqs;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-rds;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-redshift;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-elasticbeanstalk;1.10.75.1 in
> central
> found com.amazonaws#aws-java-sdk-glacier;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-sns;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-iam;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-datapipeline;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-elasticloadbalancing;1.10.75.1 in
> central
> found com.amazonaws#aws-java-sdk-emr;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-elasticache;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-elastictranscoder;1.10.75.1 in
> central
> found com.amazonaws#aws-java-sdk-ec2;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-dynamodb;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-cloudtrail;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-cloudwatch;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-logs;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-events;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-cognitoidentity;1.10.75.1 in
> central
> found com.amazonaws#aws-java-sdk-cognitosync;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-directconnect;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-cloudformation;1.10.75.1 in
> central
> found com.amazonaws#aws-java-sdk-cloudfront;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-kinesis;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-opsworks;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-ses;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-autoscaling;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-cloudsearch;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-cloudwatchmetrics;1.10.75.1 in
> central
> found com.amazonaws#aws-java-sdk-swf-libraries;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-codedeploy;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-codepipeline;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-config;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-lambda;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-ecs;1.10.75.1 in central
> found com.amazonaws#aws-java-sdk-ecr;1.10.75.1 in central
> found 

Re: error occur when I load data to s3

2018-09-03 Thread aaron
Hi kunalkapoor,
   It seems that error not fixed yet. Do you have any idea?

thanks
aaron

aaron:2.2.1 aaron$ spark-shell --executor-memory 4g --driver-memory 2g
Ivy Default Cache set to: /Users/aaron/.ivy2/cache
The jars for the packages stored in: /Users/aaron/.ivy2/jars
:: loading settings :: url =
jar:file:/usr/local/Cellar/apache-spark/2.2.1/lib/apache-carbondata-1.5.0-SNAPSHOT-bin-spark2.2.1-hadoop2.7.2.jar!/org/apache/ivy/core/settings/ivysettings.xml
com.amazonaws#aws-java-sdk added as a dependency
org.apache.hadoop#hadoop-aws added as a dependency
com.databricks#spark-avro_2.11 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
found com.amazonaws#aws-java-sdk;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-support;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-core;1.10.75.1 in central
found commons-logging#commons-logging;1.1.3 in central
found org.apache.httpcomponents#httpclient;4.3.6 in local-m2-cache
found org.apache.httpcomponents#httpcore;4.3.3 in local-m2-cache
found commons-codec#commons-codec;1.6 in local-m2-cache
found com.fasterxml.jackson.core#jackson-databind;2.5.3 in central
found com.fasterxml.jackson.core#jackson-annotations;2.5.0 in central
found com.fasterxml.jackson.core#jackson-core;2.5.3 in central
found com.fasterxml.jackson.dataformat#jackson-dataformat-cbor;2.5.3 in
central
found joda-time#joda-time;2.8.1 in central
found com.amazonaws#aws-java-sdk-simpledb;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-simpleworkflow;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-storagegateway;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-route53;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-s3;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-kms;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-importexport;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-sts;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-sqs;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-rds;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-redshift;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-elasticbeanstalk;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-glacier;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-sns;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-iam;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-datapipeline;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-elasticloadbalancing;1.10.75.1 in 
central
found com.amazonaws#aws-java-sdk-emr;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-elasticache;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-elastictranscoder;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-ec2;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-dynamodb;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-cloudtrail;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-cloudwatch;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-logs;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-events;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-cognitoidentity;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-cognitosync;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-directconnect;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-cloudformation;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-cloudfront;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-kinesis;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-opsworks;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-ses;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-autoscaling;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-cloudsearch;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-cloudwatchmetrics;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-swf-libraries;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-codedeploy;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-codepipeline;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-config;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-lambda;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-ecs;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-ecr;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-cloudhsm;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-ssm;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-workspaces;1.10.75.1 in central
found com.amazonaws#aws-java-sdk-machinelearning;1.10.75.1 in central
   

Re: error occur when I load data to s3

2018-09-03 Thread xuchuanyin
Did you build carbon with -Pbuild-with-format? it introduced Map datatype and 
changed the thrift, so you need to add it. On 09/04/2018 09:10, aaron wrote: 
Compile failed. My env is, aaron:carbondata aaron$ java -version java version 
"1.8.0_144" Java(TM) SE Runtime Environment (build 1.8.0_144-b01) Java 
HotSpot(TM) 64-Bit Server VM (build 25.144-b01, mixed mode) aaron:carbondata 
aaron$ mvn -v Apache Maven 3.5.2 (138edd61fd100ec658bfa2d307c43b76940a5d7d; 
2017-10-18T15:58:13+08:00) Maven home: /usr/local/Cellar/maven/3.5.2/libexec 
Java version: 1.8.0_144, vendor: Oracle Corporation Java home: 
/Library/Java/JavaVirtualMachines/jdk1.8.0_144.jdk/Contents/Home/jre Default 
locale: en_US, platform encoding: UTF-8 OS name: "mac os x", version: 
"10.13.6", arch: "x86_64", family: "mac" aaron:carbondata aaron$ scala -version 
Scala code runner version 2.11.8 -- Copyright 2002-2016, LAMP/EPFL Error info 
is, [ERROR] COMPILATION ERROR : [INFO] 
- [ERROR] 
/Users/aaron/workspace/carbondata/core/src/main/java/org/apache/carbondata/core/util/CarbonUtil.java:[2230,12]
 an enum switch case label must be the unqualified name of an enumeration 
constant [ERROR] 
/Users/aaron/workspace/carbondata/core/src/main/java/org/apache/carbondata/core/metadata/converter/ThriftWrapperSchemaConverterImpl.java:[160,51]
 cannot find symbol  symbol:   variable MAP  location: class 
org.apache.carbondata.format.DataType [ERROR] 
/Users/aaron/workspace/carbondata/core/src/main/java/org/apache/carbondata/core/metadata/converter/ThriftWrapperSchemaConverterImpl.java:[501,12]
 an enum switch case label must be the unqualified name of an enumeration 
constant [INFO] 3 errors [INFO] 
- [INFO] 
 [INFO] 
Reactor Summary: [INFO] [INFO] Apache CarbonData :: Parent 
 SUCCESS [  3.251 s] [INFO] Apache CarbonData :: Common 
 SUCCESS [  9.868 s] [INFO] Apache CarbonData :: Core 
.. FAILURE [  5.734 s] [INFO] Apache CarbonData :: 
Processing  SKIPPED [INFO] Apache CarbonData :: Hadoop 
 SKIPPED [INFO] Apache CarbonData :: Streaming 
. SKIPPED [INFO] Apache CarbonData :: Store SDK 
. SKIPPED [INFO] Apache CarbonData :: Spark Datasource 
.. SKIPPED [INFO] Apache CarbonData :: Spark Common 
.. SKIPPED [INFO] Apache CarbonData :: Search 
 SKIPPED [INFO] Apache CarbonData :: Lucene Index 
DataMap .. SKIPPED [INFO] Apache CarbonData :: Bloom Index DataMap 
... SKIPPED [INFO] Apache CarbonData :: Spark2  
SKIPPED [INFO] Apache CarbonData :: Spark Common Test . SKIPPED 
[INFO] Apache CarbonData :: DataMap Examples .. SKIPPED [INFO] 
Apache CarbonData :: Assembly .. SKIPPED [INFO] Apache 
CarbonData :: Hive .. SKIPPED [INFO] Apache CarbonData 
:: presto  SKIPPED [INFO] Apache CarbonData :: Spark2 
Examples ... SKIPPED [INFO] 
 [INFO] 
BUILD FAILURE [INFO] 
 [INFO] 
Total time: 19.595 s [INFO] Finished at: 2018-09-04T09:06:59+08:00 [INFO] Final 
Memory: 56M/583M [INFO] 
 
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-compiler-plugin:3.2:compile (default-compile) on 
project carbondata-core: Compilation failure: Compilation failure: [ERROR] 
/Users/aaron/workspace/carbondata/core/src/main/java/org/apache/carbondata/core/util/CarbonUtil.java:[2230,12]
 an enum switch case label must be the unqualified name of an enumeration 
constant [ERROR] 
/Users/aaron/workspace/carbondata/core/src/main/java/org/apache/carbondata/core/metadata/converter/ThriftWrapperSchemaConverterImpl.java:[160,51]
 cannot find symbol [ERROR]   symbol:   variable MAP [ERROR]   location: class 
org.apache.carbondata.format.DataType [ERROR] 
/Users/aaron/workspace/carbondata/core/src/main/java/org/apache/carbondata/core/metadata/converter/ThriftWrapperSchemaConverterImpl.java:[501,12]
 an enum switch case label must be the unqualified name of an enumeration 
constant [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the 
errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X 
switch to enable full debug logging. [ERROR] [ERROR] For more information about 
the errors and possible solutions, please read the following articles: [ERROR] 
[Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException 
[ERROR] [ERROR] After correcting the problems, you can resume the build with 
the 

Re: error occur when I load data to s3

2018-09-03 Thread aaron
Compile failed.

My env is,

aaron:carbondata aaron$ java -version
java version "1.8.0_144"
Java(TM) SE Runtime Environment (build 1.8.0_144-b01)
Java HotSpot(TM) 64-Bit Server VM (build 25.144-b01, mixed mode)
aaron:carbondata aaron$ mvn -v
Apache Maven 3.5.2 (138edd61fd100ec658bfa2d307c43b76940a5d7d;
2017-10-18T15:58:13+08:00)
Maven home: /usr/local/Cellar/maven/3.5.2/libexec
Java version: 1.8.0_144, vendor: Oracle Corporation
Java home:
/Library/Java/JavaVirtualMachines/jdk1.8.0_144.jdk/Contents/Home/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "mac os x", version: "10.13.6", arch: "x86_64", family: "mac"
aaron:carbondata aaron$ scala -version
Scala code runner version 2.11.8 -- Copyright 2002-2016, LAMP/EPFL

Error info is,

[ERROR] COMPILATION ERROR : 
[INFO] -
[ERROR]
/Users/aaron/workspace/carbondata/core/src/main/java/org/apache/carbondata/core/util/CarbonUtil.java:[2230,12]
an enum switch case label must be the unqualified name of an enumeration
constant
[ERROR]
/Users/aaron/workspace/carbondata/core/src/main/java/org/apache/carbondata/core/metadata/converter/ThriftWrapperSchemaConverterImpl.java:[160,51]
cannot find symbol
  symbol:   variable MAP
  location: class org.apache.carbondata.format.DataType
[ERROR]
/Users/aaron/workspace/carbondata/core/src/main/java/org/apache/carbondata/core/metadata/converter/ThriftWrapperSchemaConverterImpl.java:[501,12]
an enum switch case label must be the unqualified name of an enumeration
constant
[INFO] 3 errors 
[INFO] -
[INFO]

[INFO] Reactor Summary:
[INFO] 
[INFO] Apache CarbonData :: Parent  SUCCESS [  3.251
s]
[INFO] Apache CarbonData :: Common  SUCCESS [  9.868
s]
[INFO] Apache CarbonData :: Core .. FAILURE [  5.734
s]
[INFO] Apache CarbonData :: Processing  SKIPPED
[INFO] Apache CarbonData :: Hadoop  SKIPPED
[INFO] Apache CarbonData :: Streaming . SKIPPED
[INFO] Apache CarbonData :: Store SDK . SKIPPED
[INFO] Apache CarbonData :: Spark Datasource .. SKIPPED
[INFO] Apache CarbonData :: Spark Common .. SKIPPED
[INFO] Apache CarbonData :: Search  SKIPPED
[INFO] Apache CarbonData :: Lucene Index DataMap .. SKIPPED
[INFO] Apache CarbonData :: Bloom Index DataMap ... SKIPPED
[INFO] Apache CarbonData :: Spark2  SKIPPED
[INFO] Apache CarbonData :: Spark Common Test . SKIPPED
[INFO] Apache CarbonData :: DataMap Examples .. SKIPPED
[INFO] Apache CarbonData :: Assembly .. SKIPPED
[INFO] Apache CarbonData :: Hive .. SKIPPED
[INFO] Apache CarbonData :: presto  SKIPPED
[INFO] Apache CarbonData :: Spark2 Examples ... SKIPPED
[INFO]

[INFO] BUILD FAILURE
[INFO]

[INFO] Total time: 19.595 s
[INFO] Finished at: 2018-09-04T09:06:59+08:00
[INFO] Final Memory: 56M/583M
[INFO]

[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-compiler-plugin:3.2:compile (default-compile)
on project carbondata-core: Compilation failure: Compilation failure: 
[ERROR]
/Users/aaron/workspace/carbondata/core/src/main/java/org/apache/carbondata/core/util/CarbonUtil.java:[2230,12]
an enum switch case label must be the unqualified name of an enumeration
constant
[ERROR]
/Users/aaron/workspace/carbondata/core/src/main/java/org/apache/carbondata/core/metadata/converter/ThriftWrapperSchemaConverterImpl.java:[160,51]
cannot find symbol
[ERROR]   symbol:   variable MAP
[ERROR]   location: class org.apache.carbondata.format.DataType
[ERROR]
/Users/aaron/workspace/carbondata/core/src/main/java/org/apache/carbondata/core/metadata/converter/ThriftWrapperSchemaConverterImpl.java:[501,12]
an enum switch case label must be the unqualified name of an enumeration
constant
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please
read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the
command
[ERROR]   mvn  -rf :carbondata-core



--
Sent from: 
http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/


Re: error occur when I load data to s3

2018-09-03 Thread aaron
Thanks, I will have a try!



--
Sent from: 
http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/


Re: error occur when I load data to s3

2018-09-03 Thread aaron
Thanks, I will have a try.



--
Sent from: 
http://apache-carbondata-dev-mailing-list-archive.1130556.n5.nabble.com/


Re: error occur when I load data to s3

2018-09-03 Thread Liang Chen
Hi kunal

Can you list all S3 issues PR, we may need to give 1.4.2 patch release.
Because aaron plan to use carbondata in production this month.

To arron : First please you try master, see if can solve your problems.

Regards
Liang

kunalkapoor wrote
> Hi aaron,
> Many issues like this have been identified in 1.4 version. Most of the
> issues have been fixed in the master code and will be released in 1.5
> version.
> Remaing fixes are in progress.
> Can you try the same scenario in 1.5(master branch).
> 
> Thanks
> Kunal Kapoor
> 
> On Mon, Sep 3, 2018, 5:57 AM aaron <

> 949835961@

>> wrote:
> 
>> *update the aws-java-sdk and hadoop-aws to below version, then
>> authorization
>> works.
>> com.amazonaws:aws-java-sdk:1.10.75.1,org.apache.hadoop:hadoop-aws:2.7.3*
>>
>> *But we still can not load data, the exception is same.
>> carbon.sql("LOAD DATA INPATH
>> 'hdfs://localhost:9000/usr/carbon-s3/sample.csv' INTO TABLE
>> test_s3_table")*
>>
>> 18/09/02 21:49:47 ERROR CarbonLoaderUtil: main Unable to unlock Table
>> lock
>> for tabledefault.test_s3_table during table status updation
>> 18/09/02 21:49:47 ERROR CarbonLoadDataCommand: main
>> java.lang.ArrayIndexOutOfBoundsException
>> at java.lang.System.arraycopy(Native Method)
>> at
>> java.io.BufferedOutputStream.write(BufferedOutputStream.java:128)
>> at
>> org.apache.hadoop.fs.s3a.S3AOutputStream.write(S3AOutputStream.java:164)
>> at
>>
>> org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:58)
>> at java.io.DataOutputStream.write(DataOutputStream.java:107)
>> at
>>
>> org.apache.carbondata.core.datastore.filesystem.S3CarbonFile.getDataOutputStream(S3CarbonFile.java:111)
>> at
>>
>> org.apache.carbondata.core.datastore.filesystem.S3CarbonFile.getDataOutputStreamUsingAppend(S3CarbonFile.java:93)
>> at
>>
>> org.apache.carbondata.core.datastore.impl.FileFactory.getDataOutputStreamUsingAppend(FileFactory.java:276)
>> at
>> org.apache.carbondata.core.locks.S3FileLock.lock(S3FileLock.java:96)
>> at
>>
>> org.apache.carbondata.core.locks.AbstractCarbonLock.lockWithRetries(AbstractCarbonLock.java:41)
>> at
>>
>> org.apache.carbondata.core.locks.AbstractCarbonLock.lockWithRetries(AbstractCarbonLock.java:59)
>> at
>>
>> org.apache.carbondata.processing.util.CarbonLoaderUtil.recordNewLoadMetadata(CarbonLoaderUtil.java:247)
>> at
>>
>> org.apache.carbondata.processing.util.CarbonLoaderUtil.recordNewLoadMetadata(CarbonLoaderUtil.java:204)
>> at
>>
>> org.apache.carbondata.processing.util.CarbonLoaderUtil.readAndUpdateLoadProgressInTableMeta(CarbonLoaderUtil.java:437)
>> at
>>
>> org.apache.carbondata.processing.util.CarbonLoaderUtil.readAndUpdateLoadProgressInTableMeta(CarbonLoaderUtil.java:446)
>> at
>>
>> org.apache.spark.sql.execution.command.management.CarbonLoadDataCommand.processData(CarbonLoadDataCommand.scala:263)
>> at
>>
>> org.apache.spark.sql.execution.command.AtomicRunnableCommand.run(package.scala:92)
>> at
>>
>> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
>> at
>>
>> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
>> at
>>
>> org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:67)
>> at org.apache.spark.sql.Dataset.
> 
> (Dataset.scala:183)
>> at
>>
>> org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:107)
>> at
>>
>> org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:96)
>> at
>> org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:154)
>> at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:94)
>> at
>>
>> $line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.
> 
> (
> 
> :34)
>> at
>>
>> $line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.
> 
> (
> 
> :39)
>> at
>> $line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.
> 
> (
> 
> :41)
>> at
>> $line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.
> 
> (
> 
> :43)
>> at
>> $line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.
> 
> (
> 
> :45)
>> at $line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.
> 
> (
> 
> :47)
>> at $line25.$read$$iw$$iw$$iw$$iw$$iw$$iw.
> 
> (
> 
> :49)
>> at $line25.$read$$iw$$iw$$iw$$iw$$iw.
> 
> (
> 
> :51)
>> at $line25.$read$$iw$$iw$$iw$$iw.
> 
> (
> 
> :53)
>> at $line25.$read$$iw$$iw$$iw.
> 
> (
> 
> :55)
>> at $line25.$read$$iw$$iw.
> 
> (
> 
> :57)
>> at $line25.$read$$iw.
> 
> (
> 
> :59)
>> at $line25.$read.
> 
> (
> 
> :61)
>> at $line25.$read$.
> 
> (
> 
> :65)
>> at $line25.$read$.
> 
> (
> 
> )
>> at $line25.$eval$.$print$lzycompute(
> 
> :7)
>> at $line25.$eval$.$print(
> 
> :6)
>>  

Re: error occur when I load data to s3

2018-09-03 Thread Kunal Kapoor
Hi aaron,
Many issues like this have been identified in 1.4 version. Most of the
issues have been fixed in the master code and will be released in 1.5
version.
Remaing fixes are in progress.
Can you try the same scenario in 1.5(master branch).

Thanks
Kunal Kapoor

On Mon, Sep 3, 2018, 5:57 AM aaron <949835...@qq.com> wrote:

> *update the aws-java-sdk and hadoop-aws to below version, then
> authorization
> works.
> com.amazonaws:aws-java-sdk:1.10.75.1,org.apache.hadoop:hadoop-aws:2.7.3*
>
> *But we still can not load data, the exception is same.
> carbon.sql("LOAD DATA INPATH
> 'hdfs://localhost:9000/usr/carbon-s3/sample.csv' INTO TABLE
> test_s3_table")*
>
> 18/09/02 21:49:47 ERROR CarbonLoaderUtil: main Unable to unlock Table lock
> for tabledefault.test_s3_table during table status updation
> 18/09/02 21:49:47 ERROR CarbonLoadDataCommand: main
> java.lang.ArrayIndexOutOfBoundsException
> at java.lang.System.arraycopy(Native Method)
> at
> java.io.BufferedOutputStream.write(BufferedOutputStream.java:128)
> at
> org.apache.hadoop.fs.s3a.S3AOutputStream.write(S3AOutputStream.java:164)
> at
>
> org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:58)
> at java.io.DataOutputStream.write(DataOutputStream.java:107)
> at
>
> org.apache.carbondata.core.datastore.filesystem.S3CarbonFile.getDataOutputStream(S3CarbonFile.java:111)
> at
>
> org.apache.carbondata.core.datastore.filesystem.S3CarbonFile.getDataOutputStreamUsingAppend(S3CarbonFile.java:93)
> at
>
> org.apache.carbondata.core.datastore.impl.FileFactory.getDataOutputStreamUsingAppend(FileFactory.java:276)
> at
> org.apache.carbondata.core.locks.S3FileLock.lock(S3FileLock.java:96)
> at
>
> org.apache.carbondata.core.locks.AbstractCarbonLock.lockWithRetries(AbstractCarbonLock.java:41)
> at
>
> org.apache.carbondata.core.locks.AbstractCarbonLock.lockWithRetries(AbstractCarbonLock.java:59)
> at
>
> org.apache.carbondata.processing.util.CarbonLoaderUtil.recordNewLoadMetadata(CarbonLoaderUtil.java:247)
> at
>
> org.apache.carbondata.processing.util.CarbonLoaderUtil.recordNewLoadMetadata(CarbonLoaderUtil.java:204)
> at
>
> org.apache.carbondata.processing.util.CarbonLoaderUtil.readAndUpdateLoadProgressInTableMeta(CarbonLoaderUtil.java:437)
> at
>
> org.apache.carbondata.processing.util.CarbonLoaderUtil.readAndUpdateLoadProgressInTableMeta(CarbonLoaderUtil.java:446)
> at
>
> org.apache.spark.sql.execution.command.management.CarbonLoadDataCommand.processData(CarbonLoadDataCommand.scala:263)
> at
>
> org.apache.spark.sql.execution.command.AtomicRunnableCommand.run(package.scala:92)
> at
>
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
> at
>
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
> at
>
> org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:67)
> at org.apache.spark.sql.Dataset.(Dataset.scala:183)
> at
>
> org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:107)
> at
>
> org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:96)
> at
> org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:154)
> at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:94)
> at
>
> $line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.(:34)
> at
>
> $line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.(:39)
> at
> $line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.(:41)
> at
> $line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.(:43)
> at
> $line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.(:45)
> at $line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.(:47)
> at $line25.$read$$iw$$iw$$iw$$iw$$iw$$iw.(:49)
> at $line25.$read$$iw$$iw$$iw$$iw$$iw.(:51)
> at $line25.$read$$iw$$iw$$iw$$iw.(:53)
> at $line25.$read$$iw$$iw$$iw.(:55)
> at $line25.$read$$iw$$iw.(:57)
> at $line25.$read$$iw.(:59)
> at $line25.$read.(:61)
> at $line25.$read$.(:65)
> at $line25.$read$.()
> at $line25.$eval$.$print$lzycompute(:7)
> at $line25.$eval$.$print(:6)
> at $line25.$eval.$print()
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
> at
> scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
> at
>
> 

Re: error occur when I load data to s3

2018-09-02 Thread aaron
*update the aws-java-sdk and hadoop-aws to below version, then authorization
works. 
com.amazonaws:aws-java-sdk:1.10.75.1,org.apache.hadoop:hadoop-aws:2.7.3*

*But we still can not load data, the exception is same.
carbon.sql("LOAD DATA INPATH
'hdfs://localhost:9000/usr/carbon-s3/sample.csv' INTO TABLE test_s3_table")*

18/09/02 21:49:47 ERROR CarbonLoaderUtil: main Unable to unlock Table lock
for tabledefault.test_s3_table during table status updation
18/09/02 21:49:47 ERROR CarbonLoadDataCommand: main 
java.lang.ArrayIndexOutOfBoundsException
at java.lang.System.arraycopy(Native Method)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:128)
at 
org.apache.hadoop.fs.s3a.S3AOutputStream.write(S3AOutputStream.java:164)
at
org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:58)
at java.io.DataOutputStream.write(DataOutputStream.java:107)
at
org.apache.carbondata.core.datastore.filesystem.S3CarbonFile.getDataOutputStream(S3CarbonFile.java:111)
at
org.apache.carbondata.core.datastore.filesystem.S3CarbonFile.getDataOutputStreamUsingAppend(S3CarbonFile.java:93)
at
org.apache.carbondata.core.datastore.impl.FileFactory.getDataOutputStreamUsingAppend(FileFactory.java:276)
at org.apache.carbondata.core.locks.S3FileLock.lock(S3FileLock.java:96)
at
org.apache.carbondata.core.locks.AbstractCarbonLock.lockWithRetries(AbstractCarbonLock.java:41)
at
org.apache.carbondata.core.locks.AbstractCarbonLock.lockWithRetries(AbstractCarbonLock.java:59)
at
org.apache.carbondata.processing.util.CarbonLoaderUtil.recordNewLoadMetadata(CarbonLoaderUtil.java:247)
at
org.apache.carbondata.processing.util.CarbonLoaderUtil.recordNewLoadMetadata(CarbonLoaderUtil.java:204)
at
org.apache.carbondata.processing.util.CarbonLoaderUtil.readAndUpdateLoadProgressInTableMeta(CarbonLoaderUtil.java:437)
at
org.apache.carbondata.processing.util.CarbonLoaderUtil.readAndUpdateLoadProgressInTableMeta(CarbonLoaderUtil.java:446)
at
org.apache.spark.sql.execution.command.management.CarbonLoadDataCommand.processData(CarbonLoadDataCommand.scala:263)
at
org.apache.spark.sql.execution.command.AtomicRunnableCommand.run(package.scala:92)
at
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
at
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
at
org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:67)
at org.apache.spark.sql.Dataset.(Dataset.scala:183)
at
org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:107)
at
org.apache.spark.sql.CarbonSession$$anonfun$sql$1.apply(CarbonSession.scala:96)
at 
org.apache.spark.sql.CarbonSession.withProfiler(CarbonSession.scala:154)
at org.apache.spark.sql.CarbonSession.sql(CarbonSession.scala:94)
at
$line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.(:34)
at
$line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.(:39)
at
$line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.(:41)
at 
$line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.(:43)
at $line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.(:45)
at $line25.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.(:47)
at $line25.$read$$iw$$iw$$iw$$iw$$iw$$iw.(:49)
at $line25.$read$$iw$$iw$$iw$$iw$$iw.(:51)
at $line25.$read$$iw$$iw$$iw$$iw.(:53)
at $line25.$read$$iw$$iw$$iw.(:55)
at $line25.$read$$iw$$iw.(:57)
at $line25.$read$$iw.(:59)
at $line25.$read.(:61)
at $line25.$read$.(:65)
at $line25.$read$.()
at $line25.$eval$.$print$lzycompute(:7)
at $line25.$eval$.$print(:6)
at $line25.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
at 
scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
at
scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
at
scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
at
scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at
scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
at
scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)

error occur when I load data to s3

2018-08-29 Thread aaron
Hi dear community, could anybody please kindly tell me what happened?  

*Env*:

1.spark 2.2.1 + carbon1.4.1
2.spark.jars.packages 
com.amazonaws:aws-java-sdk:1.7.4,org.apache.hadoop:hadoop-aws:2.7.2
3.spark.driver.extraClassPath
file:///usr/local/Cellar/apache-spark/2.2.1/lib/*
spark.executor.extraClassPath
file:///usr/local/Cellar/apache-spark/2.2.1/lib/* 
lib folder include below jars
-rw-r--r--@ 1 aaron  staff52M Aug 29 20:50
apache-carbondata-1.4.1-bin-spark2.2.1-hadoop2.7.2.jar
-rw-r--r--  1 aaron  staff   764K Aug 29 21:33 httpclient-4.5.4.jar
-rw-r--r--  1 aaron  staff   314K Aug 29 21:40 httpcore-4.4.jar


*Code*:

import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.CarbonSession._
import org.apache.spark.sql.catalyst.util._
import org.apache.carbondata.core.util.CarbonProperties
import org.apache.carbondata.core.constants.CarbonCommonConstants
CarbonProperties.getInstance().addProperty(CarbonCommonConstants.LOCK_TYPE,
"HDFSLOCK")
val carbon =
SparkSession.builder().config(sc.getConf).config("spark.hadoop.fs.s3a.impl",
"org.apache.hadoop.fs.s3a.S3AFileSystem").config("spark.hadoop.fs.s3a.access.key",
"xxx").config("spark.hadoop.fs.s3a.secret.key",
"xxx").getOrCreateCarbonSession("hdfs://localhost:9000/usr/carbon-meta")

carbon.sql("CREATE TABLE IF NOT EXISTS test_s3_table(id string, name string,
city string, age Int) STORED BY 'carbondata' LOCATION
's3a://key:password@aaron-s3-poc/'")
carbon.sql("LOAD DATA INPATH
'hdfs://localhost:9000/usr/carbon-s3/sample.csv' INTO TABLE test_s3_table")

*s3 files,*

aws s3 ls s3://aaron-s3-poc/ --human --recursive
2018-08-29 22:13:320 Bytes LockFiles/tablestatus.lock
2018-08-29 21:41:36  616 Bytes Metadata/schema


*Issue 1,* when I create table, carbondata raise Exception
"com.amazonaws.AmazonClientException: Unable to load AWS credentials from
any provider in the chain" even if 
a. I set related properties in spark-default.conf like
spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem  
spark.hadoop.fs.s3a.awsAccessKeyId=xxx
spark.hadoop.fs.s3a.awsSecretAccessKey=xxx
spark.hadoop.fs.s3a.access.key=xxx
spark.hadoop.fs.s3a.secret.key=xxx
b.config in code
val carbon =
SparkSession.builder().config(sc.getConf).config("spark.hadoop.fs.s3a.impl",
"org.apache.hadoop.fs.s3a.S3AFileSystem").config("spark.hadoop.fs.s3a.access.key",
"xxx").config("spark.hadoop.fs.s3a.secret.key",
"xxx").getOrCreateCarbonSession("hdfs://localhost:9000/usr/carbon-meta")
c. spark-submit conf
Finally I succeed when I put credentials in LOCATION
's3a://key:password@aaron-s3-poc/'", But it's very strange. Who could tell
me why?


*Issue 2,* Load data failed

scala> carbon.sql("LOAD DATA INPATH
'hdfs://localhost:9000/usr/carbon-s3/sample.csv' INTO TABLE test_s3_table")
18/08/29 22:13:35 ERROR CarbonLoaderUtil: main Unable to unlock Table lock
for tabledefault.test_s3_table during table status updation
18/08/29 22:13:35 ERROR CarbonLoadDataCommand: main 
java.lang.ArrayIndexOutOfBoundsException
at java.lang.System.arraycopy(Native Method)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:128)
at 
org.apache.hadoop.fs.s3a.S3AOutputStream.write(S3AOutputStream.java:164)
at
org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:58)
at java.io.DataOutputStream.write(DataOutputStream.java:107)
at
org.apache.carbondata.core.datastore.filesystem.S3CarbonFile.getDataOutputStream(S3CarbonFile.java:111)
at
org.apache.carbondata.core.datastore.filesystem.S3CarbonFile.getDataOutputStreamUsingAppend(S3CarbonFile.java:93)
at
org.apache.carbondata.core.datastore.impl.FileFactory.getDataOutputStreamUsingAppend(FileFactory.java:276)
at org.apache.carbondata.core.locks.S3FileLock.lock(S3FileLock.java:96)
at
org.apache.carbondata.core.locks.AbstractCarbonLock.lockWithRetries(AbstractCarbonLock.java:41)
at
org.apache.carbondata.core.locks.AbstractCarbonLock.lockWithRetries(AbstractCarbonLock.java:59)
at
org.apache.carbondata.processing.util.CarbonLoaderUtil.recordNewLoadMetadata(CarbonLoaderUtil.java:247)
at
org.apache.carbondata.processing.util.CarbonLoaderUtil.recordNewLoadMetadata(CarbonLoaderUtil.java:204)
at
org.apache.carbondata.processing.util.CarbonLoaderUtil.readAndUpdateLoadProgressInTableMeta(CarbonLoaderUtil.java:437)
at
org.apache.carbondata.processing.util.CarbonLoaderUtil.readAndUpdateLoadProgressInTableMeta(CarbonLoaderUtil.java:446)
at
org.apache.spark.sql.execution.command.management.CarbonLoadDataCommand.processData(CarbonLoadDataCommand.scala:263)
at
org.apache.spark.sql.execution.command.AtomicRunnableCommand.run(package.scala:92)
at
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
at
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
at