Re: [ANNOUNCE] New Apache Hadoop Committer - Ayush Saxena

2019-07-15 Thread lqjacklee
congratulations.

On Mon, Jul 15, 2019 at 6:09 PM HarshaKiran Reddy Boreddy <
bharsh...@gmail.com> wrote:

> Congratulations Ayush!!!
>
>
> -- Harsha
>
> On Mon, Jul 15, 2019, 2:15 PM Vinayakumar B 
> wrote:
>
> > In bcc: general@, please bcc: (and not cc:) general@ if you want to
> > include
> >
> > It's my pleasure to announce that Ayush Saxena has been elected as
> > committer
> > on the Apache Hadoop project recognising his continued contributions to
> the
> > project.
> >
> > Please join me in congratulating him.
> >
> > Hearty Congratulations & Welcome aboard Ayush!
> >
> >
> > Vinayakumar B
> > (On behalf of the Hadoop PMC)
> >
>


How to test the hadoop-aws module

2019-06-27 Thread lqjacklee
Dear sir,


 I need to test the function in the hadoop-aws module, however I have
not own the credential to do that. so I wonder whether we could provide the
test account.
Besides, I notice that we can do in the education region, but not sure. How
can I to do that ,thanks a lot.


Re: ping

2019-04-19 Thread lqjacklee
Pong

On Fri, Apr 12, 2019 at 8:03 AM Steve Loughran 
wrote:

> Just checking to see if this list is live as I've seen nothing since April
> 6. While I enjoy the silence, I Worry
>
> -steve
>


[jira] [Resolved] (HADOOP-16121) Cannot build in dev docker environment

2019-03-10 Thread lqjacklee (JIRA)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16121?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

lqjacklee resolved HADOOP-16121.

Resolution: Resolved
  Assignee: lqjacklee

> Cannot build in dev docker environment
> --
>
> Key: HADOOP-16121
> URL: https://issues.apache.org/jira/browse/HADOOP-16121
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 3.3.0
> Environment: Darwin lqjacklee-MacBook-Pro.local 18.2.0 Darwin Kernel 
> Version 18.2.0: Mon Nov 12 20:24:46 PST 2018; 
> root:xnu-4903.231.4~2/RELEASE_X86_64 x86_64
>Reporter: lqjacklee
>Assignee: lqjacklee
>Priority: Minor
>
> Operation as below : 
>  
> 1, run the docker daemon
> 2, run ./start-build-env.sh
> 3, mvn clean package -DskipTests 
>  
> Response from the command line : 
>  
> [ERROR] Plugin org.apache.maven.plugins:maven-surefire-plugin:2.17 or one of 
> its dependencies could not be resolved: Failed to read artifact descriptor 
> for org.apache.maven.plugins:maven-surefire-plugin:jar:2.17: Could not 
> transfer artifact org.apache.maven.plugins:maven-surefire-plugin:pom:2.17 
> from/to central (https://repo.maven.apache.org/maven2): 
> /home/liu/.m2/repository/org/apache/maven/plugins/maven-surefire-plugin/2.17/maven-surefire-plugin-2.17.pom.part.lock
>  (No such file or directory) -> [Help 1] 
>  
> solution : 
> a, sudo chmod -R 775 ${USER_HOME}/.m2/
> b, sudo chown -R ${USER_NAME} ${USER_HOME}/.m2
>  
> After try the way , it still in trouble. 
>  
> c, sudo mvn clean package -DskipTests. but in this way, will download the 
> file (pom, jar ) duplicated ? 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16175) DynamoDBLocal Support

2019-03-08 Thread lqjacklee (JIRA)
lqjacklee created HADOOP-16175:
--

 Summary: DynamoDBLocal Support
 Key: HADOOP-16175
 URL: https://issues.apache.org/jira/browse/HADOOP-16175
 Project: Hadoop Common
  Issue Type: New Feature
  Components: common
Reporter: lqjacklee
Assignee: lqjacklee


DynamoDB Local 
([https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DynamoDBLocal.html)]
 provide the function that the user/developer can run the local environment 
without depending on AWS. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16123) Lack of protoc

2019-02-19 Thread lqjacklee (JIRA)
lqjacklee created HADOOP-16123:
--

 Summary: Lack of protoc 
 Key: HADOOP-16123
 URL: https://issues.apache.org/jira/browse/HADOOP-16123
 Project: Hadoop Common
  Issue Type: Bug
  Components: common
Affects Versions: 3.3.0
Reporter: lqjacklee
Assignee: Steve Loughran


During build the source code , do the steps as below : 

 

1, run docker daemon 

2, ./start-build-env.sh

3, sudo mvn clean install -DskipTests -Pnative 

the response prompt that : 

[ERROR] Failed to execute goal 
org.apache.hadoop:hadoop-maven-plugins:3.3.0-SNAPSHOT:protoc (compile-protoc) 
on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: 
'protoc --version' did not return a version -> 

[Help 1]

However , when execute the command : whereis protoc 

liu@a65d187055f9:~/hadoop$ whereis protoc
protoc: /opt/protobuf/bin/protoc

 

the PATH value : 
/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/opt/cmake/bin:/opt/protobuf/bin

 

liu@a65d187055f9:~/hadoop$ protoc --version
libprotoc 2.5.0

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16121) Cannot build in dev docker environment

2019-02-19 Thread lqjacklee (JIRA)
lqjacklee created HADOOP-16121:
--

 Summary: Cannot build in dev docker environment
 Key: HADOOP-16121
 URL: https://issues.apache.org/jira/browse/HADOOP-16121
 Project: Hadoop Common
  Issue Type: Bug
  Components: build
Affects Versions: 3.3.0
 Environment: Darwin lqjacklee-MacBook-Pro.local 18.2.0 Darwin Kernel 
Version 18.2.0: Mon Nov 12 20:24:46 PST 2018; 
root:xnu-4903.231.4~2/RELEASE_X86_64 x86_64
Reporter: lqjacklee
Assignee: Steve Loughran


Operation as below : 

 

1, run the docker daemon

2, run ./start-build-env.sh

3, mvn clean package -DskipTests 

 

Response from the command line : 

 

[ERROR] Plugin org.apache.maven.plugins:maven-surefire-plugin:2.17 or one of 
its dependencies could not be resolved: Failed to read artifact descriptor for 
org.apache.maven.plugins:maven-surefire-plugin:jar:2.17: Could not transfer 
artifact org.apache.maven.plugins:maven-surefire-plugin:pom:2.17 from/to 
central (https://repo.maven.apache.org/maven2): 
/home/liu/.m2/repository/org/apache/maven/plugins/maven-surefire-plugin/2.17/maven-surefire-plugin-2.17.pom.part.lock
 (No such file or directory) -> [Help 1] 

 

solution : 

a, sudo chmod -R 775 ${USER_HOME}/.m2/

b, sudo chown -R ${USER_NAME} ${USER_HOME}/.m2

 

After try the way , it still in trouble. 

 

c, sudo mvn clean package -DskipTests. but in this way, will download the file 
(pom, jar ) duplicated ? 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



Re: getting Yetus and github to be friends

2019-01-03 Thread lqjacklee
Cool, thanks!

On Thu, Jan 3, 2019 at 6:28 PM Steve Loughran 
wrote:

>
>
> > On 3 Jan 2019, at 01:22, lqjacklee  wrote:
> >
> > Thanks Steve, I like the style PR from the GitHub. when should we start
> to
> > change it ?
>
>
> Step 1 is having Yetus review github PRs associated with a JIRA. Without
> that part of the process, we can't begin to use github for reviewing.
>
> For a long time we were on yetus 0.5.0, which I believe lagged here. But
> since september, Hadoop has been on Yetus 0.8.0 -the latest release
> (HADOOP-14761). If there's more in there for github PR review we should
> enable it. And if we need more, well, its an ASF project: we all have the
> right to submit patches there too, the duty to test them first
>
> -steve
>
>
> >
> > On Thu, Jan 3, 2019 at 7:07 AM Steve Loughran 
> > wrote:
> >
> >>
> >> The new gitbox repo apparently does 2 way linking from github: you can
> >> commit a PR there and it'll make its way back
> >>
> >> this could be really slick -and do a big change to our review process.
> >>
> >> Before we can go near it though, we need to get Yetus doing its review &
> >> test of github PRs, which is not working right now
> >>
> >> What will it take to do that? And that means not "what does AW have to
> >> do", but "how can we help get this done?"
> >>
> >> -steve
> >>
> >>
> >> -
> >> To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
> >> For additional commands, e-mail: common-dev-h...@hadoop.apache.org
> >>
> >>
>
>


Re: getting Yetus and github to be friends

2019-01-02 Thread lqjacklee
Thanks Steve, I like the style PR from the GitHub. when should we start to
change it ?

On Thu, Jan 3, 2019 at 7:07 AM Steve Loughran 
wrote:

>
> The new gitbox repo apparently does 2 way linking from github: you can
> commit a PR there and it'll make its way back
>
> this could be really slick -and do a big change to our review process.
>
> Before we can go near it though, we need to get Yetus doing its review &
> test of github PRs, which is not working right now
>
> What will it take to do that? And that means not "what does AW have to
> do", but "how can we help get this done?"
>
> -steve
>
>
> -
> To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
> For additional commands, e-mail: common-dev-h...@hadoop.apache.org
>
>


[jira] [Created] (HADOOP-16024) org.apache.hadoop.security.ssl.TestSSLFactory#testServerWeakCiphers failed

2018-12-30 Thread lqjacklee (JIRA)
lqjacklee created HADOOP-16024:
--

 Summary: 
org.apache.hadoop.security.ssl.TestSSLFactory#testServerWeakCiphers failed
 Key: HADOOP-16024
 URL: https://issues.apache.org/jira/browse/HADOOP-16024
 Project: Hadoop Common
  Issue Type: Bug
  Components: common
Reporter: lqjacklee


The enabled cipher suites locally is : TLS_ECDHE_RSA_WITH_RC4_128_SHA. however 
I found it is excluded.

The track tree :

java.lang.AssertionError: Expected to find 'no cipher suites in common' but got 
unexpected exception: javax.net.ssl.SSLHandshakeException: No appropriate 
protocol (protocol is disabled or cipher suites are inappropriate)
 at sun.security.ssl.Handshaker.activate(Handshaker.java:509)
 at sun.security.ssl.SSLEngineImpl.kickstartHandshake(SSLEngineImpl.java:714)
 at sun.security.ssl.SSLEngineImpl.writeAppRecord(SSLEngineImpl.java:1212)
 at sun.security.ssl.SSLEngineImpl.wrap(SSLEngineImpl.java:1165)
 at javax.net.ssl.SSLEngine.wrap(SSLEngine.java:469)
 at org.apache.hadoop.security.ssl.TestSSLFactory.wrap(TestSSLFactory.java:248)
 at 
org.apache.hadoop.security.ssl.TestSSLFactory.testServerWeakCiphers(TestSSLFactory.java:220)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
 at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:498)
 at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
 at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
 at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
 at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
 at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
 at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
 at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
 at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
 at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
 at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
 at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
 at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
 at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
 at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
 at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
 at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
 at org.junit.runner.JUnitCore.run(JUnitCore.java:160)
 at 
com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
 at 
com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
 at 
com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
 at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)


 at 
org.apache.hadoop.test.GenericTestUtils.assertExceptionContains(GenericTestUtils.java:350)
 at 
org.apache.hadoop.test.GenericTestUtils.assertExceptionContains(GenericTestUtils.java:327)
 at 
org.apache.hadoop.security.ssl.TestSSLFactory.testServerWeakCiphers(TestSSLFactory.java:240)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
 at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:498)
 at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
 at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
 at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
 at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
 at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
 at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
 at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
 at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
 at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
 at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
 at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
 at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
 at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
 at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
 at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26

[jira] [Created] (HADOOP-15991) testMultipartUpload

2018-12-09 Thread lqjacklee (JIRA)
lqjacklee created HADOOP-15991:
--

 Summary: testMultipartUpload
 Key: HADOOP-15991
 URL: https://issues.apache.org/jira/browse/HADOOP-15991
 Project: Hadoop Common
  Issue Type: Bug
  Components: common
Affects Versions: 3.2.0
Reporter: lqjacklee


2018-12-10 09:58:56,482 [Thread-746] INFO contract.AbstractFSContractTestBase 
(AbstractFSContractTestBase.java:setup(184)) - Test filesystem = 
s3a://jack-testlambda implemented by S3AFileSystem\{uri=s3a://jack-testlambda, 
workingDir=s3a://jack-testlambda/user/liuquan, inputPolicy=normal, 
partSize=104857600, enableMultiObjectsDelete=true, maxKeys=5000, 
readAhead=65536, blockSize=33554432, multiPartThreshold=2147483647, 
serverSideEncryptionAlgorithm='NONE', 
blockFactory=org.apache.hadoop.fs.s3a.S3ADataBlocks$DiskBlockFactory@3111299f, 
metastore=DynamoDBMetadataStore{region=ap-southeast-1, tableName=test-h}, 
authoritative=false, useListV1=false, magicCommitter=false, 
boundedExecutor=BlockingThreadPoolExecutorService\{SemaphoredDelegatingExecutor{permitCount=25,
 available=25, waiting=0}, activeCount=0}, 
unboundedExecutor=java.util.concurrent.ThreadPoolExecutor@78c596aa[Running, 
pool size = 10, active threads = 0, queued tasks = 0, completed tasks = 20], 
credentials=AWSCredentialProviderList[refcount= 2: 
[SimpleAWSCredentialsProvider, EnvironmentVariableCredentialsProvider, 
com.amazonaws.auth.InstanceProfileCredentialsProvider@4a214309], statistics 
\{18901220 bytes read, 18912956 bytes written, 580 read ops, 0 large read ops, 
843 write ops}, metrics \{{Context=s3aFileSystem} 
\{s3aFileSystemId=94fd09ed-8145-4d1e-b11b-678151785e0b} 
\{bucket=jack-testlambda} \{stream_opened=28} \{stream_close_operations=28} 
\{stream_closed=28} \{stream_aborted=0} \{stream_seek_operations=0} 
\{stream_read_exceptions=0} \{stream_forward_seek_operations=0} 
\{stream_backward_seek_operations=0} \{stream_bytes_skipped_on_seek=0} 
\{stream_bytes_backwards_on_seek=0} \{stream_bytes_read=18901220} 
\{stream_read_operations=2734} \{stream_read_fully_operations=0} 
\{stream_read_operations_incomplete=2171} \{stream_bytes_read_in_close=0} 
\{stream_bytes_discarded_in_abort=0} \{files_created=49} \{files_copied=20} 
\{files_copied_bytes=9441684} \{files_deleted=79} 
\{fake_directories_deleted=630} \{directories_created=98} 
\{directories_deleted=19} \{ignored_errors=71} \{op_copy_from_local_file=0} 
\{op_create=55} \{op_create_non_recursive=6} \{op_delete=74} \{op_exists=80} 
\{op_get_file_checksum=8} \{op_get_file_status=923} \{op_glob_status=19} 
\{op_is_directory=0} \{op_is_file=0} \{op_list_files=8} 
\{op_list_located_status=0} \{op_list_status=51} \{op_mkdirs=114} \{op_open=28} 
\{op_rename=20} \{object_copy_requests=0} \{object_delete_requests=198} 
\{object_list_requests=337} \{object_continue_list_requests=0} 
\{object_metadata_requests=560} \{object_multipart_aborted=0} 
\{object_put_bytes=18904764} \{object_put_requests=147} 
\{object_put_requests_completed=147} \{stream_write_failures=0} 
\{stream_write_block_uploads=0} \{stream_write_block_uploads_committed=0} 
\{stream_write_block_uploads_aborted=0} \{stream_write_total_time=0} 
\{stream_write_total_data=18904764} \{committer_commits_created=0} 
\{committer_commits_completed=0} \{committer_jobs_completed=0} 
\{committer_jobs_failed=0} \{committer_tasks_completed=0} 
\{committer_tasks_failed=0} \{committer_bytes_committed=0} 
\{committer_bytes_uploaded=0} \{committer_commits_failed=0} 
\{committer_commits_aborted=0} \{committer_commits_reverted=0} 
\{committer_magic_files_created=0} 
\{s3guard_metadatastore_put_path_request=166} 
\{s3guard_metadatastore_initialization=1} \{s3guard_metadatastore_retry=0} 
\{s3guard_metadatastore_throttled=0} \{store_io_throttled=0} 
\{object_put_requests_active=0} \{object_put_bytes_pending=0} 
\{stream_write_block_uploads_active=0} \{stream_write_block_uploads_pending=49} 
\{stream_write_block_uploads_data_pending=0} 
\{S3guard_metadatastore_put_path_latencyNumOps=1} 
\{S3guard_metadatastore_put_path_latency50thPercentileLatency=427507288} 
\{S3guard_metadatastore_put_path_latency75thPercentileLatency=427507288} 
\{S3guard_metadatastore_put_path_latency90thPercentileLatency=427507288} 
\{S3guard_metadatastore_put_path_latency95thPercentileLatency=427507288} 
\{S3guard_metadatastore_put_path_latency99thPercentileLatency=427507288} 
\{S3guard_metadatastore_throttle_rateNumEvents=0} 
\{S3guard_metadatastore_throttle_rate50thPercentileFrequency (Hz)=0} 
\{S3guard_metadatastore_throttle_rate75thPercentileFrequency (Hz)=0} 
\{S3guard_metadatastore_throttle_rate90thPercentileFrequency (Hz)=0} 
\{S3guard_metadatastore_throttle_rate95thPercentileFrequency (Hz)=0} 
\{S3guard_metadatastore_throttle_rate99thPercentileFrequency (Hz)=0} }}
2018-12-10 10:01:50,127 [Thread-746] INFO s3a.ITestS3AContractMultipartUploader 
(ITestS3AContractMultipartUploader.java:teardown(108