[
https://issues.apache.org/jira/browse/IMPALA-9797?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17233996#comment-17233996
]
Tim Armstrong commented on IMPALA-9797:
---------------------------------------
[~boroknagyz] maybe we should close as cannot repro?
> TestAcid.test_acid_negative failed in inserting data via Hive
> -------------------------------------------------------------
>
> Key: IMPALA-9797
> URL: https://issues.apache.org/jira/browse/IMPALA-9797
> Project: IMPALA
> Issue Type: Bug
> Reporter: Quanlong Huang
> Assignee: Zoltán Borók-Nagy
> Priority: Critical
> Labels: broken-build
> Attachments: hive-server2.log
>
>
> Saw the test fails as:
> {code}
> query_test/test_acid.py:65: in test_acid_negative
> self.run_test_case('QueryTest/acid-negative', vector,
> use_db=unique_database)
> common/impala_test_suite.py:656: in run_test_case
> result = exec_fn(query, user=test_section.get('USER', '').strip() or None)
> common/impala_test_suite.py:610: in __exec_in_hive
> result = h.execute(query, user=user)
> common/impala_connection.py:334: in execute
> r = self.__fetch_results(handle, profile_format=profile_format)
> common/impala_connection.py:441: in __fetch_results
> cursor._wait_to_finish()
> /data/jenkins/workspace/impala-cdh-7.2.0.0-core-s3/repos/Impala/infra/python/env/lib/python2.7/site-packages/impala/hiveserver2.py:412:
> in _wait_to_finish
> raise OperationalError(resp.errorMessage)
> E OperationalError: Error while compiling statement: FAILED: Execution
> Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask. Your
> endpoint configuration is wrong; For more details see:
> http://wiki.apache.org/hadoop/UnsetHostnameOrPort {code}
> In HiveServer2's logs, looks like the query failed by bad configuration:
> {code}
> 2020-05-19T11:58:21,805 INFO [HiveServer2-Background-Pool: Thread-66]
> common.LogUtils: Thread context registration is done.
> 2020-05-19T11:58:21,813 INFO [HiveServer2-Background-Pool: Thread-66]
> reexec.ReExecDriver: Execution #1 of query
> 2020-05-19T11:58:21,816 INFO [HiveServer2-Background-Pool: Thread-66]
> lockmgr.DbTxnManager: Setting lock request transaction to txnid:2619 for
> queryId=jenkins_20200519115820_bf6d1287-9e24-46c8-93d0-80c9c449fcd8
> 2020-05-19T11:58:21,822 INFO [HiveServer2-Background-Pool: Thread-66]
> lockmgr.DbLockManager: Requesting:
> queryId=jenkins_20200519115820_bf6d1287-9e24-46c8-93d0-80c9c449fcd8
> LockRequest(component:[LockComponent(type:SHARED_READ, level:TABLE,
> dbname:test_acid_negative_44c33b5e, tablename:acid, operationType:INSERT,
> isTransactional:true, isDynamicPartitionWrite:false)], txnid:2619,
> user:jenkins,
> hostname:impala-ec2-centos74-m5-4xlarge-ondemand-11ab.vpc.cloudera.com,
> agentInfo:jenkins_20200519115820_bf6d1287-9e24-46c8-93d0-80c9c449fcd8)
> 2020-05-19T11:58:21,848 INFO [HiveServer2-Background-Pool: Thread-66]
> lockmgr.DbLockManager: Response to
> queryId=jenkins_20200519115820_bf6d1287-9e24-46c8-93d0-80c9c449fcd8
> LockResponse(lockid:1330, state:ACQUIRED)
> 2020-05-19T11:58:21,858 INFO [HiveServer2-Background-Pool: Thread-66]
> ql.Driver: Executing
> command(queryId=jenkins_20200519115820_bf6d1287-9e24-46c8-93d0-80c9c449fcd8):
> insert into acid values (1), (2), (3)
> 2020-05-19T11:58:21,858 INFO [HiveServer2-Background-Pool: Thread-66]
> ql.Driver: Query ID =
> jenkins_20200519115820_bf6d1287-9e24-46c8-93d0-80c9c449fcd8
> 2020-05-19T11:58:21,858 INFO [HiveServer2-Background-Pool: Thread-66]
> ql.Driver: Total jobs = 1
> 2020-05-19T11:58:21,879 INFO [HiveServer2-Background-Pool: Thread-66]
> ql.Driver: Launching Job 1 out of 1
> 2020-05-19T11:58:21,879 INFO [HiveServer2-Background-Pool: Thread-66]
> ql.Driver: Starting task [Stage-1:MAPRED] in serial mode
> ......
> 2020-05-19T12:18:29,556 ERROR [HiveServer2-Background-Pool: Thread-66]
> operation.Operation: Error running hive query:
> org.apache.hive.service.cli.HiveSQLException: Error while compiling
> statement: FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.tez.TezTask. Your endpoint configuration is
> wrong; For more details see:
> http://wiki.apache.org/hadoop/UnsetHostnameOrPort
> at
> org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:362)
> ~[hive-service-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:241)
> ~[hive-service-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hive.service.cli.operation.SQLOperation.access$700(SQLOperation.java:87)
> ~[hive-service-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:322)
> ~[hive-service-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at java.security.AccessController.doPrivileged(Native Method)
> ~[?:1.8.0_144]
> at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_144]
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1876)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:340)
> ~[hive-service-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> ~[?:1.8.0_144]
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> ~[?:1.8.0_144]
> at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> ~[?:1.8.0_144]
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> ~[?:1.8.0_144]
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> ~[?:1.8.0_144]
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> ~[?:1.8.0_144]
> at java.lang.Thread.run(Thread.java:748) [?:1.8.0_144]
> Caused by: java.net.ConnectException: Your endpoint configuration is wrong;
> For more details see: http://wiki.apache.org/hadoop/UnsetHostnameOrPort
> at sun.reflect.GeneratedConstructorAccessor65.newInstance(Unknown
> Source) ~[?:?]
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> ~[?:1.8.0_144]
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> ~[?:1.8.0_144]
> at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:831)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:751)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1557)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at org.apache.hadoop.ipc.Client.call(Client.java:1499)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at org.apache.hadoop.ipc.Client.call(Client.java:1396)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at com.sun.proxy.$Proxy80.getNewApplication(Unknown Source) ~[?:?]
> at
> org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getNewApplication(ApplicationClientProtocolPBClientImpl.java:274)
> ~[hadoop-yarn-common-3.1.1.7.2.0.0-185.jar:?]
> at sun.reflect.GeneratedMethodAccessor40.invoke(Unknown Source) ~[?:?]
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[?:1.8.0_144]
> at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_144]
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at com.sun.proxy.$Proxy81.getNewApplication(Unknown Source) ~[?:?]
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getNewApplication(YarnClientImpl.java:270)
> ~[hadoop-yarn-client-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:278)
> ~[hadoop-yarn-client-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.tez.client.TezYarnClient.createApplication(TezYarnClient.java:71)
> ~[tez-api-0.9.1.7.2.0.0-185.jar:0.9.1.7.2.0.0-185]
> at
> org.apache.tez.client.TezClient.createApplication(TezClient.java:1153)
> ~[tez-api-0.9.1.7.2.0.0-185.jar:0.9.1.7.2.0.0-185]
> at org.apache.tez.client.TezClient.start(TezClient.java:401)
> ~[tez-api-0.9.1.7.2.0.0-185.jar:0.9.1.7.2.0.0-185]
> at
> org.apache.hadoop.hive.ql.exec.tez.TezSessionState.startSessionAndContainers(TezSessionState.java:535)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hadoop.hive.ql.exec.tez.TezSessionState.openInternal(TezSessionState.java:373)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:298)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hadoop.hive.ql.exec.tez.TezSessionPoolSession.open(TezSessionPoolSession.java:106)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hadoop.hive.ql.exec.tez.TezTask.ensureSessionHasResources(TezTask.java:403)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hadoop.hive.ql.exec.tez.TezTask.execute(TezTask.java:209)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:213)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:105)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at org.apache.hadoop.hive.ql.Executor.launchTask(Executor.java:359)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at org.apache.hadoop.hive.ql.Executor.launchTasks(Executor.java:330)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at org.apache.hadoop.hive.ql.Executor.runTasks(Executor.java:246)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at org.apache.hadoop.hive.ql.Executor.execute(Executor.java:109)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:721)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:488)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:482)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:166)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:225)
> ~[hive-service-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> ... 13 more
> Caused by: java.net.ConnectException: Connection refused
> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> ~[?:1.8.0_144]
> at
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
> ~[?:1.8.0_144]
> at
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:699)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:803)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:413)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at org.apache.hadoop.ipc.Client.getConnection(Client.java:1627)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at org.apache.hadoop.ipc.Client.call(Client.java:1443)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at org.apache.hadoop.ipc.Client.call(Client.java:1396)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at com.sun.proxy.$Proxy80.getNewApplication(Unknown Source) ~[?:?]
> at
> org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getNewApplication(ApplicationClientProtocolPBClientImpl.java:274)
> ~[hadoop-yarn-common-3.1.1.7.2.0.0-185.jar:?]
> at sun.reflect.GeneratedMethodAccessor40.invoke(Unknown Source) ~[?:?]
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[?:1.8.0_144]
> at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_144]
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
> ~[hadoop-common-3.1.1.7.2.0.0-185.jar:?]
> at com.sun.proxy.$Proxy81.getNewApplication(Unknown Source) ~[?:?]
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getNewApplication(YarnClientImpl.java:270)
> ~[hadoop-yarn-client-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:278)
> ~[hadoop-yarn-client-3.1.1.7.2.0.0-185.jar:?]
> at
> org.apache.tez.client.TezYarnClient.createApplication(TezYarnClient.java:71)
> ~[tez-api-0.9.1.7.2.0.0-185.jar:0.9.1.7.2.0.0-185]
> at
> org.apache.tez.client.TezClient.createApplication(TezClient.java:1153)
> ~[tez-api-0.9.1.7.2.0.0-185.jar:0.9.1.7.2.0.0-185]
> at org.apache.tez.client.TezClient.start(TezClient.java:401)
> ~[tez-api-0.9.1.7.2.0.0-185.jar:0.9.1.7.2.0.0-185]
> at
> org.apache.hadoop.hive.ql.exec.tez.TezSessionState.startSessionAndContainers(TezSessionState.java:535)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hadoop.hive.ql.exec.tez.TezSessionState.openInternal(TezSessionState.java:373)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:298)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hadoop.hive.ql.exec.tez.TezSessionPoolSession.open(TezSessionPoolSession.java:106)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hadoop.hive.ql.exec.tez.TezTask.ensureSessionHasResources(TezTask.java:403)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hadoop.hive.ql.exec.tez.TezTask.execute(TezTask.java:209)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:213)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:105)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at org.apache.hadoop.hive.ql.Executor.launchTask(Executor.java:359)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at org.apache.hadoop.hive.ql.Executor.launchTasks(Executor.java:330)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at org.apache.hadoop.hive.ql.Executor.runTasks(Executor.java:246)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at org.apache.hadoop.hive.ql.Executor.execute(Executor.java:109)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:721)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:488)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:482)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:166)
> ~[hive-exec-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> at
> org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:225)
> ~[hive-service-3.1.3000.7.2.0.0-185.jar:3.1.3000.7.2.0.0-185]
> ... 13 more
> {code}
> Uploaded HiveServer2's log file.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]