Hi,
  The storage config is correct. then enabled the Kerberos security. So, please 
check the java stack trace to ensure not the authentication problem.
  Is it possible to use dfs client connect the HDFS for reading the csv file?
  The DataNode port is opening and accessible?

> 2021年2月26日 下午9:52,Mehmet - <[email protected]> 写道:
> 
> Hi,
> 
> 1. Drill version: 1.18.0
> 2. HDFS Version: Hadoop 3.0-cdh6.3.3
> 3. Storage config: https://paste.ubuntu.com/p/5Dk9jVCxYr/
> 4. drill-env.sh file: https://paste.ubuntu.com/p/MGNG4zhbrk/
> 
> Thank you.
> BR.
> 
> luoc <[email protected]>, 26 Şub 2021 Cum, 16:14 tarihinde şunu yazdı:
> 
>> Hi,
>>  That does not seem like an issues with Drill.
>> Would you please provides more helpful information :
>> 1. Drill version
>> 2. HDFS version
>> 3. Storage config
>> 
>>> 2021年2月26日 下午3:32,Mehmet - <[email protected]> 写道:
>>> 
>>> Hi Team,
>>> 
>>> I have a problem with Hdfs query on Drill. When I run a "SHOW FILES in
>>> root.`tmp/` ", I can list the files correctly.
>>> Bu when I run a select query like "Select * from root.`tmp/` it throws
>>> below error.
>>> Notes:
>>> - I have already checked the state of hdfs health(via dfsadmin and hdfs
>> ui)
>>> and there is no any corruption or block error.
>>> - Drillbits are on the same cluster with Hadoop. So I think any network
>>> problem is impossible.
>>> - I have also set dfs.client.use.datanode.hostname as true (
>>> https://stackoverflow.com/a/55290406/7894534 )
>>> 
>>> org.apache.drill.common.exceptions.UserRemoteException: DATA_READ ERROR:
>>> Could not obtain block: BP-2026912985-<namenode_ip>-
>>> 1569935018133:blk_1073842201_101390 file=/tmp/2015-summary.csv
>>> File Path: hdfs://<drillbit_ip>:8020/tmp/2015-summary.csv
>>> Fragment: 0:0 [Error Id: 466835bd-6512-4854-b231-eaa439eba6f2 on
>>> <drillbit_ip>:31010]
>>> 
>>> Thank you.
>>> --
>>> Mehmet ERSOY
>> 
>> 
> 
> -- 
> Mehmet ERSOY

Reply via email to