Hi Eugene, Thank you for reply. I didn't config these properties, it should have default config for these, right? As far as I understand, these properties are used to config the file path of Hadoop system. I can reach the text resource file through Hadoop fs instruction and spark session instruction. Also, I try to add these properties with reachable file path to the config file, but the error still exists. So it seems not to be the root cause.
Thanks, Yuchen Zhang From: Eugene Liu <[email protected]> Sent: Friday, February 22, 2019 3:17 PM To: Yuchen Zhang <[email protected]>; [email protected] Subject: Re: can't load text-dir data source Yuchen, could you check your hadoop config, /apache/hadoop/etc/hadoop/hdfs-site.xml like those properties, if folder is valid <property> <name>dfs.namenode.name.dir</name> <value>file:///data/hadoop-data/nn</value<file:///data/hadoop-data/nn%3c/value>> </property> <property> <name>dfs.datanode.data.dir</name> <value>file:///data/hadoop-data/dn</value<file:///data/hadoop-data/dn%3c/value>> </property> <property> <name>dfs.namenode.checkpoint.dir</name> <value>file:///data/hadoop-data/snn</value<file:///data/hadoop-data/snn%3c/value>> </property> ________________________________ From: Yuchen Zhang <[email protected]<mailto:[email protected]>> Sent: Friday, February 22, 2019 3:10 PM To: [email protected]<mailto:[email protected]> Subject: can't load text-dir data source Hi there, I'm new to Apache Griffin and Hadoop system and trying to set up a Griffin test env. For now, I'm trying to get accuracy measure for the Text data source on Hadoop, but get below error: 2019-02-19 09:21:06 WARN DataSource:36 - load data source [src] fails 2019-02-19 09:21:06 WARN DataSource:36 - load data source [tgt] fails But the data source can be load by sparksession : spark.read.text("hdfs:///griffin/src/src.txt"); Could anyone help me to figure out if there's any problems in my configuration? Here's my text resource file: [cid:[email protected]] Here's the DQ config: [cid:[email protected]] Thanks.
