I seem to have gone backwards?  I started over with new VM.  I started 2
docker containers, one each for name and data:

[root@sgt-pepper ~]# docker run -it --network host -e
NAMENODE_HOST=sgt-pepper apache/incubator-crail:1.2 namenode
20/02/10 20:32:44 INFO crail: initalizing namenode
20/02/10 20:32:44 INFO crail: crail.version 3101
20/02/10 20:32:44 INFO crail: crail.directorydepth 16
20/02/10 20:32:44 INFO crail: crail.tokenexpiration 10
20/02/10 20:32:44 INFO crail: crail.blocksize 1048576
20/02/10 20:32:44 INFO crail: crail.cachelimit 0
20/02/10 20:32:44 INFO crail: crail.cachepath /dev/hugepages/cache
20/02/10 20:32:44 INFO crail: crail.user crail
20/02/10 20:32:44 INFO crail: crail.shadowreplication 1
20/02/10 20:32:44 INFO crail: crail.debug false
20/02/10 20:32:44 INFO crail: crail.statistics true
20/02/10 20:32:44 INFO crail: crail.rpctimeout 1000
20/02/10 20:32:44 INFO crail: crail.datatimeout 1000
20/02/10 20:32:44 INFO crail: crail.buffersize 1048576
20/02/10 20:32:44 INFO crail: crail.slicesize 524288
20/02/10 20:32:44 INFO crail: crail.singleton true
20/02/10 20:32:44 INFO crail: crail.regionsize 1073741824
20/02/10 20:32:44 INFO crail: crail.directoryrecord 512
20/02/10 20:32:44 INFO crail: crail.directoryrandomize true
20/02/10 20:32:44 INFO crail: crail.cacheimpl
org.apache.crail.memory.MappedBufferCache
20/02/10 20:32:44 INFO crail: crail.locationmap
20/02/10 20:32:44 INFO crail: crail.namenode.address
crail://sgt-pepper:9060?id=0&size=1
20/02/10 20:32:44 INFO crail: crail.namenode.blockselection roundrobin
20/02/10 20:32:44 INFO crail: crail.namenode.fileblocks 16
20/02/10 20:32:44 INFO crail: crail.namenode.rpctype
org.apache.crail.namenode.rpc.tcp.TcpNameNode
20/02/10 20:32:44 INFO crail: crail.namenode.log
20/02/10 20:32:44 INFO crail: crail.storage.types
org.apache.crail.storage.tcp.TcpStorageTier
20/02/10 20:32:44 INFO crail: crail.storage.classes 1
20/02/10 20:32:44 INFO crail: crail.storage.rootclass 0
20/02/10 20:32:44 INFO crail: crail.storage.keepalive 2
20/02/10 20:32:44 INFO crail: round robin block selection
20/02/10 20:32:45 INFO narpc: new NaRPC server group v1.5.0, queueDepth 32,
messageSize 512, nodealy true, cores 1
20/02/10 20:32:45 INFO crail: crail.namenode.tcp.queueDepth 32
20/02/10 20:32:45 INFO crail: crail.namenode.tcp.messageSize 512
20/02/10 20:32:45 INFO crail: crail.namenode.tcp.cores 1
20/02/10 20:35:36 INFO crail: new connection from /10.114.222.82:37328
20/02/10 20:35:36 INFO narpc: adding new channel to selector, from /
10.114.222.82:37328
20/02/10 20:35:36 INFO crail: adding datanode /10.114.222.82:50020 of type
0 to storage class 0

[root@sgt-pepper ~]# docker run -it --network host -e
NAMENODE_HOST=sgt-pepper apache/incubator-crail:1.2 datanode
20/02/10 20:35:36 INFO crail: crail.version 3101
20/02/10 20:35:36 INFO crail: crail.directorydepth 16
20/02/10 20:35:36 INFO crail: crail.tokenexpiration 10
20/02/10 20:35:36 INFO crail: crail.blocksize 1048576
20/02/10 20:35:36 INFO crail: crail.cachelimit 0
20/02/10 20:35:36 INFO crail: crail.cachepath /dev/hugepages/cache
20/02/10 20:35:36 INFO crail: crail.user crail
20/02/10 20:35:36 INFO crail: crail.shadowreplication 1
20/02/10 20:35:36 INFO crail: crail.debug false
20/02/10 20:35:36 INFO crail: crail.statistics true
20/02/10 20:35:36 INFO crail: crail.rpctimeout 1000
20/02/10 20:35:36 INFO crail: crail.datatimeout 1000
20/02/10 20:35:36 INFO crail: crail.buffersize 1048576
20/02/10 20:35:36 INFO crail: crail.slicesize 524288
20/02/10 20:35:36 INFO crail: crail.singleton true
20/02/10 20:35:36 INFO crail: crail.regionsize 1073741824
20/02/10 20:35:36 INFO crail: crail.directoryrecord 512
20/02/10 20:35:36 INFO crail: crail.directoryrandomize true
20/02/10 20:35:36 INFO crail: crail.cacheimpl
org.apache.crail.memory.MappedBufferCache
20/02/10 20:35:36 INFO crail: crail.locationmap
20/02/10 20:35:36 INFO crail: crail.namenode.address crail://sgt-pepper:9060
20/02/10 20:35:36 INFO crail: crail.namenode.blockselection roundrobin
20/02/10 20:35:36 INFO crail: crail.namenode.fileblocks 16
20/02/10 20:35:36 INFO crail: crail.namenode.rpctype
org.apache.crail.namenode.rpc.tcp.TcpNameNode
20/02/10 20:35:36 INFO crail: crail.namenode.log
20/02/10 20:35:36 INFO crail: crail.storage.types
org.apache.crail.storage.tcp.TcpStorageTier
20/02/10 20:35:36 INFO crail: crail.storage.classes 1
20/02/10 20:35:36 INFO crail: crail.storage.rootclass 0
20/02/10 20:35:36 INFO crail: crail.storage.keepalive 2
20/02/10 20:35:36 INFO narpc: new NaRPC server group v1.5.0, queueDepth 16,
messageSize 2097152, nodealy false, cores 1
20/02/10 20:35:36 INFO crail: crail.storage.tcp.interface eth0
20/02/10 20:35:36 INFO crail: crail.storage.tcp.port 50020
20/02/10 20:35:36 INFO crail: crail.storage.tcp.storagelimit 1073741824
20/02/10 20:35:36 INFO crail: crail.storage.tcp.allocationsize 1073741824
20/02/10 20:35:36 INFO crail: crail.storage.tcp.datapath /dev/hugepages/data
20/02/10 20:35:36 INFO crail: crail.storage.tcp.queuedepth 16
20/02/10 20:35:36 INFO crail: crail.storage.tcp.cores 1
20/02/10 20:35:36 INFO crail: running TCP storage server, address /
10.114.222.82:50020
20/02/10 20:35:36 INFO narpc: new NaRPC client group v1.5.0, queueDepth 32,
messageSize 512, nodealy true
20/02/10 20:35:36 INFO crail: crail.namenode.tcp.queueDepth 32
20/02/10 20:35:36 INFO crail: crail.namenode.tcp.messageSize 512
20/02/10 20:35:36 INFO crail: crail.namenode.tcp.cores 1
20/02/10 20:35:36 INFO crail: connected to namenode(s) sgt-pepper/
10.114.222.82:9060
20/02/10 20:35:36 INFO crail: datanode statistics, freeBlocks 1024
...

This seems to be OK so far, since we see *adding datanode
/10.114.222.82:50020 <http://10.114.222.82:50020> of type 0 to storage
class 0* in the namenode log.

Then I download the Non-Official Binary Release from here:
https://crail.apache.org/download/ and I extract and create the 4 conf
files and the revised log4j properties file:
[root@sgt-pepper conf]# ls -atl
total 40
drwxr-xr-x. 2 kube kube 4096 Feb 10 14:14 .
-rw-r--r--. 1 kube kube  569 Feb 10 14:14 log4j.properties
drwxr-xr-x. 8 root root 4096 Feb 10 13:52 ..
-rw-r--r--. 1 kube kube 1211 Sep 25  2018 core-site.xml
-rw-r--r--. 1 kube kube 1211 Sep 25  2018 core-site.xml.template
-rw-r--r--. 1 kube kube  125 Sep 25  2018 crail-env.sh
-rw-r--r--. 1 kube kube  125 Sep 25  2018 crail-env.sh.template
-rw-r--r--. 1 kube kube  296 Sep 25  2018 crail-site.conf
-rw-r--r--. 1 kube kube  296 Sep 25  2018 crail-site.conf.template
-rw-r--r--. 1 kube kube  568 Feb 28  2018 log4j.properties.orig
-rw-r--r--. 1 kube kube    0 Feb 28  2018 slaves
-rw-r--r--. 1 kube kube    0 Feb 28  2018 slaves.template
[root@sgt-pepper conf]# diff log4j.properties log4j.properties.orig
2c2
< log4j.rootCategory=DEBUG, console
---
> log4j.rootCategory=INFO, console

I setup CRAIL_HOME:

[root@sgt-pepper apache-crail-1.2-incubating]# export CRAIL_HOME=$PWD
[root@sgt-pepper apache-crail-1.2-incubating]# env | grep CRAIL
CRAIL_HOME=/usr/local/apache-crail-1.2-incubating

=====

Now I try using the CLI:

[root@sgt-pepper apache-crail-1.2-incubating]# $CRAIL_HOME/bin/crail fs -ls
/
20/02/10 14:54:51 DEBUG Shell: Failed to detect a valid hadoop home
directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:326)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:351)
at
org.apache.hadoop.util.GenericOptionsParser.preProcessForWindows(GenericOptionsParser.java:440)
at
org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:486)
at
org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:170)
at
org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:64)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)
20/02/10 14:54:51 DEBUG Shell: setsid exited with exit code 0
20/02/10 14:54:51 DEBUG Configuration: parsing URL
jar:file:/usr/local/apache-crail-1.2-incubating/jars/hadoop-common-2.7.3.jar!/core-default.xml
20/02/10 14:54:51 DEBUG Configuration: parsing input stream
sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@52d455b8
20/02/10 14:54:52 DEBUG Configuration: parsing URL
file:/usr/local/apache-crail-1.2-incubating/conf/core-site.xml
20/02/10 14:54:52 DEBUG Configuration: parsing input stream
java.io.BufferedInputStream@71c7db30
20/02/10 14:54:52 DEBUG MutableMetricsFactory: field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=,
sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of
successful kerberos logins and latency (milliseconds)])
20/02/10 14:54:52 DEBUG MutableMetricsFactory: field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=,
sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of
failed kerberos logins and latency (milliseconds)])
20/02/10 14:54:52 DEBUG MutableMetricsFactory: field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with
annotation @org.apache.hadoop.metrics2.annotation.Metric(about=,
sampleName=Ops, always=false, type=DEFAULT, valueName=Time,
value=[GetGroups])
20/02/10 14:54:52 DEBUG MetricsSystemImpl: UgiMetrics, User and group
related metrics
20/02/10 14:54:52 DEBUG KerberosName: Kerberos krb5 configuration not
found, setting default realm to empty
20/02/10 14:54:52 DEBUG Groups:  Creating new Groups object
20/02/10 14:54:52 DEBUG NativeCodeLoader: Trying to load the custom-built
native-hadoop library...
20/02/10 14:54:52 DEBUG NativeCodeLoader: Failed to load native-hadoop with
error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
20/02/10 14:54:52 DEBUG NativeCodeLoader:
java.library.path=/usr/local/apache-crail-1.2-incubating/bin/../lib::/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
20/02/10 14:54:52 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
20/02/10 14:54:52 DEBUG PerformanceAdvisory: Falling back to shell based
20/02/10 14:54:52 DEBUG JniBasedUnixGroupsMappingWithFallback: Group
mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
20/02/10 14:54:52 DEBUG Groups: Group mapping
impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;
cacheTimeout=300000; warningDeltaMs=5000
20/02/10 14:54:52 DEBUG UserGroupInformation: hadoop login
20/02/10 14:54:52 DEBUG UserGroupInformation: hadoop login commit
20/02/10 14:54:52 DEBUG UserGroupInformation: using local
user:UnixPrincipal: root
20/02/10 14:54:52 DEBUG UserGroupInformation: Using user: "UnixPrincipal:
root" with name root
20/02/10 14:54:52 DEBUG UserGroupInformation: User entry: "root"
20/02/10 14:54:52 DEBUG UserGroupInformation: UGI loginUser:root
(auth:SIMPLE)
20/02/10 14:54:52 INFO crail: CrailHadoopFileSystem construction
20/02/10 14:54:52 INFO crail: creating singleton crail file system
20/02/10 14:54:52 INFO crail: crail.version 3101
20/02/10 14:54:52 INFO crail: crail.directorydepth 16
20/02/10 14:54:52 INFO crail: crail.tokenexpiration 10
20/02/10 14:54:52 INFO crail: crail.blocksize 1048576
20/02/10 14:54:52 INFO crail: crail.cachelimit 1073741824
20/02/10 14:54:52 INFO crail: crail.cachepath /dev/hugepages/cache
20/02/10 14:54:52 INFO crail: crail.user crail
20/02/10 14:54:52 INFO crail: crail.shadowreplication 1
20/02/10 14:54:52 INFO crail: crail.debug false
20/02/10 14:54:52 INFO crail: crail.statistics true
20/02/10 14:54:52 INFO crail: crail.rpctimeout 1000
20/02/10 14:54:52 INFO crail: crail.datatimeout 1000
20/02/10 14:54:52 INFO crail: crail.buffersize 1048576
20/02/10 14:54:52 INFO crail: crail.slicesize 524288
20/02/10 14:54:52 INFO crail: crail.singleton true
20/02/10 14:54:52 INFO crail: crail.regionsize 1073741824
20/02/10 14:54:52 INFO crail: crail.directoryrecord 512
20/02/10 14:54:52 INFO crail: crail.directoryrandomize true
20/02/10 14:54:52 INFO crail: crail.cacheimpl
org.apache.crail.memory.MappedBufferCache
20/02/10 14:54:52 INFO crail: crail.locationmap
20/02/10 14:54:52 INFO crail: crail.namenode.address crail://localhost:9060
20/02/10 14:54:52 INFO crail: crail.namenode.blockselection roundrobin
20/02/10 14:54:52 INFO crail: crail.namenode.fileblocks 16
20/02/10 14:54:52 INFO crail: crail.namenode.rpctype
org.apache.crail.namenode.rpc.tcp.TcpNameNode
20/02/10 14:54:52 INFO crail: crail.namenode.log
20/02/10 14:54:52 INFO crail: crail.storage.types
org.apache.crail.storage.tcp.TcpStorageTier
20/02/10 14:54:52 INFO crail: crail.storage.classes 1
20/02/10 14:54:52 INFO crail: crail.storage.rootclass 0
20/02/10 14:54:52 INFO crail: crail.storage.keepalive 2
20/02/10 14:54:52 INFO crail: buffer cache, allocationCount 1, bufferCount
1024
20/02/10 14:54:52 INFO narpc: new NaRPC client group v1.5.0, queueDepth 16,
messageSize 2097152, nodealy false
20/02/10 14:54:52 INFO crail: crail.storage.tcp.interface eth0
20/02/10 14:54:52 INFO crail: crail.storage.tcp.port 50020
20/02/10 14:54:52 INFO crail: crail.storage.tcp.storagelimit 1073741824
20/02/10 14:54:52 INFO crail: crail.storage.tcp.allocationsize 1073741824
20/02/10 14:54:52 INFO crail: crail.storage.tcp.datapath /dev/hugepages/data
20/02/10 14:54:52 INFO crail: crail.storage.tcp.queuedepth 16
20/02/10 14:54:52 INFO crail: crail.storage.tcp.cores 1
20/02/10 14:54:52 INFO narpc: new NaRPC client group v1.5.0, queueDepth 32,
messageSize 512, nodealy true
20/02/10 14:54:52 INFO crail: crail.namenode.tcp.queueDepth 32
20/02/10 14:54:52 INFO crail: crail.namenode.tcp.messageSize 512
20/02/10 14:54:52 INFO crail: crail.namenode.tcp.cores 1
ls: java.io.IOException: java.net.ConnectException: Connection refused

=====

Since I get connection refused, I try changing "localhost" to "sgt-pepper"
in the conf files:

[root@sgt-pepper conf]# grep sgt-pepper *
core-site.xml:    <value>crail://sgt-pepper:9060</value>
crail-site.conf:crail.namenode.address            crail://sgt-pepper:9060

Then I try using the CLI again:

[root@sgt-pepper apache-crail-1.2-incubating]# $CRAIL_HOME/bin/crail fs -ls
/
20/02/10 15:00:27 DEBUG Shell: Failed to detect a valid hadoop home
directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:326)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:351)
at
org.apache.hadoop.util.GenericOptionsParser.preProcessForWindows(GenericOptionsParser.java:440)
at
org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:486)
at
org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:170)
at
org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:64)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)
20/02/10 15:00:27 DEBUG Shell: setsid exited with exit code 0
20/02/10 15:00:27 DEBUG Configuration: parsing URL
jar:file:/usr/local/apache-crail-1.2-incubating/jars/hadoop-common-2.7.3.jar!/core-default.xml
20/02/10 15:00:27 DEBUG Configuration: parsing input stream
sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@52d455b8
20/02/10 15:00:27 DEBUG Configuration: parsing URL
file:/usr/local/apache-crail-1.2-incubating/conf/core-site.xml
20/02/10 15:00:27 DEBUG Configuration: parsing input stream
java.io.BufferedInputStream@71c7db30
20/02/10 15:00:27 DEBUG MutableMetricsFactory: field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=,
sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of
successful kerberos logins and latency (milliseconds)])
20/02/10 15:00:28 DEBUG MutableMetricsFactory: field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=,
sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of
failed kerberos logins and latency (milliseconds)])
20/02/10 15:00:28 DEBUG MutableMetricsFactory: field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with
annotation @org.apache.hadoop.metrics2.annotation.Metric(about=,
sampleName=Ops, always=false, type=DEFAULT, valueName=Time,
value=[GetGroups])
20/02/10 15:00:28 DEBUG MetricsSystemImpl: UgiMetrics, User and group
related metrics
20/02/10 15:00:28 DEBUG KerberosName: Kerberos krb5 configuration not
found, setting default realm to empty
20/02/10 15:00:28 DEBUG Groups:  Creating new Groups object
20/02/10 15:00:28 DEBUG NativeCodeLoader: Trying to load the custom-built
native-hadoop library...
20/02/10 15:00:28 DEBUG NativeCodeLoader: Failed to load native-hadoop with
error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
20/02/10 15:00:28 DEBUG NativeCodeLoader:
java.library.path=/usr/local/apache-crail-1.2-incubating/bin/../lib::/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
20/02/10 15:00:28 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
20/02/10 15:00:28 DEBUG PerformanceAdvisory: Falling back to shell based
20/02/10 15:00:28 DEBUG JniBasedUnixGroupsMappingWithFallback: Group
mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
20/02/10 15:00:28 DEBUG Groups: Group mapping
impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;
cacheTimeout=300000; warningDeltaMs=5000
20/02/10 15:00:28 DEBUG UserGroupInformation: hadoop login
20/02/10 15:00:28 DEBUG UserGroupInformation: hadoop login commit
20/02/10 15:00:28 DEBUG UserGroupInformation: using local
user:UnixPrincipal: root
20/02/10 15:00:28 DEBUG UserGroupInformation: Using user: "UnixPrincipal:
root" with name root
20/02/10 15:00:28 DEBUG UserGroupInformation: User entry: "root"
20/02/10 15:00:28 DEBUG UserGroupInformation: UGI loginUser:root
(auth:SIMPLE)
20/02/10 15:00:28 INFO crail: CrailHadoopFileSystem construction
20/02/10 15:00:28 INFO crail: creating singleton crail file system
20/02/10 15:00:28 INFO crail: crail.version 3101
20/02/10 15:00:28 INFO crail: crail.directorydepth 16
20/02/10 15:00:28 INFO crail: crail.tokenexpiration 10
20/02/10 15:00:28 INFO crail: crail.blocksize 1048576
20/02/10 15:00:28 INFO crail: crail.cachelimit 1073741824
20/02/10 15:00:28 INFO crail: crail.cachepath /dev/hugepages/cache
20/02/10 15:00:28 INFO crail: crail.user crail
20/02/10 15:00:28 INFO crail: crail.shadowreplication 1
20/02/10 15:00:28 INFO crail: crail.debug false
20/02/10 15:00:28 INFO crail: crail.statistics true
20/02/10 15:00:28 INFO crail: crail.rpctimeout 1000
20/02/10 15:00:28 INFO crail: crail.datatimeout 1000
20/02/10 15:00:28 INFO crail: crail.buffersize 1048576
20/02/10 15:00:28 INFO crail: crail.slicesize 524288
20/02/10 15:00:28 INFO crail: crail.singleton true
20/02/10 15:00:28 INFO crail: crail.regionsize 1073741824
20/02/10 15:00:28 INFO crail: crail.directoryrecord 512
20/02/10 15:00:28 INFO crail: crail.directoryrandomize true
20/02/10 15:00:28 INFO crail: crail.cacheimpl
org.apache.crail.memory.MappedBufferCache
20/02/10 15:00:28 INFO crail: crail.locationmap
20/02/10 15:00:28 INFO crail: crail.namenode.address crail://sgt-pepper:9060
20/02/10 15:00:28 INFO crail: crail.namenode.blockselection roundrobin
20/02/10 15:00:28 INFO crail: crail.namenode.fileblocks 16
20/02/10 15:00:28 INFO crail: crail.namenode.rpctype
org.apache.crail.namenode.rpc.tcp.TcpNameNode
20/02/10 15:00:28 INFO crail: crail.namenode.log
20/02/10 15:00:28 INFO crail: crail.storage.types
org.apache.crail.storage.tcp.TcpStorageTier
20/02/10 15:00:28 INFO crail: crail.storage.classes 1
20/02/10 15:00:28 INFO crail: crail.storage.rootclass 0
20/02/10 15:00:28 INFO crail: crail.storage.keepalive 2
20/02/10 15:00:28 INFO crail: buffer cache, allocationCount 1, bufferCount
1024
20/02/10 15:00:28 INFO narpc: new NaRPC client group v1.5.0, queueDepth 16,
messageSize 2097152, nodealy false
20/02/10 15:00:28 INFO crail: crail.storage.tcp.interface eth0
20/02/10 15:00:28 INFO crail: crail.storage.tcp.port 50020
20/02/10 15:00:28 INFO crail: crail.storage.tcp.storagelimit 1073741824
20/02/10 15:00:28 INFO crail: crail.storage.tcp.allocationsize 1073741824
20/02/10 15:00:28 INFO crail: crail.storage.tcp.datapath /dev/hugepages/data
20/02/10 15:00:28 INFO crail: crail.storage.tcp.queuedepth 16
20/02/10 15:00:28 INFO crail: crail.storage.tcp.cores 1
20/02/10 15:00:28 INFO narpc: new NaRPC client group v1.5.0, queueDepth 32,
messageSize 512, nodealy true
20/02/10 15:00:28 INFO crail: crail.namenode.tcp.queueDepth 32
20/02/10 15:00:28 INFO crail: crail.namenode.tcp.messageSize 512
20/02/10 15:00:28 INFO crail: crail.namenode.tcp.cores 1
20/02/10 15:00:28 INFO crail: connected to namenode(s) sgt-pepper/
10.114.222.82:9060
20/02/10 15:00:28 INFO crail: CrailHadoopFileSystem fs initialization done..
ls: /
20/02/10 15:00:28 INFO crail: Closing CrailHadoopFileSystem
20/02/10 15:00:28 INFO crail: Closing CrailFS singleton
20/02/10 15:00:28 INFO crail: mapped client cache closed

Seem better, but not really.

[root@sgt-pepper apache-crail-1.2-incubating]# $CRAIL_HOME/bin/crail fs
-mkdir /foobar
20/02/10 15:06:14 DEBUG Shell: Failed to detect a valid hadoop home
directory
java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:326)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:351)
at
org.apache.hadoop.util.GenericOptionsParser.preProcessForWindows(GenericOptionsParser.java:440)
at
org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:486)
at
org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:170)
at
org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:64)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)
20/02/10 15:06:14 DEBUG Shell: setsid exited with exit code 0
20/02/10 15:06:14 DEBUG Configuration: parsing URL
jar:file:/usr/local/apache-crail-1.2-incubating/jars/hadoop-common-2.7.3.jar!/core-default.xml
20/02/10 15:06:14 DEBUG Configuration: parsing input stream
sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@52d455b8
20/02/10 15:06:14 DEBUG Configuration: parsing URL
file:/usr/local/apache-crail-1.2-incubating/conf/core-site.xml
20/02/10 15:06:14 DEBUG Configuration: parsing input stream
java.io.BufferedInputStream@71c7db30
20/02/10 15:06:14 DEBUG MutableMetricsFactory: field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=,
sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of
successful kerberos logins and latency (milliseconds)])
20/02/10 15:06:14 DEBUG MutableMetricsFactory: field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=,
sampleName=Ops, always=false, type=DEFAULT, valueName=Time, value=[Rate of
failed kerberos logins and latency (milliseconds)])
20/02/10 15:06:14 DEBUG MutableMetricsFactory: field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with
annotation @org.apache.hadoop.metrics2.annotation.Metric(about=,
sampleName=Ops, always=false, type=DEFAULT, valueName=Time,
value=[GetGroups])
20/02/10 15:06:14 DEBUG MetricsSystemImpl: UgiMetrics, User and group
related metrics
20/02/10 15:06:14 DEBUG KerberosName: Kerberos krb5 configuration not
found, setting default realm to empty
20/02/10 15:06:14 DEBUG Groups:  Creating new Groups object
20/02/10 15:06:14 DEBUG NativeCodeLoader: Trying to load the custom-built
native-hadoop library...
20/02/10 15:06:14 DEBUG NativeCodeLoader: Failed to load native-hadoop with
error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
20/02/10 15:06:14 DEBUG NativeCodeLoader:
java.library.path=/usr/local/apache-crail-1.2-incubating/bin/../lib::/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
20/02/10 15:06:14 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
20/02/10 15:06:14 DEBUG PerformanceAdvisory: Falling back to shell based
20/02/10 15:06:14 DEBUG JniBasedUnixGroupsMappingWithFallback: Group
mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
20/02/10 15:06:14 DEBUG Groups: Group mapping
impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;
cacheTimeout=300000; warningDeltaMs=5000
20/02/10 15:06:14 DEBUG UserGroupInformation: hadoop login
20/02/10 15:06:14 DEBUG UserGroupInformation: hadoop login commit
20/02/10 15:06:14 DEBUG UserGroupInformation: using local
user:UnixPrincipal: root
20/02/10 15:06:14 DEBUG UserGroupInformation: Using user: "UnixPrincipal:
root" with name root
20/02/10 15:06:14 DEBUG UserGroupInformation: User entry: "root"
20/02/10 15:06:14 DEBUG UserGroupInformation: UGI loginUser:root
(auth:SIMPLE)
20/02/10 15:06:14 INFO crail: CrailHadoopFileSystem construction
20/02/10 15:06:14 INFO crail: creating singleton crail file system
20/02/10 15:06:14 INFO crail: crail.version 3101
20/02/10 15:06:14 INFO crail: crail.directorydepth 16
20/02/10 15:06:14 INFO crail: crail.tokenexpiration 10
20/02/10 15:06:14 INFO crail: crail.blocksize 1048576
20/02/10 15:06:14 INFO crail: crail.cachelimit 1073741824
20/02/10 15:06:14 INFO crail: crail.cachepath /dev/hugepages/cache
20/02/10 15:06:14 INFO crail: crail.user crail
20/02/10 15:06:14 INFO crail: crail.shadowreplication 1
20/02/10 15:06:14 INFO crail: crail.debug false
20/02/10 15:06:14 INFO crail: crail.statistics true
20/02/10 15:06:14 INFO crail: crail.rpctimeout 1000
20/02/10 15:06:14 INFO crail: crail.datatimeout 1000
20/02/10 15:06:14 INFO crail: crail.buffersize 1048576
20/02/10 15:06:14 INFO crail: crail.slicesize 524288
20/02/10 15:06:14 INFO crail: crail.singleton true
20/02/10 15:06:14 INFO crail: crail.regionsize 1073741824
20/02/10 15:06:14 INFO crail: crail.directoryrecord 512
20/02/10 15:06:14 INFO crail: crail.directoryrandomize true
20/02/10 15:06:14 INFO crail: crail.cacheimpl
org.apache.crail.memory.MappedBufferCache
20/02/10 15:06:14 INFO crail: crail.locationmap
20/02/10 15:06:14 INFO crail: crail.namenode.address crail://sgt-pepper:9060
20/02/10 15:06:14 INFO crail: crail.namenode.blockselection roundrobin
20/02/10 15:06:14 INFO crail: crail.namenode.fileblocks 16
20/02/10 15:06:14 INFO crail: crail.namenode.rpctype
org.apache.crail.namenode.rpc.tcp.TcpNameNode
20/02/10 15:06:14 INFO crail: crail.namenode.log
20/02/10 15:06:14 INFO crail: crail.storage.types
org.apache.crail.storage.tcp.TcpStorageTier
20/02/10 15:06:14 INFO crail: crail.storage.classes 1
20/02/10 15:06:14 INFO crail: crail.storage.rootclass 0
20/02/10 15:06:14 INFO crail: crail.storage.keepalive 2
20/02/10 15:06:14 INFO crail: buffer cache, allocationCount 1, bufferCount
1024
20/02/10 15:06:14 INFO narpc: new NaRPC client group v1.5.0, queueDepth 16,
messageSize 2097152, nodealy false
20/02/10 15:06:14 INFO crail: crail.storage.tcp.interface eth0
20/02/10 15:06:14 INFO crail: crail.storage.tcp.port 50020
20/02/10 15:06:14 INFO crail: crail.storage.tcp.storagelimit 1073741824
20/02/10 15:06:14 INFO crail: crail.storage.tcp.allocationsize 1073741824
20/02/10 15:06:14 INFO crail: crail.storage.tcp.datapath /dev/hugepages/data
20/02/10 15:06:14 INFO crail: crail.storage.tcp.queuedepth 16
20/02/10 15:06:14 INFO crail: crail.storage.tcp.cores 1
20/02/10 15:06:15 INFO narpc: new NaRPC client group v1.5.0, queueDepth 32,
messageSize 512, nodealy true
20/02/10 15:06:15 INFO crail: crail.namenode.tcp.queueDepth 32
20/02/10 15:06:15 INFO crail: crail.namenode.tcp.messageSize 512
20/02/10 15:06:15 INFO crail: crail.namenode.tcp.cores 1
20/02/10 15:06:15 INFO crail: connected to namenode(s) sgt-pepper/
10.114.222.82:9060
20/02/10 15:06:15 INFO crail: CrailHadoopFileSystem fs initialization done..
mkdir: java.util.concurrent.ExecutionException: java.io.IOException: Map
failed
20/02/10 15:06:15 INFO crail: Closing CrailHadoopFileSystem
20/02/10 15:06:15 INFO crail: Closing CrailFS singleton
20/02/10 15:06:15 INFO crail: mapped client cache closed

I'm thinkinh that *mkdir: java.util.concurrent.ExecutionException:
java.io.IOException: Map failed *is not good.

Sorry for the long append.  Probably something really dumb, but I'm
sratching my head...

Lou.


On Mon, Feb 10, 2020 at 12:13 PM David Crespi <
david.cre...@storedgesystems.com> wrote:

> Looks like both cmds are working, but it’s not really finding the
> datastore.
> You’re getting info into the namenode, but not the datanode (which means
> you’re really not writing to the datanode).  The
> First warning you can ignore (WARN NativeCodeLoader: Unable to load
> native-hadoop library for your platform)
>
> you may want to turn on debug to see more of what’s going on.
> Edit /crail/conf/log4j.properties and change from INFO to DEBUG.
>
> You should probably also look at the individual docker logs.
> docker logs YourNamenodeName & docker logs YourDatanodeName
>
> may give you some more hints of what’s happening.
>
>
> Regards,
>       David
>
>
> From: Lou DeGenaro<mailto:lou.degen...@gmail.com>
> Sent: Monday, February 10, 2020 8:12 AM
> To: dev@crail.apache.org<mailto:dev@crail.apache.org>
> Subject: Re: iobench
>
> OK, I take it back.  Having trouble with crail fs.
>
> =====
>
> [root@abbey-road ~]# $CRAIL_HOME/bin/crail fs -mkdir /foobar
> 20/02/10 10:07:16 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> ...
> 20/02/10 10:07:16 INFO crail: connected to namenode(s) abbey-road/
> 10.114.222.23:9060
> 20/02/10 10:07:16 INFO crail: CrailHadoopFileSystem fs initialization
> done..
> mkdir: java.util.concurrent.ExecutionException: java.io.IOException: Map
> failed
> 20/02/10 10:07:16 INFO crail: Closing CrailHadoopFileSystem
> 20/02/10 10:07:16 INFO crail: Closing CrailFS singleton
> 20/02/10 10:07:16 INFO crail: mapped client cache closed
>
> [root@abbey-road ~]# $CRAIL_HOME/bin/crail fs -rmdir /foobar
> 20/02/10 10:08:55 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> ...20/02/10 10:08:55 INFO crail: CrailHadoopFileSystem fs initialization
> done..
> rmdir: /foobar
> 20/02/10 10:08:55 INFO crail: Closing CrailHadoopFileSystem
> 20/02/10 10:08:55 INFO crail: Closing CrailFS singleton
> 20/02/10 10:08:55 INFO crail: mapped client cache closed
>
> [root@abbey-road ~]# $CRAIL_HOME/bin/crail fs -mkdir /foobar
> 20/02/10 10:09:43 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> ...
> 20/02/10 10:09:43 INFO crail: CrailHadoopFileSystem fs initialization
> done..
> mkdir: `/foobar': File exists
> 20/02/10 10:09:43 INFO crail: Closing CrailHadoopFileSystem
> 20/02/10 10:09:43 INFO crail: Closing CrailFS singleton
> 20/02/10 10:09:43 INFO crail: mapped client cache closed
>
> =====
>
> Seems that mkdir works, but rmdir does not?  Also, IOException was reported
> on mkdir, which seems worrisome.
>
> Lou.
>
> On Mon, Feb 10, 2020 at 8:16 AM Lou DeGenaro <lou.degen...@gmail.com>
> wrote:
>
> > David,
> >
> > Thanks.  I'm able to use *crail fs *successfully to create/list/delete.
> > Will try to increase available storage space next...
> >
> > Lou.
> >
> > On Fri, Feb 7, 2020 at 6:37 PM David Crespi <
> > david.cre...@storedgesystems.com> wrote:
> >
> >> Oh, and it may have already written something on the datastore, so you’d
> >> need
> >> to check it and remove it with the fs command anyway.
> >>
> >> crail fs -ls -R /
> >>
> >> would show you what you have there.
> >>
> >> Regards,
> >>       David
> >>
> >> (C) 714-476-2692
> >>
> >> From: Lou DeGenaro<mailto:lou.degen...@gmail.com>
> >> Sent: Friday, February 7, 2020 12:15 PM
> >> To: dev@crail.apache.org<mailto:dev@crail.apache.org>
> >> Subject: iobench
> >>
> >> Still a noob.  Got namenode and datanode running as docker images on my
> >> VM.  I'm looking for the simplest example of writing something then
> >> reading
> >> something.
> >>
> >> [root@abbey-road conf]# $CRAIL_HOME/bin/crail iobench -t write -f
> >> /filename
> >> -s 1024 -k 1
> >> 20/02/07 14:10:22 INFO crail: creating singleton crail file system
> >> 20/02/07 14:10:23 INFO crail: crail.version 3101
> >> 20/02/07 14:10:23 INFO crail: crail.directorydepth 16
> >> 20/02/07 14:10:23 INFO crail: crail.tokenexpiration 10
> >> 20/02/07 14:10:23 INFO crail: crail.blocksize 1048576
> >> 20/02/07 14:10:23 INFO crail: crail.cachelimit 1073741824
> >> 20/02/07 14:10:23 INFO crail: crail.cachepath /dev/hugepages/cache
> >> 20/02/07 14:10:23 INFO crail: crail.user crail
> >> 20/02/07 14:10:23 INFO crail: crail.shadowreplication 1
> >> 20/02/07 14:10:23 INFO crail: crail.debug false
> >> 20/02/07 14:10:23 INFO crail: crail.statistics true
> >> 20/02/07 14:10:23 INFO crail: crail.rpctimeout 1000
> >> 20/02/07 14:10:23 INFO crail: crail.datatimeout 1000
> >> 20/02/07 14:10:23 INFO crail: crail.buffersize 1048576
> >> 20/02/07 14:10:23 INFO crail: crail.slicesize 524288
> >> 20/02/07 14:10:23 INFO crail: crail.singleton true
> >> 20/02/07 14:10:23 INFO crail: crail.regionsize 1073741824
> >> 20/02/07 14:10:23 INFO crail: crail.directoryrecord 512
> >> 20/02/07 14:10:23 INFO crail: crail.directoryrandomize true
> >> 20/02/07 14:10:23 INFO crail: crail.cacheimpl
> >> org.apache.crail.memory.MappedBufferCache
> >> 20/02/07 14:10:23 INFO crail: crail.locationmap
> >> 20/02/07 14:10:23 INFO crail: crail.namenode.address
> >> crail://abbey-road:9060
> >> 20/02/07 14:10:23 INFO crail: crail.namenode.blockselection roundrobin
> >> 20/02/07 14:10:23 INFO crail: crail.namenode.fileblocks 16
> >> 20/02/07 14:10:23 INFO crail: crail.namenode.rpctype
> >> org.apache.crail.namenode.rpc.tcp.TcpNameNode
> >> 20/02/07 14:10:23 INFO crail: crail.namenode.log
> >> 20/02/07 14:10:23 INFO crail: crail.storage.types
> >> org.apache.crail.storage.tcp.TcpStorageTier
> >> 20/02/07 14:10:23 INFO crail: crail.storage.classes 1
> >> 20/02/07 14:10:23 INFO crail: crail.storage.rootclass 0
> >> 20/02/07 14:10:23 INFO crail: crail.storage.keepalive 2
> >> 20/02/07 14:10:23 INFO crail: buffer cache, allocationCount 1,
> bufferCount
> >> 1024
> >> 20/02/07 14:10:23 INFO narpc: new NaRPC client group v1.5.0, queueDepth
> >> 16,
> >> messageSize 2097152, nodealy false
> >> 20/02/07 14:10:23 INFO crail: crail.storage.tcp.interface eth0
> >> 20/02/07 14:10:23 INFO crail: crail.storage.tcp.port 50020
> >> 20/02/07 14:10:23 INFO crail: crail.storage.tcp.storagelimit 1073741824
> >> 20/02/07 14:10:23 INFO crail: crail.storage.tcp.allocationsize
> 1073741824
> >> 20/02/07 14:10:23 INFO crail: crail.storage.tcp.datapath
> >> /dev/hugepages/data
> >> 20/02/07 14:10:23 INFO crail: crail.storage.tcp.queuedepth 16
> >> 20/02/07 14:10:23 INFO crail: crail.storage.tcp.cores 1
> >> 20/02/07 14:10:23 INFO narpc: new NaRPC client group v1.5.0, queueDepth
> >> 32,
> >> messageSize 512, nodealy true
> >> 20/02/07 14:10:23 INFO crail: crail.namenode.tcp.queueDepth 32
> >> 20/02/07 14:10:23 INFO crail: crail.namenode.tcp.messageSize 512
> >> 20/02/07 14:10:23 INFO crail: crail.namenode.tcp.cores 1
> >> 20/02/07 14:10:23 INFO crail: connected to namenode(s) abbey-road/
> >> 10.114.222.23:9060
> >> write, filename /filename, size 1024, loop 1, storageClass 0,
> >> locationClass
> >> 0, buffered true
> >> Exception in thread "main" java.io.IOException: Map failed
> >> at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:938)
> >> at
> >>
> >>
> org.apache.crail.memory.MappedBufferCache.allocateRegion(MappedBufferCache.java:94)
> >> at
> org.apache.crail.memory.BufferCache.allocateBuffer(BufferCache.java:95)
> >> at
> >>
> org.apache.crail.core.CoreDataStore.allocateBuffer(CoreDataStore.java:482)
> >> at org.apache.crail.tools.CrailBenchmark.write(CrailBenchmark.java:87)
> >> at org.apache.crail.tools.CrailBenchmark.main(CrailBenchmark.java:1070)
> >> Caused by: java.lang.OutOfMemoryError: Map failed
> >> at sun.nio.ch.FileChannelImpl.map0(Native Method)
> >> at sun.nio.ch.FileChannelImpl.map(FileChannelImpl.java:935)
> >> ... 5 more
> >>
> >>
>
>

Reply via email to