This is an automated email from the ASF dual-hosted git repository.

guoyp pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/griffin.git


The following commit(s) were added to refs/heads/master by this push:
     new fec14c9  Update dev-env-build.md
fec14c9 is described below

commit fec14c9507444e39e3c3e157e4437a92e5d9113d
Author: iyuriysoft <[email protected]>
AuthorDate: Wed Jan 23 06:42:25 2019 +0800

    Update dev-env-build.md
    
    Author: iyuriysoft <[email protected]>
    
    Closes #477 from iyuriysoft/patch-1.
---
 griffin-doc/dev/dev-env-build.md | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/griffin-doc/dev/dev-env-build.md b/griffin-doc/dev/dev-env-build.md
index f61b55e..2448eec 100644
--- a/griffin-doc/dev/dev-env-build.md
+++ b/griffin-doc/dev/dev-env-build.md
@@ -78,6 +78,8 @@ You can test your measure JAR built in the docker container, 
using the existed s
 
 For debug purpose, you'd better install hadoop, spark, hive locally, so you 
can test your program more quickly.
 
+Note: If you run Hadoop in a pseudo-distributed mode on MacOS you have to 
update hdfs-site.xml, namely, comment out the parameters 
**dfs.namenode.servicerpc-address** and **dfs.namenode.rpc-address**, otherwise 
you can get the error like: "java.net.ConnectException: Call From 
mycomputer/127.0.1.1 to localhost:9000 failed on connection exception: 
java.net.ConnectException: Connection refused; For more details see:  
http://wiki.apache.org/hadoop/ConnectionRefused";
+
 ## Deploy on docker container
 Firstly, in the griffin directory, build you packages at once.
 ```

Reply via email to