Hi Chris, I'm sorry for late response. It seems to be a network problem because it's very simple cluster configuration.
Q. Could you share your network environment? Q. Is master or slave running on virtual machine? Best regards, Hyunsik On Fri, Sep 26, 2014 at 9:26 AM, Christian Schwabe <[email protected]> wrote: > > Hello guys, > > my current problem is that the Tajo Worker on Slave couldn't connect to > Master. > i need further/more information about > http://tajo.apache.org/docs/current/configuration/cluster_setup.html#settings > What does these parameters mean? > > I've attached the worker log from slave where only one worker run. > I've attached the tajo-site.xml from slave. > Is there any setting incorrect? > > IP-Address: > -Master :: 192.168.178.101 > -Slave :: 192.168.178.39 > > Hopefully u can help me. > > Best regards, > Chris > > > Am 26.09.2014 12:56:55, schrieb Christian Schwabe: > > > > Hello guys, > > sorry for spamming. > I found the solution. > The path of home directory for Tajo was not the same on both machines. > When I now start Tajo on master in HA mode the worker on the slave machine > starts, too. > BUT in the webUI for the master i didn't see the second worker alive. > > Warm regards, > Chris > > > > Am 26.09.2014 12:31:58, schrieb Christian Schwabe: > > > Hello guys, > > next try to get an answer.. > Hadoop is still running successfully on both machines master and slave. > Start Tajo in HA mode on master. In webUI i see one Query Master, one Worker > and one Masters. Is that correct so far? > Now i start on slave machine the Tajo worker with 'sh tajo-daemon.sh start > worker' and worker starts successfully. Also the conf/masters and > conf/slaves are created on master machine and slave machine > > However, I see in the WebUI on master no second worker. What is still wrong? > Thanks for any advice. > > > Best regards, > Chris > > > > Am 23.09.2014 22:58:25, schrieb Christian Schwabe: > > Hello guys, > > some days later I already made some progress. > HDFS run already successfully in background. I follow these instructions: > http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/ > > TajoMasters starts already on HA mode, but how to configure Tajo in detail > to use hdfs? I read this documentation > http://tajo.apache.org/docs/current/configuration/cluster_setup.html, but i > don’t know how to configure tajo to store files to hdfs. > Also i don’t know how to setup the second machine to start only a worker and > connect to master. > I already applied masters and workers file. > > MacBook1 > > workers content: > > localhost > 192.168.178.39 // second worker > > masters content: > > localhost > > MacBook2 > > workers content: > > localhost > 192.168.178.101 > > masters content: > > 192.168.178.101 > > > > Actually I miss a complete guide in the documentation for setup a second > worker to the master. When I realize that with your help, I will post these > instructions and place it to documentation available. I promise! > > Hopefully you can help me. > > > Beat regards, > Chris > > > Am 22.09.2014 um 08:46 schrieb Christian Schwabe > <[email protected]>: > > > Hello guys, > > can someone help to setup a cluster with a second worker? > > > Best regards, > Chris > > > > >
