Hello Sergii,
After diving in deeper, we noticed that cobbler container is not running.
Attempted to restart docker-cobbler through supervisorctl – cobbler shell via
dockerctl becomes available for a short time, but soon fails as cobbler
container runs for a short time (seconds) and then stops due to errors.
Please let us know if this is recoverable and best course of action to take.
Also would it be possible to build a mirantis openstack server on a different
server and add in existing nodes to it, so we can create a new environment
without having to bring down the slave nodes?
Here is a portion of the cobbler log via dockerctl copied and pasted:
Stopping cobbler daemon: [FAILED]
Starting dnsmasq: [ OK ] Traceback (most recent call last):
File "/usr/bin/cobblerd", line 76, in main
api = cobbler_api.BootAPI(is_cobblerd=True)
File "/usr/lib/python2.6/site-packages/cobbler/api.py", line 130, in __init__
self.deserialize()
File "/usr/lib/python2.6/site-packages/cobbler/api.py", line 898, in
deserialize
return self._config.deserialize()
File "/usr/lib/python2.6/site-packages/cobbler/config.py", line 266, in
deserialize
raise CX("serializer: error loading collection %s. Check
/etc/cobbler/modules.conf" % item.collection_type())
CX: 'serializer: error loading collection distro. Check
/etc/cobbler/modules.conf'
Stopping httpd: [FAILED] Stopping xinetd: [FAILED] _[0;32mInfo: Loading facts
in /etc/puppet/modules/nagios/lib/facter/disks.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/nagios/lib/facter/mountpoints.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/corosync/lib/facter/pacemaker_hostname.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/firewall/lib/facter/iptables_persistent_version.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/firewall/lib/facter/ip6tables_version.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/firewall/lib/facter/iptables_version.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/ceph/lib/facter/ceph_osd.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/ceph/lib/facter/cinder_conf.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/ceph/lib/facter/glance_api_conf.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/ceph/lib/facter/ceph_conf.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/ceph/lib/facter/nova_compute.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/ceph/lib/facter/keystone_conf.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/osnailyfacter/lib/facter/naily.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/stdlib/lib/facter/root_home.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/stdlib/lib/facter/pe_version.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/stdlib/lib/facter/puppet_vardir.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/galera/lib/facter/galera_gcomm_empty.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/galera/lib/facter/mysql_log_file_size_real.rb_[0m
_[0;32mInfo: Loading facts in /etc/puppet/modules/puppet/lib/facter/cacrl.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/puppet/lib/facter/cacert.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/puppet/lib/facter/hostprivkey.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/puppet/lib/facter/localacacert.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/puppet/lib/facter/certname.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/puppet/lib/facter/puppet_semantic_version.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/puppet/lib/facter/hostcert.rb_[0m
_[0;32mInfo: Loading facts in /etc/puppet/modules/puppet/lib/facter/cakey.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/concat/lib/facter/concat_basedir.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/mmm/lib/facter/ipaddresses.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/swift/lib/facter/swift_mountpoints.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/nailgun/lib/facter/fuel_version.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/nailgun/lib/facter/generate_fuel_key.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/l23network/lib/facter/ovs_vlan_splinters.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/l23network/lib/facter/fqdn_hostname.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/l23network/lib/facter/check_kern_module.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/l23network/lib/facter/default_route.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/l23network/lib/facter/openvswitch.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/neutron/lib/facter/defaultroute.rb_[0m
ls: cannot access /dev/sda?: No such file or directory
ls: cannot access /dev/sda?: No such file or directory
_[1;31mWarning: Config file /etc/puppet/hiera.yaml not found, using Hiera
defaults_[0m
_[mNotice: Compiled catalog for 1cd07e2e6b90.englab.brocade.com in environment
production in 2.91 seconds_[0m
_[0;32mInfo: Loading facts in /etc/puppet/modules/nagios/lib/facter/disks.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/nagios/lib/facter/mountpoints.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/corosync/lib/facter/pacemaker_hostname.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/firewall/lib/facter/iptables_persistent_version.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/firewall/lib/facter/ip6tables_version.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/firewall/lib/facter/iptables_version.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/ceph/lib/facter/ceph_osd.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/ceph/lib/facter/cinder_conf.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/ceph/lib/facter/glance_api_conf.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/ceph/lib/facter/ceph_conf.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/ceph/lib/facter/nova_compute.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/ceph/lib/facter/keystone_conf.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/osnailyfacter/lib/facter/naily.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/stdlib/lib/facter/root_home.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/stdlib/lib/facter/pe_version.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/stdlib/lib/facter/puppet_vardir.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/galera/lib/facter/galera_gcomm_empty.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/galera/lib/facter/mysql_log_file_size_real.rb_[0m
_[0;32mInfo: Loading facts in /etc/puppet/modules/puppet/lib/facter/cacrl.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/puppet/lib/facter/cacert.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/puppet/lib/facter/hostprivkey.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/puppet/lib/facter/localacacert.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/puppet/lib/facter/certname.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/puppet/lib/facter/puppet_semantic_version.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/puppet/lib/facter/hostcert.rb_[0m
_[0;32mInfo: Loading facts in /etc/puppet/modules/puppet/lib/facter/cakey.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/concat/lib/facter/concat_basedir.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/mmm/lib/facter/ipaddresses.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/swift/lib/facter/swift_mountpoints.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/nailgun/lib/facter/fuel_version.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/nailgun/lib/facter/generate_fuel_key.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/l23network/lib/facter/ovs_vlan_splinters.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/l23network/lib/facter/fqdn_hostname.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/l23network/lib/facter/check_kern_module.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/l23network/lib/facter/default_route.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/l23network/lib/facter/openvswitch.rb_[0m
_[0;32mInfo: Loading facts in
/etc/puppet/modules/neutron/lib/facter/defaultroute.rb_[0m
ls: cannot access /dev/sda?: No such file or directory
ls: cannot access /dev/sda?: No such file or directory
_[0;32mInfo: Applying configuration version '1407408243'_[0m
_[mNotice:
/Stage[main]/Cobbler::Iptables/Cobbler::Iptables::Access_to_cobbler_port[ssh]/Exec[access_to_cobbler_tcp_port:
22]/returns: executed successfully_[0m
_[mNotice:
/Stage[main]/Cobbler::Iptables/Cobbler::Iptables::Access_to_cobbler_port[tftp_udp]/Exec[access_to_cobbler_udp_port:
69]/returns: executed successfully_[0m
_[mNotice:
/Stage[main]/Cobbler::Iptables/Cobbler::Iptables::Access_to_cobbler_port[dns_udp]/Exec[access_to_cobbler_udp_port:
53]/returns: executed successfully_[0m
_[mNotice:
/Stage[main]/Cobbler::Iptables/Cobbler::Iptables::Access_to_cobbler_port[http]/Exec[access_to_cobbler_tcp_port:
80]/returns: executed successfully_[0m
_[mNotice:
/Stage[main]/Cobbler::Iptables/Cobbler::Iptables::Access_to_cobbler_port[dhcp_68]/Exec[access_to_cobbler_udp_port:
68]/returns: executed successfully_[0m
_[mNotice:
/Stage[main]/Cobbler::Iptables/Cobbler::Iptables::Access_to_cobbler_port[xmlrpc_api]/Exec[access_to_cobbler_tcp_port:
25151]/returns: executed successfully_[0m
_[mNotice:
/Stage[main]/Cobbler::Iptables/Cobbler::Iptables::Access_to_cobbler_port[tftp_tcp]/Exec[access_to_cobbler_tcp_port:
69]/returns: executed successfully_[0m
_[mNotice:
/Stage[main]/Cobbler::Iptables/Cobbler::Iptables::Access_to_cobbler_port[http_3128]/Exec[access_to_cobbler_tcp_port:
3128]/returns: executed successfully_[0m
_[mNotice:
/Stage[main]/Cobbler::Iptables/Cobbler::Iptables::Access_to_cobbler_port[pxe_4011]/Exec[access_to_cobbler_udp_port:
4011]/returns: executed successfully_[0m
_[mNotice:
/Stage[main]/Cobbler::Iptables/Cobbler::Iptables::Access_to_cobbler_port[dhcp_67]/Exec[access_to_cobbler_udp_port:
67]/returns: executed successfully_[0m
_[mNotice:
/Stage[main]/Cobbler::Iptables/Cobbler::Iptables::Access_to_cobbler_port[ntp_udp]/Exec[access_to_cobbler_udp_port:
123]/returns: executed successfully_[0m
_[mNotice:
/Stage[main]/Cobbler::Iptables/Cobbler::Iptables::Access_to_cobbler_port[syslog_tcp]/Exec[access_to_cobbler_tcp_port:
25150]/returns: executed successfully_[0m
_[mNotice:
/Stage[main]/Cobbler::Iptables/Cobbler::Iptables::Access_to_cobbler_port[https]/Exec[access_to_cobbler_tcp_port:
443]/returns: executed successfully_[0m
_[mNotice:
/Stage[main]/Cobbler::Iptables/Cobbler::Iptables::Access_to_cobbler_port[dns_tcp]/Exec[access_to_cobbler_tcp_port:
53]/returns: executed successfully_[0m
_[0;32mInfo: cobbler_digest_user: user cobbler already exists_[0m
_[mNotice: /Stage[main]/Cobbler::Server/Service[httpd]/ensure: ensure changed
'stopped' to 'running'_[0m
_[0;32mInfo: /Stage[main]/Cobbler::Server/Service[httpd]: Unscheduling refresh
on Service[httpd]_[0m
_[mNotice: /Stage[main]/Cobbler::Server/Service[cobblerd]/ensure: ensure
changed 'stopped' to 'running'_[0m
_[0;32mInfo: /Stage[main]/Cobbler::Server/Service[cobblerd]: Scheduling refresh
of Exec[cobbler_sync]_[0m
_[0;32mInfo: /Stage[main]/Cobbler::Server/Service[cobblerd]: Unscheduling
refresh on Service[cobblerd]_[0m
_[mNotice: /Stage[main]/Cobbler::Server/Exec[cobbler_sync]/returns: cobblerd
does not appear to be running/accessible_[0m
_[1;31mError: /Stage[main]/Cobbler::Server/Exec[cobbler_sync]: Failed to call
refresh: cobbler sync returned 155 instead of one of [0]_[0m
_[1;31mError: /Stage[main]/Cobbler::Server/Exec[cobbler_sync]: cobbler sync
returned 155 instead of one of [0]_[0m
_[mNotice: /Stage[main]/Cobbler::Server/Service[xinetd]/ensure: ensure changed
'stopped' to 'running'_[0m
_[0;32mInfo: /Stage[main]/Cobbler::Server/Service[xinetd]: Unscheduling refresh
on Service[xinetd]_[0m
_[0;32mInfo: cobbler_distro: checking if distro exists: bootstrap_[0m
_[1;31mError: /Stage[main]/Nailgun::Cobbler/Cobbler_distro[bootstrap]: Could
not evaluate: cobblerd does not appear to be running/accessible
_[0m
_[mNotice: /Stage[main]/Nailgun::Cobbler/Exec[cp /root/.ssh/id_rsa.pub
/etc/cobbler/authorized_keys]/returns: executed successfully_[0m
_[mNotice: /Stage[main]/Nailgun::Cobbler/Cobbler_profile[bootstrap]: Dependency
Cobbler_distro[bootstrap] has failures: true_[0m
_[1;31mWarning: /Stage[main]/Nailgun::Cobbler/Cobbler_profile[bootstrap]:
Skipping because of failed dependencies_[0m
_[mNotice: /Stage[main]/Nailgun::Cobbler/Exec[cobbler_system_add_default]:
Dependency Cobbler_distro[bootstrap] has failures: true_[0m
_[1;31mWarning: /Stage[main]/Nailgun::Cobbler/Exec[cobbler_system_add_default]:
Skipping because of failed dependencies_[0m
_[mNotice: /Stage[main]/Nailgun::Cobbler/Exec[cobbler_system_edit_default]:
Dependency Cobbler_distro[bootstrap] has failures: true_[0m
_[1;31mWarning:
/Stage[main]/Nailgun::Cobbler/Exec[cobbler_system_edit_default]: Skipping
because of failed dependencies_[0m
_[mNotice: /Stage[main]/Nailgun::Cobbler/Exec[nailgun_cobbler_sync]: Dependency
Cobbler_distro[bootstrap] has failures: true_[0m
_[1;31mWarning: /Stage[main]/Nailgun::Cobbler/Exec[nailgun_cobbler_sync]:
Skipping because of failed dependencies_[0m
_[0;32mInfo: cobbler_distro: checking if distro exists: ubuntu_1204_x86_64_[0m
_[1;31mError: /Stage[main]/Nailgun::Cobbler/Cobbler_distro[ubuntu_1204_x86_64]:
Could not evaluate: cobblerd does not appear to be running/accessible
_[0m
_[mNotice: /Stage[main]/Nailgun::Cobbler/Cobbler_profile[ubuntu_1204_x86_64]:
Dependency Cobbler_distro[ubuntu_1204_x86_64] has failures: true_[0m
_[1;31mWarning:
/Stage[main]/Nailgun::Cobbler/Cobbler_profile[ubuntu_1204_x86_64]: Skipping
because of failed dependencies_[0m
_[0;32mInfo: cobbler_distro: checking if distro exists: centos-x86_64_[0m
_[1;31mError: /Stage[main]/Nailgun::Cobbler/Cobbler_distro[centos-x86_64]:
Could not evaluate: cobblerd does not appear to be running/accessible
_[0m
_[mNotice: /Stage[main]/Nailgun::Cobbler/Cobbler_profile[centos-x86_64]:
Dependency Cobbler_distro[centos-x86_64] has failures: true_[0m
_[1;31mWarning: /Stage[main]/Nailgun::Cobbler/Cobbler_profile[centos-x86_64]:
Skipping because of failed dependencies_[0m
_[mNotice: Finished catalog run in 73.14 seconds_[0m
Stopping cobbler daemon: [FAILED]
Starting dnsmasq: [ OK ] Traceback (most recent call last):
File "/usr/bin/cobblerd", line 76, in main
api = cobbler_api.BootAPI(is_cobblerd=True)
File "/usr/lib/python2.6/site-packages/cobbler/api.py", line 130, in __init__
self.deserialize()
File "/usr/lib/python2.6/site-packages/cobbler/api.py", line 898, in
deserialize
return self._config.deserialize()
File "/usr/lib/python2.6/site-packages/cobbler/config.py", line 266, in
deserialize
raise CX("serializer: error loading collection %s. Check
/etc/cobbler/modules.conf" % item.collection_type())
CX: 'serializer: error loading collection distro. Check
/etc/cobbler/modules.conf'
Stopping httpd: [FAILED] Stopping xinetd: [FAILED] _[0;32mInfo: Loading facts
in /etc/puppet/modules/nagios/lib/facter/disks.rb_[0m
From: Chee Thao (CW)
Sent: Wednesday, August 27, 2014 10:02 AM
To: Mike Chao
Subject: FW: [Fuel-dev] Additional Slave node is not getting DHCP IP assigned
From: Raghunath Mallina (CW)
Sent: Tuesday, August 26, 2014 11:20 AM
To: Chee Thao (CW)
Cc: Gandhirajan Mariappan (CW)
Subject: FW: [Fuel-dev] Additional Slave node is not getting DHCP IP assigned
Some inputs from the Marantis Team.
Thanks
Raghunath
From: Sergii Golovatiuk [mailto:[email protected]]
Sent: Tuesday, August 26, 2014 11:18 AM
To: Gandhirajan Mariappan (CW)
Cc: [email protected]<mailto:[email protected]>; Nataraj
Mylsamy (CW); Raghunath Mallina (CW)
Subject: Re: [Fuel-dev] Additional Slave node is not getting DHCP IP assigned
Hi Gandhi,
The DHCP server on master node should definitely work in cobbler container. You
may check it by
dockerctl shell cobbler
cat /etc/dnsmasq.conf
Also dhcprelay should be running to allow dhcp traffic to pass through master
node to cobbler container.
Additionally you may use tcpdump to trace traffic from node-3 to cobbler
container.
--
Best regards,
Sergii Golovatiuk,
Skype #golserge
IRC #holser
On Tue, Aug 26, 2014 at 9:41 AM, Gandhirajan Mariappan (CW)
<[email protected]<mailto:[email protected]>> wrote:
Hi Fuel Dev,
In addition to the 2 slave nodes and 1 master node attached to the Brocade VDX
device, we are attaching 1 more slave node (slave node 3). We have set the PXE
boot setting in 3rd slave node and tried the below setup but DHCP in master
node is not assigning IP to the slave node 3. Kindly let us know what could be
the problem. Is there any way to make master node DHCP to assign IP to slave
node 3?
“We have setup a testing DHCP server on a new CentOS server. This CentOS
Server was given the same port on the VDX switch as the Mirantis Fuel Master
Node. The DHCP Server on the CentOS was able to give DHCP ip to the 3rd node
that was having issues getting DHCP ip from the Fuel Master Node.
What this confirm is that the VDX switch is not causing the issue and that the
DHCP Server on the Fuel Master Node is having issues giving out DHCP leases.
The DHCP Server on the Fuel Master Node appears to be running, but because it
is not a standard DHCP server, but on that is wrapped in custom software from
Mirantis, we currently do not know what we need to adjust to get it working.
It looks like it is a very tightly coupled system where each component depends
on the other.”
Thanks and Regards,
Gandhi Rajan
--
Mailing list: https://launchpad.net/~fuel-dev
Post to : [email protected]<mailto:[email protected]>
Unsubscribe : https://launchpad.net/~fuel-dev
More help : https://help.launchpad.net/ListHelp
--
Mailing list: https://launchpad.net/~fuel-dev
Post to : [email protected]
Unsubscribe : https://launchpad.net/~fuel-dev
More help : https://help.launchpad.net/ListHelp