[Expired for OpenStack Compute (nova) because there has been no activity
for 60 days.]
** Changed in: nova
Status: Incomplete => Expired
--
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/1619954
Title:
migration gives "Unexpected API Error"
Status in OpenStack Compute (nova):
Expired
Bug description:
I'm trying to move an instance from one Compute host to another:
----- s n i p -----
bladeA01:~# openstack server list --long -f csv -c ID -c Name -c Host --quote
none | grep -i bladeA06
ec3810bd-e201-4243-ab67-f1a0801acc0a,devel-sid-5,bladeA06
e67c0719-3c13-45cc-8733-d0e66548d08e,devel-sid-1,bladeA06
d2424223-1707-4976-ac17-c5b766697541,devel-sid-6,bladeA06
bladeA01:~# openstack server migrate --live bladeA05 --shared-migration
--wait e67c0719-3c13-45cc-8733-d0e66548d08e
Unexpected API Error. Please report this at http://bugs.launchpad.net/nova/
and attach the Nova API log if possible.
<class 'oslo_messaging.exceptions.MessagingTimeout'> (HTTP 500) (Request-ID:
req-8c00c15f-8453-4402-97f0-511037334ae4)
----- s n i p -----
The nova-api log on the Control node say:
----- s n i p -----
==> /var/log/nova/nova-api.log <==
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions
[req-8c00c15f-8453-4402-97f0-511037334ae4 4b0e25c70d2b4ad6ba4c50250f2f0b0b
04ee0e71babe4fd7aa16c3f64a8fca89 - - -] Unexpected exception in API method
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions Traceback
(most recent call last):
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions File
"/usr/lib/python2.7/dist-packages/nova/api/openstack/extensions.py", line 478,
in wrapped
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions return
f(*args, **kwargs)
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions File
"/usr/lib/python2.7/dist-packages/nova/api/validation/__init__.py", line 73, in
wrapper
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions return
func(*args, **kwargs)
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions File
"/usr/lib/python2.7/dist-packages/nova/api/validation/__init__.py", line 73, in
wrapper
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions return
func(*args, **kwargs)
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions File
"/usr/lib/python2.7/dist-packages/nova/api/openstack/compute/migrate_server.py",
line 93, in _migrate_live
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions
disk_over_commit, host)
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions File
"/usr/lib/python2.7/dist-packages/nova/compute/api.py", line 158, in inner
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions return
function(self, context, instance, *args, **kwargs)
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions File
"/usr/lib/python2.7/dist-packages/nova/compute/api.py", line 186, in _wrapped
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions return
fn(self, context, instance, *args, **kwargs)
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions File
"/usr/lib/python2.7/dist-packages/nova/compute/api.py", line 139, in inner
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions return
f(self, context, instance, *args, **kw)
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions File
"/usr/lib/python2.7/dist-packages/nova/compute/api.py", line 3371, in
live_migrate
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions
request_spec=request_spec)
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions File
"/usr/lib/python2.7/dist-packages/nova/conductor/api.py", line 194, in
live_migrate_instance
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions
block_migration, disk_over_commit, None, request_spec=request_spec)
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions File
"/usr/lib/python2.7/dist-packages/nova/conductor/rpcapi.py", line 309, in
migrate_server
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions return
cctxt.call(context, 'migrate_server', **kw)
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions File
"/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 158, in
call
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions
retry=self.retry)
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions File
"/usr/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 90, in
_send
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions
timeout=timeout, retry=retry)
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions File
"/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line
470, in send
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions
retry=retry)
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions File
"/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line
459, in _send
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions result =
self._waiter.wait(msg_id, timeout)
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions File
"/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line
342, in wait
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions message
= self.waiters.get(msg_id, timeout=timeout)
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions File
"/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line
244, in get
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions 'to
message ID %s' % msg_id)
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions
MessagingTimeout: Timed out waiting for a reply to message ID
3cc1c96eb1c34a9aacdf3d41474f8826
2016-09-03 19:18:10.245 8699 ERROR nova.api.openstack.extensions
2016-09-03 19:18:10.247 8699 INFO nova.api.openstack.wsgi
[req-8c00c15f-8453-4402-97f0-511037334ae4 4b0e25c70d2b4ad6ba4c50250f2f0b0b
04ee0e71babe4fd7aa16c3f64a8fca89 - - -] HTTP exception thrown: Unexpected API
Error. Please report this at http://bugs.launchpad.net/nova/ and attach the
Nova API log if possible.
<class 'oslo_messaging.exceptions.MessagingTimeout'>
2016-09-03 19:18:10.248 8699 DEBUG nova.api.openstack.wsgi
[req-8c00c15f-8453-4402-97f0-511037334ae4 4b0e25c70d2b4ad6ba4c50250f2f0b0b
04ee0e71babe4fd7aa16c3f64a8fca89 - - -] Returning 500 to user: Unexpected API
Error. Please report this at http://bugs.launchpad.net/nova/ and attach the
Nova API log if possible.
<class 'oslo_messaging.exceptions.MessagingTimeout'> __call__
/usr/lib/python2.7/dist-packages/nova/api/openstack/wsgi.py:1070
2016-09-03 19:18:10.250 8699 INFO nova.osapi_compute.wsgi.server
[req-8c00c15f-8453-4402-97f0-511037334ae4 4b0e25c70d2b4ad6ba4c50250f2f0b0b
04ee0e71babe4fd7aa16c3f64a8fca89 - - -] 10.0.4.1 "POST
/v2/04ee0e71babe4fd7aa16c3f64a8fca89/servers/e67c0719-3c13-45cc-8733-d0e66548d08e/action
HTTP/1.1" status: 500 len: 447 time: 60.4883082
----- s n i p -----
The nova-compute log file on the destination host:
----- s n i p -----
==> /var/log/nova/nova-compute.log <==
2016-09-03 20:17:04.766 27260 INFO nova.virt.libvirt.driver
[req-8c00c15f-8453-4402-97f0-511037334ae4 4b0e25c70d2b4ad6ba4c50250f2f0b0b
04ee0e71babe4fd7aa16c3f64a8fca89 - - -] Instance launched has CPU info:
{"vendor": "Intel", "model": "Nehalem", "arch": "x86_64", "features": ["pge",
"clflush", "sep", "syscall", "vme", "dtes64", "tsc", "vmx", "xtpr", "cmov",
"nx", "est", "pat", "monitor", "pbe", "lm", "msr", "fpu", "fxsr", "tm",
"sse4.1", "pae", "sse4.2", "acpi", "mmx", "cx8", "mce", "de", "tm2", "ht",
"pse", "pni", "pdcm", "mca", "apic", "sse", "dca", "ds", "invtsc", "lahf_lm",
"rdtscp", "sse2", "ss", "ds_cpl", "ssse3", "cx16", "pse36", "mtrr", "popcnt"],
"topology": {"cores": 4, "cells": 2, "threads": 2, "sockets": 1}}
2016-09-03 20:17:31.367 27260 INFO nova.compute.resource_tracker
[req-532b1b6d-01e9-456f-bc07-81c74b7e4f2e - - - - -] Auditing locally available
compute resources for node bladeA05.domain.tld
2016-09-03 20:17:31.986 27260 INFO nova.compute.resource_tracker
[req-532b1b6d-01e9-456f-bc07-81c74b7e4f2e - - - - -] Total usable vcpus: 16,
total allocated vcpus: 8
2016-09-03 20:17:31.986 27260 INFO nova.compute.resource_tracker
[req-532b1b6d-01e9-456f-bc07-81c74b7e4f2e - - - - -] Final resource view:
name=bladeA05.domain.tld phys_ram=32230MB used_ram=8704MB phys_disk=128GB
used_disk=70GB total_vcpus=16 used_vcpus=8 pci_stats=[]
2016-09-03 20:17:32.055 27260 INFO nova.compute.resource_tracker
[req-532b1b6d-01e9-456f-bc07-81c74b7e4f2e - - - - -] Compute_service record
updated for bladeA05:bladeA05.domain.tld
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher
[req-8c00c15f-8453-4402-97f0-511037334ae4 4b0e25c70d2b4ad6ba4c50250f2f0b0b
04ee0e71babe4fd7aa16c3f64a8fca89 - - -] Exception during message handling:
Timed out waiting for a reply to message ID 02c2c768be1644c4bf76f83317bd376e
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher Traceback
(most recent call last):
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 138,
in _dispatch_and_reply
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher
incoming.message))
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 185,
in _dispatch
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher return
self._do_dispatch(endpoint, method, ctxt, args)
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 127,
in _do_dispatch
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher result
= func(ctxt, **new_args)
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/nova/exception.py", line 110, in wrapped
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher payload)
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 221, in __exit__
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher
self.force_reraise()
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 197, in
force_reraise
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher
six.reraise(self.type_, self.value, self.tb)
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/nova/exception.py", line 89, in wrapped
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher return
f(self, context, *args, **kw)
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 409, in
decorated_function
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher return
function(self, context, *args, **kwargs)
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 387, in
decorated_function
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher
kwargs['instance'], e, sys.exc_info())
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 221, in __exit__
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher
self.force_reraise()
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 197, in
force_reraise
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher
six.reraise(self.type_, self.value, self.tb)
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 375, in
decorated_function
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher return
function(self, context, *args, **kwargs)
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 5136, in
check_can_live_migrate_destination
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher
disk_over_commit)
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 5152, in
_do_check_can_live_migrate_destination
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher
dest_check_data)
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/nova/compute/rpcapi.py", line 475, in
check_can_live_migrate_source
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher
dest_check_data=dest_check_data)
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 158, in
call
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher
retry=self.retry)
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 90, in
_send
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher
timeout=timeout, retry=retry)
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line
470, in send
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher
retry=retry)
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line
459, in _send
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher result
= self._waiter.wait(msg_id, timeout)
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line
342, in wait
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher message
= self.waiters.get(msg_id, timeout=timeout)
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher File
"/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line
244, in get
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher 'to
message ID %s' % msg_id)
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher
MessagingTimeout: Timed out waiting for a reply to message ID
02c2c768be1644c4bf76f83317bd376e
2016-09-03 20:18:04.871 27260 ERROR oslo_messaging.rpc.dispatcher
2016-09-03 20:18:04.874 27260 ERROR oslo_messaging._drivers.common
[req-8c00c15f-8453-4402-97f0-511037334ae4 4b0e25c70d2b4ad6ba4c50250f2f0b0b
04ee0e71babe4fd7aa16c3f64a8fca89 - - -] Returning exception Timed out waiting
for a reply to message ID 02c2c768be1644c4bf76f83317bd376e to caller
2016-09-03 20:18:04.874 27260 ERROR oslo_messaging._drivers.common
[req-8c00c15f-8453-4402-97f0-511037334ae4 4b0e25c70d2b4ad6ba4c50250f2f0b0b
04ee0e71babe4fd7aa16c3f64a8fca89 - - -] ['Traceback (most recent call
last):\n', ' File
"/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 138,
in _dispatch_and_reply\n incoming.message))\n', ' File
"/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 185,
in _dispatch\n return self._do_dispatch(endpoint, method, ctxt, args)\n', '
File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line
127, in _do_dispatch\n result = func(ctxt, **new_args)\n', ' File
"/usr/lib/python2.7/dist-packages/nova/exception.py", line 110, in wrapped\n
payload)\n', ' File "/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py",
line 221, in __exit__\n self.force_reraise()\n', ' File
"/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 197, in
force_reraise\n six.reraise(self.type_, self.value, self.tb)\n', ' File
"/usr/lib/python2.7/dist-packages/nova/exception.py", line 89, in wrapped\n
return f(self, context, *args, **kw)\n', ' File
"/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 409, in
decorated_function\n return function(self, context, *args, **kwargs)\n', '
File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 387, in
decorated_function\n kwargs[\'instance\'], e, sys.exc_info())\n', ' File
"/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 221, in
__exit__\n self.force_reraise()\n', ' File
"/usr/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 197, in
force_reraise\n six.reraise(self.type_, self.value, self.tb)\n', ' File
"/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 375, in
decorated_function\n return function(self, context, *args, **kwargs)\n', '
File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 5136, in
check_can_live_migrate_destination\n disk_over_commit)\n', ' File
"/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 5152, in
_do_check_can_live_migrate_destination\n dest_check_data)\n', ' File
"/usr/lib/python2.7/dist-packages/nova/compute/rpcapi.py", line 475, in
check_can_live_migrate_source\n dest_check_data=dest_check_data)\n', ' File
"/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 158, in
call\n retry=self.retry)\n', ' File
"/usr/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 90, in
_send\n timeout=timeout, retry=retry)\n', ' File
"/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line
470, in send\n retry=retry)\n', ' File
"/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line
459, in _send\n result = self._waiter.wait(msg_id, timeout)\n', ' File
"/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line
342, in wait\n message = self.waiters.get(msg_id, timeout=timeout)\n', '
File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py",
line 244, in get\n \'to message ID %s\' % msg_id)\n', 'MessagingTimeout:
Timed out waiting for a reply to message ID 02c2c768be1644c4bf76f83317bd376e\n']
2016-09-03 20:18:07.552 27260 INFO oslo_messaging._drivers.amqpdriver [-] No
calling threads waiting for msg_id : 02c2c768be1644c4bf76f83317bd376e
----- s n i p -----
And on the source host:
----- s n i p -----
==> /var/log/nova/nova-compute.log <==
2016-09-03 19:17:47.518 26517 INFO nova.compute.resource_tracker
[req-ec8339f1-fc37-42a4-a576-17e9d3fcabf8 - - - - -] Auditing locally available
compute resources for node bladeA06.domain.tld
2016-09-03 19:17:48.018 26517 INFO nova.compute.resource_tracker
[req-ec8339f1-fc37-42a4-a576-17e9d3fcabf8 - - - - -] Total usable vcpus: 16,
total allocated vcpus: 6
2016-09-03 19:17:48.019 26517 INFO nova.compute.resource_tracker
[req-ec8339f1-fc37-42a4-a576-17e9d3fcabf8 - - - - -] Final resource view:
name=bladeA06.domain.tld phys_ram=32165MB used_ram=12800MB phys_disk=128GB
used_disk=120GB total_vcpus=16 used_vcpus=6 pci_stats=[]
2016-09-03 19:17:48.082 26517 INFO nova.compute.resource_tracker
[req-ec8339f1-fc37-42a4-a576-17e9d3fcabf8 - - - - -] Compute_service record
updated for bladeA06:bladeA06.domain.tld
----- s n i p -----
I suspect that this have something to do with over provisioning. I
could not create my latest stack (8 instances, each having 2VCPUs,
4GB). So I had to startup another Compute node. But I don't want to
have that running at the moment, so therefor I'm trying to migrate
those three instances (6VCPUs, 13GB) to another host.
Looking at bladeA0[234] and how they're over provisioned, the
instances _should_ fit on bladeA05!
----- s n i p -----
Hostname Type VCPUs/used VCPUs/total RAM/used RAM/total Local Storage/used
Local Storage/total Instances
bladeA02 QEMU 20 16 30.0GB 31.5GB 250GB
128GB 15
bladeA03 QEMU 20 16 28.5GB 31.5GB 250GB
128GB 13
bladeA04 QEMU 19 16 27.5GB 31.5GB 255GB
128GB 11
bladeA05 QEMU 8 16 8.5GB 31.5GB 70GB
128GB 5
bladeA06 QEMU 6 16 12.5GB 31.4GB 120GB
128GB 3
----- s n i p -----
I have configured
nova.conf:DEFAULT/cpu_allocation_ratio=8.0
nova.conf:DEFAULT/ram_allocation_ratio=1.0
nova.conf:DEFAULT/disk_allocation_ratio=3.0
On my current/old setup (VirtualBox) I've had "bad" experiences with
over provisioned memory, hence the 1.0 in ram alloc ratio. Apparently,
this isn't (much of) a problem (?) with qemu, so I might bump that
later. But even so, both the VCPU, RAM and Disk is/should be enough!
For disk, I'm using a remote Cinder SAN (ZFS On Linux), sharing the volume
via iSCSI to the host.
This is something I've always wondered about, why is there a "Local
Storage/total = 128GB"? The Compute hosts have two local disks, one for the OS
(128GB usable FS) and one for Swift (137GB usable FS).
To manage notifications about this bug go to:
https://bugs.launchpad.net/nova/+bug/1619954/+subscriptions
--
Mailing list: https://launchpad.net/~yahoo-eng-team
Post to : [email protected]
Unsubscribe : https://launchpad.net/~yahoo-eng-team
More help : https://help.launchpad.net/ListHelp