Reviewed:  https://review.openstack.org/544772
Committed: 
https://git.openstack.org/cgit/openstack/nova/commit/?id=1bd749288c12c83c2c48f5a6bf9a0b5fc9f682c7
Submitter: Zuul
Branch:    master

commit 1bd749288c12c83c2c48f5a6bf9a0b5fc9f682c7
Author: Hironori Shiina <shiina.hiron...@jp.fujitsu.com>
Date:   Thu Feb 15 12:13:23 2018 +0900

    ironic: Clean up resources after unprovision fails
    
    When deleting a bare metal instance, even if ironic fails to
    unprovision a node, network resources are deallocated. Then, VIFs are
    removed though the VIFs are not detached from the ironic node.
    
    This patch cleans up resources nova prepares for ironic even if
    unprovision fails.
    
    Change-Id: I01f45f157078a92ccc9f4ff5b4bfeae6cef77c1a
    Closes-Bug: 1749629


** Changed in: nova
       Status: In Progress => Fix Released

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/1749629

Title:
  VIFs are not detached from ironic node when unprovision fails

Status in OpenStack Compute (nova):
  Fix Released

Bug description:
  Description
  ===========
  When deleting a bare metal instance, even if ironic fails to unprovision a 
node, network resources are deallocated. Then VIFs are removed though the VIFs 
are not detached from the ironic node. The instance gets in ERROR state and can 
be deleted by requesting deletion again without any call to ironic. In ironic, 
the node gets in 'error' provision state after the unprovision failure. Though 
node can be recovered by undeploying it with ironic API, the VIF UUIDs still 
remain in the node. This causes an error when the node is deployed again since 
ironic tries to bind the node to the deleted VIFs. It would be better to detach 
VIFs even if unprovisioning fails.

  Steps to reproduce
  ==================
  * create an instance
  * delete an instance, then it failed due to ironic bug
      ironic node gets in 'error' provision state
  * recover the ironic node with:
      openstack baremetal node undeploy <node>
  * create an instance again, then the previous node is chosen

  Expected result
  ===============
  After the execution of the steps above, what should have
  happened if the issue wasn't present?

  Actual result
  =============
  What happened instead of the expected result?
  How did the issue look like?

  Environment
  ===========
  Deploying Ironic with DevStack[1] with master branch.

  [1] https://docs.openstack.org/ironic/latest/contributor/dev-
  quickstart.html#deploying-ironic-with-devstack

  Logs & Configs
  ==============

  n-cpu log when undeploy failed
  -------------------------------
  Feb 14 21:24:51 compute nova-compute[17722]: INFO nova.compute.manager [None 
req-420e3a3a-3af4-4662-877e-d19fd24ea036 service nova] [instance: 
651f3a00-6074-4346-a229-97e8874eed36] Successfully reverted task state from 
deleting on failure for instance.
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server 
[None req-420e3a3a-3af4-4662-877e-d19fd24ea036 service nova] Exception during 
message handling: NovaException: Error destroying the instance on node 
8a2eea80-d53a-419e-a56c-9981f248cf91. Provision state still 'deleting'.
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server 
Traceback (most recent call last):
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", 
line 163, in _process_incoming
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   res = self.dispatcher.dispatch(message)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File 
"/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 
220, in dispatch
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   return self._do_dispatch(endpoint, method, ctxt, args)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File 
"/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 
190, in _do_dispatch 
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   result = func(ctxt, **new_args)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/opt/stack/nova/nova/exception_wrapper.py", line 76, in wrapped
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   function_name, call_dict, binary)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 
220, in __exit__
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   self.force_reraise()
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 
196, in force_reraise
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   six.reraise(self.type_, self.value, self.tb)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/opt/stack/nova/nova/exception_wrapper.py", line 67, in wrapped
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   return f(self, context, *args, **kw)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/opt/stack/nova/nova/compute/manager.py", line 186, in decorated_function
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   "Error: %s", e, instance=instance)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 
220, in __exit__
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   self.force_reraise()
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 
196, in force_reraise
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   six.reraise(self.type_, self.value, self.tb)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/opt/stack/nova/nova/compute/manager.py", line 156, in decorated_function
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   return function(self, context, *args, **kwargs)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/opt/stack/nova/nova/compute/utils.py", line 970, in decorated_function
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   return function(self, context, *args, **kwargs)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/opt/stack/nova/nova/compute/manager.py", line 214, in decorated_function
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   kwargs['instance'], e, sys.exc_info())
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 
220, in __exit__
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   self.force_reraise()
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 
196, in force_reraise
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   six.reraise(self.type_, self.value, self.tb)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/opt/stack/nova/nova/compute/manager.py", line 202, in decorated_function
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   return function(self, context, *args, **kwargs)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/opt/stack/nova/nova/compute/manager.py", line 2540, in 
terminate_instance
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   do_terminate_instance(instance, bdms)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py", 
line 274, in inner
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   return f(*args, **kwargs)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/opt/stack/nova/nova/compute/manager.py", line 2538, in 
do_terminate_instance
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   self._set_instance_obj_error_state(context, instance)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 
220, in __exit__
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   self.force_reraise()
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 
196, in force_reraise
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   six.reraise(self.type_, self.value, self.tb)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/opt/stack/nova/nova/compute/manager.py", line 2528, in 
do_terminate_instance
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   self._delete_instance(context, instance, bdms)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/opt/stack/nova/nova/hooks.py", line 154, in inner
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   rv = f(*args, **kwargs)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/opt/stack/nova/nova/compute/manager.py", line 2463, in _delete_instance
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   self._shutdown_instance(context, instance, bdms)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/opt/stack/nova/nova/compute/manager.py", line 2368, in 
_shutdown_instance
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   requested_networks)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 
220, in __exit__
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   self.force_reraise()
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 
196, in force_reraise
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   six.reraise(self.type_, self.value, self.tb)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/opt/stack/nova/nova/compute/manager.py", line 2355, in 
_shutdown_instance
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   block_device_info)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/opt/stack/nova/nova/virt/ironic/driver.py", line 1216, in destroy
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   self._unprovision(instance, node)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/opt/stack/nova/nova/virt/ironic/driver.py", line 1189, in _unprovision
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   timer.start(interval=CONF.ironic.api_retry_interval).wait()
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/usr/local/lib/python2.7/dist-packages/eventlet/event.py", line 121, in 
wait
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   return hubs.get_hub().switch()
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/usr/local/lib/python2.7/dist-packages/eventlet/hubs/hub.py", line 294, 
in switch
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   return self.greenlet.switch()
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py", 
line 137, in _run_loop
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   result = func(*self.args, **self.kw)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
 File "/opt/stack/nova/nova/virt/ironic/driver.py", line 1181, in 
_wait_for_provision_state
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server  
   raise exception.NovaException(msg)
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server 
NovaException: Error destroying the instance on node 
8a2eea80-d53a-419e-a56c-9981f248cf91. Provision state still 'deleting'.
  Feb 14 21:24:52 compute nova-compute[17722]: ERROR oslo_messaging.rpc.server

  n-cpu log when re-create failed
  -------------------------------

  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6] Traceback (most recent call 
last):
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]   File 
"/opt/stack/nova/nova/compute/manager.py", line 2239, in _build_resources
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]     yield resources
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]   File 
"/opt/stack/nova/nova/compute/manager.py", line 2022, in _build_and_run_instance
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]     
block_device_info=block_device_info)
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]   File 
"/opt/stack/nova/nova/virt/ironic/driver.py", line 1133, in spawn
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]     'node': node_uuid})
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]   File 
"/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in 
__exit__
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]     self.force_reraise()
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]   File 
"/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in 
force_reraise
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]     six.reraise(self.type_, 
self.value, self.tb)
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]   File 
"/opt/stack/nova/nova/virt/ironic/driver.py", line 1125, in spawn
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]     
timer.start(interval=CONF.ironic.api_retry_interval).wait()
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]   File 
"/usr/local/lib/python2.7/dist-packages/eventlet/event.py", line 121, in wait
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]     return 
hubs.get_hub().switch()
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]   File 
"/usr/local/lib/python2.7/dist-packages/eventlet/hubs/hub.py", line 294, in 
switch
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]     return 
self.greenlet.switch()
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]   File 
"/usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py", line 137, 
in _run_loop
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]     result = func(*self.args, 
**self.kw)
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]   File 
"/opt/stack/nova/nova/virt/ironic/driver.py", line 508, in _wait_for_active  
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6]     raise 
exception.InstanceDeployFailure(msg)
  Feb 14 21:48:24 compute nova-compute[17722]: ERROR nova.compute.manager 
[instance: 8d518628-3235-4e24-a4e9-492159d7fae6] InstanceDeployFailure: Failed 
to provision instance 8d518628-3235-4e24-a4e9-492159d7fae6: Failed to prepare 
to deploy: Unable to set binding:host_id for neutron port 
90eac274-e2e1-43a9-86eb-d81d784aa652. Error: Port 
90eac274-e2e1-43a9-86eb-d81d784aa652 could not be found.

To manage notifications about this bug go to:
https://bugs.launchpad.net/nova/+bug/1749629/+subscriptions

-- 
Mailing list: https://launchpad.net/~yahoo-eng-team
Post to     : yahoo-eng-team@lists.launchpad.net
Unsubscribe : https://launchpad.net/~yahoo-eng-team
More help   : https://help.launchpad.net/ListHelp

Reply via email to