Public bug reported:
I am observing StaleDataError on stable/pike during live migration,
causing live migration to fail. It occurs when attempting to live-
migrate a handful of VM's (5-6 VM's is all it takes) in rapid succession
from the same source to the same target. This quick and dirty script is
able to make the issue appear reliably:
for i in `openstack server list --all-projects --host <origin> -c ID -f
value`; do openstack server migrate $i --live <target>; done
>From the neutron server logs:
DB exceeded retry limit.: StaleDataError: UPDATE statement on table
'standardattributes' expected to update 1 row(s); 0 were matched.
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api Traceback (most recent call
last):
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/oslo_db/api.py",
line 138, in wrapper
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api return f(*args, **kwargs)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/neutron/db/api.py",
line 128, in wrapped
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api LOG.debug("Retry wrapper
got retriable exception: %s", e)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/oslo_utils/excutils.py",
line 220, in __exit__
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api self.force_reraise()
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/oslo_utils/excutils.py",
line 196, in force_reraise
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api six.reraise(self.type_,
self.value, self.tb)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/neutron/db/api.py",
line 124, in wrapped
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api return f(*dup_args,
**dup_kwargs)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py",
line 1346, in update_port
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api mech_context, attrs)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py",
line 354, in _process_port_binding
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api
db.clear_binding_levels(plugin_context, port_id, original_host)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/oslo_db/sqlalchemy/enginefacade.py",
line 979, in wrapper
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api return fn(*args, **kwargs)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/usr/lib64/python2.7/contextlib.py", line 24, in __exit__
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api self.gen.next()
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/oslo_db/sqlalchemy/enginefacade.py",
line 1029, in _transaction_scope
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api yield resource
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/usr/lib64/python2.7/contextlib.py", line 24, in __exit__
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api self.gen.next()
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/oslo_db/sqlalchemy/enginefacade.py",
line 655, in _session
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api self.session.flush()
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
line 2171, in flush
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api self._flush(objects)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
line 2291, in _flush
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api
transaction.rollback(_capture_exception=True)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/sqlalchemy/util/langhelpers.py",
line 66, in __exit__
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api compat.reraise(exc_type,
exc_value, exc_tb)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
line 2255, in _flush
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api flush_context.execute()
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/sqlalchemy/orm/unitofwork.py",
line 389, in execute
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api rec.execute(self)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/sqlalchemy/orm/unitofwork.py",
line 548, in execute
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api uow
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/sqlalchemy/orm/persistence.py",
line 177, in save_obj
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api mapper, table, update)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/sqlalchemy/orm/persistence.py",
line 760, in _emit_update_statements
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api (table.description,
len(records), rows))
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api StaleDataError: UPDATE
statement on table 'standardattributes' expected to update 1 row(s); 0 were
matched.
It appears that somewhere in clear_binding_levels() there is some
contention on the update to the standardattributes record for the port.
I can confirm this only on stable/pike, I have not had a chance to
confirm on master.
** Affects: neutron
Importance: Undecided
Status: New
--
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to neutron.
https://bugs.launchpad.net/bugs/1813817
Title:
Max retries exceeded with StaleDataError in standardattributes during
live migration
Status in neutron:
New
Bug description:
I am observing StaleDataError on stable/pike during live migration,
causing live migration to fail. It occurs when attempting to live-
migrate a handful of VM's (5-6 VM's is all it takes) in rapid
succession from the same source to the same target. This quick and
dirty script is able to make the issue appear reliably:
for i in `openstack server list --all-projects --host <origin> -c ID
-f value`; do openstack server migrate $i --live <target>; done
From the neutron server logs:
DB exceeded retry limit.: StaleDataError: UPDATE statement on table
'standardattributes' expected to update 1 row(s); 0 were matched.
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api Traceback (most recent call
last):
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/oslo_db/api.py",
line 138, in wrapper
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api return f(*args, **kwargs)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/neutron/db/api.py",
line 128, in wrapped
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api LOG.debug("Retry wrapper
got retriable exception: %s", e)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/oslo_utils/excutils.py",
line 220, in __exit__
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api self.force_reraise()
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/oslo_utils/excutils.py",
line 196, in force_reraise
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api six.reraise(self.type_,
self.value, self.tb)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/neutron/db/api.py",
line 124, in wrapped
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api return f(*dup_args,
**dup_kwargs)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py",
line 1346, in update_port
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api mech_context, attrs)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/neutron/plugins/ml2/plugin.py",
line 354, in _process_port_binding
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api
db.clear_binding_levels(plugin_context, port_id, original_host)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/oslo_db/sqlalchemy/enginefacade.py",
line 979, in wrapper
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api return fn(*args,
**kwargs)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/usr/lib64/python2.7/contextlib.py", line 24, in __exit__
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api self.gen.next()
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/oslo_db/sqlalchemy/enginefacade.py",
line 1029, in _transaction_scope
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api yield resource
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/usr/lib64/python2.7/contextlib.py", line 24, in __exit__
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api self.gen.next()
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/oslo_db/sqlalchemy/enginefacade.py",
line 655, in _session
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api self.session.flush()
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
line 2171, in flush
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api self._flush(objects)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
line 2291, in _flush
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api
transaction.rollback(_capture_exception=True)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/sqlalchemy/util/langhelpers.py",
line 66, in __exit__
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api compat.reraise(exc_type,
exc_value, exc_tb)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
line 2255, in _flush
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api flush_context.execute()
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/sqlalchemy/orm/unitofwork.py",
line 389, in execute
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api rec.execute(self)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/sqlalchemy/orm/unitofwork.py",
line 548, in execute
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api uow
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/sqlalchemy/orm/persistence.py",
line 177, in save_obj
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api mapper, table, update)
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api File
"/opt/stack/venv/neutron-20181030T130300Z/lib/python2.7/site-packages/sqlalchemy/orm/persistence.py",
line 760, in _emit_update_statements
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api (table.description,
len(records), rows))
2019-01-24 09:57:34.959 255478 ERROR oslo_db.api StaleDataError: UPDATE
statement on table 'standardattributes' expected to update 1 row(s); 0 were
matched.
It appears that somewhere in clear_binding_levels() there is some
contention on the update to the standardattributes record for the
port. I can confirm this only on stable/pike, I have not had a chance
to confirm on master.
To manage notifications about this bug go to:
https://bugs.launchpad.net/neutron/+bug/1813817/+subscriptions
--
Mailing list: https://launchpad.net/~yahoo-eng-team
Post to : [email protected]
Unsubscribe : https://launchpad.net/~yahoo-eng-team
More help : https://help.launchpad.net/ListHelp