We saw the same error too:
2014-12-08 11:08:33.315 47268 INFO neutron.wsgi [-] (47268) accepted
('10.23.70.28', 48899)
2014-12-08 11:08:33.315 47268 INFO neutron.wsgi [-] 10.23.70.28 - -
[08/Dec/2014 11:08:33] "GET / HTTP/1.0" 200 262 0.000255
2014-12-08 11:08:33.985 47264 ERROR neutron.api.v2.resource
[req-98012a5a-2a0d-4052-a30f-59d0e99c5e39 None] index failed
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource Traceback (most
recent call last):
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/neutron/api/v2/resource.py",
line 87, in resource
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource result =
method(request=request, **args)
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/neutron/api/v2/base.py",
line 308, in index
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource return
self._items(request, True, parent_id)
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/neutron/api/v2/base.py",
line 242, in _items
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource obj_list =
obj_getter(request.context, **kwargs)
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/neutron/db/db_base_plugin_v2.py",
line 1419, in get_ports
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource items =
[self._make_port_dict(c, fields) for c in query]
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py",
line 2438, in __iter__
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource return
self._execute_and_instances(context)
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py",
line 2451, in _execute_and_instances
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource
close_with_result=True)
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/sqlalchemy/orm/query.py",
line 2442, in _connection_from_session
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource **kw)
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
line 854, in connection
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource
close_with_result=close_with_result)
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/sqlalchemy/orm/session.py",
line 860, in _connection_for_bind
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource return
engine.contextual_connect(**kwargs)
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/sqlalchemy/engine/base.py",
line 1798, in contextual_connect
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource
self.pool.connect(),
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/sqlalchemy/pool.py",
line 338, in connect
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource return
_ConnectionFairy._checkout(self)
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/sqlalchemy/pool.py",
line 644, in _checkout
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource fairy =
_ConnectionRecord.checkout(pool)
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/sqlalchemy/pool.py",
line 440, in checkout
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource rec =
pool._do_get()
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/sqlalchemy/pool.py",
line 959, in _do_get
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource (self.size(),
self.overflow(), self._timeout))
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource TimeoutError:
QueuePool limit of size 10 overflow 20 reached, connection timed out, timeout 10
2014-12-08 11:08:33.985 47264 TRACE neutron.api.v2.resource
With the error, we cannot create/delete any network. Once we reboot
neutron-server, then everything is fine.
For our case, we do have some other Exception before this:
2014-12-08 11:05:40.284 47019 DEBUG oslo.messaging._drivers.amqp [-] unpacked
context: {u'tenant': None, u'project_name': None, u'user_id': None, u'roles':
[u'admin'], u'tenant_id': None, u'auth_token': '<SANITIZED>', u'timestamp':
u'2014-11-28 12:28:43.459391', u'is_admin': True, u'user': None, u'request_id':
u'req-925a27e0-9674-4c2d-bf17-f05c4135d46b', u'tenant_name': None,
u'project_id': None, u'user_name': None, u'read_deleted': u'no'} _safe_log
/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/oslo/messaging/_drivers/common.py:177
2014-12-08 11:05:40.285 47019 DEBUG neutron.common.rpc [-] Incoming RPC:
ctxt:{u'tenant': None, u'project_name': None, u'user_id': None, u'roles':
[u'admin'], u'tenant_id': None, u'auth_token': '***', u'timestamp':
u'2014-11-28 12:28:43.459391', u'is_admin': True, u'user': None, u'request_id':
u'req-925a27e0-9674-4c2d-bf17-f05c4135d46b', u'tenant_name': None,
u'project_id': None, u'user_name': None, u'read_deleted': u'no'}
message:{u'args': {u'host':
u'overcloud-ce-novacompute1-novacompute1-xiyzwwruqcch'}, u'version': u'1.0',
u'method': u'get_external_network_id'} __call__
/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/neutron/common/rpc.py:115
2014-12-08 11:05:40.285 47019 DEBUG neutron.context
[req-925a27e0-9674-4c2d-bf17-f05c4135d46b None] Arguments dropped when creating
context: {u'project_name': None, u'tenant': None} __init__
/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/neutron/context.py:83
2014-12-08 11:05:40.309 47019 ERROR oslo.messaging.rpc.dispatcher
[req-925a27e0-9674-4c2d-bf17-f05c4135d46b ] Exception during message handling:
More than one external network exists
2014-12-08 11:05:40.309 47019 TRACE oslo.messaging.rpc.dispatcher Traceback
(most recent call last):
2014-12-08 11:05:40.309 47019 TRACE oslo.messaging.rpc.dispatcher File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py",
line 134, in _dispatch_and_reply
2014-12-08 11:05:40.309 47019 TRACE oslo.messaging.rpc.dispatcher
incoming.message))
2014-12-08 11:05:40.309 47019 TRACE oslo.messaging.rpc.dispatcher File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py",
line 177, in _dispatch
2014-12-08 11:05:40.309 47019 TRACE oslo.messaging.rpc.dispatcher return
self._do_dispatch(endpoint, method, ctxt, args)
2014-12-08 11:05:40.309 47019 TRACE oslo.messaging.rpc.dispatcher File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py",
line 123, in _do_dispatch
2014-12-08 11:05:40.309 47019 TRACE oslo.messaging.rpc.dispatcher result =
getattr(endpoint, method)(ctxt, **new_args)
2014-12-08 11:05:40.309 47019 TRACE oslo.messaging.rpc.dispatcher File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/neutron/db/l3_rpc_base.py",
line 137, in get_external_network_id
2014-12-08 11:05:40.309 47019 TRACE oslo.messaging.rpc.dispatcher net_id =
self.plugin.get_external_network_id(context)
2014-12-08 11:05:40.309 47019 TRACE oslo.messaging.rpc.dispatcher File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/neutron/db/external_net_db.py",
line 161, in get_external_network_id
2014-12-08 11:05:40.309 47019 TRACE oslo.messaging.rpc.dispatcher raise
n_exc.TooManyExternalNetworks()
2014-12-08 11:05:40.309 47019 TRACE oslo.messaging.rpc.dispatcher
TooManyExternalNetworks: More than one external network exists
2014-12-08 11:05:40.309 47019 TRACE oslo.messaging.rpc.dispatcher
2014-12-08 11:05:40.310 47019 ERROR oslo.messaging._drivers.common
[req-925a27e0-9674-4c2d-bf17-f05c4135d46b ] Returning exception More than one
external network exists to caller
2014-12-08 11:05:40.310 47019 ERROR oslo.messaging._drivers.common
[req-925a27e0-9674-4c2d-bf17-f05c4135d46b ] ['Traceback (most recent call
last):\n', ' File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py",
line 134, in _dispatch_and_reply\n incoming.message))\n', ' File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py",
line 177, in _dispatch\n return self._do_dispatch(endpoint, method, ctxt,
args)\n', ' File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py",
line 123, in _do_dispatch\n result = getattr(endpoint, method)(ctxt,
**new_args)\n', ' File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/neutron/db/l3_rpc_base.py",
line 137, in get_external_network_id\n net_id =
self.plugin.get_external_network_id(context)\n', ' File
"/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/neutron/db/external_net_db.
py", line 161, in get_external_network_id\n raise
n_exc.TooManyExternalNetworks()\n', 'TooManyExternalNetworks: More than one
external network exists\n']
2014-12-08 11:05:40.311 47019 DEBUG oslo.messaging._drivers.amqp
[req-925a27e0-9674-4c2d-bf17-f05c4135d46b ] UNIQUE_ID is
57129fb198d54b9187ef23ee7a4a662b. _add_unique_id
/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/oslo/messaging/_drivers/amqp.py:246
2014-12-08 11:05:40.312 47019 DEBUG oslo.messaging._drivers.amqp
[req-925a27e0-9674-4c2d-bf17-f05c4135d46b ] UNIQUE_ID is
966e7537306d47d6b9e13be8b49ad63e. _add_unique_id
/opt/stack/venvs/openstack/local/lib/python2.7/site-packages/oslo/messaging/_drivers/amqp.py:246
--
You received this bug notification because you are a member of Ubuntu
Bugs, which is subscribed to Ubuntu.
https://bugs.launchpad.net/bugs/1384108
Title:
Exception during message handling: QueuePool limit of size 10 overflow
20 reached, connection timed out, timeout 10
To manage notifications about this bug go to:
https://bugs.launchpad.net/neutron/+bug/1384108/+subscriptions
--
ubuntu-bugs mailing list
[email protected]
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs