** Description changed:

  Starting from 14th of October, Friday. Our CI jobs with Generic driver
- started failing with traces [1].
+ (Cinder as backend) started failing with traces [1].
  
  Investigation showed that problem appears when we get volume from Cinder with 
"in-use" status but with empty attachments.
  And Nova considers it as attached, but Cinder does not. And it is not 
attached indeed.
  It is not stable bug. It is concurrency-based bug. If we can do something in 
Manila then wait some time after each operation we do...
  
  [1]
  raw:
  ERROR oslo_messaging.rpc.server [req-7739192f-6bb8-470c-a094-a8f1cfa51d2c 
8ce864aff2194e48a6d0f1f1e1e4d79b 68a2d42bc83542c3b8aedb893461bc2d - - -] 
Exception during message handling
  ERROR oslo_messaging.rpc.server Traceback (most recent call last):
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", line 
133, in _process_incoming
  ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 
150, in dispatch
  ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, 
method, ctxt, args)
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 
121, in _do_dispatch
  ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/manager.py", line 163, in wrapped
  ERROR oslo_messaging.rpc.server     return f(self, *args, **kwargs)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/utils.py", line 571, in wrapper
  ERROR oslo_messaging.rpc.server     return func(self, *args, **kwargs)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/manager.py", line 1361, in 
create_share_instance
  ERROR oslo_messaging.rpc.server     {'status': constants.STATUS_ERROR}
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in 
__exit__
  ERROR oslo_messaging.rpc.server     self.force_reraise()
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in 
force_reraise
  ERROR oslo_messaging.rpc.server     six.reraise(self.type_, self.value, 
self.tb)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/manager.py", line 1332, in 
create_share_instance
  ERROR oslo_messaging.rpc.server     context, share_instance, 
share_server=share_server)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/drivers/generic.py", line 117, in wrap
  ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kwargs)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/drivers/generic.py", line 231, in 
create_share
  ERROR oslo_messaging.rpc.server     self._format_device(server_details, 
volume)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/drivers/generic.py", line 270, in 
_format_device
  ERROR oslo_messaging.rpc.server     self._ssh_exec(server_details, command)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/drivers/generic.py", line 169, in _ssh_exec
  ERROR oslo_messaging.rpc.server     check_exit_code=check_exit_code)
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py", line 
513, in ssh_execute
  ERROR oslo_messaging.rpc.server     cmd=sanitized_cmd)
  ERROR oslo_messaging.rpc.server ProcessExecutionError: Unexpected error while 
running command.
  ERROR oslo_messaging.rpc.server Command: sudo mkfs.ext4
  ERROR oslo_messaging.rpc.server Exit code: 1
  ERROR oslo_messaging.rpc.server Stdout: u''
  ERROR oslo_messaging.rpc.server Stderr: u'Usage: mkfs.ext4 [-c|-l filename] 
[-b block-size] [-C cluster-size]\n\t[-i bytes-per-inode] [-I inode-size] [-J 
journal-options]\n\t[-G flex-group-size] [-N number-of-inodes]\n\t[-m 
reserved-blocks-percentage] [-o creator-os]\n\t[-g blocks-per-group] [-L 
volume-label] [-M last-mounted-directory]\n\t[-O feature[,...]] [-r 
fs-revision] [-E extended-option[,...]]\n\t[-t fs-type] [-T usage-type ] [-U 
UUID] [-jnqvDFKSV] device [blocks-count]\n'
  
  logs:
  
http://logs.openstack.org/64/386364/1/check/gate-manila-tempest-dsvm-postgres-generic-singlebackend-ubuntu-xenial-nv/eef11b0/logs/screen-m-shr.txt.gz?level=TRACE#_2016-10-14_15_15_19_898

** Also affects: cinder
   Importance: Undecided
       Status: New

** Also affects: nova
   Importance: Undecided
       Status: New

** Description changed:

  Starting from 14th of October, Friday. Our CI jobs with Generic driver
  (Cinder as backend) started failing with traces [1].
  
  Investigation showed that problem appears when we get volume from Cinder with 
"in-use" status but with empty attachments.
  And Nova considers it as attached, but Cinder does not. And it is not 
attached indeed.
+ After all, it is impossible to detach "in-use" volume.
  It is not stable bug. It is concurrency-based bug. If we can do something in 
Manila then wait some time after each operation we do...
  
  [1]
  raw:
  ERROR oslo_messaging.rpc.server [req-7739192f-6bb8-470c-a094-a8f1cfa51d2c 
8ce864aff2194e48a6d0f1f1e1e4d79b 68a2d42bc83542c3b8aedb893461bc2d - - -] 
Exception during message handling
  ERROR oslo_messaging.rpc.server Traceback (most recent call last):
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", line 
133, in _process_incoming
  ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 
150, in dispatch
  ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, 
method, ctxt, args)
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 
121, in _do_dispatch
  ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/manager.py", line 163, in wrapped
  ERROR oslo_messaging.rpc.server     return f(self, *args, **kwargs)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/utils.py", line 571, in wrapper
  ERROR oslo_messaging.rpc.server     return func(self, *args, **kwargs)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/manager.py", line 1361, in 
create_share_instance
  ERROR oslo_messaging.rpc.server     {'status': constants.STATUS_ERROR}
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in 
__exit__
  ERROR oslo_messaging.rpc.server     self.force_reraise()
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in 
force_reraise
  ERROR oslo_messaging.rpc.server     six.reraise(self.type_, self.value, 
self.tb)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/manager.py", line 1332, in 
create_share_instance
  ERROR oslo_messaging.rpc.server     context, share_instance, 
share_server=share_server)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/drivers/generic.py", line 117, in wrap
  ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kwargs)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/drivers/generic.py", line 231, in 
create_share
  ERROR oslo_messaging.rpc.server     self._format_device(server_details, 
volume)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/drivers/generic.py", line 270, in 
_format_device
  ERROR oslo_messaging.rpc.server     self._ssh_exec(server_details, command)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/drivers/generic.py", line 169, in _ssh_exec
  ERROR oslo_messaging.rpc.server     check_exit_code=check_exit_code)
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py", line 
513, in ssh_execute
  ERROR oslo_messaging.rpc.server     cmd=sanitized_cmd)
  ERROR oslo_messaging.rpc.server ProcessExecutionError: Unexpected error while 
running command.
  ERROR oslo_messaging.rpc.server Command: sudo mkfs.ext4
  ERROR oslo_messaging.rpc.server Exit code: 1
  ERROR oslo_messaging.rpc.server Stdout: u''
  ERROR oslo_messaging.rpc.server Stderr: u'Usage: mkfs.ext4 [-c|-l filename] 
[-b block-size] [-C cluster-size]\n\t[-i bytes-per-inode] [-I inode-size] [-J 
journal-options]\n\t[-G flex-group-size] [-N number-of-inodes]\n\t[-m 
reserved-blocks-percentage] [-o creator-os]\n\t[-g blocks-per-group] [-L 
volume-label] [-M last-mounted-directory]\n\t[-O feature[,...]] [-r 
fs-revision] [-E extended-option[,...]]\n\t[-t fs-type] [-T usage-type ] [-U 
UUID] [-jnqvDFKSV] device [blocks-count]\n'
  
  logs:
  
http://logs.openstack.org/64/386364/1/check/gate-manila-tempest-dsvm-postgres-generic-singlebackend-ubuntu-xenial-nv/eef11b0/logs/screen-m-shr.txt.gz?level=TRACE#_2016-10-14_15_15_19_898

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/1633535

Title:
  Generic driver fails to attach Cinder volume to Nova VM

Status in Cinder:
  New
Status in Manila:
  In Progress
Status in OpenStack Compute (nova):
  New

Bug description:
  Starting from 14th of October, Friday. Our CI jobs with Generic driver
  (Cinder as backend) started failing with traces [1].

  Investigation showed that problem appears when we get volume from Cinder with 
"in-use" status but with empty attachments.
  And Nova considers it as attached, but Cinder does not. And it is not 
attached indeed.
  After all, it is impossible to detach "in-use" volume.
  It is not stable bug. It is concurrency-based bug. If we can do something in 
Manila then wait some time after each operation we do...

  [1]
  raw:
  ERROR oslo_messaging.rpc.server [req-7739192f-6bb8-470c-a094-a8f1cfa51d2c 
8ce864aff2194e48a6d0f1f1e1e4d79b 68a2d42bc83542c3b8aedb893461bc2d - - -] 
Exception during message handling
  ERROR oslo_messaging.rpc.server Traceback (most recent call last):
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/server.py", line 
133, in _process_incoming
  ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 
150, in dispatch
  ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, 
method, ctxt, args)
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 
121, in _do_dispatch
  ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/manager.py", line 163, in wrapped
  ERROR oslo_messaging.rpc.server     return f(self, *args, **kwargs)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/utils.py", line 571, in wrapper
  ERROR oslo_messaging.rpc.server     return func(self, *args, **kwargs)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/manager.py", line 1361, in 
create_share_instance
  ERROR oslo_messaging.rpc.server     {'status': constants.STATUS_ERROR}
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 220, in 
__exit__
  ERROR oslo_messaging.rpc.server     self.force_reraise()
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 196, in 
force_reraise
  ERROR oslo_messaging.rpc.server     six.reraise(self.type_, self.value, 
self.tb)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/manager.py", line 1332, in 
create_share_instance
  ERROR oslo_messaging.rpc.server     context, share_instance, 
share_server=share_server)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/drivers/generic.py", line 117, in wrap
  ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kwargs)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/drivers/generic.py", line 231, in 
create_share
  ERROR oslo_messaging.rpc.server     self._format_device(server_details, 
volume)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/drivers/generic.py", line 270, in 
_format_device
  ERROR oslo_messaging.rpc.server     self._ssh_exec(server_details, command)
  ERROR oslo_messaging.rpc.server   File 
"/opt/stack/new/manila/manila/share/drivers/generic.py", line 169, in _ssh_exec
  ERROR oslo_messaging.rpc.server     check_exit_code=check_exit_code)
  ERROR oslo_messaging.rpc.server   File 
"/usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py", line 
513, in ssh_execute
  ERROR oslo_messaging.rpc.server     cmd=sanitized_cmd)
  ERROR oslo_messaging.rpc.server ProcessExecutionError: Unexpected error while 
running command.
  ERROR oslo_messaging.rpc.server Command: sudo mkfs.ext4
  ERROR oslo_messaging.rpc.server Exit code: 1
  ERROR oslo_messaging.rpc.server Stdout: u''
  ERROR oslo_messaging.rpc.server Stderr: u'Usage: mkfs.ext4 [-c|-l filename] 
[-b block-size] [-C cluster-size]\n\t[-i bytes-per-inode] [-I inode-size] [-J 
journal-options]\n\t[-G flex-group-size] [-N number-of-inodes]\n\t[-m 
reserved-blocks-percentage] [-o creator-os]\n\t[-g blocks-per-group] [-L 
volume-label] [-M last-mounted-directory]\n\t[-O feature[,...]] [-r 
fs-revision] [-E extended-option[,...]]\n\t[-t fs-type] [-T usage-type ] [-U 
UUID] [-jnqvDFKSV] device [blocks-count]\n'

  logs:
  
http://logs.openstack.org/64/386364/1/check/gate-manila-tempest-dsvm-postgres-generic-singlebackend-ubuntu-xenial-nv/eef11b0/logs/screen-m-shr.txt.gz?level=TRACE#_2016-10-14_15_15_19_898

To manage notifications about this bug go to:
https://bugs.launchpad.net/cinder/+bug/1633535/+subscriptions

-- 
Mailing list: https://launchpad.net/~yahoo-eng-team
Post to     : yahoo-eng-team@lists.launchpad.net
Unsubscribe : https://launchpad.net/~yahoo-eng-team
More help   : https://help.launchpad.net/ListHelp

Reply via email to