Your message dated Sun, 29 Mar 2026 20:48:54 +0000
with message-id <[email protected]>
and subject line Bug#1073401: fixed in python-digitalocean 1.17.0-1
has caused the Debian Bug report #1073401,
regarding python-digitalocean: FTBFS: dh_auto_test: error: pybuild --test -i 
python{version} -p "3.12 3.11" returned exit code 13
to be marked as done.

This means that you claim that the problem has been dealt with.
If this is not the case it is now your responsibility to reopen the
Bug report if necessary, and/or fix the problem forthwith.

(NB: If you are a system administrator and have no idea what this
message is talking about, this may indicate a serious mail system
misconfiguration somewhere. Please contact [email protected]
immediately.)


-- 
1073401: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1073401
Debian Bug Tracking System
Contact [email protected] with problems
--- Begin Message ---
Source: python-digitalocean
Version: 1.16.0-3
Severity: serious
Justification: FTBFS
Tags: trixie sid ftbfs
User: [email protected]
Usertags: ftbfs-20240615 ftbfs-trixie

Hi,

During a rebuild of all packages in sid, your package failed to build
on amd64.


Relevant part (hopefully):
> make[1]: Entering directory '/<<PKGBUILDDIR>>'
> dh_auto_build
> I: pybuild base:311: /usr/bin/python3.12 setup.py build 
> running build
> running build_py
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/FloatingIP.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Volume.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Tag.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Balance.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Region.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Record.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Firewall.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Size.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Domain.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Metadata.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/LoadBalancer.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/baseapi.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Droplet.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Manager.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Action.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Image.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Kernel.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/VPC.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Project.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Account.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Certificate.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/SSHKey.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> copying digitalocean/Snapshot.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build/digitalocean
> I: pybuild base:311: /usr/bin/python3 setup.py build 
> running build
> running build_py
> creating 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/FloatingIP.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/__init__.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Volume.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Tag.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Balance.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Region.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Record.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Firewall.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Size.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Domain.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Metadata.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/LoadBalancer.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/baseapi.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Droplet.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Manager.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Action.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Image.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Kernel.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/VPC.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Project.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Account.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Certificate.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/SSHKey.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> copying digitalocean/Snapshot.py -> 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build/digitalocean
> PYTHONPATH=. python3 -m sphinx -N -bhtml docs/ build/html
> Running Sphinx v7.2.6
> WARNING: Invalid configuration value found: 'language = None'. Update your 
> configuration to a valid language code. Falling back to 'en' (English).
> making output directory... done
> WARNING: The pre-Sphinx 1.0 'intersphinx_mapping' format is deprecated and 
> will be removed in Sphinx 8. Update to the current format as described in the 
> documentation. Hint: "intersphinx_mapping = {'<name>': 
> ('https://docs.python.org/', 
> None)}".https://www.sphinx-doc.org/en/master/usage/extensions/intersphinx.html#confval-intersphinx_mapping
> loading intersphinx inventory from https://docs.python.org/objects.inv...
> WARNING: failed to reach any of the inventories with the following issues:
> intersphinx inventory 'https://docs.python.org/objects.inv' not fetchable due 
> to <class 'requests.exceptions.ProxyError'>: 
> HTTPSConnectionPool(host='docs.python.org', port=443): Max retries exceeded 
> with url: /objects.inv (Caused by ProxyError('Unable to connect to proxy', 
> NewConnectionError('<urllib3.connection.HTTPSConnection object at 
> 0x7f58057dfdd0>: Failed to establish a new connection: [Errno 111] Connection 
> refused')))
> building [mo]: targets for 0 po files that are out of date
> writing output... 
> building [html]: targets for 2 source files that are out of date
> updating environment: [new config] 2 added, 0 changed, 0 removed
> reading sources... [ 50%] digitalocean
> reading sources... [100%] index
> 
> /<<PKGBUILDDIR>>/digitalocean/Domain.py:docstring of 
> digitalocean.Domain.Domain.create_new_domain_record:15: ERROR: Unexpected 
> indentation.
> looking for now-outdated files... none found
> pickling environment... done
> checking consistency... done
> preparing documents... done
> copying assets... copying static files... done
> copying extra files... done
> done
> writing output... [ 50%] digitalocean
> writing output... [100%] index
> 
> generating indices... genindex py-modindex done
> highlighting module code... [  6%] digitalocean.Account
> highlighting module code... [ 12%] digitalocean.Action
> highlighting module code... [ 19%] digitalocean.Domain
> highlighting module code... [ 25%] digitalocean.Droplet
> highlighting module code... [ 31%] digitalocean.FloatingIP
> highlighting module code... [ 38%] digitalocean.Image
> highlighting module code... [ 44%] digitalocean.Kernel
> highlighting module code... [ 50%] digitalocean.LoadBalancer
> highlighting module code... [ 56%] digitalocean.Manager
> highlighting module code... [ 62%] digitalocean.Metadata
> highlighting module code... [ 69%] digitalocean.Record
> highlighting module code... [ 75%] digitalocean.Region
> highlighting module code... [ 81%] digitalocean.SSHKey
> highlighting module code... [ 88%] digitalocean.Size
> highlighting module code... [ 94%] digitalocean.Tag
> highlighting module code... [100%] digitalocean.Volume
> 
> writing additional pages... search done
> dumping search index in English (code: en)... done
> dumping object inventory... done
> build succeeded, 4 warnings.
> 
> The HTML pages are in build/html.
> make[1]: Leaving directory '/<<PKGBUILDDIR>>'
>    dh_auto_test -O--buildsystem=pybuild
> I: pybuild pybuild:308: cp -r /<<PKGBUILDDIR>>/digitalocean/tests 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build
> I: pybuild base:311: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build; python3.12 -m 
> pytest 
> ============================= test session starts 
> ==============================
> platform linux -- Python 3.12.4, pytest-8.2.2, pluggy-1.5.0
> rootdir: /<<PKGBUILDDIR>>
> collected 152 items
> 
> tests/test_action.py ..                                                  [  
> 1%]
> tests/test_baseapi.py ....                                               [  
> 3%]
> tests/test_certificate.py ....                                           [  
> 6%]
> tests/test_domain.py .......                                             [ 
> 11%]
> tests/test_droplet.py ............................................       [ 
> 40%]
> tests/test_firewall.py FF.FF                                             [ 
> 43%]
> tests/test_floatingip.py ......                                          [ 
> 47%]
> tests/test_image.py .......                                              [ 
> 51%]
> tests/test_load_balancer.py ..........                                   [ 
> 58%]
> tests/test_manager.py ...............................                    [ 
> 78%]
> tests/test_project.py .........                                          [ 
> 84%]
> tests/test_snapshot.py ..                                                [ 
> 86%]
> tests/test_tag.py .......                                                [ 
> 90%]
> tests/test_volume.py ..........                                          [ 
> 97%]
> tests/test_vpc.py ....                                                   
> [100%]
> 
> =================================== FAILURES 
> ===================================
> ________________________ TestFirewall.test_add_droplets 
> ________________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff02e0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff02e0>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(44 
> bytes read, -44 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
>     self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff02e0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(44 bytes read, -44 more expected)', IncompleteRead(44 bytes 
> read, -44 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_firewall.TestFirewall testMethod=test_add_droplets>
> 
>     @responses.activate
>     def test_add_droplets(self):
>         data = self.load_from_file('firewalls/droplets.json')
>     
>         url = self.base_url + "firewalls/12345/droplets"
>         responses.add(responses.POST, url,
>                       body=data,
>                       status=204,
>                       content_type='application/json')
>     
>         droplet_id = json.loads(data)["droplet_ids"][0]
> >       self.firewall.add_droplets([droplet_id])
> 
> tests/test_firewall.py:73: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> digitalocean/Firewall.py:202: in add_droplets
>     return self.get_data(
> digitalocean/baseapi.py:216: in get_data
>     req = self.__perform_request(url, type, params)
> digitalocean/baseapi.py:133: in __perform_request
>     return requests_method(url, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:115: in post
>     return request("post", url, data=data, json=json, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:59: in request
>     return session.request(method=method, url=url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(44 bytes read, -44 more expected)', IncompleteRead(44 
> bytes read, -44 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> __________________________ TestFirewall.test_add_tags 
> __________________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff2110>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff2110>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(41 
> bytes read, -41 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
>     self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff2110>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(41 bytes read, -41 more expected)', IncompleteRead(41 bytes 
> read, -41 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_firewall.TestFirewall testMethod=test_add_tags>
> 
>     @responses.activate
>     def test_add_tags(self):
>         data = self.load_from_file('firewalls/tags.json')
>     
>         url = self.base_url + "firewalls/12345/tags"
>         responses.add(responses.POST, url,
>                       body=data,
>                       status=204,
>                       content_type='application/json')
>     
>         tag = json.loads(data)["tags"][0]
> >       self.firewall.add_tags([tag])
> 
> tests/test_firewall.py:104: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> digitalocean/Firewall.py:222: in add_tags
>     return self.get_data(
> digitalocean/baseapi.py:216: in get_data
>     req = self.__perform_request(url, type, params)
> digitalocean/baseapi.py:133: in __perform_request
>     return requests_method(url, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:115: in post
>     return request("post", url, data=data, json=json, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:59: in request
>     return session.request(method=method, url=url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(41 bytes read, -41 more expected)', IncompleteRead(41 
> bytes read, -41 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> ______________________ TestFirewall.test_remove_droplets 
> _______________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3fbfd60>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3fbfd60>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(44 
> bytes read, -44 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
>     self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3fbfd60>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(44 bytes read, -44 more expected)', IncompleteRead(44 bytes 
> read, -44 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_firewall.TestFirewall testMethod=test_remove_droplets>
> 
>     @responses.activate
>     def test_remove_droplets(self):
>         data = self.load_from_file('firewalls/droplets.json')
>     
>         url = self.base_url + "firewalls/12345/droplets"
>         responses.add(responses.DELETE,
>                       url,
>                       body=data,
>                       status=204,
>                       content_type='application/json')
>     
>         droplet_id = json.loads(data)["droplet_ids"][0]
> >       self.firewall.remove_droplets([droplet_id])
> 
> tests/test_firewall.py:89: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> digitalocean/Firewall.py:212: in remove_droplets
>     return self.get_data(
> digitalocean/baseapi.py:216: in get_data
>     req = self.__perform_request(url, type, params)
> digitalocean/baseapi.py:133: in __perform_request
>     return requests_method(url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:671: in delete
>     return self.request("DELETE", url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(44 bytes read, -44 more expected)', IncompleteRead(44 
> bytes read, -44 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> ________________________ TestFirewall.test_remove_tags 
> _________________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff0100>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff0100>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(41 
> bytes read, -41 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.12/contextlib.py:158: in __exit__
>     self.gen.throw(value)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7f6ac3ff0100>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(41 bytes read, -41 more expected)', IncompleteRead(41 bytes 
> read, -41 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_firewall.TestFirewall testMethod=test_remove_tags>
> 
>     @responses.activate
>     def test_remove_tags(self):
>         data = self.load_from_file('firewalls/tags.json')
>     
>         url = self.base_url + "firewalls/12345/tags"
>         responses.add(responses.DELETE, url,
>                       body=data,
>                       status=204,
>                       content_type='application/json')
>     
>         tag = json.loads(data)["tags"][0]
> >       self.firewall.remove_tags([tag])
> 
> tests/test_firewall.py:119: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> digitalocean/Firewall.py:232: in remove_tags
>     return self.get_data(
> digitalocean/baseapi.py:216: in get_data
>     req = self.__perform_request(url, type, params)
> digitalocean/baseapi.py:133: in __perform_request
>     return requests_method(url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:671: in delete
>     return self.request("DELETE", url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(41 bytes read, -41 more expected)', IncompleteRead(41 
> bytes read, -41 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> =============================== warnings summary 
> ===============================
> .pybuild/cpython3_3.12_digitalocean/build/tests/test_droplet.py::TestDroplet::test_get_kernel_available_with_pages
> .pybuild/cpython3_3.12_digitalocean/build/tests/test_manager.py::TestManager::test_get_droplet_snapshots
> .pybuild/cpython3_3.12_digitalocean/build/tests/test_manager.py::TestManager::test_get_per_region_volumes
> .pybuild/cpython3_3.12_digitalocean/build/tests/test_manager.py::TestManager::test_get_volume_snapshots
>   /usr/lib/python3/dist-packages/responses/__init__.py:436: 
> DeprecationWarning: Argument 'match_querystring' is deprecated. Use 
> 'responses.matchers.query_param_matcher' or 
> 'responses.matchers.query_string_matcher'
>     warn(
> 
> -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
> =========================== short test summary info 
> ============================
> FAILED tests/test_firewall.py::TestFirewall::test_add_droplets - 
> requests.exc...
> FAILED tests/test_firewall.py::TestFirewall::test_add_tags - 
> requests.excepti...
> FAILED tests/test_firewall.py::TestFirewall::test_remove_droplets - 
> requests....
> FAILED tests/test_firewall.py::TestFirewall::test_remove_tags - 
> requests.exce...
> ================== 4 failed, 148 passed, 4 warnings in 1.06s 
> ===================
> E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.12_digitalocean/build; python3.12 -m 
> pytest 
> I: pybuild pybuild:308: cp -r /<<PKGBUILDDIR>>/digitalocean/tests 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build
> I: pybuild base:311: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build; python3.11 -m 
> pytest 
> ============================= test session starts 
> ==============================
> platform linux -- Python 3.11.9, pytest-8.2.2, pluggy-1.5.0
> rootdir: /<<PKGBUILDDIR>>
> collected 152 items
> 
> tests/test_action.py ..                                                  [  
> 1%]
> tests/test_baseapi.py ....                                               [  
> 3%]
> tests/test_certificate.py ....                                           [  
> 6%]
> tests/test_domain.py .......                                             [ 
> 11%]
> tests/test_droplet.py ............................................       [ 
> 40%]
> tests/test_firewall.py FF.FF                                             [ 
> 43%]
> tests/test_floatingip.py ......                                          [ 
> 47%]
> tests/test_image.py .......                                              [ 
> 51%]
> tests/test_load_balancer.py ..........                                   [ 
> 58%]
> tests/test_manager.py ...............................                    [ 
> 78%]
> tests/test_project.py .........                                          [ 
> 84%]
> tests/test_snapshot.py ..                                                [ 
> 86%]
> tests/test_tag.py .......                                                [ 
> 90%]
> tests/test_volume.py ..........                                          [ 
> 97%]
> tests/test_vpc.py ....                                                   
> [100%]
> 
> =================================== FAILURES 
> ===================================
> ________________________ TestFirewall.test_add_droplets 
> ________________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7f1d228def20>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7f1d228def20>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(44 
> bytes read, -44 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
>     self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7f1d228def20>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(44 bytes read, -44 more expected)', IncompleteRead(44 bytes 
> read, -44 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_firewall.TestFirewall testMethod=test_add_droplets>
> 
>     @responses.activate
>     def test_add_droplets(self):
>         data = self.load_from_file('firewalls/droplets.json')
>     
>         url = self.base_url + "firewalls/12345/droplets"
>         responses.add(responses.POST, url,
>                       body=data,
>                       status=204,
>                       content_type='application/json')
>     
>         droplet_id = json.loads(data)["droplet_ids"][0]
> >       self.firewall.add_droplets([droplet_id])
> 
> tests/test_firewall.py:73: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> digitalocean/Firewall.py:202: in add_droplets
>     return self.get_data(
> digitalocean/baseapi.py:216: in get_data
>     req = self.__perform_request(url, type, params)
> digitalocean/baseapi.py:133: in __perform_request
>     return requests_method(url, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:115: in post
>     return request("post", url, data=data, json=json, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:59: in request
>     return session.request(method=method, url=url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(44 bytes read, -44 more expected)', IncompleteRead(44 
> bytes read, -44 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> __________________________ TestFirewall.test_add_tags 
> __________________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7f1d22819270>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7f1d22819270>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(41 
> bytes read, -41 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
>     self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7f1d22819270>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(41 bytes read, -41 more expected)', IncompleteRead(41 bytes 
> read, -41 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_firewall.TestFirewall testMethod=test_add_tags>
> 
>     @responses.activate
>     def test_add_tags(self):
>         data = self.load_from_file('firewalls/tags.json')
>     
>         url = self.base_url + "firewalls/12345/tags"
>         responses.add(responses.POST, url,
>                       body=data,
>                       status=204,
>                       content_type='application/json')
>     
>         tag = json.loads(data)["tags"][0]
> >       self.firewall.add_tags([tag])
> 
> tests/test_firewall.py:104: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> digitalocean/Firewall.py:222: in add_tags
>     return self.get_data(
> digitalocean/baseapi.py:216: in get_data
>     req = self.__perform_request(url, type, params)
> digitalocean/baseapi.py:133: in __perform_request
>     return requests_method(url, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:115: in post
>     return request("post", url, data=data, json=json, **kwargs)
> /usr/lib/python3/dist-packages/requests/api.py:59: in request
>     return session.request(method=method, url=url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(41 bytes read, -41 more expected)', IncompleteRead(41 
> bytes read, -41 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> ______________________ TestFirewall.test_remove_droplets 
> _______________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7f1d22798cd0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7f1d22798cd0>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(44 
> bytes read, -44 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
>     self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7f1d22798cd0>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(44 bytes read, -44 more expected)', IncompleteRead(44 bytes 
> read, -44 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_firewall.TestFirewall testMethod=test_remove_droplets>
> 
>     @responses.activate
>     def test_remove_droplets(self):
>         data = self.load_from_file('firewalls/droplets.json')
>     
>         url = self.base_url + "firewalls/12345/droplets"
>         responses.add(responses.DELETE,
>                       url,
>                       body=data,
>                       status=204,
>                       content_type='application/json')
>     
>         droplet_id = json.loads(data)["droplet_ids"][0]
> >       self.firewall.remove_droplets([droplet_id])
> 
> tests/test_firewall.py:89: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> digitalocean/Firewall.py:212: in remove_droplets
>     return self.get_data(
> digitalocean/baseapi.py:216: in get_data
>     req = self.__perform_request(url, type, params)
> digitalocean/baseapi.py:133: in __perform_request
>     return requests_method(url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:671: in delete
>     return self.request("DELETE", url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(44 bytes read, -44 more expected)', IncompleteRead(44 
> bytes read, -44 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> ________________________ TestFirewall.test_remove_tags 
> _________________________
> 
> self = <urllib3.response.HTTPResponse object at 0x7f1d2379c160>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
> >               yield
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:710: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7f1d2379c160>, amt = 10240
> 
>     def _raw_read(
>         self,
>         amt: int | None = None,
>     ) -> bytes:
>         """
>         Reads `amt` of bytes from the socket.
>         """
>         if self._fp is None:
>             return None  # type: ignore[return-value]
>     
>         fp_closed = getattr(self._fp, "closed", False)
>     
>         with self._error_catcher():
>             data = self._fp_read(amt) if not fp_closed else b""
>             if amt is not None and amt != 0 and not data:
>                 # Platform-specific: Buggy versions of Python.
>                 # Close the connection when no data is returned
>                 #
>                 # This is redundant to what httplib/http.client _should_
>                 # already do.  However, versions of python released before
>                 # December 15, 2012 (http://bugs.python.org/issue16298) do
>                 # not properly close the connection in all cases. There is
>                 # no harm in redundantly calling close.
>                 self._fp.close()
>                 if (
>                     self.enforce_content_length
>                     and self.length_remaining is not None
>                     and self.length_remaining != 0
>                 ):
>                     # This is an edge case that httplib failed to cover due
>                     # to concerns of backward compatibility. We're
>                     # addressing it here to make sure IncompleteRead is
>                     # raised during streaming, so all calls with incorrect
>                     # Content-Length are caught.
> >                   raise IncompleteRead(self._fp_bytes_read, 
> > self.length_remaining)
> E                   urllib3.exceptions.IncompleteRead: IncompleteRead(41 
> bytes read, -41 more expected)
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:835: IncompleteRead
> 
> The above exception was the direct cause of the following exception:
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
> >               yield from self.raw.stream(chunk_size, decode_content=True)
> 
> /usr/lib/python3/dist-packages/requests/models.py:820: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> /usr/lib/python3/dist-packages/urllib3/response.py:936: in stream
>     data = self.read(amt=amt, decode_content=decode_content)
> /usr/lib/python3/dist-packages/urllib3/response.py:907: in read
>     data = self._raw_read(amt)
> /usr/lib/python3/dist-packages/urllib3/response.py:813: in _raw_read
>     with self._error_catcher():
> /usr/lib/python3.11/contextlib.py:158: in __exit__
>     self.gen.throw(typ, value, traceback)
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
> self = <urllib3.response.HTTPResponse object at 0x7f1d2379c160>
> 
>     @contextmanager
>     def _error_catcher(self) -> typing.Generator[None, None, None]:
>         """
>         Catch low-level python exceptions, instead re-raising urllib3
>         variants, so that low-level exceptions are not leaked in the
>         high-level api.
>     
>         On exit, release the connection back to the pool.
>         """
>         clean_exit = False
>     
>         try:
>             try:
>                 yield
>     
>             except SocketTimeout as e:
>                 # FIXME: Ideally we'd like to include the url in the 
> ReadTimeoutError but
>                 # there is yet no clean way to get at it from this context.
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except BaseSSLError as e:
>                 # FIXME: Is there a better way to differentiate between 
> SSLErrors?
>                 if "read operation timed out" not in str(e):
>                     # SSL errors related to framing/MAC get wrapped and 
> reraised here
>                     raise SSLError(e) from e
>     
>                 raise ReadTimeoutError(self._pool, None, "Read timed out.") 
> from e  # type: ignore[arg-type]
>     
>             except (HTTPException, OSError) as e:
>                 # This includes IncompleteRead.
> >               raise ProtocolError(f"Connection broken: {e!r}", e) from e
> E               urllib3.exceptions.ProtocolError: ('Connection broken: 
> IncompleteRead(41 bytes read, -41 more expected)', IncompleteRead(41 bytes 
> read, -41 more expected))
> 
> /usr/lib/python3/dist-packages/urllib3/response.py:727: ProtocolError
> 
> During handling of the above exception, another exception occurred:
> 
> self = <tests.test_firewall.TestFirewall testMethod=test_remove_tags>
> 
>     @responses.activate
>     def test_remove_tags(self):
>         data = self.load_from_file('firewalls/tags.json')
>     
>         url = self.base_url + "firewalls/12345/tags"
>         responses.add(responses.DELETE, url,
>                       body=data,
>                       status=204,
>                       content_type='application/json')
>     
>         tag = json.loads(data)["tags"][0]
> >       self.firewall.remove_tags([tag])
> 
> tests/test_firewall.py:119: 
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> digitalocean/Firewall.py:232: in remove_tags
>     return self.get_data(
> digitalocean/baseapi.py:216: in get_data
>     req = self.__perform_request(url, type, params)
> digitalocean/baseapi.py:133: in __perform_request
>     return requests_method(url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:671: in delete
>     return self.request("DELETE", url, **kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:589: in request
>     resp = self.send(prep, **send_kwargs)
> /usr/lib/python3/dist-packages/requests/sessions.py:746: in send
>     r.content
> /usr/lib/python3/dist-packages/requests/models.py:902: in content
>     self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
> _ 
> 
>     def generate():
>         # Special case for urllib3.
>         if hasattr(self.raw, "stream"):
>             try:
>                 yield from self.raw.stream(chunk_size, decode_content=True)
>             except ProtocolError as e:
> >               raise ChunkedEncodingError(e)
> E               requests.exceptions.ChunkedEncodingError: ('Connection 
> broken: IncompleteRead(41 bytes read, -41 more expected)', IncompleteRead(41 
> bytes read, -41 more expected))
> 
> /usr/lib/python3/dist-packages/requests/models.py:822: ChunkedEncodingError
> =============================== warnings summary 
> ===============================
> .pybuild/cpython3_3.11_digitalocean/build/tests/test_droplet.py::TestDroplet::test_get_kernel_available_with_pages
> .pybuild/cpython3_3.11_digitalocean/build/tests/test_manager.py::TestManager::test_get_droplet_snapshots
> .pybuild/cpython3_3.11_digitalocean/build/tests/test_manager.py::TestManager::test_get_per_region_volumes
> .pybuild/cpython3_3.11_digitalocean/build/tests/test_manager.py::TestManager::test_get_volume_snapshots
>   /usr/lib/python3/dist-packages/responses/__init__.py:436: 
> DeprecationWarning: Argument 'match_querystring' is deprecated. Use 
> 'responses.matchers.query_param_matcher' or 
> 'responses.matchers.query_string_matcher'
>     warn(
> 
> -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
> =========================== short test summary info 
> ============================
> FAILED tests/test_firewall.py::TestFirewall::test_add_droplets - 
> requests.exc...
> FAILED tests/test_firewall.py::TestFirewall::test_add_tags - 
> requests.excepti...
> FAILED tests/test_firewall.py::TestFirewall::test_remove_droplets - 
> requests....
> FAILED tests/test_firewall.py::TestFirewall::test_remove_tags - 
> requests.exce...
> ================== 4 failed, 148 passed, 4 warnings in 1.39s 
> ===================
> E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd 
> /<<PKGBUILDDIR>>/.pybuild/cpython3_3.11_digitalocean/build; python3.11 -m 
> pytest 
> dh_auto_test: error: pybuild --test -i python{version} -p "3.12 3.11" 
> returned exit code 13


The full build log is available from:
http://qa-logs.debian.net/2024/06/15/python-digitalocean_1.16.0-3_unstable.log

All bugs filed during this archive rebuild are listed at:
https://bugs.debian.org/cgi-bin/pkgreport.cgi?tag=ftbfs-20240615;[email protected]
or:
https://udd.debian.org/bugs/?release=na&merged=ign&fnewerval=7&flastmodval=7&fusertag=only&fusertagtag=ftbfs-20240615&[email protected]&allbugs=1&cseverity=1&ctags=1&caffected=1#results

A list of current common problems and possible solutions is available at
http://wiki.debian.org/qa.debian.org/FTBFS . You're welcome to contribute!

If you reassign this bug to another package, please mark it as 'affects'-ing
this package. See https://www.debian.org/Bugs/server-control#affects

If you fail to reproduce this, please provide a build log and diff it with mine
so that we can identify if something relevant changed in the meantime.

--- End Message ---
--- Begin Message ---
Source: python-digitalocean
Source-Version: 1.17.0-1
Done: Harlan Lieberman-Berg <[email protected]>

We believe that the bug you reported is fixed in the latest version of
python-digitalocean, which is due to be installed in the Debian FTP archive.

A summary of the changes between this version and the previous one is
attached.

Thank you for reporting the bug, which will now be closed.  If you
have further comments please address them to [email protected],
and the maintainer will reopen the bug report if appropriate.

Debian distribution maintenance software
pp.
Harlan Lieberman-Berg <[email protected]> (supplier of updated 
python-digitalocean package)

(This message was generated automatically at their request; if you
believe that there is a problem with it please contact the archive
administrators by mailing [email protected])


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

Format: 1.8
Date: Sun, 29 Mar 2026 16:23:38 -0400
Source: python-digitalocean
Architecture: source
Version: 1.17.0-1
Distribution: unstable
Urgency: medium
Maintainer: Debian Python Team <[email protected]>
Changed-By: Harlan Lieberman-Berg <[email protected]>
Closes: 1073401
Changes:
 python-digitalocean (1.17.0-1) unstable; urgency=medium
 .
   * Team upload.
   * New upstream version 1.17.0
   * Add patches to fix FTBFS (Closes: #1073401)
Checksums-Sha1:
 ad52cd8b96fb078ea008541b9488a03404513faa 2437 python-digitalocean_1.17.0-1.dsc
 f5c65fa876b75aea46e77c07f4374afeadd70111 60261 
python-digitalocean_1.17.0.orig.tar.gz
 49c40871f14cbf48a917170342f8b1bf616cd986 4188 
python-digitalocean_1.17.0-1.debian.tar.xz
 9a68a18adca9487528b418e8f3900101faf1c660 8306 
python-digitalocean_1.17.0-1_amd64.buildinfo
Checksums-Sha256:
 191e284d7804d3c8406d04f28f6a8a97dd1aebeba5725943ce86a7468204e40e 2437 
python-digitalocean_1.17.0-1.dsc
 9c9c788ae03a088d0c03a9a59ff7ac6c492caadd4942d4fc58795ee859fc228f 60261 
python-digitalocean_1.17.0.orig.tar.gz
 c0ba45647e4615878d303b8149e56f2ecaab5dbb4e81def3e601707b2477b412 4188 
python-digitalocean_1.17.0-1.debian.tar.xz
 3be9fe8a41f0787e4266d264d3b96f615e3cb175652ada7fb33ac86f181bbc93 8306 
python-digitalocean_1.17.0-1_amd64.buildinfo
Files:
 c3ec2dda818729286947e9c5feb7c98b 2437 python optional 
python-digitalocean_1.17.0-1.dsc
 70ebcb5e0e67abe6aa302d84c3d6f5fb 60261 python optional 
python-digitalocean_1.17.0.orig.tar.gz
 ae5ebc878cd75f20d2ecabf1e987f2f5 4188 python optional 
python-digitalocean_1.17.0-1.debian.tar.xz
 aa4a41d4ced2dd1cbce5f84519839850 8306 python optional 
python-digitalocean_1.17.0-1_amd64.buildinfo

-----BEGIN PGP SIGNATURE-----

iQKqBAEBCgCUFiEEoQidPpMhm/pf9hDqpdsxOyUajKMFAmnJiytfFIAAAAAALgAo
aXNzdWVyLWZwckBub3RhdGlvbnMub3BlbnBncC5maWZ0aGhvcnNlbWFuLm5ldEEx
MDg5RDNFOTMyMTlCRkE1RkY2MTBFQUE1REIzMTNCMjUxQThDQTMWHGhsaWViZXJt
YW5AZGViaWFuLm9yZwAKCRCl2zE7JRqMo8AgD/9Fhl4H7oPB8EgjxEHeufBeK7qg
U2Wwztab2xlgpk/YKtMwCqoyZf//00Mo9ZoYs03Nskw1FuS2v8B/QGAINTrqNXxg
cGDD8vM8H9TlTJRCVBZXOmnDoUZgMG1/UTW6oY3tb3pvGf2bJ9mhARxTyLZlSta2
yPMoT8Pt2lwa++lEghOx40w41DCfbDIX6p/LsOSTlaae5f6zM7Ufcfno6djI8I3m
ve8U9HFmN9+fd/UyqqDlcEgRLobgyKh5HW3bBZlpTlBXrcmJeC80EgojUbdh21xu
em+m4s99mB14EftnPLwqJt1qKGs0MBog94EyghBbtk7GKl+/2SsQae23+dIcGDdr
DXansKXhpfpSxNR2sPPYX/Ff95d1/arYdo25ueagWxVU36evmCJ6bwUtcNNKEBHo
hqFX5yyBQmLFskcWTemVkdtU9bMcCTYlNaJDCtfs0lplirlQAbibpm/gawDYE4Hi
0M13RQYO9Rub7ATCwT2v6l7NwpLbjSPfi6PmwcSfE8BV+xIaMgaLrUrvzx0ZZRLZ
Sli9rzRUqSjQzjDnPUEFAK12dOJfOlFRiu6lD29G0V7YDNY2XcgBdud+fTHK9zmc
VNI3ZXKsYOdsEFQJ2vyg72kjlSBy8YFiFjROeh82LK1sQVyVvtZWolBwhn9Bwr4H
uCxGDUGwPqqWfNpJZg==
=aSGb
-----END PGP SIGNATURE-----

Attachment: pgp2tA30TWPlF.pgp
Description: PGP signature


--- End Message ---

Reply via email to