Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package python-httpx for openSUSE:Factory 
checked in at 2021-07-12 21:40:16
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-httpx (Old)
 and      /work/SRC/openSUSE:Factory/.python-httpx.new.2625 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "python-httpx"

Mon Jul 12 21:40:16 2021 rev:2 rq:905775 version:0.18.2

Changes:
--------
--- /work/SRC/openSUSE:Factory/python-httpx/python-httpx.changes        
2021-07-12 01:25:14.769134638 +0200
+++ /work/SRC/openSUSE:Factory/.python-httpx.new.2625/python-httpx.changes      
2021-07-12 21:40:35.459941280 +0200
@@ -1,0 +2,30 @@
+Fri Jun 18 07:56:56 UTC 2021 - Antonio Larrosa <[email protected]>
+
+- Update to 0.18.2
+  * Added
+    - Support for Python 3.10. (Pull #1687)
+    - Expose httpx.USE_CLIENT_DEFAULT, used as the default to auth
+      and timeout parameters in request methods.
+    - Support HTTP/2 "prior knowledge", using
+      httpx.Client(http1=False, http2=True).
+  * Fixed
+    - Clean up some cases where warnings were being issued.
+    - Prefer Content-Length over Transfer-Encoding: chunked for
+      content= cases.
+
+- Update to 0.18.1
+  * Changed
+    - Update brotli support to use the brotlicffi package
+    - Ensure that Request(..., stream=...) does not auto-generate
+      any headers on the request instance.
+  * Fixed
+    - Pass through timeout=... in top-level httpx.stream()
+      function.
+    - Map httpcore transport close exceptions to httpx exceptions.
+
+- Add patch (submitted to upstream at gh#encode/httpx#1669) to add
+  a pytest marker so we can disable the tests that use the network
+  in %check:
+  * 0001-Add-a-network-pytest-mark-for-tests-that-use-the-network.patch
+
+-------------------------------------------------------------------

Old:
----
  httpx-0.18.0.tar.gz

New:
----
  0001-Add-a-network-pytest-mark-for-tests-that-use-the-network.patch
  httpx-0.18.2.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ python-httpx.spec ++++++
--- /var/tmp/diff_new_pack.YhXrv3/_old  2021-07-12 21:40:36.119936078 +0200
+++ /var/tmp/diff_new_pack.YhXrv3/_new  2021-07-12 21:40:36.119936078 +0200
@@ -20,17 +20,19 @@
 %define skip_python2 1
 %define skip_python36 1
 Name:           python-httpx
-Version:        0.18.0
+Version:        0.18.2
 Release:        0
 Summary:        Python HTTP client with async support
 License:        BSD-3-Clause
 Group:          Development/Languages/Python
 URL:            https://github.com/encode/httpx
 Source:         
https://github.com/encode/httpx/archive/%{version}.tar.gz#/httpx-%{version}.tar.gz
+# PATCH-FIX-UPSTREAM [email protected] (gh#encode/httpx#1669)
+Patch0:         
0001-Add-a-network-pytest-mark-for-tests-that-use-the-network.patch
 BuildRequires:  %{python_module setuptools}
 BuildRequires:  fdupes
 BuildRequires:  python-rpm-macros
-Requires:       python-Brotli >= 0.7.0
+Requires:       python-brotlicffi
 Requires:       python-certifi
 Requires:       python-chardet >= 3.0
 Requires:       python-h11 >= 0.8.0
@@ -42,8 +44,9 @@
 Requires:       python-sniffio
 BuildArch:      noarch
 # SECTION test requirements
-BuildRequires:  %{python_module Brotli >= 0.7.0}
+BuildRequires:  %{python_module anyio}
 BuildRequires:  %{python_module async_generator}
+BuildRequires:  %{python_module brotlicffi}
 BuildRequires:  %{python_module certifi}
 BuildRequires:  %{python_module chardet >= 3.0}
 BuildRequires:  %{python_module h11 >= 0.8.0}
@@ -68,6 +71,7 @@
 
 %prep
 %setup -q -n httpx-%{version}
+%patch0 -p1
 rm setup.cfg
 
 %build
@@ -78,8 +82,7 @@
 %python_expand %fdupes %{buildroot}%{$python_sitelib}
 
 %check
-# test_start_tls_on_*_socket_stream and test_connect_timeout require network
-%pytest -k 'not (test_start_tls_on_tcp_socket_stream or 
test_start_tls_on_uds_socket_stream or test_connect_timeout or 
test_async_proxy_close or test_sync_proxy_close)'
+%pytest -k 'not network'
 
 %files %{python_files}
 %doc CHANGELOG.md README.md

++++++ 0001-Add-a-network-pytest-mark-for-tests-that-use-the-network.patch 
++++++
>From 55b8ee87b0e57f3dda924e3d01ee8eaa39c4aa81 Mon Sep 17 00:00:00 2001
From: Antonio Larrosa <[email protected]>
Date: Mon, 7 Jun 2021 18:32:29 +0200
Subject: [PATCH] Add a network pytest mark for tests that use the network

Sometimes it's useful to have the tests that use the network
marked so they can be skipped easily when we know the network
is not available.

This is useful for example on SUSE and openSUSE's build servers.
When building the httpx packages (actually, any package in the
distribution) the network is disabled so we can assure
reproducible builds (among other benefits). With this mark, it's
easier to skip tests that can not succeed.
---
 setup.cfg                    | 1 +
 tests/client/test_proxies.py | 2 ++
 tests/test_timeouts.py       | 1 +
 3 files changed, 4 insertions(+)

diff --git a/setup.cfg b/setup.cfg
index c860d819c..eb5451d96 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -18,6 +18,7 @@ combine_as_imports = True
   default:::uvicorn
 markers =
   copied_from(source, changes=None): mark test as copied from somewhere else, 
along with a description of changes made to accodomate e.g. our test setup
+  network: marks tests which require network connection
 
 [coverage:run]
 omit = venv/*, httpx/_compat.py
diff --git a/tests/client/test_proxies.py b/tests/client/test_proxies.py
index 6ea4cbe40..2817d202b 100644
--- a/tests/client/test_proxies.py
+++ b/tests/client/test_proxies.py
@@ -122,6 +122,7 @@ def test_transport_for_request(url, proxies, expected):
 
 
 @pytest.mark.asyncio
[email protected]
 async def test_async_proxy_close():
     try:
         client = httpx.AsyncClient(proxies={"https://": PROXY_URL})
@@ -130,6 +131,7 @@ async def test_async_proxy_close():
         await client.aclose()
 
 
[email protected]
 def test_sync_proxy_close():
     try:
         client = httpx.Client(proxies={"https://": PROXY_URL})
diff --git a/tests/test_timeouts.py b/tests/test_timeouts.py
index 46a8bee8f..c7a665c3b 100644
--- a/tests/test_timeouts.py
+++ b/tests/test_timeouts.py
@@ -23,6 +23,7 @@ async def test_write_timeout(server):
 
 
 @pytest.mark.usefixtures("async_environment")
[email protected]
 async def test_connect_timeout(server):
     timeout = httpx.Timeout(None, connect=1e-6)
 
++++++ httpx-0.18.0.tar.gz -> httpx-0.18.2.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/.github/PULL_REQUEST_TEMPLATE.md 
new/httpx-0.18.2/.github/PULL_REQUEST_TEMPLATE.md
--- old/httpx-0.18.0/.github/PULL_REQUEST_TEMPLATE.md   2021-04-27 
16:20:22.000000000 +0200
+++ new/httpx-0.18.2/.github/PULL_REQUEST_TEMPLATE.md   2021-06-17 
12:29:22.000000000 +0200
@@ -1,7 +1,7 @@
 The starting point for contributions should usually be [a 
discussion](https://github.com/encode/httpx/discussions)
 
 Simple documentation typos may be raised as stand-alone pull requests, but 
otherwise
-please ensure you've discussed the your proposal prior to issuing a pull 
request.
+please ensure you've discussed your proposal prior to issuing a pull request.
 
 This will help us direct work appropriately, and ensure that any suggested 
changes
 have been okayed by the maintainers.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/.github/workflows/test-suite.yml 
new/httpx-0.18.2/.github/workflows/test-suite.yml
--- old/httpx-0.18.0/.github/workflows/test-suite.yml   2021-04-27 
16:20:22.000000000 +0200
+++ new/httpx-0.18.2/.github/workflows/test-suite.yml   2021-06-17 
12:29:22.000000000 +0200
@@ -14,7 +14,7 @@
 
     strategy:
       matrix:
-        python-version: ["3.6", "3.7", "3.8", "3.9"]
+        python-version: ["3.6", "3.7", "3.8", "3.9", "3.10.0-beta.2"]
 
     steps:
       - uses: "actions/checkout@v2"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/CHANGELOG.md 
new/httpx-0.18.2/CHANGELOG.md
--- old/httpx-0.18.0/CHANGELOG.md       2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/CHANGELOG.md       2021-06-17 12:29:22.000000000 +0200
@@ -4,6 +4,31 @@
 
 The format is based on [Keep a 
Changelog](https://keepachangelog.com/en/1.0.0/).
 
+## 0.18.2 (17th June, 2021)
+
+### Added
+
+* Support for Python 3.10. (Pull #1687)
+* Expose `httpx.USE_CLIENT_DEFAULT`, used as the default to `auth` and 
`timeout` parameters in request methods. (Pull #1634)
+* Support [HTTP/2 "prior 
knowledge"](https://python-hyper.org/projects/hyper-h2/en/v2.3.1/negotiating-http2.html#prior-knowledge),
 using `httpx.Client(http1=False, http2=True)`. (Pull #1624)
+
+### Fixed
+
+* Clean up some cases where warnings were being issued. (Pull #1687)
+* Prefer Content-Length over Transfer-Encoding: chunked for 
content=<file-like> cases. (Pull #1619)
+
+## 0.18.1 (29th April, 2021)
+
+### Changed
+
+* Update brotli support to use the `brotlicffi` package (Pull #1605)
+* Ensure that `Request(..., stream=...)` does not auto-generate any headers on 
the request instance. (Pull #1607)
+
+### Fixed
+
+* Pass through `timeout=...` in top-level httpx.stream() function. (Pull #1613)
+* Map httpcore transport close exceptions to httpx exceptions. (Pull #1606)
+
 ## 0.18.0 (27th April, 2021)
 
 The 0.18.x release series formalises our low-level Transport API, introducing 
the base classes `httpx.BaseTransport` and `httpx.AsyncBaseTransport`.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/README.md new/httpx-0.18.2/README.md
--- old/httpx-0.18.0/README.md  2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/README.md  2021-06-17 12:29:22.000000000 +0200
@@ -123,7 +123,7 @@
   * `idna` - Internationalized domain name support.
 * `sniffio` - Async library autodetection.
 * `async_generator` - Backport support for `contextlib.asynccontextmanager`. 
*(Only required for Python 3.6)*
-* `brotlipy` - Decoding for "brotli" compressed responses. *(Optional)*
+* `brotlicffi` - Decoding for "brotli" compressed responses. *(Optional)*
 
 A huge amount of credit is due to `requests` for the API layout that
 much of this work follows, as well as to `urllib3` for plenty of design
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/docs/advanced.md 
new/httpx-0.18.2/docs/advanced.md
--- old/httpx-0.18.0/docs/advanced.md   2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/docs/advanced.md   2021-06-17 12:29:22.000000000 +0200
@@ -228,10 +228,10 @@
 
 There are currently two event hooks:
 
-* `request` - Called once a request is about to be sent. Passed the `request` 
instance.
-* `response` - Called once the response has been returned. Passed the 
`response` instance.
+* `request` - Called after a request is fully prepared, but before it is sent 
to the network. Passed the `request` instance.
+* `response` - Called after the response has been fetched from the network, 
but before it is returned to the caller. Passed the `response` instance.
 
-These allow you to install client-wide functionality such as logging and 
monitoring.
+These allow you to install client-wide functionality such as logging, 
monitoring or tracing.
 
 ```python
 def log_request(request):
@@ -255,6 +255,15 @@
 client = httpx.Client(event_hooks={'response': [raise_on_4xx_5xx]})
 ```
 
+The hooks are also allowed to modify `request` and `response` objects.
+
+```python
+def add_timestamp(request):
+    request.headers['x-request-timestamp'] = 
datetime.now(tz=datetime.utc).isoformat()
+    
+client = httpx.Client(event_hooks={'request': [add_timestamp]})
+```
+
 Event hooks must always be set as a **list of callables**, and you may register
 multiple event hooks for each type of event.
 
@@ -361,7 +370,7 @@
 ...
 ```
 
-When using `Client` instances, `trust_env` should be set on the client itself, 
rather that on the request methods:
+When using `Client` instances, `trust_env` should be set on the client itself, 
rather than on the request methods:
 
 ```python
 client = httpx.Client(trust_env=False)
@@ -1100,7 +1109,7 @@
 
 
 # Switch to a mock transport, if the TESTING environment variable is set.
-if os.environ['TESTING'].upper() == "TRUE":
+if os.environ.get('TESTING', '').upper() == "TRUE":
     transport = httpx.MockTransport(handler)
 else:
     transport = httpx.HTTPTransport()
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/docs/async.md 
new/httpx-0.18.2/docs/async.md
--- old/httpx-0.18.0/docs/async.md      2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/docs/async.md      2021-06-17 12:29:22.000000000 +0200
@@ -190,6 +190,29 @@
 !!! important
     The `curio` package must be installed to use the Curio backend.
 
+
+### [AnyIO](https://github.com/agronholm/anyio)
+
+AnyIO is an [asynchronous networking and concurrency 
library](https://anyio.readthedocs.io/) that works on top of either `asyncio` 
or `trio`. It blends in with native libraries of your chosen backend (defaults 
to `asyncio`).
+
+```python
+import httpx
+import anyio
+
+async def main():
+    async with httpx.AsyncClient() as client:
+        response = await client.get('https://www.example.com/')
+        print(response)
+
+anyio.run(main, backend='trio')
+```
+
+When instantiating a transport instance explicitly, for maximum consistency, 
the same `backend` parameter can be passed to `AsyncHTTPTransport`.
+```python
+transport = httpx.AsyncHTTPTransport(backend="trio")
+```
+
+
 ## Calling into Python Web Apps
 
 Just as `httpx.Client` allows you to call directly into WSGI web applications,
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/docs/environment_variables.md 
new/httpx-0.18.2/docs/environment_variables.md
--- old/httpx-0.18.0/docs/environment_variables.md      2021-04-27 
16:20:22.000000000 +0200
+++ new/httpx-0.18.2/docs/environment_variables.md      2021-06-17 
12:29:22.000000000 +0200
@@ -132,6 +132,18 @@
 SSL_CERT_DIR=/path/to/ca-certs/ python -c "import httpx; 
httpx.get('https://example.com')"
 ```
 
+## `NETRC`
+
+Valid values: a filename
+
+If this environment variable is set but auth parameter is not defined, HTTPX 
will add auth information stored in the .netrc file into the request's header. 
If you do not provide NETRC environment either, HTTPX will use default files. 
(~/.netrc, ~/_netrc)
+
+Example:
+
+```console
+NETRC=/path/to/netrcfile/.my_netrc python -c "import httpx; 
httpx.get('https://example.com')"
+```
+
 ## Proxies
 
 The environment variables documented below are used as a convention by various 
HTTP tooling, including:
@@ -175,3 +187,4 @@
 python -c "import httpx; httpx.get('http://127.0.0.1:5000/my-api')"
 python -c "import httpx; httpx.get('https://www.python-httpx.org')"
 ```
+
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/docs/http2.md 
new/httpx-0.18.2/docs/http2.md
--- old/httpx-0.18.0/docs/http2.md      2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/docs/http2.md      2021-06-17 12:29:22.000000000 +0200
@@ -5,7 +5,7 @@
 the core semantics of the request or response, but alters the way that data is
 sent to and from the server.
 
-Rather that the text format that HTTP/1.1 uses, HTTP/2 is a binary format.
+Rather than the text format that HTTP/1.1 uses, HTTP/2 is a binary format.
 The binary format provides full request and response multiplexing, and 
efficient
 compression of HTTP headers. The stream multiplexing means that where HTTP/1.1
 requires one TCP stream for each concurrent request, HTTP/2 allows a single TCP
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/docs/index.md 
new/httpx-0.18.2/docs/index.md
--- old/httpx-0.18.0/docs/index.md      2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/docs/index.md      2021-06-17 12:29:22.000000000 +0200
@@ -115,7 +115,7 @@
   * `idna` - Internationalized domain name support.
 * `sniffio` - Async library autodetection.
 * `async_generator` - Backport support for `contextlib.asynccontextmanager`. 
*(Only required for Python 3.6)*
-* `brotlipy` - Decoding for "brotli" compressed responses. *(Optional)*
+* `brotlicffi` - Decoding for "brotli" compressed responses. *(Optional)*
 
 A huge amount of credit is due to `requests` for the API layout that
 much of this work follows, as well as to `urllib3` for plenty of design
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/httpx/__init__.py 
new/httpx-0.18.2/httpx/__init__.py
--- old/httpx-0.18.0/httpx/__init__.py  2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/httpx/__init__.py  2021-06-17 12:29:22.000000000 +0200
@@ -1,7 +1,7 @@
 from .__version__ import __description__, __title__, __version__
 from ._api import delete, get, head, options, patch, post, put, request, stream
 from ._auth import Auth, BasicAuth, DigestAuth
-from ._client import AsyncClient, Client
+from ._client import USE_CLIENT_DEFAULT, AsyncClient, Client
 from ._config import Limits, Proxy, Timeout, create_ssl_context
 from ._content import ByteStream
 from ._exceptions import (
@@ -111,6 +111,7 @@
     "TransportError",
     "UnsupportedProtocol",
     "URL",
+    "USE_CLIENT_DEFAULT",
     "WriteError",
     "WriteTimeout",
     "WSGITransport",
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/httpx/__version__.py 
new/httpx-0.18.2/httpx/__version__.py
--- old/httpx-0.18.0/httpx/__version__.py       2021-04-27 16:20:22.000000000 
+0200
+++ new/httpx-0.18.2/httpx/__version__.py       2021-06-17 12:29:22.000000000 
+0200
@@ -1,3 +1,3 @@
 __title__ = "httpx"
 __description__ = "A next generation HTTP client, for Python 3."
-__version__ = "0.18.0"
+__version__ = "0.18.2"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/httpx/_api.py 
new/httpx-0.18.2/httpx/_api.py
--- old/httpx-0.18.0/httpx/_api.py      2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/httpx/_api.py      2021-06-17 12:29:22.000000000 +0200
@@ -142,7 +142,12 @@
     [0]: /quickstart#streaming-responses
     """
     with Client(
-        cookies=cookies, proxies=proxies, cert=cert, verify=verify, 
trust_env=trust_env
+        cookies=cookies,
+        proxies=proxies,
+        cert=cert,
+        verify=verify,
+        timeout=timeout,
+        trust_env=trust_env,
     ) as client:
         with client.stream(
             method=method,
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/httpx/_client.py 
new/httpx-0.18.2/httpx/_client.py
--- old/httpx-0.18.0/httpx/_client.py   2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/httpx/_client.py   2021-06-17 12:29:22.000000000 +0200
@@ -12,11 +12,9 @@
     DEFAULT_LIMITS,
     DEFAULT_MAX_REDIRECTS,
     DEFAULT_TIMEOUT_CONFIG,
-    UNSET,
     Limits,
     Proxy,
     Timeout,
-    UnsetType,
 )
 from ._decoders import SUPPORTED_DECODERS
 from ._exceptions import (
@@ -65,6 +63,31 @@
 U = typing.TypeVar("U", bound="AsyncClient")
 
 
+class UseClientDefault:
+    """
+    For some parameters such as `auth=...` and `timeout=...` we need to be able
+    to indicate the default "unset" state, in a way that is distinctly 
different
+    to using `None`.
+
+    The default "unset" state indicates that whatever default is set on the
+    client should be used. This is different to setting `None`, which
+    explicitly disables the parameter, possibly overriding a client default.
+
+    For example we use `timeout=USE_CLIENT_DEFAULT` in the `request()` 
signature.
+    Omitting the `timeout` parameter will send a request using whatever default
+    timeout has been configured on the client. Including `timeout=None` will
+    ensure no timeout is used.
+
+    Note that user code shouldn't need to use the `USE_CLIENT_DEFAULT` 
constant,
+    but it is used internally when a parameter is not included.
+    """
+
+    pass  # pragma: nocover
+
+
+USE_CLIENT_DEFAULT = UseClientDefault()
+
+
 logger = get_logger(__name__)
 
 USER_AGENT = f"python-httpx/{__version__}"
@@ -402,9 +425,13 @@
             raise TypeError(f'Invalid "auth" argument: {auth!r}')
 
     def _build_request_auth(
-        self, request: Request, auth: typing.Union[AuthTypes, UnsetType] = 
UNSET
+        self,
+        request: Request,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
     ) -> Auth:
-        auth = self._auth if isinstance(auth, UnsetType) else 
self._build_auth(auth)
+        auth = (
+            self._auth if isinstance(auth, UseClientDefault) else 
self._build_auth(auth)
+        )
 
         if auth is not None:
             return auth
@@ -497,9 +524,8 @@
             # the origin.
             headers.pop("Authorization", None)
 
-            # Remove the Host header, so that a new one will be auto-populated 
on
-            # the request instance.
-            headers.pop("Host", None)
+            # Update the Host header.
+            headers["Host"] = url.netloc.decode("ascii")
 
         if method != request.method and method == "GET":
             # If we've switch to a 'GET' request, then strip any headers which
@@ -580,6 +606,7 @@
         cookies: CookieTypes = None,
         verify: VerifyTypes = True,
         cert: CertTypes = None,
+        http1: bool = True,
         http2: bool = False,
         proxies: ProxiesTypes = None,
         mounts: typing.Mapping[str, BaseTransport] = None,
@@ -619,6 +646,7 @@
         self._transport = self._init_transport(
             verify=verify,
             cert=cert,
+            http1=http1,
             http2=http2,
             limits=limits,
             transport=transport,
@@ -632,6 +660,7 @@
                 proxy,
                 verify=verify,
                 cert=cert,
+                http1=http1,
                 http2=http2,
                 limits=limits,
                 trust_env=trust_env,
@@ -649,6 +678,7 @@
         self,
         verify: VerifyTypes = True,
         cert: CertTypes = None,
+        http1: bool = True,
         http2: bool = False,
         limits: Limits = DEFAULT_LIMITS,
         transport: BaseTransport = None,
@@ -662,7 +692,12 @@
             return WSGITransport(app=app)
 
         return HTTPTransport(
-            verify=verify, cert=cert, http2=http2, limits=limits, 
trust_env=trust_env
+            verify=verify,
+            cert=cert,
+            http1=http1,
+            http2=http2,
+            limits=limits,
+            trust_env=trust_env,
         )
 
     def _init_proxy_transport(
@@ -670,6 +705,7 @@
         proxy: Proxy,
         verify: VerifyTypes = True,
         cert: CertTypes = None,
+        http1: bool = True,
         http2: bool = False,
         limits: Limits = DEFAULT_LIMITS,
         trust_env: bool = True,
@@ -677,6 +713,7 @@
         return HTTPTransport(
             verify=verify,
             cert=cert,
+            http1=http1,
             http2=http2,
             limits=limits,
             trust_env=trust_env,
@@ -706,9 +743,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Build and send a request.
@@ -762,9 +799,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> typing.Iterator[Response]:
         """
         Alternative to `httpx.request()` that streams the response body
@@ -804,9 +841,9 @@
         request: Request,
         *,
         stream: bool = False,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Send a request.
@@ -825,7 +862,9 @@
             raise RuntimeError("Cannot send a request, as the client has been 
closed.")
 
         self._state = ClientState.OPENED
-        timeout = self.timeout if isinstance(timeout, UnsetType) else 
Timeout(timeout)
+        timeout = (
+            self.timeout if isinstance(timeout, UseClientDefault) else 
Timeout(timeout)
+        )
 
         auth = self._build_request_auth(request, auth)
 
@@ -858,32 +897,35 @@
         history: typing.List[Response],
     ) -> Response:
         auth_flow = auth.sync_auth_flow(request)
-        request = next(auth_flow)
+        try:
+            request = next(auth_flow)
 
-        for hook in self._event_hooks["request"]:
-            hook(request)
+            for hook in self._event_hooks["request"]:
+                hook(request)
 
-        while True:
-            response = self._send_handling_redirects(
-                request,
-                timeout=timeout,
-                allow_redirects=allow_redirects,
-                history=history,
-            )
-            try:
+            while True:
+                response = self._send_handling_redirects(
+                    request,
+                    timeout=timeout,
+                    allow_redirects=allow_redirects,
+                    history=history,
+                )
                 try:
-                    next_request = auth_flow.send(response)
-                except StopIteration:
-                    return response
+                    try:
+                        next_request = auth_flow.send(response)
+                    except StopIteration:
+                        return response
 
-                response.history = list(history)
-                response.read()
-                request = next_request
-                history.append(response)
+                    response.history = list(history)
+                    response.read()
+                    request = next_request
+                    history.append(response)
 
-            except Exception as exc:
-                response.close()
-                raise exc
+                except Exception as exc:
+                    response.close()
+                    raise exc
+        finally:
+            auth_flow.close()
 
     def _send_handling_redirects(
         self,
@@ -964,9 +1006,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Send a `GET` request.
@@ -991,9 +1033,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Send an `OPTIONS` request.
@@ -1018,9 +1060,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Send a `HEAD` request.
@@ -1049,9 +1091,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Send a `POST` request.
@@ -1084,9 +1126,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Send a `PUT` request.
@@ -1119,9 +1161,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Send a `PATCH` request.
@@ -1150,9 +1192,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Send a `DELETE` request.
@@ -1205,7 +1247,10 @@
                 transport.__exit__(exc_type, exc_value, traceback)
 
     def __del__(self) -> None:
-        if self._state == ClientState.OPENED:
+        # We use 'getattr' here, to manage the case where '__del__()' is called
+        # on a partically initiallized instance that raised an exception during
+        # the call to '__init__()'.
+        if getattr(self, "_state", None) == ClientState.OPENED:  # noqa: B009
             self.close()
 
 
@@ -1266,6 +1311,7 @@
         cookies: CookieTypes = None,
         verify: VerifyTypes = True,
         cert: CertTypes = None,
+        http1: bool = True,
         http2: bool = False,
         proxies: ProxiesTypes = None,
         mounts: typing.Mapping[str, AsyncBaseTransport] = None,
@@ -1305,6 +1351,7 @@
         self._transport = self._init_transport(
             verify=verify,
             cert=cert,
+            http1=http1,
             http2=http2,
             limits=limits,
             transport=transport,
@@ -1319,6 +1366,7 @@
                 proxy,
                 verify=verify,
                 cert=cert,
+                http1=http1,
                 http2=http2,
                 limits=limits,
                 trust_env=trust_env,
@@ -1335,6 +1383,7 @@
         self,
         verify: VerifyTypes = True,
         cert: CertTypes = None,
+        http1: bool = True,
         http2: bool = False,
         limits: Limits = DEFAULT_LIMITS,
         transport: AsyncBaseTransport = None,
@@ -1348,7 +1397,12 @@
             return ASGITransport(app=app)
 
         return AsyncHTTPTransport(
-            verify=verify, cert=cert, http2=http2, limits=limits, 
trust_env=trust_env
+            verify=verify,
+            cert=cert,
+            http1=http1,
+            http2=http2,
+            limits=limits,
+            trust_env=trust_env,
         )
 
     def _init_proxy_transport(
@@ -1356,6 +1410,7 @@
         proxy: Proxy,
         verify: VerifyTypes = True,
         cert: CertTypes = None,
+        http1: bool = True,
         http2: bool = False,
         limits: Limits = DEFAULT_LIMITS,
         trust_env: bool = True,
@@ -1392,9 +1447,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Build and send a request.
@@ -1441,9 +1496,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> typing.AsyncIterator[Response]:
         """
         Alternative to `httpx.request()` that streams the response body
@@ -1483,9 +1538,9 @@
         request: Request,
         *,
         stream: bool = False,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Send a request.
@@ -1504,7 +1559,9 @@
             raise RuntimeError("Cannot send a request, as the client has been 
closed.")
 
         self._state = ClientState.OPENED
-        timeout = self.timeout if isinstance(timeout, UnsetType) else 
Timeout(timeout)
+        timeout = (
+            self.timeout if isinstance(timeout, UseClientDefault) else 
Timeout(timeout)
+        )
 
         auth = self._build_request_auth(request, auth)
 
@@ -1537,32 +1594,35 @@
         history: typing.List[Response],
     ) -> Response:
         auth_flow = auth.async_auth_flow(request)
-        request = await auth_flow.__anext__()
+        try:
+            request = await auth_flow.__anext__()
 
-        for hook in self._event_hooks["request"]:
-            await hook(request)
+            for hook in self._event_hooks["request"]:
+                await hook(request)
 
-        while True:
-            response = await self._send_handling_redirects(
-                request,
-                timeout=timeout,
-                allow_redirects=allow_redirects,
-                history=history,
-            )
-            try:
+            while True:
+                response = await self._send_handling_redirects(
+                    request,
+                    timeout=timeout,
+                    allow_redirects=allow_redirects,
+                    history=history,
+                )
                 try:
-                    next_request = await auth_flow.asend(response)
-                except StopAsyncIteration:
-                    return response
+                    try:
+                        next_request = await auth_flow.asend(response)
+                    except StopAsyncIteration:
+                        return response
 
-                response.history = list(history)
-                await response.aread()
-                request = next_request
-                history.append(response)
+                    response.history = list(history)
+                    await response.aread()
+                    request = next_request
+                    history.append(response)
 
-            except Exception as exc:
-                await response.aclose()
-                raise exc
+                except Exception as exc:
+                    await response.aclose()
+                    raise exc
+        finally:
+            await auth_flow.aclose()
 
     async def _send_handling_redirects(
         self,
@@ -1650,9 +1710,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Send a `GET` request.
@@ -1677,9 +1737,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Send an `OPTIONS` request.
@@ -1704,9 +1764,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Send a `HEAD` request.
@@ -1735,9 +1795,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Send a `POST` request.
@@ -1770,9 +1830,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Send a `PUT` request.
@@ -1805,9 +1865,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Send a `PATCH` request.
@@ -1836,9 +1896,9 @@
         params: QueryParamTypes = None,
         headers: HeaderTypes = None,
         cookies: CookieTypes = None,
-        auth: typing.Union[AuthTypes, UnsetType] = UNSET,
+        auth: typing.Union[AuthTypes, UseClientDefault] = USE_CLIENT_DEFAULT,
         allow_redirects: bool = True,
-        timeout: typing.Union[TimeoutTypes, UnsetType] = UNSET,
+        timeout: typing.Union[TimeoutTypes, UseClientDefault] = 
USE_CLIENT_DEFAULT,
     ) -> Response:
         """
         Send a `DELETE` request.
@@ -1891,7 +1951,10 @@
                 await proxy.__aexit__(exc_type, exc_value, traceback)
 
     def __del__(self) -> None:
-        if self._state == ClientState.OPENED:
+        # We use 'getattr' here, to manage the case where '__del__()' is called
+        # on a partically initiallized instance that raised an exception during
+        # the call to '__init__()'.
+        if getattr(self, "_state", None) == ClientState.OPENED:  # noqa: B009
             # Unlike the sync case, we cannot silently close the client when
             # it is garbage collected, because `.aclose()` is an async 
operation,
             # but `__del__` is not.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/httpx/_compat.py 
new/httpx-0.18.2/httpx/_compat.py
--- old/httpx-0.18.0/httpx/_compat.py   2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/httpx/_compat.py   2021-06-17 12:29:22.000000000 +0200
@@ -1,6 +1,25 @@
+"""
+The _compat module is used for code which requires branching between different
+Python environments. It is excluded from the code coverage checks.
+"""
+import ssl
+import sys
+
 # `contextlib.asynccontextmanager` exists from Python 3.7 onwards.
 # For 3.6 we require the `async_generator` package for a backported version.
 try:
     from contextlib import asynccontextmanager  # type: ignore
-except ImportError:  # pragma: no cover
+except ImportError:
     from async_generator import asynccontextmanager  # type: ignore # noqa
+
+
+def set_minimum_tls_version_1_2(context: ssl.SSLContext) -> None:
+    if sys.version_info >= (3, 10):
+        context.minimum_version = ssl.TLSVersion.TLSv1_2
+    else:
+        # These become deprecated in favor of 'context.minimum_version'
+        # from Python 3.10 onwards.
+        context.options |= ssl.OP_NO_SSLv2
+        context.options |= ssl.OP_NO_SSLv3
+        context.options |= ssl.OP_NO_TLSv1
+        context.options |= ssl.OP_NO_TLSv1_1
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/httpx/_config.py 
new/httpx-0.18.2/httpx/_config.py
--- old/httpx-0.18.0/httpx/_config.py   2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/httpx/_config.py   2021-06-17 12:29:22.000000000 +0200
@@ -6,6 +6,7 @@
 
 import certifi
 
+from ._compat import set_minimum_tls_version_1_2
 from ._models import URL, Headers
 from ._types import CertTypes, HeaderTypes, TimeoutTypes, URLTypes, VerifyTypes
 from ._utils import get_ca_bundle_from_env, get_logger
@@ -90,8 +91,8 @@
         Return an SSL context for unverified connections.
         """
         context = self._create_default_ssl_context()
-        context.verify_mode = ssl.CERT_NONE
         context.check_hostname = False
+        context.verify_mode = ssl.CERT_NONE
         self._load_client_certs(context)
         return context
 
@@ -153,11 +154,8 @@
         Creates the default SSLContext object that's used for both verified
         and unverified connections.
         """
-        context = ssl.SSLContext(ssl.PROTOCOL_TLS)
-        context.options |= ssl.OP_NO_SSLv2
-        context.options |= ssl.OP_NO_SSLv3
-        context.options |= ssl.OP_NO_TLSv1
-        context.options |= ssl.OP_NO_TLSv1_1
+        context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
+        set_minimum_tls_version_1_2(context)
         context.options |= ssl.OP_NO_COMPRESSION
         context.set_ciphers(DEFAULT_CIPHERS)
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/httpx/_content.py 
new/httpx-0.18.2/httpx/_content.py
--- old/httpx-0.18.0/httpx/_content.py  2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/httpx/_content.py  2021-06-17 12:29:22.000000000 +0200
@@ -17,7 +17,7 @@
 from ._multipart import MultipartStream
 from ._transports.base import AsyncByteStream, SyncByteStream
 from ._types import RequestContent, RequestData, RequestFiles, ResponseContent
-from ._utils import primitive_value_to_str
+from ._utils import peek_filelike_length, primitive_value_to_str
 
 
 class ByteStream(AsyncByteStream, SyncByteStream):
@@ -82,12 +82,17 @@
 
     if isinstance(content, (bytes, str)):
         body = content.encode("utf-8") if isinstance(content, str) else content
-        content_length = str(len(body))
-        headers = {"Content-Length": content_length} if body else {}
+        content_length = len(body)
+        headers = {"Content-Length": str(content_length)} if body else {}
         return headers, ByteStream(body)
 
     elif isinstance(content, Iterable):
-        headers = {"Transfer-Encoding": "chunked"}
+        content_length_or_none = peek_filelike_length(content)
+
+        if content_length_or_none is None:
+            headers = {"Transfer-Encoding": "chunked"}
+        else:
+            headers = {"Content-Length": str(content_length_or_none)}
         return headers, IteratorByteStream(content)  # type: ignore
 
     elif isinstance(content, AsyncIterable):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/httpx/_decoders.py 
new/httpx-0.18.2/httpx/_decoders.py
--- old/httpx-0.18.0/httpx/_decoders.py 2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/httpx/_decoders.py 2021-06-17 12:29:22.000000000 +0200
@@ -11,9 +11,9 @@
 from ._exceptions import DecodingError
 
 try:
-    import brotli
+    import brotlicffi
 except ImportError:  # pragma: nocover
-    brotli = None
+    brotlicffi = None
 
 
 class ContentDecoder:
@@ -99,14 +99,14 @@
     """
 
     def __init__(self) -> None:
-        if brotli is None:  # pragma: nocover
+        if brotlicffi is None:  # pragma: nocover
             raise ImportError(
-                "Using 'BrotliDecoder', but the 'brotlipy' or 'brotli' library 
"
+                "Using 'BrotliDecoder', but the 'brotlicffi' library "
                 "is not installed."
                 "Make sure to install httpx using `pip install httpx[brotli]`."
             ) from None
 
-        self.decompressor = brotli.Decompressor()
+        self.decompressor = brotlicffi.Decompressor()
         self.seen_data = False
         if hasattr(self.decompressor, "decompress"):
             self._decompress = self.decompressor.decompress
@@ -118,8 +118,8 @@
             return b""
         self.seen_data = True
         try:
-            return self._decompress(data)
-        except brotli.error as exc:
+            return self.decompressor.decompress(data)
+        except brotlicffi.Error as exc:
             raise DecodingError(str(exc)) from exc
 
     def flush(self) -> bytes:
@@ -129,7 +129,7 @@
             if hasattr(self.decompressor, "finish"):
                 self.decompressor.finish()
             return b""
-        except brotli.error as exc:  # pragma: nocover
+        except brotlicffi.Error as exc:  # pragma: nocover
             raise DecodingError(str(exc)) from exc
 
 
@@ -365,5 +365,5 @@
 }
 
 
-if brotli is None:
+if brotlicffi is None:
     SUPPORTED_DECODERS.pop("br")  # pragma: nocover
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/httpx/_models.py 
new/httpx-0.18.2/httpx/_models.py
--- old/httpx-0.18.0/httpx/_models.py   2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/httpx/_models.py   2021-06-17 12:29:22.000000000 +0200
@@ -1089,14 +1089,21 @@
         if cookies:
             Cookies(cookies).set_cookie_header(self)
 
-        if stream is not None:
+        if stream is None:
+            headers, stream = encode_request(content, data, files, json)
+            self._prepare(headers)
+            self.stream = stream
+            # Load the request body, except for streaming content.
+            if isinstance(stream, ByteStream):
+                self.read()
+        else:
             # There's an important distinction between `Request(content=...)`,
             # and `Request(stream=...)`.
             #
-            # Using `content=...` implies automatically populated content 
headers,
-            # of either `Content-Length: ...` or `Transfer-Encoding: chunked`.
+            # Using `content=...` implies automatically populated `Host` and 
content
+            # headers, of either `Content-Length: ...` or `Transfer-Encoding: 
chunked`.
             #
-            # Using `stream=...` will not automatically include any content 
headers.
+            # Using `stream=...` will not automatically include *any* 
auto-populated headers.
             #
             # As an end-user you don't really need `stream=...`. It's only
             # useful when:
@@ -1104,14 +1111,6 @@
             # * Preserving the request stream when copying requests, eg for 
redirects.
             # * Creating request instances on the *server-side* of the 
transport API.
             self.stream = stream
-            self._prepare({})
-        else:
-            headers, stream = encode_request(content, data, files, json)
-            self._prepare(headers)
-            self.stream = stream
-            # Load the request body, except for streaming content.
-            if isinstance(stream, ByteStream):
-                self.read()
 
     def _prepare(self, default_headers: typing.Dict[str, str]) -> None:
         for key, value in default_headers.items():
@@ -1216,7 +1215,14 @@
         self.is_closed = False
         self.is_stream_consumed = False
 
-        if stream is not None:
+        if stream is None:
+            headers, stream = encode_response(content, text, html, json)
+            self._prepare(headers)
+            self.stream = stream
+            if isinstance(stream, ByteStream):
+                # Load the response body, except for streaming content.
+                self.read()
+        else:
             # There's an important distinction between `Response(content=...)`,
             # and `Response(stream=...)`.
             #
@@ -1229,13 +1235,6 @@
             # useful when creating response instances having received a stream
             # from the transport API.
             self.stream = stream
-        else:
-            headers, stream = encode_response(content, text, html, json)
-            self._prepare(headers)
-            self.stream = stream
-            if isinstance(stream, ByteStream):
-                # Load the response body, except for streaming content.
-                self.read()
 
         self._num_bytes_downloaded = 0
 
@@ -1679,10 +1678,10 @@
         """
         Loads any cookies based on the response `Set-Cookie` headers.
         """
-        urlib_response = self._CookieCompatResponse(response)
+        urllib_response = self._CookieCompatResponse(response)
         urllib_request = self._CookieCompatRequest(response.request)
 
-        self.jar.extract_cookies(urlib_response, urllib_request)  # type: 
ignore
+        self.jar.extract_cookies(urllib_response, urllib_request)  # type: 
ignore
 
     def set_cookie_header(self, request: Request) -> None:
         """
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/httpx/_multipart.py 
new/httpx-0.18.2/httpx/_multipart.py
--- old/httpx-0.18.0/httpx/_multipart.py        2021-04-27 16:20:22.000000000 
+0200
+++ new/httpx-0.18.2/httpx/_multipart.py        2021-06-17 12:29:22.000000000 
+0200
@@ -93,9 +93,8 @@
             return len(headers) + len(to_bytes(self.file))
 
         # Let's do our best not to read `file` into memory.
-        try:
-            file_length = peek_filelike_length(self.file)
-        except OSError:
+        file_length = peek_filelike_length(self.file)
+        if file_length is None:
             # As a last resort, read file and cache contents for later.
             assert not hasattr(self, "_data")
             self._data = to_bytes(self.file.read())
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/httpx/_transports/default.py 
new/httpx-0.18.2/httpx/_transports/default.py
--- old/httpx-0.18.0/httpx/_transports/default.py       2021-04-27 
16:20:22.000000000 +0200
+++ new/httpx-0.18.2/httpx/_transports/default.py       2021-06-17 
12:29:22.000000000 +0200
@@ -116,6 +116,7 @@
         self,
         verify: VerifyTypes = True,
         cert: CertTypes = None,
+        http1: bool = True,
         http2: bool = False,
         limits: Limits = DEFAULT_LIMITS,
         trust_env: bool = True,
@@ -133,6 +134,7 @@
                 max_connections=limits.max_connections,
                 max_keepalive_connections=limits.max_keepalive_connections,
                 keepalive_expiry=limits.keepalive_expiry,
+                http1=http1,
                 http2=http2,
                 uds=uds,
                 local_address=local_address,
@@ -162,7 +164,8 @@
         exc_value: BaseException = None,
         traceback: TracebackType = None,
     ) -> None:
-        self._pool.__exit__(exc_type, exc_value, traceback)
+        with map_httpcore_exceptions():
+            self._pool.__exit__(exc_type, exc_value, traceback)
 
     def handle_request(
         self,
@@ -210,6 +213,7 @@
         self,
         verify: VerifyTypes = True,
         cert: CertTypes = None,
+        http1: bool = True,
         http2: bool = False,
         limits: Limits = DEFAULT_LIMITS,
         trust_env: bool = True,
@@ -227,6 +231,7 @@
                 max_connections=limits.max_connections,
                 max_keepalive_connections=limits.max_keepalive_connections,
                 keepalive_expiry=limits.keepalive_expiry,
+                http1=http1,
                 http2=http2,
                 uds=uds,
                 local_address=local_address,
@@ -256,7 +261,8 @@
         exc_value: BaseException = None,
         traceback: TracebackType = None,
     ) -> None:
-        await self._pool.__aexit__(exc_type, exc_value, traceback)
+        with map_httpcore_exceptions():
+            await self._pool.__aexit__(exc_type, exc_value, traceback)
 
     async def handle_async_request(
         self,
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/httpx/_utils.py 
new/httpx-0.18.2/httpx/_utils.py
--- old/httpx-0.18.0/httpx/_utils.py    2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/httpx/_utils.py    2021-06-17 12:29:22.000000000 +0200
@@ -342,7 +342,7 @@
     return None
 
 
-def peek_filelike_length(stream: typing.IO) -> int:
+def peek_filelike_length(stream: typing.Any) -> typing.Optional[int]:
     """
     Given a file-like stream object, return its length in number of bytes
     without reading it into memory.
@@ -350,7 +350,9 @@
     try:
         # Is it an actual file?
         fd = stream.fileno()
-    except OSError:
+        # Yup, seems to be an actual file.
+        length = os.fstat(fd).st_size
+    except (AttributeError, OSError):
         # No... Maybe it's something that supports random access, like 
`io.BytesIO`?
         try:
             # Assuming so, go to end of stream to figure out its length,
@@ -358,14 +360,11 @@
             offset = stream.tell()
             length = stream.seek(0, os.SEEK_END)
             stream.seek(offset)
-        except OSError:
+        except (AttributeError, OSError):
             # Not even that? Sorry, we're doomed...
-            raise
-        else:
-            return length
-    else:
-        # Yup, seems to be an actual file.
-        return os.fstat(fd).st_size
+            return None
+
+    return length
 
 
 class Timer:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/requirements.txt 
new/httpx-0.18.2/requirements.txt
--- old/httpx-0.18.0/requirements.txt   2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/requirements.txt   2021-06-17 12:29:22.000000000 +0200
@@ -1,7 +1,4 @@
--e .[http2]
-
-# Optional
-brotlipy==0.7.*
+-e .[http2,brotli]
 
 # Documentation
 mkdocs
@@ -22,7 +19,8 @@
 flake8-pie==0.5.*
 isort==5.*
 mypy
-pytest==5.*
+types-certifi
+pytest==6.*
 pytest-asyncio
 pytest-trio
 trio
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/setup.cfg new/httpx-0.18.2/setup.cfg
--- old/httpx-0.18.0/setup.cfg  2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/setup.cfg  2021-06-17 12:29:22.000000000 +0200
@@ -16,9 +16,12 @@
 
 [tool:pytest]
 addopts = -rxXs
+filterwarnings =
+  error
+  default:::uvicorn
 markers =
   copied_from(source, changes=None): mark test as copied from somewhere else, 
along with a description of changes made to accodomate e.g. our test setup
 
 [coverage:run]
-omit = venv/*
+omit = venv/*, httpx/_compat.py
 include = httpx/*, tests/*
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/setup.py new/httpx-0.18.2/setup.py
--- old/httpx-0.18.0/setup.py   2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/setup.py   2021-06-17 12:29:22.000000000 +0200
@@ -59,12 +59,12 @@
         "certifi",
         "sniffio",
         "rfc3986[idna2008]>=1.3,<2",
-        "httpcore>=0.13.0,<0.14.0",
+        "httpcore>=0.13.3,<0.14.0",
         "async_generator; python_version < '3.7'"
     ],
     extras_require={
         "http2": "h2==3.*",
-        "brotli": "brotlipy==0.7.*",
+        "brotli": "brotlicffi==1.*",
     },
     classifiers=[
         "Development Status :: 4 - Beta",
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/tests/conftest.py 
new/httpx-0.18.2/tests/conftest.py
--- old/httpx-0.18.0/tests/conftest.py  2021-04-27 16:20:22.000000000 +0200
+++ new/httpx-0.18.2/tests/conftest.py  2021-06-17 12:29:22.000000000 +0200
@@ -14,6 +14,7 @@
     PrivateFormat,
     load_pem_private_key,
 )
+from typing_extensions import Literal
 from uvicorn.config import Config
 from uvicorn.main import Server
 
@@ -164,7 +165,7 @@
     await send({"type": "http.response.body"})
 
 
-SERVER_SCOPE = "session"
+SERVER_SCOPE: Literal["session"] = "session"
 
 
 @pytest.fixture(scope=SERVER_SCOPE)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/tests/models/test_responses.py 
new/httpx-0.18.2/tests/models/test_responses.py
--- old/httpx-0.18.0/tests/models/test_responses.py     2021-04-27 
16:20:22.000000000 +0200
+++ new/httpx-0.18.2/tests/models/test_responses.py     2021-06-17 
12:29:22.000000000 +0200
@@ -2,7 +2,7 @@
 import pickle
 from unittest import mock
 
-import brotli
+import brotlicffi
 import pytest
 
 import httpx
@@ -788,7 +788,7 @@
 def test_decode_error_with_request(header_value):
     headers = [(b"Content-Encoding", header_value)]
     body = b"test 123"
-    compressed_body = brotli.compress(body)[3:]
+    compressed_body = brotlicffi.compress(body)[3:]
     with pytest.raises(httpx.DecodingError):
         httpx.Response(
             200,
@@ -809,7 +809,7 @@
 def test_value_error_without_request(header_value):
     headers = [(b"Content-Encoding", header_value)]
     body = b"test 123"
-    compressed_body = brotli.compress(body)[3:]
+    compressed_body = brotlicffi.compress(body)[3:]
     with pytest.raises(httpx.DecodingError):
         httpx.Response(200, headers=headers, content=compressed_body)
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/tests/test_content.py 
new/httpx-0.18.2/tests/test_content.py
--- old/httpx-0.18.0/tests/test_content.py      2021-04-27 16:20:22.000000000 
+0200
+++ new/httpx-0.18.2/tests/test_content.py      2021-06-17 12:29:22.000000000 
+0200
@@ -49,6 +49,18 @@
 
 
 @pytest.mark.asyncio
+async def test_bytesio_content():
+    headers, stream = encode_request(content=io.BytesIO(b"Hello, world!"))
+    assert isinstance(stream, typing.Iterable)
+    assert not isinstance(stream, typing.AsyncIterable)
+
+    content = b"".join([part for part in stream])
+
+    assert headers == {"Content-Length": "13"}
+    assert content == b"Hello, world!"
+
+
[email protected]
 async def test_iterator_content():
     def hello_world():
         yield b"Hello, "
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/tests/test_decoders.py 
new/httpx-0.18.2/tests/test_decoders.py
--- old/httpx-0.18.0/tests/test_decoders.py     2021-04-27 16:20:22.000000000 
+0200
+++ new/httpx-0.18.2/tests/test_decoders.py     2021-06-17 12:29:22.000000000 
+0200
@@ -1,6 +1,6 @@
 import zlib
 
-import brotli
+import brotlicffi
 import pytest
 
 import httpx
@@ -69,7 +69,7 @@
 
 def test_brotli():
     body = b"test 123"
-    compressed_body = brotli.compress(body)
+    compressed_body = brotlicffi.compress(body)
 
     headers = [(b"Content-Encoding", b"br")]
     response = httpx.Response(
@@ -102,7 +102,7 @@
 
 def test_multi_with_identity():
     body = b"test 123"
-    compressed_body = brotli.compress(body)
+    compressed_body = brotlicffi.compress(body)
 
     headers = [(b"Content-Encoding", b"br, identity")]
     response = httpx.Response(
@@ -165,7 +165,7 @@
 def test_decoding_errors(header_value):
     headers = [(b"Content-Encoding", header_value)]
     body = b"test 123"
-    compressed_body = brotli.compress(body)[3:]
+    compressed_body = brotlicffi.compress(body)[3:]
     with pytest.raises(httpx.DecodingError):
         request = httpx.Request("GET", "https://example.org";)
         httpx.Response(200, headers=headers, content=compressed_body, 
request=request)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.18.0/tests/test_multipart.py 
new/httpx-0.18.2/tests/test_multipart.py
--- old/httpx-0.18.0/tests/test_multipart.py    2021-04-27 16:20:22.000000000 
+0200
+++ new/httpx-0.18.2/tests/test_multipart.py    2021-06-17 12:29:22.000000000 
+0200
@@ -107,34 +107,35 @@
         "e": True,
         "f": "",
     }
-    files = {"file": ("name.txt", open(path, "rb"))}
+    with open(path, "rb") as input_file:
+        files = {"file": ("name.txt", input_file)}
 
-    with mock.patch("os.urandom", return_value=os.urandom(16)):
-        boundary = os.urandom(16).hex()
+        with mock.patch("os.urandom", return_value=os.urandom(16)):
+            boundary = os.urandom(16).hex()
 
-        headers, stream = encode_request(data=data, files=files)
-        assert isinstance(stream, typing.Iterable)
+            headers, stream = encode_request(data=data, files=files)
+            assert isinstance(stream, typing.Iterable)
 
-        content = (
-            '--{0}\r\nContent-Disposition: form-data; name="a"\r\n\r\n1\r\n'
-            '--{0}\r\nContent-Disposition: form-data; name="b"\r\n\r\nC\r\n'
-            '--{0}\r\nContent-Disposition: form-data; name="c"\r\n\r\n11\r\n'
-            '--{0}\r\nContent-Disposition: form-data; name="c"\r\n\r\n22\r\n'
-            '--{0}\r\nContent-Disposition: form-data; name="c"\r\n\r\n33\r\n'
-            '--{0}\r\nContent-Disposition: form-data; name="d"\r\n\r\n\r\n'
-            '--{0}\r\nContent-Disposition: form-data; name="e"\r\n\r\ntrue\r\n'
-            '--{0}\r\nContent-Disposition: form-data; name="f"\r\n\r\n\r\n'
-            '--{0}\r\nContent-Disposition: form-data; name="file";'
-            ' filename="name.txt"\r\n'
-            "Content-Type: text/plain\r\n\r\n<file content>\r\n"
-            "--{0}--\r\n"
-            "".format(boundary).encode("ascii")
-        )
-        assert headers == {
-            "Content-Type": f"multipart/form-data; boundary={boundary}",
-            "Content-Length": str(len(content)),
-        }
-        assert content == b"".join(stream)
+            content = (
+                '--{0}\r\nContent-Disposition: form-data; 
name="a"\r\n\r\n1\r\n'
+                '--{0}\r\nContent-Disposition: form-data; 
name="b"\r\n\r\nC\r\n'
+                '--{0}\r\nContent-Disposition: form-data; 
name="c"\r\n\r\n11\r\n'
+                '--{0}\r\nContent-Disposition: form-data; 
name="c"\r\n\r\n22\r\n'
+                '--{0}\r\nContent-Disposition: form-data; 
name="c"\r\n\r\n33\r\n'
+                '--{0}\r\nContent-Disposition: form-data; name="d"\r\n\r\n\r\n'
+                '--{0}\r\nContent-Disposition: form-data; 
name="e"\r\n\r\ntrue\r\n'
+                '--{0}\r\nContent-Disposition: form-data; name="f"\r\n\r\n\r\n'
+                '--{0}\r\nContent-Disposition: form-data; name="file";'
+                ' filename="name.txt"\r\n'
+                "Content-Type: text/plain\r\n\r\n<file content>\r\n"
+                "--{0}--\r\n"
+                "".format(boundary).encode("ascii")
+            )
+            assert headers == {
+                "Content-Type": f"multipart/form-data; boundary={boundary}",
+                "Content-Length": str(len(content)),
+            }
+            assert content == b"".join(stream)
 
 
 def test_multipart_encode_unicode_file_contents() -> None:

Reply via email to