Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package python-httpx for openSUSE:Factory 
checked in at 2023-09-12 21:02:17
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python-httpx (Old)
 and      /work/SRC/openSUSE:Factory/.python-httpx.new.1766 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "python-httpx"

Tue Sep 12 21:02:17 2023 rev:10 rq:1110267 version:0.24.1

Changes:
--------
--- /work/SRC/openSUSE:Factory/python-httpx/python-httpx.changes        
2023-05-09 13:06:22.768678731 +0200
+++ /work/SRC/openSUSE:Factory/.python-httpx.new.1766/python-httpx.changes      
2023-09-12 21:02:37.139609768 +0200
@@ -1,0 +2,14 @@
+Sat Sep  9 23:10:43 UTC 2023 - Torsten Gruner <simmpho...@opensuse.org>
+
+- update to 0.24.1
+  * Provide additional context in some InvalidURL exceptions. (#2675)
+  * Fix optional percent-encoding behaviour. (#2671)
+  * More robust checking for opening upload files in binary mode. (#2630)
+  * Properly support IP addresses in NO_PROXY environment variable. (#2659)
+  * Set default file for NetRCAuth() to None to use the stdlib default. (#2667)
+  * Set logging request lines to INFO level for async requests, in line
+    with sync requests. (#2656)
+  * Fix which gen-delims need to be escaped for path/query/fragment
+    components in URL. (#2701)
+
+-------------------------------------------------------------------

Old:
----
  httpx-0.24.0.tar.gz

New:
----
  httpx-0.24.1.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ python-httpx.spec ++++++
--- /var/tmp/diff_new_pack.LabJIt/_old  2023-09-12 21:02:38.635663131 +0200
+++ /var/tmp/diff_new_pack.LabJIt/_new  2023-09-12 21:02:38.639663274 +0200
@@ -27,7 +27,7 @@
 
 %{?sle15_python_module_pythons}
 Name:           python-httpx%{psuffix}
-Version:        0.24.0
+Version:        0.24.1
 Release:        0
 Summary:        Python HTTP client with async support
 License:        BSD-3-Clause
@@ -41,9 +41,9 @@
 BuildRequires:  fdupes
 BuildRequires:  python-rpm-macros
 Requires:       python-certifi
-Requires:       python-httpcore >= 0.15.0
 Requires:       python-idna >= 2.0
 Requires:       python-sniffio
+Requires:       (python-httpcore >= 0.17.2 with python-httpcore < 0.18.0)
 Recommends:     python-Brotli
 Recommends:     python-Pygments >= 2
 Recommends:     python-click >= 8
@@ -78,7 +78,7 @@
 %prep
 %autosetup -p1 -n httpx-%{version}
 # remove turning pytest warnings into error
-sed -i '/tool.pytest/,$ {/error/d}' setup.cfg
+#sed -i '/tool.pytest/,$ {/error/d}' setup.cfg
 
 %build
 %pyproject_wheel

++++++ httpx-0.24.0.tar.gz -> httpx-0.24.1.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.24.0/.github/PULL_REQUEST_TEMPLATE.md 
new/httpx-0.24.1/.github/PULL_REQUEST_TEMPLATE.md
--- old/httpx-0.24.0/.github/PULL_REQUEST_TEMPLATE.md   2023-04-11 
12:00:02.000000000 +0200
+++ new/httpx-0.24.1/.github/PULL_REQUEST_TEMPLATE.md   2023-05-18 
13:03:21.000000000 +0200
@@ -1,9 +1,12 @@
-The starting point for contributions should usually be [a 
discussion](https://github.com/encode/httpx/discussions)
+<!-- Thanks for contributing to HTTPX! 💚
+Given this is a project maintained by volunteers, please read this template to 
not waste your time, or ours! 😁 -->
 
-Simple documentation typos may be raised as stand-alone pull requests, but 
otherwise
-please ensure you've discussed your proposal prior to issuing a pull request.
+# Summary
 
-This will help us direct work appropriately, and ensure that any suggested 
changes
-have been okayed by the maintainers.
+<!-- Write a small summary about what is happening here. -->
 
-- [ ] Initially raised as discussion #...
+# Checklist
+
+- [ ] I understand that this PR may be closed in case there was no previous 
discussion. (This doesn't apply to typos!)
+- [ ] I've added a test for each change that was introduced, and I tried as 
much as possible to make a single atomic change.
+- [ ] I've updated the documentation accordingly.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.24.0/CHANGELOG.md 
new/httpx-0.24.1/CHANGELOG.md
--- old/httpx-0.24.0/CHANGELOG.md       2023-04-11 12:00:02.000000000 +0200
+++ new/httpx-0.24.1/CHANGELOG.md       2023-05-18 13:03:21.000000000 +0200
@@ -4,6 +4,21 @@
 
 The format is based on [Keep a 
Changelog](https://keepachangelog.com/en/1.0.0/).
 
+## 0.24.1 (17th May, 2023)
+
+### Added
+
+* Provide additional context in some `InvalidURL` exceptions. (#2675)
+
+### Fixed
+
+* Fix optional percent-encoding behaviour. (#2671)
+* More robust checking for opening upload files in binary mode. (#2630)
+* Properly support IP addresses in `NO_PROXY` environment variable. (#2659)
+* Set default file for `NetRCAuth()` to `None` to use the stdlib default. 
(#2667)
+* Set logging request lines to INFO level for async requests, in line with 
sync requests. (#2656)
+* Fix which gen-delims need to be escaped for path/query/fragment components 
in URL. (#2701)
+
 ## 0.24.0 (6th April, 2023)
 
 ### Changed
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.24.0/docs/third_party_packages.md 
new/httpx-0.24.1/docs/third_party_packages.md
--- old/httpx-0.24.0/docs/third_party_packages.md       2023-04-11 
12:00:02.000000000 +0200
+++ new/httpx-0.24.1/docs/third_party_packages.md       2023-05-18 
13:03:21.000000000 +0200
@@ -54,6 +54,12 @@
 
 This package adds caching functionality to HTTPX
 
+### httpx-sse
+
+[GitHub](https://github.com/florimondmanca/httpx-sse)
+
+Allows consuming Server-Sent Events (SSE) with HTTPX.
+
 ### robox
 
 [Github](https://github.com/danclaudiupop/robox)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.24.0/httpx/__version__.py 
new/httpx-0.24.1/httpx/__version__.py
--- old/httpx-0.24.0/httpx/__version__.py       2023-04-11 12:00:02.000000000 
+0200
+++ new/httpx-0.24.1/httpx/__version__.py       2023-05-18 13:03:21.000000000 
+0200
@@ -1,3 +1,3 @@
 __title__ = "httpx"
 __description__ = "A next generation HTTP client, for Python 3."
-__version__ = "0.24.0"
+__version__ = "0.24.1"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.24.0/httpx/_auth.py 
new/httpx-0.24.1/httpx/_auth.py
--- old/httpx-0.24.0/httpx/_auth.py     2023-04-11 12:00:02.000000000 +0200
+++ new/httpx-0.24.1/httpx/_auth.py     2023-05-18 13:03:21.000000000 +0200
@@ -147,7 +147,7 @@
     Use a 'netrc' file to lookup basic auth credentials based on the url host.
     """
 
-    def __init__(self, file: typing.Optional[str]):
+    def __init__(self, file: typing.Optional[str] = None):
         self._netrc_info = netrc.netrc(file)
 
     def auth_flow(self, request: Request) -> typing.Generator[Request, 
Response, None]:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.24.0/httpx/_client.py 
new/httpx-0.24.1/httpx/_client.py
--- old/httpx-0.24.0/httpx/_client.py   2023-04-11 12:00:02.000000000 +0200
+++ new/httpx-0.24.1/httpx/_client.py   2023-05-18 13:03:21.000000000 +0200
@@ -1726,10 +1726,13 @@
         self.cookies.extract_cookies(response)
         response.default_encoding = self._default_encoding
 
-        status = f"{response.status_code} {response.reason_phrase}"
-        response_line = f"{response.http_version} {status}"
-        logger.debug(
-            'HTTP Request: %s %s "%s"', request.method, request.url, 
response_line
+        logger.info(
+            'HTTP Request: %s %s "%s %d %s"',
+            request.method,
+            request.url,
+            response.http_version,
+            response.status_code,
+            response.reason_phrase,
         )
 
         return response
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.24.0/httpx/_multipart.py 
new/httpx-0.24.1/httpx/_multipart.py
--- old/httpx-0.24.0/httpx/_multipart.py        2023-04-11 12:00:02.000000000 
+0200
+++ new/httpx-0.24.1/httpx/_multipart.py        2023-05-18 13:03:21.000000000 
+0200
@@ -122,14 +122,14 @@
             # requests does the opposite (it overwrites the header with the 
3rd tuple element)
             headers["Content-Type"] = content_type
 
-        if "b" not in getattr(fileobj, "mode", "b"):
-            raise TypeError(
-                "Multipart file uploads must be opened in binary mode, not 
text mode."
-            )
         if isinstance(fileobj, io.StringIO):
             raise TypeError(
                 "Multipart file uploads require 'io.BytesIO', not 
'io.StringIO'."
             )
+        if isinstance(fileobj, io.TextIOBase):
+            raise TypeError(
+                "Multipart file uploads must be opened in binary mode, not 
text mode."
+            )
 
         self.filename = filename
         self.file = fileobj
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.24.0/httpx/_urlparse.py 
new/httpx-0.24.1/httpx/_urlparse.py
--- old/httpx-0.24.0/httpx/_urlparse.py 2023-04-11 12:00:02.000000000 +0200
+++ new/httpx-0.24.1/httpx/_urlparse.py 2023-05-18 13:03:21.000000000 +0200
@@ -253,12 +253,19 @@
     if has_authority:
         path = normalize_path(path)
 
-    parsed_path: str = quote(path, safe=SUB_DELIMS + ":@/")
+    # The GEN_DELIMS set is... : / ? # [ ] @
+    # These do not need to be percent-quoted unless they serve as delimiters 
for the
+    # specific component.
+
+    # For 'path' we need to drop ? and # from the GEN_DELIMS set.
+    parsed_path: str = quote(path, safe=SUB_DELIMS + ":/[]@")
+    # For 'query' we need to drop '#' from the GEN_DELIMS set.
     parsed_query: typing.Optional[str] = (
-        None if query is None else quote(query, safe=SUB_DELIMS + "/?")
+        None if query is None else quote(query, safe=SUB_DELIMS + ":/?[]@")
     )
+    # For 'fragment' we can include all of the GEN_DELIMS set.
     parsed_fragment: typing.Optional[str] = (
-        None if fragment is None else quote(fragment, safe=SUB_DELIMS + "/?")
+        None if fragment is None else quote(fragment, safe=SUB_DELIMS + 
":/?#[]@")
     )
 
     # The parsed ASCII bytestrings are our canonical form.
@@ -287,7 +294,7 @@
         try:
             ipaddress.IPv4Address(host)
         except ipaddress.AddressValueError:
-            raise InvalidURL("Invalid IPv4 address")
+            raise InvalidURL(f"Invalid IPv4 address: {host!r}")
         return host
 
     elif IPv6_STYLE_HOSTNAME.match(host):
@@ -302,7 +309,7 @@
         try:
             ipaddress.IPv6Address(host[1:-1])
         except ipaddress.AddressValueError:
-            raise InvalidURL("Invalid IPv6 address")
+            raise InvalidURL(f"Invalid IPv6 address: {host!r}")
         return host[1:-1]
 
     elif host.isascii():
@@ -317,7 +324,7 @@
     try:
         return idna.encode(host.lower()).decode("ascii")
     except idna.IDNAError:
-        raise InvalidURL("Invalid IDNA hostname")
+        raise InvalidURL(f"Invalid IDNA hostname: {host!r}")
 
 
 def normalize_port(
@@ -338,7 +345,7 @@
     try:
         port_as_int = int(port)
     except ValueError:
-        raise InvalidURL("Invalid port")
+        raise InvalidURL(f"Invalid port: {port!r}")
 
     # See https://url.spec.whatwg.org/#url-miscellaneous
     default_port = {"ftp": 21, "http": 80, "https": 443, "ws": 80, "wss": 
443}.get(
@@ -399,7 +406,7 @@
 
 def percent_encode(char: str) -> str:
     """
-    Replace every character in a string with the percent-encoded 
representation.
+    Replace a single character with the percent-encoded representation.
 
     Characters outside the ASCII range are represented with their a 
percent-encoded
     representation of their UTF-8 byte sequence.
@@ -411,13 +418,29 @@
     return "".join([f"%{byte:02x}" for byte in char.encode("utf-8")]).upper()
 
 
+def is_safe(string: str, safe: str = "/") -> bool:
+    """
+    Determine if a given string is already quote-safe.
+    """
+    NON_ESCAPED_CHARS = UNRESERVED_CHARACTERS + safe + "%"
+
+    # All characters must already be non-escaping or '%'
+    for char in string:
+        if char not in NON_ESCAPED_CHARS:
+            return False
+
+    # Any '%' characters must be valid '%xx' escape sequences.
+    return string.count("%") == len(PERCENT_ENCODED_REGEX.findall(string))
+
+
 def quote(string: str, safe: str = "/") -> str:
-    NON_ESCAPED_CHARS = UNRESERVED_CHARACTERS + safe
-    if string.count("%") == len(PERCENT_ENCODED_REGEX.findall(string)):
-        # If all occurances of '%' are valid '%xx' escapes, then treat
-        # percent as a non-escaping character.
-        NON_ESCAPED_CHARS += "%"
+    """
+    Use percent-encoding to quote a string if required.
+    """
+    if is_safe(string, safe=safe):
+        return string
 
+    NON_ESCAPED_CHARS = UNRESERVED_CHARACTERS + safe
     return "".join(
         [char if char in NON_ESCAPED_CHARS else percent_encode(char) for char 
in string]
     )
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.24.0/httpx/_utils.py 
new/httpx-0.24.1/httpx/_utils.py
--- old/httpx-0.24.0/httpx/_utils.py    2023-04-11 12:00:02.000000000 +0200
+++ new/httpx-0.24.1/httpx/_utils.py    2023-05-18 13:03:21.000000000 +0200
@@ -1,5 +1,6 @@
 import codecs
 import email.message
+import ipaddress
 import mimetypes
 import os
 import re
@@ -259,7 +260,16 @@
             # NO_PROXY=google.com is marked as "all://*google.com,
             #   which disables "www.google.com" and "google.com".
             #   (But not "wwwgoogle.com")
-            mounts[f"all://*{hostname}"] = None
+            # NO_PROXY can include domains, IPv6, IPv4 addresses and 
"localhost"
+            #   NO_PROXY=example.com,::1,localhost,192.168.0.0/16
+            if is_ipv4_hostname(hostname):
+                mounts[f"all://{hostname}"] = None
+            elif is_ipv6_hostname(hostname):
+                mounts[f"all://[{hostname}]"] = None
+            elif hostname.lower() == "localhost":
+                mounts[f"all://{hostname}"] = None
+            else:
+                mounts[f"all://*{hostname}"] = None
 
     return mounts
 
@@ -449,3 +459,19 @@
 
     def __eq__(self, other: typing.Any) -> bool:
         return isinstance(other, URLPattern) and self.pattern == other.pattern
+
+
+def is_ipv4_hostname(hostname: str) -> bool:
+    try:
+        ipaddress.IPv4Address(hostname.split("/")[0])
+    except Exception:
+        return False
+    return True
+
+
+def is_ipv6_hostname(hostname: str) -> bool:
+    try:
+        ipaddress.IPv6Address(hostname.split("/")[0])
+    except Exception:
+        return False
+    return True
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.24.0/pyproject.toml 
new/httpx-0.24.1/pyproject.toml
--- old/httpx-0.24.0/pyproject.toml     2023-04-11 12:00:02.000000000 +0200
+++ new/httpx-0.24.1/pyproject.toml     2023-05-18 13:03:21.000000000 +0200
@@ -99,3 +99,29 @@
 
 [tool.ruff.isort]
 combine-as-imports = true
+
+[tool.mypy]
+ignore_missing_imports = true
+strict = true
+
+[[tool.mypy.overrides]]
+module = "tests.*"
+disallow_untyped_defs = false
+check_untyped_defs = true
+
+[tool.pytest.ini_options]
+addopts = "-rxXs"
+filterwarnings = [
+  "error",
+  "ignore: You seem to already have a custom sys.excepthook handler installed. 
I'll skip installing Trio's custom handler, but this means MultiErrors will not 
show full tracebacks.:RuntimeWarning",
+  # See: https://github.com/agronholm/anyio/issues/508
+  "ignore: trio.MultiError is deprecated since Trio 
0.22.0:trio.TrioDeprecationWarning"
+]
+markers = [
+  "copied_from(source, changes=None): mark test as copied from somewhere else, 
along with a description of changes made to accodomate e.g. our test setup",
+  "network: marks tests which require network connection. Used in 3rd-party 
build environments that have network disabled."
+]
+
+[tool.coverage.run]
+omit = ["venv/*", "httpx/_compat.py"]
+include = ["httpx/*", "tests/*"]
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.24.0/requirements.txt 
new/httpx-0.24.1/requirements.txt
--- old/httpx-0.24.0/requirements.txt   2023-04-11 12:00:02.000000000 +0200
+++ new/httpx-0.24.1/requirements.txt   2023-05-18 13:03:21.000000000 +0200
@@ -7,12 +7,12 @@
 # Optional charset auto-detection
 # Used in our test cases
 chardet==5.1.0
-types-chardet==5.0.4.2
+types-chardet==5.0.4.5
 
 # Documentation
 mkdocs==1.4.2
 mkautodoc==0.2.0
-mkdocs-material==9.0.15
+mkdocs-material==9.1.5
 
 # Packaging
 build==0.10.0
@@ -20,13 +20,13 @@
 
 # Tests & Linting
 black==23.3.0
-coverage==7.2.2
-cryptography==39.0.1
+coverage[toml]==7.2.2
+cryptography==40.0.2
 mypy==1.0.1
 types-certifi==2021.10.8.2
 pytest==7.2.2
 ruff==0.0.260
 trio==0.22.0
-trio-typing==0.7.0
-trustme==0.9.0
-uvicorn==0.20.0
+trio-typing==0.8.0
+trustme==1.0.0
+uvicorn==0.22.0
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.24.0/scripts/check 
new/httpx-0.24.1/scripts/check
--- old/httpx-0.24.0/scripts/check      2023-04-11 12:00:02.000000000 +0200
+++ new/httpx-0.24.1/scripts/check      2023-05-18 13:03:21.000000000 +0200
@@ -11,4 +11,4 @@
 ./scripts/sync-version
 ${PREFIX}black --check --diff --target-version=py37 $SOURCE_FILES
 ${PREFIX}mypy $SOURCE_FILES
-${PREFIX}ruff check --diff $SOURCE_FILES
+${PREFIX}ruff check $SOURCE_FILES
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.24.0/setup.cfg new/httpx-0.24.1/setup.cfg
--- old/httpx-0.24.0/setup.cfg  2023-04-11 12:00:02.000000000 +0200
+++ new/httpx-0.24.1/setup.cfg  1970-01-01 01:00:00.000000000 +0100
@@ -1,22 +0,0 @@
-[mypy]
-ignore_missing_imports = True
-strict = True
-
-[mypy-tests.*]
-disallow_untyped_defs = False
-check_untyped_defs = True
-
-[tool:pytest]
-addopts = -rxXs
-filterwarnings =
-  error
-  ignore: You seem to already have a custom sys.excepthook handler installed. 
I'll skip installing Trio's custom handler, but this means MultiErrors will not 
show full tracebacks.:RuntimeWarning
-  # See: https://github.com/agronholm/anyio/issues/508
-  ignore: trio\.MultiError is deprecated since Trio 
0\.22\.0:trio.TrioDeprecationWarning
-markers =
-  copied_from(source, changes=None): mark test as copied from somewhere else, 
along with a description of changes made to accodomate e.g. our test setup
-  network: marks tests which require network connection. Used in 3rd-party 
build environments that have network disabled.
-
-[coverage:run]
-omit = venv/*, httpx/_compat.py
-include = httpx/*, tests/*
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.24.0/tests/models/test_responses.py 
new/httpx-0.24.1/tests/models/test_responses.py
--- old/httpx-0.24.0/tests/models/test_responses.py     2023-04-11 
12:00:02.000000000 +0200
+++ new/httpx-0.24.1/tests/models/test_responses.py     2023-05-18 
13:03:21.000000000 +0200
@@ -380,17 +380,18 @@
 
 def test_iter_raw_with_chunksize():
     response = httpx.Response(200, content=streaming_body())
-
     parts = [part for part in response.iter_raw(chunk_size=5)]
     assert parts == [b"Hello", b", wor", b"ld!"]
 
     response = httpx.Response(200, content=streaming_body())
+    parts = [part for part in response.iter_raw(chunk_size=7)]
+    assert parts == [b"Hello, ", b"world!"]
 
+    response = httpx.Response(200, content=streaming_body())
     parts = [part for part in response.iter_raw(chunk_size=13)]
     assert parts == [b"Hello, world!"]
 
     response = httpx.Response(200, content=streaming_body())
-
     parts = [part for part in response.iter_raw(chunk_size=20)]
     assert parts == [b"Hello, world!"]
 
@@ -596,6 +597,14 @@
     parts = [part for part in response.iter_text(chunk_size=5)]
     assert parts == ["Hello", ", wor", "ld!"]
 
+    response = httpx.Response(200, content=b"Hello, world!!")
+    parts = [part for part in response.iter_text(chunk_size=7)]
+    assert parts == ["Hello, ", "world!!"]
+
+    response = httpx.Response(200, content=b"Hello, world!")
+    parts = [part for part in response.iter_text(chunk_size=7)]
+    assert parts == ["Hello, ", "world!"]
+
     response = httpx.Response(200, content=b"Hello, world!")
     parts = [part for part in response.iter_text(chunk_size=13)]
     assert parts == ["Hello, world!"]
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.24.0/tests/test_decoders.py 
new/httpx-0.24.1/tests/test_decoders.py
--- old/httpx-0.24.0/tests/test_decoders.py     2023-04-11 12:00:02.000000000 
+0200
+++ new/httpx-0.24.1/tests/test_decoders.py     2023-05-18 13:03:21.000000000 
+0200
@@ -1,12 +1,11 @@
 import typing
 import zlib
 
+import brotli
 import chardet
 import pytest
 
 import httpx
-from httpx._compat import brotli
-from httpx._decoders import ByteChunker, LineDecoder, TextChunker, TextDecoder
 
 
 def test_deflate():
@@ -214,125 +213,55 @@
 
 
 def test_text_decoder_empty_cases():
-    decoder = TextDecoder()
-    assert decoder.flush() == ""
+    response = httpx.Response(200, content=b"")
+    assert response.text == ""
 
-    decoder = TextDecoder()
-    assert decoder.decode(b"") == ""
-    assert decoder.flush() == ""
+    response = httpx.Response(200, content=[b""])
+    response.read()
+    assert response.text == ""
 
 
 def test_line_decoder_nl():
-    decoder = LineDecoder()
-    assert decoder.decode("") == []
-    assert decoder.decode("a\n\nb\nc") == ["a", "", "b"]
-    assert decoder.flush() == ["c"]
-
-    decoder = LineDecoder()
-    assert decoder.decode("") == []
-    assert decoder.decode("a\n\nb\nc\n") == ["a", "", "b", "c"]
-    assert decoder.flush() == []
+    response = httpx.Response(200, content=[b""])
+    assert list(response.iter_lines()) == []
+
+    response = httpx.Response(200, content=[b"", b"a\n\nb\nc"])
+    assert list(response.iter_lines()) == ["a", "", "b", "c"]
 
     # Issue #1033
-    decoder = LineDecoder()
-    assert decoder.decode("") == []
-    assert decoder.decode("12345\n") == ["12345"]
-    assert decoder.decode("foo ") == []
-    assert decoder.decode("bar ") == []
-    assert decoder.decode("baz\n") == ["foo bar baz"]
-    assert decoder.flush() == []
+    response = httpx.Response(
+        200, content=[b"", b"12345\n", b"foo ", b"bar ", b"baz\n"]
+    )
+    assert list(response.iter_lines()) == ["12345", "foo bar baz"]
 
 
 def test_line_decoder_cr():
-    decoder = LineDecoder()
-    assert decoder.decode("") == []
-    assert decoder.decode("a\r\rb\rc") == ["a", "", "b"]
-    assert decoder.flush() == ["c"]
-
-    decoder = LineDecoder()
-    assert decoder.decode("") == []
-    assert decoder.decode("a\r\rb\rc\r") == ["a", "", "b"]
-    assert decoder.flush() == ["c"]
+    response = httpx.Response(200, content=[b"", b"a\r\rb\rc"])
+    assert list(response.iter_lines()) == ["a", "", "b", "c"]
+
+    response = httpx.Response(200, content=[b"", b"a\r\rb\rc\r"])
+    assert list(response.iter_lines()) == ["a", "", "b", "c"]
 
     # Issue #1033
-    decoder = LineDecoder()
-    assert decoder.decode("") == []
-    assert decoder.decode("12345\r") == []
-    assert decoder.decode("foo ") == ["12345"]
-    assert decoder.decode("bar ") == []
-    assert decoder.decode("baz\r") == []
-    assert decoder.flush() == ["foo bar baz"]
+    response = httpx.Response(
+        200, content=[b"", b"12345\r", b"foo ", b"bar ", b"baz\r"]
+    )
+    assert list(response.iter_lines()) == ["12345", "foo bar baz"]
 
 
 def test_line_decoder_crnl():
-    decoder = LineDecoder()
-    assert decoder.decode("") == []
-    assert decoder.decode("a\r\n\r\nb\r\nc") == ["a", "", "b"]
-    assert decoder.flush() == ["c"]
-
-    decoder = LineDecoder()
-    assert decoder.decode("") == []
-    assert decoder.decode("a\r\n\r\nb\r\nc\r\n") == ["a", "", "b", "c"]
-    assert decoder.flush() == []
-
-    decoder = LineDecoder()
-    assert decoder.decode("") == []
-    assert decoder.decode("a\r") == []
-    assert decoder.decode("\n\r\nb\r\nc") == ["a", "", "b"]
-    assert decoder.flush() == ["c"]
+    response = httpx.Response(200, content=[b"", b"a\r\n\r\nb\r\nc"])
+    assert list(response.iter_lines()) == ["a", "", "b", "c"]
+
+    response = httpx.Response(200, content=[b"", b"a\r\n\r\nb\r\nc\r\n"])
+    assert list(response.iter_lines()) == ["a", "", "b", "c"]
+
+    response = httpx.Response(200, content=[b"", b"a\r", b"\n\r\nb\r\nc"])
+    assert list(response.iter_lines()) == ["a", "", "b", "c"]
 
     # Issue #1033
-    decoder = LineDecoder()
-    assert decoder.decode("") == []
-    assert decoder.decode("12345\r\n") == ["12345"]
-    assert decoder.decode("foo ") == []
-    assert decoder.decode("bar ") == []
-    assert decoder.decode("baz\r\n") == ["foo bar baz"]
-    assert decoder.flush() == []
-
-
-def test_byte_chunker():
-    decoder = ByteChunker()
-    assert decoder.decode(b"1234567") == [b"1234567"]
-    assert decoder.decode(b"89") == [b"89"]
-    assert decoder.flush() == []
-
-    decoder = ByteChunker(chunk_size=3)
-    assert decoder.decode(b"1234567") == [b"123", b"456"]
-    assert decoder.decode(b"89") == [b"789"]
-    assert decoder.flush() == []
-
-    decoder = ByteChunker(chunk_size=3)
-    assert decoder.decode(b"123456") == [b"123", b"456"]
-    assert decoder.decode(b"789") == [b"789"]
-    assert decoder.flush() == []
-
-    decoder = ByteChunker(chunk_size=3)
-    assert decoder.decode(b"123456") == [b"123", b"456"]
-    assert decoder.decode(b"78") == []
-    assert decoder.flush() == [b"78"]
-
-
-def test_text_chunker():
-    decoder = TextChunker()
-    assert decoder.decode("1234567") == ["1234567"]
-    assert decoder.decode("89") == ["89"]
-    assert decoder.flush() == []
-
-    decoder = TextChunker(chunk_size=3)
-    assert decoder.decode("1234567") == ["123", "456"]
-    assert decoder.decode("89") == ["789"]
-    assert decoder.flush() == []
-
-    decoder = TextChunker(chunk_size=3)
-    assert decoder.decode("123456") == ["123", "456"]
-    assert decoder.decode("789") == ["789"]
-    assert decoder.flush() == []
-
-    decoder = TextChunker(chunk_size=3)
-    assert decoder.decode("123456") == ["123", "456"]
-    assert decoder.decode("78") == []
-    assert decoder.flush() == ["78"]
+    response = httpx.Response(200, content=[b"", b"12345\r\n", b"foo bar 
baz\r\n"])
+    assert list(response.iter_lines()) == ["12345", "foo bar baz"]
 
 
 def test_invalid_content_encoding_header():
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.24.0/tests/test_urlparse.py 
new/httpx-0.24.1/tests/test_urlparse.py
--- old/httpx-0.24.0/tests/test_urlparse.py     2023-04-11 12:00:02.000000000 
+0200
+++ new/httpx-0.24.1/tests/test_urlparse.py     2023-05-18 13:03:21.000000000 
+0200
@@ -53,7 +53,7 @@
 def test_urlparse_invalid_ipv4():
     with pytest.raises(httpx.InvalidURL) as exc:
         httpx.URL("https://999.999.999.999/";)
-    assert str(exc.value) == "Invalid IPv4 address"
+    assert str(exc.value) == "Invalid IPv4 address: '999.999.999.999'"
 
 
 def test_urlparse_valid_ipv6():
@@ -64,7 +64,7 @@
 def test_urlparse_invalid_ipv6():
     with pytest.raises(httpx.InvalidURL) as exc:
         httpx.URL("https://[2001]/";)
-    assert str(exc.value) == "Invalid IPv6 address"
+    assert str(exc.value) == "Invalid IPv6 address: '[2001]'"
 
 
 def test_urlparse_unescaped_idna_host():
@@ -80,7 +80,7 @@
 def test_urlparse_invalid_idna_host():
     with pytest.raises(httpx.InvalidURL) as exc:
         httpx.URL("https://☃.com/";)
-    assert str(exc.value) == "Invalid IDNA hostname"
+    assert str(exc.value) == "Invalid IDNA hostname: '☃.com'"
 
 
 # Tests for different port types
@@ -100,7 +100,7 @@
 def test_urlparse_invalid_port():
     with pytest.raises(httpx.InvalidURL) as exc:
         httpx.URL("https://example.com:abc/";)
-    assert str(exc.value) == "Invalid port"
+    assert str(exc.value) == "Invalid port: 'abc'"
 
 
 # Tests for path handling
@@ -126,6 +126,24 @@
     assert url.path == "../abc"
 
 
+# Tests for optional percent encoding
+
+
+def test_param_requires_encoding():
+    url = httpx.URL("http://webservice";, params={"u": "with spaces"})
+    assert str(url) == "http://webservice?u=with%20spaces";
+
+
+def test_param_does_not_require_encoding():
+    url = httpx.URL("http://webservice";, params={"u": "with%20spaces"})
+    assert str(url) == "http://webservice?u=with%20spaces";
+
+
+def test_param_with_existing_escape_requires_encoding():
+    url = httpx.URL("http://webservice";, params={"u": 
"http://example.com?q=foo%2Fa"})
+    assert str(url) == 
"http://webservice?u=http%3A//example.com%3Fq%3Dfoo%252Fa";
+
+
 # Tests for invalid URLs
 
 
@@ -229,3 +247,33 @@
 
     url = url.copy_with(path="/abc")
     assert str(url) == "http://example.com/abc";
+
+
+# Tests for percent encoding across path, query, and fragement...
+
+
+def test_path_percent_encoding():
+    # Test percent encoding for SUB_DELIMS ALPHA NUM and allowable GEN_DELIMS
+    url = httpx.URL("https://example.com/!$&;'()*+,;= abc ABC 123 :/[]@")
+    assert url.raw_path == b"/!$&'()*+,;=%20abc%20ABC%20123%20:/[]@"
+    assert url.path == "/!$&'()*+,;= abc ABC 123 :/[]@"
+    assert url.query == b""
+    assert url.fragment == ""
+
+
+def test_query_percent_encoding():
+    # Test percent encoding for SUB_DELIMS ALPHA NUM and allowable GEN_DELIMS
+    url = httpx.URL("https://example.com/?!$&;'()*+,;= abc ABC 123 :/[]@" + "?")
+    assert url.raw_path == b"/?!$&'()*+,;=%20abc%20ABC%20123%20:/[]@?"
+    assert url.path == "/"
+    assert url.query == b"!$&'()*+,;=%20abc%20ABC%20123%20:/[]@?"
+    assert url.fragment == ""
+
+
+def test_fragment_percent_encoding():
+    # Test percent encoding for SUB_DELIMS ALPHA NUM and allowable GEN_DELIMS
+    url = httpx.URL("https://example.com/#!$&;'()*+,;= abc ABC 123 :/[]@" + 
"?#")
+    assert url.raw_path == b"/"
+    assert url.path == "/"
+    assert url.query == b""
+    assert url.fragment == "!$&'()*+,;= abc ABC 123 :/[]@?#"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/httpx-0.24.0/tests/test_utils.py 
new/httpx-0.24.1/tests/test_utils.py
--- old/httpx-0.24.0/tests/test_utils.py        2023-04-11 12:00:02.000000000 
+0200
+++ new/httpx-0.24.1/tests/test_utils.py        2023-05-18 13:03:21.000000000 
+0200
@@ -185,6 +185,12 @@
         ),
         ({"all_proxy": "http://127.0.0.1"}, {"all://": "http://127.0.0.1"}),
         ({"TRAVIS_APT_PROXY": "http://127.0.0.1"}, {}),
+        ({"no_proxy": "127.0.0.1"}, {"all://127.0.0.1": None}),
+        ({"no_proxy": "192.168.0.0/16"}, {"all://192.168.0.0/16": None}),
+        ({"no_proxy": "::1"}, {"all://[::1]": None}),
+        ({"no_proxy": "localhost"}, {"all://localhost": None}),
+        ({"no_proxy": "github.com"}, {"all://*github.com": None}),
+        ({"no_proxy": ".github.com"}, {"all://*.github.com": None}),
     ],
 )
 def test_get_environment_proxies(environment, proxies):

Reply via email to