This is an automated email from the ASF dual-hosted git repository.
lidavidm pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/arrow-adbc.git
The following commit(s) were added to refs/heads/main by this push:
new 08ced8d1a docs(python): use public API to consume stream handle (#2105)
08ced8d1a is described below
commit 08ced8d1a30e4b2dd7a6aef9f6acd634b6147ac6
Author: Joris Van den Bossche <[email protected]>
AuthorDate: Tue Aug 27 16:06:08 2024 +0200
docs(python): use public API to consume stream handle (#2105)
Small follow-up on https://github.com/apache/arrow-adbc/pull/2097
This method is available starting from pyarrow 15.0, only one release
after introduction of the general protocol, so I think it should be fine
to just use that.
---
docs/source/python/recipe/driver_manager_lowlevel.py | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/docs/source/python/recipe/driver_manager_lowlevel.py
b/docs/source/python/recipe/driver_manager_lowlevel.py
index ed34650b1..e8137184d 100644
--- a/docs/source/python/recipe/driver_manager_lowlevel.py
+++ b/docs/source/python/recipe/driver_manager_lowlevel.py
@@ -58,7 +58,7 @@ handle, rowcount = stmt.execute_query()
#: (other drivers, like the PostgreSQL driver, may know).
assert rowcount == -1
#: We can use the PyArrow APIs to read the result.
-reader =
pyarrow.RecordBatchReader._import_from_c_capsule(handle.__arrow_c_stream__())
+reader = pyarrow.RecordBatchReader.from_stream(handle)
assert reader.schema == pyarrow.schema([("THEANSWER", "int64")])
#: Finally, we have to clean up all the objects. (They also support the
#: context manager protocol.)