This is an automated email from the ASF dual-hosted git repository.
wesm pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/arrow.git
The following commit(s) were added to refs/heads/master by this push:
new d64a231 ARROW-2181: [PYTHON][DOC] Add doc on usage of concat_tables
d64a231 is described below
commit d64a2318d8a7113c2aa6362b3476c68cc429679d
Author: Bryan Cutler <[email protected]>
AuthorDate: Sat Mar 10 13:34:22 2018 -0500
ARROW-2181: [PYTHON][DOC] Add doc on usage of concat_tables
Adding Python API doc on usage of pa.concat_tables.
Author: Bryan Cutler <[email protected]>
Closes #1733 from BryanCutler/doc-python-concat_table-ARROW-2181 and
squashes the following commits:
2f8d9f80 <Bryan Cutler> use full pyarrow in api ref
95087090 <Bryan Cutler> added doc on usage for concat_tables
---
python/doc/source/data.rst | 16 ++++++++++++++++
1 file changed, 16 insertions(+)
diff --git a/python/doc/source/data.rst b/python/doc/source/data.rst
index 3a602d5..0717260 100644
--- a/python/doc/source/data.rst
+++ b/python/doc/source/data.rst
@@ -393,6 +393,22 @@ objects to contiguous NumPy arrays for use in pandas:
c.to_pandas()
+Multiple tables can also be concatenated together to form a single table using
+``pyarrow.concat_tables``, if the schemas are equal:
+
+.. ipython:: python
+
+ tables = [table] * 2
+ table_all = pa.concat_tables(tables)
+ table_all.num_rows
+ c = table_all[0]
+ c.data.num_chunks
+
+This is similar to ``Table.from_batches``, but uses tables as input instead of
+record batches. Record batches can be made into tables, but not the other way
+around, so if your data is already in table form, then use
+``pyarrow.concat_tables``.
+
Custom Schema and Field Metadata
--------------------------------
--
To stop receiving notification emails like this one, please contact
[email protected].