danepitkin opened a new issue, #37217:
URL: https://github.com/apache/arrow/issues/37217

   ### Describe the bug, including details regarding any error messages, 
version, and platform.
   
   The Cython 3.0.0 upgrade https://github.com/apache/arrow/pull/37097 is 
failing to generate numpydocs. There are missing docstrings for many functions 
that were going unreported before. Let's fix them in a separate PR due to the 
volume.
   
   Example here 
https://github.com/apache/arrow/actions/runs/5868446229/job/15911310413?pr=37097:
   ```
   INFO:archery:Running Python docstring linters
   
   pyarrow._compute.Expression.equals
   -> pyarrow._compute.Expression.equals(self, Expression other)
   PR01: Parameters {'other'} not documented
   
   pyarrow.gandiva.TreeExprBuilder.make_or
   -> pyarrow.gandiva.TreeExprBuilder.make_or(self, children)
   PR01: Parameters {'children'} not documented
   
   pyarrow.gandiva.TreeExprBuilder.make_literal
   -> pyarrow.gandiva.TreeExprBuilder.make_literal(self, value, dtype)
   PR01: Parameters {'dtype', 'value'} not documented
   
   pyarrow.gandiva.TreeExprBuilder.make_in_expression
   -> pyarrow.gandiva.TreeExprBuilder.make_in_expression(self, Node node, 
values, dtype)
   PR01: Parameters {'node', 'dtype', 'values'} not documented
   
   pyarrow.gandiva.TreeExprBuilder.make_if
   -> pyarrow.gandiva.TreeExprBuilder.make_if(self, Node condition, Node 
this_node, Node else_node, DataType return_type)
   PR01: Parameters {'return_type', 'condition', 'this_node', 'else_node'} not 
documented
   
   pyarrow.gandiva.TreeExprBuilder.make_function
   -> pyarrow.gandiva.TreeExprBuilder.make_function(self, name, children, 
DataType return_type)
   PR01: Parameters {'return_type', 'children', 'name'} not documented
   
   pyarrow.gandiva.TreeExprBuilder.make_field
   -> pyarrow.gandiva.TreeExprBuilder.make_field(self, Field field)
   PR01: Parameters {'field'} not documented
   
   pyarrow.gandiva.TreeExprBuilder.make_expression
   -> pyarrow.gandiva.TreeExprBuilder.make_expression(self, Node root_node, 
Field return_field)
   PR01: Parameters {'return_field', 'root_node'} not documented
   
   pyarrow.gandiva.TreeExprBuilder.make_condition
   -> pyarrow.gandiva.TreeExprBuilder.make_condition(self, Node condition)
   PR01: Parameters {'condition'} not documented
   
   pyarrow.gandiva.TreeExprBuilder.make_and
   -> pyarrow.gandiva.TreeExprBuilder.make_and(self, children)
   PR01: Parameters {'children'} not documented
   
   pyarrow.gandiva.Projector.evaluate
   -> pyarrow.gandiva.Projector.evaluate(self, RecordBatch batch, 
SelectionVector selection=None)
   PR01: Parameters {'selection', 'batch'} not documented
   
   pyarrow.gandiva.Filter.evaluate
   -> pyarrow.gandiva.Filter.evaluate(self, RecordBatch batch, MemoryPool pool, 
dtype=u'int32')
   PR01: Parameters {'dtype', 'batch', 'pool'} not documented
   
   pyarrow._dataset.Partitioning.parse
   -> pyarrow._dataset.Partitioning.parse(self, path)
   PR01: Parameters {'path'} not documented
   
   pyarrow._dataset_parquet.ParquetReadOptions.equals
   -> pyarrow._dataset_parquet.ParquetReadOptions.equals(self, 
ParquetReadOptions other)
   PR01: Parameters {'other'} not documented
   
   pyarrow._dataset_parquet.ParquetFragmentScanOptions.equals
   -> pyarrow._dataset_parquet.ParquetFragmentScanOptions.equals(self, 
ParquetFragmentScanOptions other)
   PR01: Parameters {'other'} not documented
   
   pyarrow._dataset_parquet.ParquetFileWriteOptions.update
   -> pyarrow._dataset_parquet.ParquetFileWriteOptions.update(self, **kwargs)
   PR01: Parameters {'**kwargs'} not documented
   
   pyarrow._dataset_parquet.ParquetFileFormat.make_write_options
   -> pyarrow._dataset_parquet.ParquetFileFormat.make_write_options(self, 
**kwargs)
   PR01: Parameters {'**kwargs'} not documented
   
   pyarrow._dataset_parquet.ParquetFileFormat.equals
   -> pyarrow._dataset_parquet.ParquetFileFormat.equals(self, ParquetFileFormat 
other)
   PR01: Parameters {'other'} not documented
   
   pyarrow._dataset_orc.OrcFileFormat.equals
   -> pyarrow._dataset_orc.OrcFileFormat.equals(self, OrcFileFormat other)
   PR01: Parameters {'other'} not documented
   
   pyarrow._dataset.JsonFragmentScanOptions.equals
   -> pyarrow._dataset.JsonFragmentScanOptions.equals(self, 
JsonFragmentScanOptions other)
   PR01: Parameters {'other'} not documented
   
   pyarrow._dataset.JsonFileFormat.equals
   -> pyarrow._dataset.JsonFileFormat.equals(self, JsonFileFormat other)
   PR01: Parameters {'other'} not documented
   
   pyarrow._dataset.IpcFileFormat.make_write_options
   -> pyarrow._dataset.IpcFileFormat.make_write_options(self, **kwargs)
   PR01: Parameters {'**kwargs'} not documented
   
   pyarrow._dataset.IpcFileFormat.equals
   -> pyarrow._dataset.IpcFileFormat.equals(self, IpcFileFormat other)
   PR01: Parameters {'other'} not documented
   
   pyarrow._dataset.type.FileSystemDataset.from_paths
   -> pyarrow._dataset.FileSystemDataset.from_paths(cls, paths, schema=None, 
format=None, filesystem=None, partitions=None, root_partition=None)
   PR01: Parameters {'format', 'filesystem', 'root_partition', 'schema', 
'partitions'} not documented
   
   pyarrow._compute.Expression.equals
   -> pyarrow._compute.Expression.equals(self, Expression other)
   PR01: Parameters {'other'} not documented
   
   pyarrow._dataset.CsvFragmentScanOptions.equals
   -> pyarrow._dataset.CsvFragmentScanOptions.equals(self, 
CsvFragmentScanOptions other)
   PR01: Parameters {'other'} not documented
   
   pyarrow._dataset.CsvFileFormat.make_write_options
   -> pyarrow._dataset.CsvFileFormat.make_write_options(self, **kwargs)
   PR01: Parameters {'**kwargs'} not documented
   
   pyarrow._dataset.CsvFileFormat.equals
   -> pyarrow._dataset.CsvFileFormat.equals(self, CsvFileFormat other)
   PR01: Parameters {'other'} not documented
   
   pyarrow._json.ReadOptions.equals
   -> pyarrow._json.ReadOptions.equals(self, ReadOptions other)
   PR01: Parameters {'other'} not documented
   
   pyarrow._json.ParseOptions.equals
   -> pyarrow._json.ParseOptions.equals(self, ParseOptions other)
   PR01: Parameters {'other'} not documented
   
   pyarrow._fs.FileSystem.equals
   -> pyarrow._fs.FileSystem.equals(self, FileSystem other)
   PR01: Parameters {'other'} not documented
   
   pyarrow._s3fs.S3LogLevel
   -> pyarrow._s3fs.An enumeration.
   PR01: Parameters {'value', 'names', 'module', 'qualname', 'start', 'type'} 
not documented
   
   pyarrow._fs.FileType
   -> pyarrow._fs.An enumeration.
   PR01: Parameters {'value', 'names', 'module', 'qualname', 'start', 'type'} 
not documented
   
   pyarrow.lib.MetadataVersion
   -> pyarrow.lib.An enumeration.
   PR01: Parameters {'value', 'names', 'module', 'qualname', 'start', 'type'} 
not documented
   
   pyarrow.lib.DataType.field
   -> pyarrow.lib.DataType.field(self, i) -> Field
   PR01: Parameters {'i'} not documented
   
   pyarrow.lib.Scalar.equals
   -> pyarrow.lib.Scalar.equals(self, Scalar other)
   PR01: Parameters {'other'} not documented
   
   pyarrow.lib.Array.format
   -> pyarrow.lib.Array.format(self, **kwargs)
   PR01: Parameters {'**kwargs'} not documented
   
   pyarrow.lib.Array.equals
   -> pyarrow.lib.Array.equals(self, Array other)
   PR01: Parameters {'other'} not documented
   
   pyarrow.lib.UnionArray.child
   -> pyarrow.lib.UnionArray.child(self, int pos)
   PR01: Parameters {'pos'} not documented
   
   pyarrow.lib.NativeFile.writelines
   -> pyarrow.lib.NativeFile.writelines(self, lines)
   PR01: Parameters {'lines'} not documented
   
   pyarrow.lib.NativeFile.read_buffer
   -> pyarrow.lib.NativeFile.read_buffer(self, nbytes=None)
   PR01: Parameters {'nbytes'} not documented
   
   pyarrow.lib.SparseCSRMatrix.dim_name
   -> pyarrow.lib.SparseCSRMatrix.dim_name(self, i)
   PR01: Parameters {'i'} not documented
   
   pyarrow.lib.type.SparseCSFTensor.from_dense_numpy
   -> pyarrow.lib.SparseCSFTensor.from_dense_numpy(cls, obj, dim_names=None)
   PR01: Parameters {'dim_names', 'obj'} not documented
   
   pyarrow.lib.SparseCSFTensor.dim_name
   -> pyarrow.lib.SparseCSFTensor.dim_name(self, i)
   PR01: Parameters {'i'} not documented
   
   pyarrow.lib.type.SparseCSCMatrix.from_dense_numpy
   -> pyarrow.lib.SparseCSCMatrix.from_dense_numpy(cls, obj, dim_names=None)
   PR01: Parameters {'dim_names', 'obj'} not documented
   
   pyarrow.lib.SparseCSCMatrix.dim_name
   -> pyarrow.lib.SparseCSCMatrix.dim_name(self, i)
   PR01: Parameters {'i'} not documented
   
   pyarrow.lib.type.SparseCOOTensor.from_dense_numpy
   -> pyarrow.lib.SparseCOOTensor.from_dense_numpy(cls, obj, dim_names=None)
   PR01: Parameters {'dim_names', 'obj'} not documented
   
   pyarrow.lib.SparseCOOTensor.dim_name
   -> pyarrow.lib.SparseCOOTensor.dim_name(self, i)
   PR01: Parameters {'i'} not documented
   
   pyarrow.lib.PythonFile.truncate
   -> pyarrow.lib.PythonFile.truncate(self, pos=None)
   PR01: Parameters {'pos'} not documented
   
   pyarrow.lib.PythonFile.readlines
   -> pyarrow.lib.PythonFile.readlines(self, hint=None)
   PR01: Parameters {'hint'} not documented
   
   pyarrow.lib.PythonFile.readline
   -> pyarrow.lib.PythonFile.readline(self, size=None)
   PR01: Parameters {'size'} not documented
   
   pyarrow.lib.MetadataVersion
   -> pyarrow.lib.An enumeration.
   PR01: Parameters {'value', 'names', 'module', 'qualname', 'start', 'type'} 
not documented
   
   pyarrow.lib.KeyValueMetadata.value
   -> pyarrow.lib.KeyValueMetadata.value(self, i)
   PR01: Parameters {'i'} not documented
   
   pyarrow.lib.KeyValueMetadata.key
   -> pyarrow.lib.KeyValueMetadata.key(self, i)
   PR01: Parameters {'i'} not documented
   
   pyarrow.lib.KeyValueMetadata.get_all
   -> pyarrow.lib.KeyValueMetadata.get_all(self, key)
   PR01: Parameters {'key'} not documented
   
   pyarrow.lib.KeyValueMetadata.equals
   -> pyarrow.lib.KeyValueMetadata.equals(self, KeyValueMetadata other)
   PR01: Parameters {'other'} not documented
   
   pyarrow.lib.FixedSizeBufferWriter.set_memcopy_threshold
   -> pyarrow.lib.FixedSizeBufferWriter.set_memcopy_threshold(self, int64_t 
threshold)
   PR01: Parameters {'threshold'} not documented
   
   pyarrow.lib.FixedSizeBufferWriter.set_memcopy_threads
   -> pyarrow.lib.FixedSizeBufferWriter.set_memcopy_threads(self, int 
num_threads)
   PR01: Parameters {'num_threads'} not documented
   
   pyarrow.lib.FixedSizeBufferWriter.set_memcopy_blocksize
   -> pyarrow.lib.FixedSizeBufferWriter.set_memcopy_blocksize(self, int64_t 
blocksize)
   PR01: Parameters {'blocksize'} not documented
   
   pyarrow.lib.ChunkedArray.format
   -> pyarrow.lib.ChunkedArray.format(self, **kwargs)
   PR01: Parameters {'**kwargs'} not documented
   
   pyarrow._parquet.ParquetReader.set_use_threads
   -> pyarrow._parquet.ParquetReader.set_use_threads(self, bool use_threads)
   PR01: Parameters {'use_threads'} not documented
   
   pyarrow._parquet.ParquetReader.set_batch_size
   -> pyarrow._parquet.ParquetReader.set_batch_size(self, int64_t batch_size)
   PR01: Parameters {'batch_size'} not documented
   
   pyarrow._parquet.ParquetReader.scan_contents
   -> pyarrow._parquet.ParquetReader.scan_contents(self, column_indices=None, 
batch_size=65536)
   PR01: Parameters {'batch_size', 'column_indices'} not documented
   
   pyarrow._parquet.ParquetReader.read_row_groups
   -> pyarrow._parquet.ParquetReader.read_row_groups(self, row_groups, 
column_indices=None, bool use_threads=True)
   PR01: Parameters {'column_indices', 'row_groups', 'use_threads'} not 
documented
   
   pyarrow._parquet.ParquetReader.read_row_group
   -> pyarrow._parquet.ParquetReader.read_row_group(self, int i, 
column_indices=None, bool use_threads=True)
   PR01: Parameters {'i', 'column_indices', 'use_threads'} not documented
   
   pyarrow._parquet.ParquetReader.read_column
   -> pyarrow._parquet.ParquetReader.read_column(self, int column_index)
   PR01: Parameters {'column_index'} not documented
   
   pyarrow._parquet.ParquetReader.read_all
   -> pyarrow._parquet.ParquetReader.read_all(self, column_indices=None, bool 
use_threads=True)
   PR01: Parameters {'column_indices', 'use_threads'} not documented
   
   pyarrow._parquet.ParquetReader.open
   -> pyarrow._parquet.ParquetReader.open(self, source, *, bool 
use_memory_map=False, read_dictionary=None, FileMetaData metadata=None, int 
buffer_size=0, bool pre_buffer=False, coerce_int96_timestamp_unit=None, 
FileDecryptionProperties decryption_properties=None, 
thrift_string_size_limit=None, thrift_container_size_limit=None)
   PR01: Parameters {'thrift_string_size_limit', 'use_memory_map', 
'thrift_container_size_limit', 'pre_buffer', 'source', 'read_dictionary', 
'metadata', 'decryption_properties', 'buffer_size', 
'coerce_int96_timestamp_unit'} not documented
   
   pyarrow._parquet.ParquetReader.iter_batches
   -> pyarrow._parquet.ParquetReader.iter_batches(self, int64_t batch_size, 
row_groups, column_indices=None, bool use_threads=True)
   PR01: Parameters {'row_groups', 'use_threads', 'batch_size', 
'column_indices'} not documented
   
   pyarrow._csv.ReadOptions.equals
   -> pyarrow._csv.ReadOptions.equals(self, ReadOptions other)
   PR01: Parameters {'other'} not documented
   
   pyarrow._csv.ParseOptions.equals
   -> pyarrow._csv.ParseOptions.equals(self, ParseOptions other)
   PR01: Parameters {'other'} not documented
   
   pyarrow._csv.ConvertOptions.equals
   -> pyarrow._csv.ConvertOptions.equals(self, ConvertOptions other)
   PR01: Parameters {'other'} not documented
   
   Total number of docstring violations: 72
   ```
   
   ### Component(s)
   
   Python


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to