pitrou commented on code in PR #42174:
URL: https://github.com/apache/arrow/pull/42174#discussion_r1683081544
##########
cpp/tools/parquet/CMakeLists.txt:
##########
@@ -32,5 +32,16 @@ if(PARQUET_BUILD_EXECUTABLES)
RUNTIME DESTINATION ${CMAKE_INSTALL_BINDIR})
endforeach(TOOL)
+ # Only build parquet-dump-footer when statically linking.
Review Comment:
> So we are committing to `Scrub`(and other APIs) becoming a public API of
`libparquet`.
Not really, my proposal is that they are exposed as internal APIs. We have
other examples of this, for example the fuzzing entrypoint:
https://github.com/apache/arrow/blob/0bae073d1abe439d113cc12e06e7ada886a2f2fd/cpp/src/parquet/arrow/reader.h#L372-L377
For example, you could instantiate a `FileMetaData` object from the
serialized metadata, and implement a `FileMetaData::ScrubSensitiveData` method
that does what you want. `FileMetaData` has a `SerializeToString` method that
reencodes to Thrift, and you could add another `SerializeToJson` method.
Or, if `FileMetaData` does not work, you can just move the current helper
functions into libparquet, taking care of replacing `format::FileMetaData` with
either `std::any` or a dedicated opaque wrapper. Not particularly pretty, but
it would certainly work.
> For my education, what are we getting by shipping dynamically linked tools
to justify the above commitment? Is there a requirement for them in some
platform? Perhaps it is best to always ship statically linked tools instead?
It's not even about shipping the tools, it's about getting working tools on
developers' workstation without having to be forced into static linking. I'm
also not sure static linking works properly on Windows... and it tends to blow
up file size a lot.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]