Hi, I was wondering if there are any back-compat guarantees given by Arrow and what the behaviour would be in case of a backwards incompatible change (I believe there haven't been any since 0.8.0). Are there checks in the readers to detect that the format being read is incompatible or is there a potential for incorrectness?
For example, in spark, there is only a check for the minimum version of pyarrow [0] and none to verify that the version of the arrow jar shipped matches the version of pyarrow being used. Thanks! Mihir [0] https://github.com/apache/spark/blob/834b8609793525a5a486013732d8c98e1c6e6504/python/pyspark/sql/utils.py#L139
