jorisvandenbossche commented on issue #34681:
URL: https://github.com/apache/arrow/issues/34681#issuecomment-1480737829

   I think we might actually also write into the memory (so not just garbage, 
which makes the test pass consistently for the example), but that's of course 
just as bad, and indeed when using larger data that gives a segfault at 
clean-up.
   
   It's true that it is a slippery slope and that you can't start with ensuring 
the validity of the arrays on any operation. At some point you need to assume 
you have valid data. 
   I suppose a more appropriate solution here would be to have better testing 
utilities, like an `assert_equals`-like helper, and then in such a function we 
could first call `validate()` on the arrays and then `equals`, to avoid such 
issues. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to