WangGuangxin commented on pull request #34846:
URL: https://github.com/apache/spark/pull/34846#issuecomment-1041227864


   > I have a question to those who are more familiar with Spark internals 
though: are there any places inside of Spark that implicitly depends on the 
page size being a power-of-two? Do we run any risk that such assumption is not 
checked at runtime, which can lead to out-of-bounds access in the boundary 
cases?
   
   As far as I know there is no such constraint. And Spark allows users set 
custom page size by conf `spark.buffer.pageSize`, whichi is not restricted to 
the power-of-two.  


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to