Github user kevinpetersavage commented on the pull request:

    https://github.com/apache/spark/pull/4957#issuecomment-95219959
  
    Sorry @tdas, crossed wires, didn't see your comment before I'd already 
looked at this. 
    
    I see your point, but I think that the underlying problem with the test was 
that batches were not limited above. Without fixing this I don't really see how 
you can fix the test. I also don't think you can fix it by doing timing better 
because we're on the JVM. I think that the change I just made is a reasonable 
hybrid that doesn't do copying though data structures. 
    
    Would it be a better idea to create a bug about block sizes that can be too 
large and submit this as a fix for that bug so that it can be reviewed properly?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to