Github user watermen commented on the issue:
    @jackylk Agreed with @xuchuanyin, Spark’s storage levels are meant to 
provide different trade-offs between memory usage and CPU efficiency. So 
different environment correspond to different storage level. So here we'd 
better make a conf named 'storage_level', and the default value of it is 
MEMORY_ONLY(the same to default value in spark). We can get more info about 
storage level in spark 

If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at or file a JIRA ticket
with INFRA.

Reply via email to