Currently in Spark cubing, the StorageLevel is set to StorageLevel.MEMORY_AND_DISK_SER, which will take up a lot of memory if the RDD of the layer is large. Can we make StorageLevel configurable ? So that for large cube, user can set it to Disk to avoid OOM error.
- Make StorageLevel in Spark Cubing configurable vula2
