RE: releasing memory without stopping the spark context ?

2016-08-31 Thread Rajani, Arpan
Removing Data Spark automatically monitors cache usage on each node and drops out old data partitions in a least-recently-used (LRU) fashion. If you would like to manually remove an RDD instead of waiting for it to fall out of the cache, use the RDD.unpersist() method. (Copied from

Controlling access to hive/db-tables while using SparkSQL

2016-08-30 Thread Rajani, Arpan
Hi All, In our YARN cluster, we have setup spark 1.6.1 , we plan to give access to all the end users/developers/BI users, etc. But we learnt any valid user after getting their own user kerb TGT, can get hold of sqlContext (in program or in shell) and can run any query against any secure