quick answers: 1. you can pretty much set any spark configuration at hive using set command. 2. no. you have to make the call.
On Thu, Oct 22, 2015 at 10:32 PM, Jone Zhang <joyoungzh...@gmail.com> wrote: > 1.How can i set Storage Level when i use Hive on Spark? > 2.Do Spark have any intention of dynamically determined Hive on MapReduce > or Hive on Spark, base on SQL features. > > Thanks in advance > Best regards >