Hi Kevin,

I saw #235, it changes too much to catch at a glance. Do you think It's a
kind of solution that I can make a new interpreter with my own setting?
I'll try to that, but first of all, I use and have a spark 1.2 cluster
only. Could you tell me your schedule for merging spark_1.2 branch into
master?

And the purpose of spark.* is tuning spark as well. When I use a mesos
cluster, spark.exeutor.uri is needed for specific spark executor. and Kyro
is also needed and compression, logging event, and so on. I think It's a
good idea to support spark-defaults.conf and spark-env.sh. How about you?

Regards,
JL

On Fri, Jan 30, 2015 at 12:41 AM, Kevin Kim (Sangwoo) <[email protected]>
wrote:

> Great to hear you found the way!
> Maybe we can find out a better way than the workaround in a near future.
>
> And what is the purpose of setting spark.*?
> Can you be more specific?
>
> [I guess you can set spark.* via  #235]
>



-- 
이종열, Jongyoul Lee, 李宗烈
http://madeng.net

Reply via email to