Hi

- Is there a way to reload the spark context in the gui ?
- I've also realised that sc.hadoopConfiguration.set() is only working
once, after the first time, even if I modify the params with the same
function again, they don't seems to change...
- By the way, is there a way to print the parameter I've set ?... I
tried sc.hadoopConfiguration.get("myparam"), but doesn't work...

Best,
Sw

Reply via email to