Hello, I've just wanted to use sc._jsc.hadoopConfiguration().set('key','value') in pyspark 1.5.2 but I got set method not exists error.
Are there anyone who know a workaround to set some hdfs related properties like dfs.blocksize? Thanks in advance! Tamas