Hi

First conf is used for Hadoop to determine the locality distribution of HDFS 
file. Second conf is used for Spark, though with the same name, actually they 
are two different classes.

Thanks
Jerry

From: qinwei [mailto:wei....@dewmobile.net]
Sent: Sunday, September 28, 2014 2:05 PM
To: user
Subject: problem with data locality api

Hi, everyone
    I come across with a problem about data locality, i found these example 
code in 《Spark-on-YARN-A-Deep-Dive-Sandy-Ryza.pdf》
        val locData = InputFormatInfo.computePreferredLocations(Seq(new 
InputFormatInfo(conf, classOf[TextInputFormat], new Path(“myfile.txt”)))
        val sc = new SparkContext(conf, locData)
    but i found the two confs above are of different types, conf in the first 
line if of type org.apache.hadoop.conf.Configuration, and conf in the second 
line is of type SparkConf,  can anyone explain that to me or give me some 
example code?

________________________________
qinwei

Reply via email to