In 'SparkHadoopUtil.scala' in /core/src/main/scala/org/apache/spark/deploy/, there is a method:
def newConfiguration(): Configuration = new Configuration() There is a header that imports Configuration : import org.apache.hadoop.conf.Configuration But I'm unable to find the definition of Configuration under /core/src/main/scala/org/apache/hadoop/ The only subdirectories in this directory are mapred and mapreduce. Does anybody know where 'Configuration' is defined?