Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/2814#issuecomment-61233478
  
    Hm, that's weird since I thought I ran the test against the default Hadoop, 
and that's 1.0.4. Which test failed? (I'll go look around jenkins too)
    
    I can't find 1.0.4 source at the moment, but it looks like in ancient 
times, the JobContextImpl constructor was package-private:
    
    
http://grepcode.com/file/repo1.maven.org/maven2/org.apache.hadoop/hadoop-mapred/0.22.0/org/apache/hadoop/mapred/JobContextImpl.java
    
    So `JobContext` is an interface and `JobContextImpl` has a non-public 
constructor. That's probably why this mechanism was created in the first place. 
Another solution is to change the constructor's accessibility at runtime. I can 
try that and verify the test again.
    
    All of this would go away once Hadoop 1.x is not supported anymore too, 
along with some more acrobatics inside `SparkHadoopWriter`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to