[ https://issues.apache.org/jira/browse/SPARK-12921?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15138507#comment-15138507 ]
Apache Spark commented on SPARK-12921: -------------------------------------- User 'JoshRosen' has created a pull request for this issue: https://github.com/apache/spark/pull/11131 > Use SparkHadoopUtil reflection to access TaskAttemptContext in > SpecificParquetRecordReaderBase > ---------------------------------------------------------------------------------------------- > > Key: SPARK-12921 > URL: https://issues.apache.org/jira/browse/SPARK-12921 > Project: Spark > Issue Type: Bug > Components: Spark Core, SQL > Reporter: Josh Rosen > Assignee: Josh Rosen > Fix For: 1.6.1 > > > It looks like there's one place left in the codebase, > SpecificParquetRecordReaderBase, where we didn't use SparkHadoopUtil's > reflective accesses of TaskAttemptContext methods, creating problems when > using a single Spark artifact with both Hadoop 1.x and 2.x. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org