GitHub user xwu0226 opened a pull request:
https://github.com/apache/spark/pull/9542
[Spark-11522][SQL] input_file_name() returns "" for external tables
When computing partition for non-parquet relation, `HadoopRDD.compute` is
used. but it does not set the thread local variable `inputFileName` in
`NewSqlHadoopRDD`, like `NewSqlHadoopRDD.compute` does.. Yet, when getting the
`inputFileName`, `NewSqlHadoopRDD.inputFileName` is exptected, which is empty
now.
Adding the setting inputFileName in HadoopRDD.compute resolves this issue.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/xwu0226/spark SPARK-11522
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/9542.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #9542
----
commit fd7533d3171bb7e90af21e216463829b7fdbd33f
Author: xin Wu <[email protected]>
Date: 2015-11-06T23:47:14Z
SPARK-11522 input_file_name() returns empty string for external tables
commit b67a0143be47fb14264bb71b787ed6eb351c0c81
Author: xin Wu <[email protected]>
Date: 2015-11-07T02:56:06Z
SPARK-11522 updating testcases
commit a2d83db953470c2cff130362b462afb6ad2470d6
Author: xin Wu <[email protected]>
Date: 2015-11-07T05:06:14Z
SPARK-11522 update testcase
commit a88260ff90c25dcc18dd797119fdbcfc6503f991
Author: xin Wu <[email protected]>
Date: 2015-11-07T17:09:24Z
SPARK-11522 update testcase
commit 2658f2808ede6512c625f3bb33823bc6492823d5
Author: xin Wu <[email protected]>
Date: 2015-11-07T17:12:34Z
SPARK-11522 update testcase
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]