Github user yhuai commented on the pull request:
https://github.com/apache/spark/pull/4729#issuecomment-76313868
OK. Now I understand what's going on. For SPARK-5950, we cannot do insert
because `InsertIntoTable` will not be resolved and you saw an
org.apache.spark.sql.AnalysisException, right? For SPARK-5508, the problem is
data is inserted through InsertIntoHive and we cannot read it from our data
source API write path. Are you trying to resolve both in this PR?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]