[
https://issues.apache.org/jira/browse/SPARK-14250?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15216740#comment-15216740
]
Georg Heiler commented on SPARK-14250:
--------------------------------------
The local file seems to have been corrupted trying to read it from docker.
> Parquet import failure: No predefined schema found
> --------------------------------------------------
>
> Key: SPARK-14250
> URL: https://issues.apache.org/jira/browse/SPARK-14250
> Project: Spark
> Issue Type: Bug
> Components: Spark Core, SparkR
> Affects Versions: 1.6.1
> Environment: Osx using docker toolbox, as well as linux / ubuntu
> Reporter: Georg Heiler
>
> No predefined schema found, and no Parquet data files or summary files found.
> Trying to read a parquet file with sparkR this exception occurs. Reading a
> parquet file works without any problems on a local spark installation.
> http://stackoverflow.com/questions/36283703/spark-in-docker-parquet-error-no-predefined-schema-found?noredirect=1#comment60211289_36283703
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]