[jira] [Commented] (SPARK-16664) Spark 1.6.2 - Persist call on Data frames with more than 200 columns is wiping out the data.

2016-07-29 Thread Apache Spark (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-16664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15399754#comment-15399754 ] Apache Spark commented on SPARK-16664: -- User 'breakdawn' has created a pull request for this issue:

[jira] [Commented] (SPARK-16664) Spark 1.6.2 - Persist call on Data frames with more than 200 columns is wiping out the data.

2016-07-29 Thread Thomas Graves (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-16664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15399168#comment-15399168 ] Thomas Graves commented on SPARK-16664: --- I am out of the office until 8/8. Please contact my

[jira] [Commented] (SPARK-16664) Spark 1.6.2 - Persist call on Data frames with more than 200 columns is wiping out the data.

2016-07-28 Thread Julien Champ (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-16664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15397259#comment-15397259 ] Julien Champ commented on SPARK-16664: -- - We are experiencing the same problem here with Spark 1.6.2

[jira] [Commented] (SPARK-16664) Spark 1.6.2 - Persist call on Data frames with more than 200 columns is wiping out the data.

2016-07-22 Thread Dongjoon Hyun (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-16664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15390562#comment-15390562 ] Dongjoon Hyun commented on SPARK-16664: --- Oh, I missed the 201 cases. Sorry. > Spark 1.6.2 -

[jira] [Commented] (SPARK-16664) Spark 1.6.2 - Persist call on Data frames with more than 200 columns is wiping out the data.

2016-07-22 Thread Apache Spark (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-16664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15390534#comment-15390534 ] Apache Spark commented on SPARK-16664: -- User 'breakdawn' has created a pull request for this issue:

[jira] [Commented] (SPARK-16664) Spark 1.6.2 - Persist call on Data frames with more than 200 columns is wiping out the data.

2016-07-22 Thread Liwei Lin (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-16664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15390525#comment-15390525 ] Liwei Lin commented on SPARK-16664: --- I've found the root cause; will submit a patch shortly. > Spark

[jira] [Commented] (SPARK-16664) Spark 1.6.2 - Persist call on Data frames with more than 200 columns is wiping out the data.

2016-07-22 Thread Satish Kolli (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-16664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15389397#comment-15389397 ] Satish Kolli commented on SPARK-16664: -- [~dongjoon] Problem exists in master also. I tried a nightly

[jira] [Commented] (SPARK-16664) Spark 1.6.2 - Persist call on Data frames with more than 200 columns is wiping out the data.

2016-07-22 Thread Wesley Tang (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-16664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15389367#comment-15389367 ] Wesley Tang commented on SPARK-16664: - According to the original post, the size should be 201 to

[jira] [Commented] (SPARK-16664) Spark 1.6.2 - Persist call on Data frames with more than 200 columns is wiping out the data.

2016-07-21 Thread Dongjoon Hyun (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-16664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15388219#comment-15388219 ] Dongjoon Hyun commented on SPARK-16664: --- Also, FYI, RC5 is the same with the current master. >

[jira] [Commented] (SPARK-16664) Spark 1.6.2 - Persist call on Data frames with more than 200 columns is wiping out the data.

2016-07-21 Thread Dongjoon Hyun (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-16664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15388209#comment-15388209 ] Dongjoon Hyun commented on SPARK-16664: --- It looks like that. FYI, here is the result of current

[jira] [Commented] (SPARK-16664) Spark 1.6.2 - Persist call on Data frames with more than 200 columns is wiping out the data.

2016-07-21 Thread Satish Kolli (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-16664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15387858#comment-15387858 ] Satish Kolli commented on SPARK-16664: -- Here is a demonstration from the spark shell: {code}

[jira] [Commented] (SPARK-16664) Spark 1.6.2 - Persist call on Data frames with more than 200 columns is wiping out the data.

2016-07-21 Thread Ying Zhou (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-16664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15387745#comment-15387745 ] Ying Zhou commented on SPARK-16664: --- Hi there, The first test case creates a dataframe with a 1 row

[jira] [Commented] (SPARK-16664) Spark 1.6.2 - Persist call on Data frames with more than 200 columns is wiping out the data.

2016-07-21 Thread Sean Owen (JIRA)
[ https://issues.apache.org/jira/browse/SPARK-16664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15387717#comment-15387717 ] Sean Owen commented on SPARK-16664: --- What does "wipe out data" mean here? your query is for 100