[
https://issues.apache.org/jira/browse/SPARK-16703?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16703:
Assignee: Apache Spark (was: Cheng Lian)
> Extra space in WindowSpecDefinition SQL
[
https://issues.apache.org/jira/browse/SPARK-16703?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391370#comment-15391370
]
Apache Spark commented on SPARK-16703:
--
User 'liancheng' has created a pull request for this issue:
[
https://issues.apache.org/jira/browse/SPARK-16703?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16703:
Assignee: Cheng Lian (was: Apache Spark)
> Extra space in WindowSpecDefinition SQL
[
https://issues.apache.org/jira/browse/SPARK-16703?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Cheng Lian updated SPARK-16703:
---
Description:
For a {{WindowSpecDefinition}} whose {{partitionSpec}} is empty, there's an
extra
[
https://issues.apache.org/jira/browse/SPARK-16703?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Cheng Lian updated SPARK-16703:
---
Description:
For a {{WindowSpecDefinition}} whose {{partitionSpec}} is empty, there's an
extra
Cheng Lian created SPARK-16703:
--
Summary: Extra space in WindowSpecDefinition SQL representation
Key: SPARK-16703
URL: https://issues.apache.org/jira/browse/SPARK-16703
Project: Spark
Issue
[
https://issues.apache.org/jira/browse/SPARK-16685?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16685:
Assignee: Apache Spark
> audit release docs are ambiguous
>
[
https://issues.apache.org/jira/browse/SPARK-16685?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391361#comment-15391361
]
Apache Spark commented on SPARK-16685:
--
User 'rxin' has created a pull request for this issue:
[
https://issues.apache.org/jira/browse/SPARK-16685?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16685:
Assignee: (was: Apache Spark)
> audit release docs are ambiguous
>
[
https://issues.apache.org/jira/browse/SPARK-16699?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Reynold Xin updated SPARK-16699:
Description:
In the following code in `VectorizedHashMapGenerator.scala`:
{code}
def
[
https://issues.apache.org/jira/browse/SPARK-16699?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Reynold Xin resolved SPARK-16699.
-
Resolution: Fixed
Assignee: Qifan Pu
Fix Version/s: (was: 2.0.0)
[
https://issues.apache.org/jira/browse/SPARK-16702?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Angus Gerry updated SPARK-16702:
Attachment: SparkThreadsBlocked.txt
> Driver hangs after executors are lost
>
Angus Gerry created SPARK-16702:
---
Summary: Driver hangs after executors are lost
Key: SPARK-16702
URL: https://issues.apache.org/jira/browse/SPARK-16702
Project: Spark
Issue Type: Bug
[
https://issues.apache.org/jira/browse/SPARK-16534?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391274#comment-15391274
]
Apache Spark commented on SPARK-16534:
--
User 'jerryshao' has created a pull request for this issue:
[
https://issues.apache.org/jira/browse/SPARK-16534?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16534:
Assignee: (was: Apache Spark)
> Kafka 0.10 Python support
> -
[
https://issues.apache.org/jira/browse/SPARK-16534?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16534:
Assignee: Apache Spark
> Kafka 0.10 Python support
> -
>
>
[
https://issues.apache.org/jira/browse/SPARK-5581?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Josh Rosen updated SPARK-5581:
--
Assignee: Josh Rosen
> When writing sorted map output file, avoid open / close between each partition
>
[
https://issues.apache.org/jira/browse/SPARK-5581?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Josh Rosen updated SPARK-5581:
--
Assignee: Brian Cho (was: Josh Rosen)
> When writing sorted map output file, avoid open / close
[
https://issues.apache.org/jira/browse/SPARK-16603?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391268#comment-15391268
]
keliang edited comment on SPARK-16603 at 7/25/16 2:46 AM:
--
Hi, I test this
[
https://issues.apache.org/jira/browse/SPARK-16603?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391268#comment-15391268
]
keliang commented on SPARK-16603:
-
Hi, I test this feature with spark-2.0.1-snapshot:
first. create
[
https://issues.apache.org/jira/browse/SPARK-5581?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Josh Rosen resolved SPARK-5581.
---
Resolution: Fixed
Fix Version/s: 2.1.0
Issue resolved by pull request 13382
[
https://issues.apache.org/jira/browse/SPARK-16698?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16698:
Assignee: Apache Spark
> json parsing regression - "." in keys
>
[
https://issues.apache.org/jira/browse/SPARK-16698?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16698:
Assignee: (was: Apache Spark)
> json parsing regression - "." in keys
>
[
https://issues.apache.org/jira/browse/SPARK-16698?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391264#comment-15391264
]
Apache Spark commented on SPARK-16698:
--
User 'HyukjinKwon' has created a pull request for this
[
https://issues.apache.org/jira/browse/SPARK-16701?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391247#comment-15391247
]
Apache Spark commented on SPARK-16701:
--
User 'lovexi' has created a pull request for this issue:
[
https://issues.apache.org/jira/browse/SPARK-16701?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16701:
Assignee: (was: Apache Spark)
> Make parameters configurable in BlockManager
>
[
https://issues.apache.org/jira/browse/SPARK-16701?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16701:
Assignee: Apache Spark
> Make parameters configurable in BlockManager
>
YangyangLiu created SPARK-16701:
---
Summary: Make parameters configurable in BlockManager
Key: SPARK-16701
URL: https://issues.apache.org/jira/browse/SPARK-16701
Project: Spark
Issue Type:
[
https://issues.apache.org/jira/browse/SPARK-16700?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sylvain Zimmer updated SPARK-16700:
---
Component/s: (was: Spark Core)
> StructType doesn't accept Python dicts anymore
>
[
https://issues.apache.org/jira/browse/SPARK-16700?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sylvain Zimmer updated SPARK-16700:
---
Description:
Hello,
I found this issue while testing my codebase with 2.0.0-rc5
StructType
[
https://issues.apache.org/jira/browse/SPARK-16700?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sylvain Zimmer updated SPARK-16700:
---
Component/s: PySpark
> StructType doesn't accept Python dicts anymore
>
Sylvain Zimmer created SPARK-16700:
--
Summary: StructType doesn't accept Python dicts anymore
Key: SPARK-16700
URL: https://issues.apache.org/jira/browse/SPARK-16700
Project: Spark
Issue
[
https://issues.apache.org/jira/browse/SPARK-16699?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391239#comment-15391239
]
Dongjoon Hyun commented on SPARK-16699:
---
Hi, [~qifan].
Nice catch! By the way, usually, only
[
https://issues.apache.org/jira/browse/SPARK-16645?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Wenchen Fan resolved SPARK-16645.
-
Resolution: Fixed
Fix Version/s: 2.1.0
Issue resolved by pull request 14283
[
https://issues.apache.org/jira/browse/SPARK-16698?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391231#comment-15391231
]
Hyukjin Kwon commented on SPARK-16698:
--
It seems it does not work for all `FileFormat` data sources.
[
https://issues.apache.org/jira/browse/SPARK-16698?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391217#comment-15391217
]
Hyukjin Kwon commented on SPARK-16698:
--
FYI, this does not happen when it is read from json RDD. Let
[
https://issues.apache.org/jira/browse/SPARK-16699?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16699:
Assignee: Apache Spark
> Fix performance bug in hash aggregate on long string keys
>
[
https://issues.apache.org/jira/browse/SPARK-16699?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391191#comment-15391191
]
Apache Spark commented on SPARK-16699:
--
User 'ooq' has created a pull request for this issue:
[
https://issues.apache.org/jira/browse/SPARK-16699?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16699:
Assignee: (was: Apache Spark)
> Fix performance bug in hash aggregate on long string
[
https://issues.apache.org/jira/browse/SPARK-16699?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Qifan Pu updated SPARK-16699:
-
Description:
In the following code in `VectorizedHashMapGenerator.scala`:
```
def hashBytes(b:
Qifan Pu created SPARK-16699:
Summary: Fix performance bug in hash aggregate on long string keys
Key: SPARK-16699
URL: https://issues.apache.org/jira/browse/SPARK-16699
Project: Spark
Issue
[
https://issues.apache.org/jira/browse/SPARK-16685?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391168#comment-15391168
]
Patrick Wendell commented on SPARK-16685:
-
These scripts are pretty old and I'm not sure if
TobiasP created SPARK-16698:
---
Summary: json parsing regression - "." in keys
Key: SPARK-16698
URL: https://issues.apache.org/jira/browse/SPARK-16698
Project: Spark
Issue Type: Bug
[
https://issues.apache.org/jira/browse/SPARK-16589?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391108#comment-15391108
]
Maciej Szymkiewicz commented on SPARK-16589:
[~holdenk] Makes sense. I was thinking more
[
https://issues.apache.org/jira/browse/SPARK-12378?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15387889#comment-15387889
]
Chandana Sapparapu edited comment on SPARK-12378 at 7/24/16 3:46 PM:
-
[
https://issues.apache.org/jira/browse/SPARK-16695?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391088#comment-15391088
]
Sean Owen commented on SPARK-16695:
---
`sbt -Dscala-2.11 package` works for 1.6.2 on my Mac. So does
[
https://issues.apache.org/jira/browse/SPARK-3246?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391082#comment-15391082
]
Mohamed Baddar commented on SPARK-3246:
---
[~sheridanrawlins] Working on it soon, most probably on 1st
[
https://issues.apache.org/jira/browse/SPARK-16696?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Weichen Xu updated SPARK-16696:
---
Issue Type: Improvement (was: Bug)
> unused broadcast variables should call destroy instead of
[
https://issues.apache.org/jira/browse/SPARK-16697?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16697:
Assignee: Apache Spark
> redundant RDD computation in LDAOptimizer
>
[
https://issues.apache.org/jira/browse/SPARK-16697?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391067#comment-15391067
]
Apache Spark commented on SPARK-16697:
--
User 'WeichenXu123' has created a pull request for this
[
https://issues.apache.org/jira/browse/SPARK-16697?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16697:
Assignee: (was: Apache Spark)
> redundant RDD computation in LDAOptimizer
>
[
https://issues.apache.org/jira/browse/SPARK-16697?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Weichen Xu updated SPARK-16697:
---
Description:
In mllib.clustering.LDAOptimizer
the submitMiniBatch method,
the stats: RDD do not
Weichen Xu created SPARK-16697:
--
Summary: redundant RDD computation in LDAOptimizer
Key: SPARK-16697
URL: https://issues.apache.org/jira/browse/SPARK-16697
Project: Spark
Issue Type:
[
https://issues.apache.org/jira/browse/SPARK-16696?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Weichen Xu updated SPARK-16696:
---
Description:
Unused broadcast variables should call destroy() instead of unpersist() so that
the
[
https://issues.apache.org/jira/browse/SPARK-16696?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15391050#comment-15391050
]
Apache Spark commented on SPARK-16696:
--
User 'WeichenXu123' has created a pull request for this
[
https://issues.apache.org/jira/browse/SPARK-16696?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16696:
Assignee: (was: Apache Spark)
> unused broadcast variables should call destroy
[
https://issues.apache.org/jira/browse/SPARK-16696?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16696:
Assignee: Apache Spark
> unused broadcast variables should call destroy instead of
Weichen Xu created SPARK-16696:
--
Summary: unused broadcast variables should call destroy instead of
unpersist
Key: SPARK-16696
URL: https://issues.apache.org/jira/browse/SPARK-16696
Project: Spark
Eran Mizrahi created SPARK-16695:
Summary: compile spark with scala 2.11 is not working (with Sbt)
Key: SPARK-16695
URL: https://issues.apache.org/jira/browse/SPARK-16695
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-16416?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen updated SPARK-16416:
--
Assignee: Mikael Ståldal
> Logging in shutdown hook does not work properly with Log4j 2.x
>
[
https://issues.apache.org/jira/browse/SPARK-16416?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen resolved SPARK-16416.
---
Resolution: Fixed
Fix Version/s: 2.1.0
Issue resolved by pull request 14320
[
https://issues.apache.org/jira/browse/SPARK-16667?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen updated SPARK-16667:
--
Target Version/s: (was: 1.6.0)
> Spark driver executor dont release unused memory
>
[
https://issues.apache.org/jira/browse/SPARK-16692?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen updated SPARK-16692:
--
Target Version/s: (was: 1.6.0)
> multilabel classification to DataFrame, ML
>
[
https://issues.apache.org/jira/browse/SPARK-16463?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen updated SPARK-16463:
--
Assignee: Dongjoon Hyun
> Support `truncate` option in Overwrite mode for JDBC DataFrameWriter
>
[
https://issues.apache.org/jira/browse/SPARK-16410?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen resolved SPARK-16410.
---
Resolution: Duplicate
> DataFrameWriter's jdbc method drops table in overwrite mode
>
[
https://issues.apache.org/jira/browse/SPARK-16463?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen resolved SPARK-16463.
---
Resolution: Fixed
Fix Version/s: 2.1.0
Issue resolved by pull request 14086
[
https://issues.apache.org/jira/browse/SPARK-16685?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15390990#comment-15390990
]
Sean Owen commented on SPARK-16685:
---
[~pwendell] I think you put this in place; do you know if
[
https://issues.apache.org/jira/browse/SPARK-16541?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen resolved SPARK-16541.
---
Resolution: Cannot Reproduce
> SparkTC application could not shutdown successfully
>
[
https://issues.apache.org/jira/browse/SPARK-16664?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen updated SPARK-16664:
--
Target Version/s: 2.0.1
> Spark 1.6.2 - Persist call on Data frames with more than 200 columns is
>
[
https://issues.apache.org/jira/browse/SPARK-16676?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen resolved SPARK-16676.
---
Resolution: Not A Problem
If your executors aren't starting, then you have another earlier problem.
[
https://issues.apache.org/jira/browse/SPARK-16573?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen resolved SPARK-16573.
---
Resolution: Won't Fix
> executor stderr processing tools
> -
>
>
[
https://issues.apache.org/jira/browse/SPARK-16574?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen resolved SPARK-16574.
---
Resolution: Not A Problem
> Distribute computing to each node based on certain hints
>
[
https://issues.apache.org/jira/browse/SPARK-16601?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen resolved SPARK-16601.
---
Resolution: Not A Problem
> Spark2.0 fail in creating table using sql statement "create table
>
[
https://issues.apache.org/jira/browse/SPARK-16694?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16694:
Assignee: Apache Spark (was: Sean Owen)
> Use for/foreach rather than map for Unit
[
https://issues.apache.org/jira/browse/SPARK-16694?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-16694:
Assignee: Sean Owen (was: Apache Spark)
> Use for/foreach rather than map for Unit
[
https://issues.apache.org/jira/browse/SPARK-16694?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15390982#comment-15390982
]
Apache Spark commented on SPARK-16694:
--
User 'srowen' has created a pull request for this issue:
Sean Owen created SPARK-16694:
-
Summary: Use for/foreach rather than map for Unit expressions
whose side effects are required
Key: SPARK-16694
URL: https://issues.apache.org/jira/browse/SPARK-16694
77 matches
Mail list logo