[jira] [Commented] (SPARK-25776) The disk write buffer size must be greater than 12.

2018-11-04 Thread Kazuaki Ishizaki (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25776?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16674442#comment-16674442
 ] 

Kazuaki Ishizaki commented on SPARK-25776:
--

Issue resolved by pull request 22754
https://github.com/apache/spark/pull/22754

> The disk write buffer size must be greater than 12.
> ---
>
> Key: SPARK-25776
> URL: https://issues.apache.org/jira/browse/SPARK-25776
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: liuxian
>Assignee: liuxian
>Priority: Minor
> Fix For: 3.0.0
>
>
> In {color:#205081}{{UnsafeSorterSpillWriter.java}}{color}, when we write a 
> record to a spill file wtih {{ {color:#205081}void write(Object baseObject, 
> long baseOffset, int recordLength, long keyPrefix{color})}}, 
> {color:#205081}{{recordLength}} {color}and {color:#205081}{{keyPrefix}} 
> {color}will be written the disk write buffer first, and these will take 12 
> bytes, so the disk write buffer size must be greater than 12.
> If {color:#205081}{{diskWriteBufferSize}} {color}is 10, it will print this 
> exception info:
> _java.lang.ArrayIndexOutOfBoundsException: 10_
>  _at 
> org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillWriter.writeLongToBuffer
>  (UnsafeSorterSpillWriter.java:91)_
>  _at 
> org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillWriter.write(UnsafeSorterSpillWriter.java:123)_
>  _at 
> org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.spillIterator(UnsafeExternalSorter.java:498)_
>  _at 
> org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.spill(UnsafeExternalSorter.java:222)_
>  _at org.apache.spark.memory.MemoryConsumer.spill(MemoryConsumer.java:65)_



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-25776) The disk write buffer size must be greater than 12.

2018-10-18 Thread Apache Spark (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25776?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16656180#comment-16656180
 ] 

Apache Spark commented on SPARK-25776:
--

User '10110346' has created a pull request for this issue:
https://github.com/apache/spark/pull/22754

> The disk write buffer size must be greater than 12.
> ---
>
> Key: SPARK-25776
> URL: https://issues.apache.org/jira/browse/SPARK-25776
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: liuxian
>Priority: Minor
>
> In {color:#205081}{{UnsafeSorterSpillWriter.java}}{color}, when we write a 
> record to a spill file wtih {{ {color:#205081}void write(Object baseObject, 
> long baseOffset, int recordLength, long keyPrefix{color})}}, 
> {color:#205081}{{recordLength}} {color}and {color:#205081}{{keyPrefix}} 
> {color}will be written the disk write buffer first, and these will take 12 
> bytes, so the disk write buffer size must be greater than 12.
> If {color:#205081}{{diskWriteBufferSize}} {color}is 10, it will print this 
> exception info:
> _java.lang.ArrayIndexOutOfBoundsException: 10_
>  _at 
> org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillWriter.writeLongToBuffer
>  (UnsafeSorterSpillWriter.java:91)_
>  _at 
> org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillWriter.write(UnsafeSorterSpillWriter.java:123)_
>  _at 
> org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.spillIterator(UnsafeExternalSorter.java:498)_
>  _at 
> org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.spill(UnsafeExternalSorter.java:222)_
>  _at org.apache.spark.memory.MemoryConsumer.spill(MemoryConsumer.java:65)_



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-25776) The disk write buffer size must be greater than 12.

2018-10-18 Thread Apache Spark (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-25776?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16656177#comment-16656177
 ] 

Apache Spark commented on SPARK-25776:
--

User '10110346' has created a pull request for this issue:
https://github.com/apache/spark/pull/22754

> The disk write buffer size must be greater than 12.
> ---
>
> Key: SPARK-25776
> URL: https://issues.apache.org/jira/browse/SPARK-25776
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 3.0.0
>Reporter: liuxian
>Priority: Minor
>
> In {color:#205081}{{UnsafeSorterSpillWriter.java}}{color}, when we write a 
> record to a spill file wtih {{ {color:#205081}void write(Object baseObject, 
> long baseOffset, int recordLength, long keyPrefix{color})}}, 
> {color:#205081}{{recordLength}} {color}and {color:#205081}{{keyPrefix}} 
> {color}will be written the disk write buffer first, and these will take 12 
> bytes, so the disk write buffer size must be greater than 12.
> If {color:#205081}{{diskWriteBufferSize}} {color}is 10, it will print this 
> exception info:
> _java.lang.ArrayIndexOutOfBoundsException: 10_
>  _at 
> org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillWriter.writeLongToBuffer
>  (UnsafeSorterSpillWriter.java:91)_
>  _at 
> org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillWriter.write(UnsafeSorterSpillWriter.java:123)_
>  _at 
> org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.spillIterator(UnsafeExternalSorter.java:498)_
>  _at 
> org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.spill(UnsafeExternalSorter.java:222)_
>  _at org.apache.spark.memory.MemoryConsumer.spill(MemoryConsumer.java:65)_



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org