Hi,

In which version of Spark will this fix  be available ?
The deployment is on EMR

Regards,
Snehasish

On Fri, Feb 9, 2018 at 8:51 PM, Wenchen Fan <cloud0...@gmail.com> wrote:

> It should be fixed by https://github.com/apache/spark/pull/20561 soon.
>
> On Fri, Feb 9, 2018 at 6:16 PM, Wenchen Fan <cloud0...@gmail.com> wrote:
>
>> This has been reported before: http://apache-spark-de
>> velopers-list.1001551.n3.nabble.com/java-lang-IllegalStateEx
>> ception-There-is-no-space-for-new-record-tc20108.html
>>
>> I think we may have a real bug here, but we need a reproduce. Can you
>> provide one? thanks!
>>
>> On Fri, Feb 9, 2018 at 5:59 PM, SNEHASISH DUTTA <info.snehas...@gmail.com
>> > wrote:
>>
>>> Hi ,
>>>
>>> I am facing the following when running on EMR
>>>
>>> Caused by: java.lang.IllegalStateException: There is no space for new
>>> record
>>>         at org.apache.spark.util.collection.unsafe.sort.UnsafeInMemoryS
>>> orter.insertRecord(UnsafeInMemorySorter.java:226)
>>>         at org.apache.spark.sql.execution.UnsafeKVExternalSorter.<init>
>>> (UnsafeKVExternalSorter.java:132)
>>>         at org.apache.spark.sql.execution.UnsafeFixedWidthAggregationMa
>>> p.destructAndCreateExternalSorter(UnsafeFixedWidthAggregatio
>>> nMap.java:250)
>>>
>>> I am using pyspark 2.2 , what spark configuration should be
>>> changed/modified to get this resolved
>>>
>>>
>>> Regards,
>>> Snehasish
>>>
>>>
>>> Regards,
>>> Snehasish
>>>
>>> On Fri, Feb 9, 2018 at 1:26 PM, SNEHASISH DUTTA <
>>> info.snehas...@gmail.com> wrote:
>>>
>>>> Hi ,
>>>>
>>>> I am facing the following when running on EMR
>>>>
>>>> Caused by: java.lang.IllegalStateException: There is no space for new
>>>> record
>>>>         at org.apache.spark.util.collection.unsafe.sort.UnsafeInMemoryS
>>>> orter.insertRecord(UnsafeInMemorySorter.java:226)
>>>>         at org.apache.spark.sql.execution.UnsafeKVExternalSorter.<init>
>>>> (UnsafeKVExternalSorter.java:132)
>>>>         at org.apache.spark.sql.execution.UnsafeFixedWidthAggregationMa
>>>> p.destructAndCreateExternalSorter(UnsafeFixedWidthAggregatio
>>>> nMap.java:250)
>>>>
>>>> I am using spark 2.2 , what spark configuration should be
>>>> changed/modified to get this resolved
>>>>
>>>>
>>>> Regards,
>>>> Snehasish
>>>>
>>>
>>>
>>
>

Reply via email to