Zamil Majdy created SPARK-44718:
-----------------------------------

             Summary: High On-heap memory usage is detected while doing 
parquet-file reading with Off-Heap memory mode enabled on spark
                 Key: SPARK-44718
                 URL: https://issues.apache.org/jira/browse/SPARK-44718
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core, SQL
    Affects Versions: 3.4.1
            Reporter: Zamil Majdy


I see the high use of on-heap memory usage while doing the parquet file reading 
when the off-heap memory mode is enabled. This is caused by the memory-mode for 
the column vector for the vectorized reader is configured by different flag, 
and the default value is always set to On-Heap.

Conf to reproduce the issue:

{{spark.memory.offHeap.size 1000000}}
{{spark.memory.offHeap.enabled true}}

Enabling these configurations only will not change the memory mode used for 
parquet-reading by the vectorized reader to Off-Heap.

 

Proposed PR: https://github.com/apache/spark/pull/42394



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to