[ 
https://issues.apache.org/jira/browse/SPARK-13352?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15234488#comment-15234488
 ] 

Zhang, Liye edited comment on SPARK-13352 at 4/11/16 5:02 AM:
--------------------------------------------------------------

Hi [~davies], I think this JIRA is related with 
[SPARK-14242|https://issues.apache.org/jira/browse/SPARK-142242] and 
[SPARK-14290|https://issues.apache.org/jira/browse/SPARK-14290], can you test 
with spark master branch again to see if this issue still exists?


was (Author: liyezhang556520):
Hi [~davies], I think this JIRA is related with 
[SPARK-14242|https://issues.apache.org/jira/browse/SPARK-142242] and 
[SPARK-14290|https://issues.apache.org/jira/browse/SPARK-14290], can you test 
with spark master again to see if this issue still exists?

> BlockFetch does not scale well on large block
> ---------------------------------------------
>
>                 Key: SPARK-13352
>                 URL: https://issues.apache.org/jira/browse/SPARK-13352
>             Project: Spark
>          Issue Type: Bug
>          Components: Block Manager, Spark Core
>            Reporter: Davies Liu
>            Priority: Critical
>
> BlockManager.getRemoteBytes() perform poorly on large block
> {code}
>   test("block manager") {
>     val N = 500 << 20
>     val bm = sc.env.blockManager
>     val blockId = TaskResultBlockId(0)
>     val buffer = ByteBuffer.allocate(N)
>     buffer.limit(N)
>     bm.putBytes(blockId, buffer, StorageLevel.MEMORY_AND_DISK_SER)
>     val result = bm.getRemoteBytes(blockId)
>     assert(result.isDefined)
>     assert(result.get.limit() === (N))
>   }
> {code}
> Here are runtime for different block sizes:
> {code}
> 50M            3 seconds
> 100M          7 seconds
> 250M          33 seconds
> 500M         2 min
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to