For those interested in this question, through some off-list discussion, it ended up getting an expanded version and answer over on Stack Overflow:
https://stackoverflow.com/questions/45172614/use-chapel-to-handle-massive-matrix -Brad On Tue, 18 Jul 2017, buddha via Chapel-users wrote:
I have a matrix that is too large for my current machines to load directly into memory. Is there a preferred patten in the Chapel community for something like incremental loads + distribution? Spark uses the RDD notion to persist to disk strategically. Does Chapel support a pattern like that? Thanks, b ~~~~~ May All Your Sequences Converge ------------------------------------------------------------------------------ Check out the vibrant tech community on one of the world's most engaging tech sites, Slashdot.org! http://sdm.link/slashdot _______________________________________________ Chapel-users mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/chapel-users
------------------------------------------------------------------------------ Check out the vibrant tech community on one of the world's most engaging tech sites, Slashdot.org! http://sdm.link/slashdot _______________________________________________ Chapel-users mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/chapel-users
