[
https://issues.apache.org/jira/browse/SYSTEMML-2476?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
LI Guobao updated SYSTEMML-2476:
--------------------------------
Description:
When trying to use scalar casting to get element from a list, unexpected
mapreduce tasks are launched instead of CP mode. The scenario is to replace *C
= 1* with *C = as.scalar(hyperparams["C"])* inside the {{_gradient function_}}
found in {{_src/test/scripts/functions/paramserv/mnist_lenet_paramserv.dml_}}.
And then the problem could be reproduced by launching the method
{{_testParamservBSPBatchDisjointContiguous_}} inside class
_{{org.apache.sysml.test.integration.functions.paramserv.ParamservLocalNNTest}}_
Here is the stack:
{code:java}
18/07/31 22:10:27 INFO mapred.MapTask: numReduceTasks: 1
18/07/31 22:10:27 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
18/07/31 22:10:27 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100
18/07/31 22:10:27 INFO mapred.MapTask: soft limit at 83886080
18/07/31 22:10:27 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600
18/07/31 22:10:27 INFO mapred.MapTask: kvstart = 26214396; length = 6553600
18/07/31 22:10:27 INFO mapreduce.Job: The url to track the job:
http://localhost:8080/
18/07/31 22:10:27 INFO mapreduce.Job: Running job: job_local792652629_0008
{code}
[~mboehm7], if possible, could you take a look on this? And I've double checked
the creation of execution context in {{ParamservBuiltinCPInstruction}}. But it
is instance of ExecutionContext not SparkExecutionContext.
was:
When trying to use scalar casting to get element from a list, unexpected
mapreduce tasks are launched instead of CP mode. The scenario is to replace *C
= 1* with *C = as.scalar(hyperparams["C"])* inside the {{_gradient function_}}
found in {{_src/test/scripts/functions/paramserv/mnist_lenet_paramserv.dml_}}.
And then the problem could be reproduced by launching the method
{{_testParamservBSPBatchDisjointContiguous_}} inside class
_{{org.apache.sysml.test.integration.functions.paramserv.ParamservLocalNNTest}}_
Here is the stack:
{code:java}
18/07/31 22:10:27 INFO mapred.MapTask: numReduceTasks: 1
18/07/31 22:10:27 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
18/07/31 22:10:27 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100
18/07/31 22:10:27 INFO mapred.MapTask: soft limit at 83886080
18/07/31 22:10:27 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600
18/07/31 22:10:27 INFO mapred.MapTask: kvstart = 26214396; length = 6553600
18/07/31 22:10:27 INFO mapreduce.Job: The url to track the job:
http://localhost:8080/
18/07/31 22:10:27 INFO mapreduce.Job: Running job: job_local792652629_0008
{code}
> Unexpected mapreduce task
> -------------------------
>
> Key: SYSTEMML-2476
> URL: https://issues.apache.org/jira/browse/SYSTEMML-2476
> Project: SystemML
> Issue Type: Bug
> Reporter: LI Guobao
> Priority: Major
>
> When trying to use scalar casting to get element from a list, unexpected
> mapreduce tasks are launched instead of CP mode. The scenario is to replace
> *C = 1* with *C = as.scalar(hyperparams["C"])* inside the {{_gradient
> function_}} found in
> {{_src/test/scripts/functions/paramserv/mnist_lenet_paramserv.dml_}}. And
> then the problem could be reproduced by launching the method
> {{_testParamservBSPBatchDisjointContiguous_}} inside class
> _{{org.apache.sysml.test.integration.functions.paramserv.ParamservLocalNNTest}}_
> Here is the stack:
> {code:java}
> 18/07/31 22:10:27 INFO mapred.MapTask: numReduceTasks: 1
> 18/07/31 22:10:27 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
> 18/07/31 22:10:27 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100
> 18/07/31 22:10:27 INFO mapred.MapTask: soft limit at 83886080
> 18/07/31 22:10:27 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600
> 18/07/31 22:10:27 INFO mapred.MapTask: kvstart = 26214396; length = 6553600
> 18/07/31 22:10:27 INFO mapreduce.Job: The url to track the job:
> http://localhost:8080/
> 18/07/31 22:10:27 INFO mapreduce.Job: Running job: job_local792652629_0008
> {code}
> [~mboehm7], if possible, could you take a look on this? And I've double
> checked the creation of execution context in
> {{ParamservBuiltinCPInstruction}}. But it is instance of ExecutionContext not
> SparkExecutionContext.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)