Hi Christopher:

Thank you for the suggestions.  I think I understand.  

Is the fix to create a "super" array, super_array, that contains all the 
inputs PF_outer wants to pass to PF_inner, i.e., output = pmap(PF_Inner, 
super_array)?

I will give your suggestions a go later today or tomorrow and let you know 
the outcome.

Best wishes,

Jim

On Saturday, October 10, 2015 at 8:38:51 AM UTC-4, Christopher Fisher wrote:
>
> Another thing to check is how you are mapping your inputs to the available 
> workers. Just to give you a simple example, suppose you have a function 
> called MyModel that accepts an array of parameters and each worker receives 
> the same parameters. The basic steps would be:
>
> Nworkers = 3
>
> parms = [.3 2.0 .33]
>
> #Array of parameter arrays, one for each worker
> ParmArray = Any[parms for i = 1:Nworkers]
>
> output = pmap(MyModel,ParmArray)
>
> So you would have to adapt your more complex input structure to the 
> example above. 
>
>
> On Wednesday, October 7, 2015 at 7:46:13 PM UTC-4, [email protected] 
> wrote:
>>
>> Hi All:
>>
>> Julia is opened for a terminal session using Julia -p 4.
>>
>> A program test_SI_AR1.jl is run from the julia command line with the 
>> return 
>>
>>
>> exception on exception on exception on 4: exception on 3: 2: 5: ERROR: 
>> `PF_SI_AR1_inner` has no method matching PF_SI_AR1_inner(::Array{Any,1})
>>  in anonymous at multi.jl:855
>>  in run_work_thunk at multi.jl:621
>>  in anonymous at task.jl:855
>> ERROR: `PF_SI_AR1_inner` has no method matching 
>> PF_SI_AR1_inner(::Array{Any,1})
>>  in anonymous at multi.jl:855
>>  in run_work_thunk at multi.jl:621
>>  in anonymous at task.jl:855
>> ERROR: `PF_SI_AR1_inner` has no method matching 
>> PF_SI_AR1_inner(::Array{Any,1})
>>  in anonymous at multi.jl:855
>>  in run_work_thunk at multi.jl:621
>>  in anonymous at task.jl:855
>> ERROR: `PF_SI_AR1_inner` has no method matching 
>> PF_SI_AR1_inner(::Array{Any,1})
>>  in anonymous at multi.jl:855
>>  in run_work_thunk at multi.jl:621
>>  in anonymous at task.jl:855
>> ERROR: `exp` has no method matching exp(::Array{MethodError,1})
>>  in PF_SI_AR1_outer at 
>> /home/jim_nason/jmn_work/smith/NS4/jl_code_Summer2015/SI_rho/MH_PF_test/PF_outer_example.jl:123
>>  in include at ./boot.jl:245
>>  in include_from_node1 at ./loading.jl:128
>>  in reload_path at loading.jl:152
>>  in _require at loading.jl:67
>>  in require at loading.jl:51
>> while loading 
>> /home/jim_nason/jmn_work/smith/NS4/jl_code_Summer2015/SI_rho/MH_PF_test/test_SI_AR1.jl,
>>  
>> in expression starting on line 88
>>
>>
>> Note that these error statements keep repeating until Julia is forced to 
>> shut down.
>>
>> The code for PF_outer_example.jl is attached.  PF_outer_example.jl calls 
>> to PF_inner_example.jl, which is also attached.
>>
>> This code implements a particle filter Markov chain Monte Carlo estimator 
>> of a state space model.  The particle filter is run in 
>> PF_inner_example.jl.  
>>
>> test_SI_AR1.jl loads parameters and coefficients of the model to be 
>> estimated along with the data, yyy, which is a nvar x obs array and passes 
>> these PF_outer_example.jl
>>
>> PF_outer_example.jl runs the particle filter on all the observations of 
>> yyy, j = 1, 2, ..., obs.  At each j, PF_outer_example.jl does several 
>> computations and passes these, yyy[:,j], and the parameters and 
>> coefficients to PF_inner_example.jl
>>
>> PF_inner_example.jl runs the particle filter on observations yyy[.,j] for 
>> m = 1, 2, ..., mprt replications.  mprt = the number of particles which be 
>> anywhere from 500 to 10,000.
>>
>> I am trying to implement a pmap command in PF_outer_example.jl to run the 
>> mprt particles in PF_inner_example.jl in parallel.
>>
>> With no success.
>>
>> I have tried a couple of variations of the pmap command, but the same 
>> error message above is always returned by Julia.  The variations of the 
>> pmap command that I have tried are listed in PF_outer_example.jl.
>>
>> Obviously, I do not understand something that is fundamental to pmap.  
>> Any advice/suggestions are welcome.
>>
>> Best,
>>
>> Jim
>>
>>
>>

Reply via email to