For anyone who encounters this issue, I figured out a solution. One problem 
is that map sent each individual element in an array to the LogLikelihood 
function, rather than the appropriate row/column vectors. I created an one 
dimension Any array of tuples. Each element of the resulting array 
contained an element with the appropriate vectors, which were extracted in 
the function. 

The other issue was passing data to the function so that the log likelihood 
could be evaluated. I did not see a way to do this through the function or 
map. Strangely enough @eveywhere only worked immediately after I initially 
added the procs. The problem was that I could not change the data with 
@everywhere (although it would change locally). The following link shows 
how to accomplish this goal with two sendto functions (both must be loaded 
into your session). 

http://stackoverflow.com/questions/27677399/julia-how-to-copy-data-to-another-processor-in-julia

There may be more elegant solutions, but this appears to work. 

On Tuesday, April 28, 2015 at 2:25:15 PM UTC-4, Christopher Fisher wrote:
>
>
> I'm fitting a complex cognitive model to data. Because the model does not 
> have closed-form solution for the likelihood, computationally intensive 
> simulation is required to generate the model predictions for fitting. My 
> fitting routine involves two steps: (1) a brute force search of the 
> plausible parameter space to find a good starting point for (2) finetuning 
> with Optim. I have been able to parallelize the second step but I am having 
> trouble with the first. 
>  
> Here is what I did: I precomputed kernel density functions from 100,000 
> parameter combinations, resulting in three files: (1) ParmList, a list of 
> the 100,000 parameter combinations, (2) RangeVar, the range inputs into the 
> kernel density function, UnivariateKDE(), and (3) DensityVar,  the 
> density inputs also for the kernel density function,UnivariateKDE(). I 
> would like to use a distributed array to compute the loglikelihood of 
> various sets of data across the 100,000 paramter combinations.  I'm having 
> trouble using map with a function that loops over the kernel densities. I'm 
> also having trouble getting the data to the function.  The basic code I 
> have is provided below. Any help would be greatly appreciated.
>  
> addprocs(16)
>
> #RangeVar and DensityVar are precomputed inputs for the kernel density 
> function
>
> #UnivariateKDE
>
> #100,000 by 4 array
>
> RangeVar = readcsv("RangeVar.csv")
>
> #2048 by 100,000 array
>
> DensityVar = readcsv("DesnsityVar.csv")
>
> #List of parameters corresponding to each kernel density function
>
> #100,000 by 4 array
>
> ParmList = readcsv("ParmList.csv")
>
>  
>
>  
>
> #Convert to Distributed Array
>
> RangeVarD = distribute(RangeVar)
>
> #Convert to Distributed Array
>
> DensityVarD = distribute(DensityVar)
>
>  
>
> #Example data
>
> data = [.34 .27 .32 .35 .34 .33]
>
>  
>
> @everywhere function LogLikelihood(DensityVar,DensityRange,data)
>
>  #Loop over the parameter combinations, grabbing the appropriate inputes 
> to reconstruct the kernel density functions
>
> for i = 1:size(RangeVar,1)
>
> range = FloatRange(RangeVar[i,1],RangeVar[i,2],RangeVar[i,3],RangeVar[i,4])
>
>         Dens = DensityVar[:,i]
>
> #Reconstruct the Kernal density function corresponding to the ith 
> parameter set
>
>         f = UnivariateKDE(range,Dens)
>
>             #Estimate the likelihood of the data given the parameters 
> using the kernel density function
>
>         L = pdf(f,data)
>
>             #Compute the summed log likelihood
>
>         KDLL[i] = sum(log(L))
>
>     end
>
>     return KDLL
>
> end
>
>  
>

Reply via email to