Hi Ken Interesting code you have there. I will have to take a closer look at it. Yes I would be happy to collaborate. But let me first try my problem out in Julia .. I am new to Julia and I am currently debating whether my code that I want to process will be faster in Python using mpi4py or Julia in parallel. I am definitely more familiar with Python. Keep in touch.
Charles On Saturday, July 26, 2014 11:26:31 PM UTC-7, Ken B wrote: > > Hi Charles, > > You can have a look at the MinFinder algorithm for which I've just created > a pull request to Optim.jl (talk about a coincidence!): > https://github.com/JuliaOpt/Optim.jl/pull/72 > > I'd like to add the possibility to run each optimization in parallel, but > I have no experience with these things, although I have time to learn :). > Would you like to collaborate on this? > > Does anyone know of some parallel sample code to have a look at? Basically > it's sending each optimization problem to a separate worker and getting the > results, taking into account that some optimizations might take much longer > than others. > > Cheers, > Ken > > On Saturday, 26 July 2014 23:13:28 UTC-5, Charles Martineau wrote: >> >> Yes I could do that but it is simpler (I think) to execute the code in >> parallel instead of sending 20 codes to be executed on the cluste.r >> >> On Saturday, July 26, 2014 10:08:20 AM UTC-7, Michael Prentiss wrote: >>> >>> What you are doing makes sense. Starting from multiple starting points >>> is important. >>> >>> I am curious why you just don't just run 20 different 1-processor jobs >>> instead of bothering with the parallelism? >>> >>> >>> On Saturday, July 26, 2014 11:22:07 AM UTC-5, Iain Dunning wrote: >>>> >>>> The idea is to call the optimize function multiple times in parallel, >>>> not to call it once and let it do parallel multistart. >>>> >>>> Check out the "parallel map and loops" section of the parallel >>>> programming chapter in the Julia manual, I think it'll be clearer there. >>>> >>>> On Friday, July 25, 2014 8:00:40 PM UTC-4, Charles Martineau wrote: >>>>> >>>>> Thank you for your answer. So I would have to loop over, say 20 random >>>>> set of starting points, where in my loop I would use the Optim package to >>>>> minimize my MLE function for each random set. Where online is the >>>>> documents >>>>> that shows how to specify that we want the command >>>>> >>>>> Optim.optimize(my function, etc.) to be parallelized? Sorry for my >>>>> ignorance, I am new to Julia! >>>>> >>>>> >>>>> On Friday, July 25, 2014 2:04:08 PM UTC-7, Iain Dunning wrote: >>>>>> >>>>>> I'm not familiar with that particular package, but the Julia way to >>>>>> do it could be to use the Optim.jl package and create a random set of >>>>>> starting points, and do a parallel-map over that set of starting points. >>>>>> Should work quite well. Trickier (maybe) would be to just give each >>>>>> processor a different random seed and generate starting points on each >>>>>> processor. >>>>>> >>>>>> On Friday, July 25, 2014 3:05:05 PM UTC-4, Charles Martineau wrote: >>>>>>> >>>>>>> Dear Julia developers and users, >>>>>>> >>>>>>> I am currently using in Matlab the multisearch algorithm to find >>>>>>> multiple local minima: >>>>>>> http://www.mathworks.com/help/gads/multistart-class.html for a MLE >>>>>>> function. >>>>>>> I use this Multisearch in a parallel setup as well. >>>>>>> >>>>>>> Can I do something similar in Julia using parallel programming? >>>>>>> >>>>>>> Thank you >>>>>>> >>>>>>> Charles >>>>>>> >>>>>>>
