Short update: the merger of this PR in Optim.jl has been postponed. 
Meanwhile it is available as a separate package: 
Pkg.clone("https://github.com/Ken-B/MinFinder.jl.git";) Feel free to try it 
out and, as usual, all feedback welcome :)

Ken

On Sunday, 27 July 2014 23:06:34 UTC-5, Ken B wrote:
>
> Hi Hans,
>
> 1) Your welcome :) The code is in the PR: 
> https://github.com/JuliaOpt/Optim.jl/pull/73 and if all goes well could 
> merge soon. Do try it out and share the experience.
>
> 2) Mindfinder 2.0 introduces 2 new stopping rules and an extra validation 
> rule for sample points. If there is interest, I could add these as optional 
> to the current code.
>
> 3) The package test uses the rosenbrock, camel, rastrigin and 
> shekel(5,7,10) examples from the paper. They can be found in the folder 
> `problems`. Some more are described at 
> http://www2.compute.dtu.dk/~kajm/Test_ex_forms/test_ex.html. I would be 
> happy to add those you have already implemented, let me know where's the 
> code. 
>
> 4) Good question. As far as l know, these optimization procedures only 
> look at specific function values and sometimes their derivatives and then 
> hop around, so it is impossible to find all minima in finite time :) I 
> guess a program could solve the automatic differentiation equations if 
> analytical solutions exist. However, if not, then you can never be sure, it 
> would seem to me.
> The minfinder algorithm has a parameter EXHAUSTIVE (`p` in the paper) that 
> controls how exhaustive you explore the search space. You can tune that 
> parameter to your problem.
>
> Cheers,
> Ken
>
> On Sunday, 27 July 2014 08:25:28 UTC-5, Hans W Borchers wrote:
>>
>> Ken:
>>
>> (1) Thanks for pointing out this approach and for implementing it.
>> Unfortunately, I was not able to locate your code at Github. I would 
>> certainly try it out on some of my examples in global optimization.
>>
>> (2) Did you include (or do you plan to include) the improvements of 
>> MinFinder,
>> as discussed in "MinFinder 2.0: An improved version of MinFinder" by 
>> Tsoulos and Lagaris?
>>
>> (3) Also this article contains examples of functions with many local 
>> minima.
>> Most of these are test functions for global optimization procedures. Did 
>> you test your function on these examples?
>>
>> I have implemented  some of these functions for my own purposes.
>> I wonder whether it would be useful to have a Julia package of its own 
>> for compiling optimization test functions.
>>
>> (4) Are you sure/Is it guaranteed MinFinder will *reliably* find *all* 
>> local minima?
>> This is a difficult problem, and for example there is a long discussion 
>> on this topic in Chapter 4, by Stan Wagon, in the book "The SIAM 100 Digit 
>> Challenge" about all the preventive measures to be taken to be able to 
>> guarantee to find all local minima -- and thus also the one global minimum.
>>
>>
>> On Sunday, July 27, 2014 8:26:31 AM UTC+2, Ken B wrote:
>>>
>>> Hi Charles,
>>>
>>> You can have a look at the MinFinder algorithm for which I've just 
>>> created a pull request to Optim.jl (talk about a coincidence!):
>>> https://github.com/JuliaOpt/Optim.jl/pull/72
>>>
>>> I'd like to add the possibility to run each optimization in parallel, 
>>> but I have no experience with these things, although I have time to learn 
>>> :). Would you like to collaborate on this? 
>>>
>>> Does anyone know of some parallel sample code to have a look at? 
>>> Basically it's sending each optimization problem to a separate worker and 
>>> getting the results, taking into account that some optimizations might take 
>>> much longer than others.
>>>
>>> Cheers,
>>> Ken
>>>
>>> On Saturday, 26 July 2014 23:13:28 UTC-5, Charles Martineau wrote:
>>>>
>>>> Yes I could do that but it is simpler (I think) to execute the code in 
>>>> parallel instead of sending 20 codes to be executed on the cluste.r 
>>>>
>>>> On Saturday, July 26, 2014 10:08:20 AM UTC-7, Michael Prentiss wrote:
>>>>>
>>>>> What you are doing makes sense.  Starting from multiple starting 
>>>>> points is important.
>>>>>
>>>>> I am curious why you just don't just run 20 different 1-processor jobs 
>>>>> instead of bothering with the parallelism?
>>>>>
>>>>>
>>>>> On Saturday, July 26, 2014 11:22:07 AM UTC-5, Iain Dunning wrote:
>>>>>>
>>>>>> The idea is to call the optimize function multiple times in parallel, 
>>>>>> not to call it once and let it do parallel multistart.
>>>>>>
>>>>>> Check out the "parallel map and loops" section of the parallel 
>>>>>> programming chapter in the Julia manual, I think it'll be clearer there.
>>>>>>
>>>>>> On Friday, July 25, 2014 8:00:40 PM UTC-4, Charles Martineau wrote:
>>>>>>>
>>>>>>> Thank you for your answer. So I would have to loop over, say 20 
>>>>>>> random set of starting points, where in my loop I would use the Optim 
>>>>>>> package to minimize my MLE function for each random set. Where online 
>>>>>>> is 
>>>>>>> the documents that shows how to specify that we want the command 
>>>>>>>
>>>>>>> Optim.optimize(my function, etc.) to be parallelized? Sorry for my 
>>>>>>> ignorance, I am new to Julia!
>>>>>>>
>>>>>>>
>>>>>>> On Friday, July 25, 2014 2:04:08 PM UTC-7, Iain Dunning wrote:
>>>>>>>>
>>>>>>>> I'm not familiar with that particular package, but the Julia way to 
>>>>>>>> do it could be to use the Optim.jl package and create a random set of 
>>>>>>>> starting points, and do a parallel-map over that set of starting 
>>>>>>>> points. 
>>>>>>>> Should work quite well. Trickier (maybe) would be to just give each 
>>>>>>>> processor a different random seed and generate starting points on each 
>>>>>>>> processor.
>>>>>>>>
>>>>>>>> On Friday, July 25, 2014 3:05:05 PM UTC-4, Charles Martineau wrote:
>>>>>>>>>
>>>>>>>>> Dear Julia developers and users,
>>>>>>>>>
>>>>>>>>> I am currently using in Matlab the multisearch algorithm to find 
>>>>>>>>> multiple local minima: 
>>>>>>>>> http://www.mathworks.com/help/gads/multistart-class.html for a 
>>>>>>>>> MLE function.
>>>>>>>>> I use this Multisearch in a parallel setup as well.
>>>>>>>>>
>>>>>>>>> Can I do something similar in Julia using parallel programming?
>>>>>>>>>
>>>>>>>>> Thank you
>>>>>>>>>
>>>>>>>>> Charles
>>>>>>>>>
>>>>>>>>>

Reply via email to