Nope.

One could write a SIF parser from scratch, but it would take some time.

--Tim

On Sunday, July 27, 2014 08:51:51 AM John Myles White wrote:
> Is CUTEst.jl easier to get working these days? The issue I opened in March
> seems to still be open.
> 
>  — John
> 
> On Jul 27, 2014, at 6:40 AM, Tim Holy <[email protected]> wrote:
> > A package of test functions sounds worthwhile. There's also CUTEst.jl:
> > https://github.com/lpoo/CUTEst.jl
> > 
> > --Tim
> > 
> > On Sunday, July 27, 2014 06:25:28 AM Hans W Borchers wrote:
> >> Ken:
> >> 
> >> (1) Thanks for pointing out this approach and for implementing it.
> >> Unfortunately, I was not able to locate your code at Github. I would
> >> certainly try it out on some of my examples in global optimization.
> >> 
> >> (2) Did you include (or do you plan to include) the improvements of
> >> MinFinder,
> >> as discussed in "MinFinder 2.0: An improved version of MinFinder" by
> >> Tsoulos and Lagaris?
> >> 
> >> (3) Also this article contains examples of functions with many local
> >> minima. Most of these are test functions for global optimization
> >> procedures. Did you test your function on these examples?
> >> 
> >> I have implemented  some of these functions for my own purposes.
> >> I wonder whether it would be useful to have a Julia package of its own
> >> for
> >> compiling optimization test functions.
> >> 
> >> (4) Are you sure/Is it guaranteed MinFinder will *reliably* find *all*
> >> local minima?
> >> This is a difficult problem, and for example there is a long discussion
> >> on
> >> this topic in Chapter 4, by Stan Wagon, in the book "The SIAM 100 Digit
> >> Challenge" about all the preventive measures to be taken to be able to
> >> guarantee to find all local minima -- and thus also the one global
> >> minimum.
> >> 
> >> On Sunday, July 27, 2014 8:26:31 AM UTC+2, Ken B wrote:
> >>> Hi Charles,
> >>> 
> >>> You can have a look at the MinFinder algorithm for which I've just
> >>> created
> >>> a pull request to Optim.jl (talk about a coincidence!):
> >>> https://github.com/JuliaOpt/Optim.jl/pull/72
> >>> 
> >>> I'd like to add the possibility to run each optimization in parallel,
> >>> but
> >>> I have no experience with these things, although I have time to learn
> >>> :).
> >>> Would you like to collaborate on this?
> >>> 
> >>> Does anyone know of some parallel sample code to have a look at?
> >>> Basically
> >>> it's sending each optimization problem to a separate worker and getting
> >>> the
> >>> results, taking into account that some optimizations might take much
> >>> longer
> >>> than others.
> >>> 
> >>> Cheers,
> >>> Ken
> >>> 
> >>> On Saturday, 26 July 2014 23:13:28 UTC-5, Charles Martineau wrote:
> >>>> Yes I could do that but it is simpler (I think) to execute the code in
> >>>> parallel instead of sending 20 codes to be executed on the cluste.r
> >>>> 
> >>>> On Saturday, July 26, 2014 10:08:20 AM UTC-7, Michael Prentiss wrote:
> >>>>> What you are doing makes sense.  Starting from multiple starting
> >>>>> points
> >>>>> is important.
> >>>>> 
> >>>>> I am curious why you just don't just run 20 different 1-processor jobs
> >>>>> instead of bothering with the parallelism?
> >>>>> 
> >>>>> On Saturday, July 26, 2014 11:22:07 AM UTC-5, Iain Dunning wrote:
> >>>>>> The idea is to call the optimize function multiple times in parallel,
> >>>>>> not to call it once and let it do parallel multistart.
> >>>>>> 
> >>>>>> Check out the "parallel map and loops" section of the parallel
> >>>>>> programming chapter in the Julia manual, I think it'll be clearer
> >>>>>> there.
> >>>>>> 
> >>>>>> On Friday, July 25, 2014 8:00:40 PM UTC-4, Charles Martineau wrote:
> >>>>>>> Thank you for your answer. So I would have to loop over, say 20
> >>>>>>> random
> >>>>>>> set of starting points, where in my loop I would use the Optim
> >>>>>>> package
> >>>>>>> to
> >>>>>>> minimize my MLE function for each random set. Where online is the
> >>>>>>> documents
> >>>>>>> that shows how to specify that we want the command
> >>>>>>> 
> >>>>>>> Optim.optimize(my function, etc.) to be parallelized? Sorry for my
> >>>>>>> ignorance, I am new to Julia!>>>>>
> >>>>>>> 
> >>>>>>> On Friday, July 25, 2014 2:04:08 PM UTC-7, Iain Dunning wrote:
> >>>>>>>> I'm not familiar with that particular package, but the Julia way to
> >>>>>>>> do it could be to use the Optim.jl package and create a random set
> >>>>>>>> of
> >>>>>>>> starting points, and do a parallel-map over that set of starting
> >>>>>>>> points.
> >>>>>>>> Should work quite well. Trickier (maybe) would be to just give each
> >>>>>>>> processor a different random seed and generate starting points on
> >>>>>>>> each
> >>>>>>>> processor.
> >>>>>>>> 
> >>>>>>>> On Friday, July 25, 2014 3:05:05 PM UTC-4, Charles Martineau wrote:
> >>>>>>>>> Dear Julia developers and users,
> >>>>>>>>> 
> >>>>>>>>> I am currently using in Matlab the multisearch algorithm to find
> >>>>>>>>> multiple local minima:
> >>>>>>>>> http://www.mathworks.com/help/gads/multistart-class.html for a MLE
> >>>>>>>>> function.
> >>>>>>>>> I use this Multisearch in a parallel setup as well.
> >>>>>>>>> 
> >>>>>>>>> Can I do something similar in Julia using parallel programming?
> >>>>>>>>> 
> >>>>>>>>> Thank you
> >>>>>>>>> 
> >>>>>>>>> Charles

Reply via email to