Thanks Rob, I have notified the maintainer about the suggestion. Oyvind
________________________________ From: Robert Lowe <ra...@cam.ac.uk> To: oyvfos <oyv...@yahoo.no> Cc: r-devel@r-project.org Sent: Wed, May 18, 2011 2:27:15 PM Subject: Re: [Rd] Max likelihood using GPU Hi Oyvind, I believe this is possible to implement. There is already some work ongoing in using the GPU in R and they use the CUDA toolkit as the reference you supplied do. http://brainarray.mbni.med.umich.edu/Brainarray/rgpgpu/ Thanks, Rob On 18 May 2011, at 10:07, oyvfos wrote: > Dear all, > Probably many of you experience long computation times when estimating large > number of parameters using maximum likelihood with functions that reguire > numerical methods such as integration or root-finding. Maximum likelihood is > an example of paralellization that could sucessfully utilize GPU. The > general algorithm is described here: >http://openlab-mu-internal.web.cern.ch/openlab-mu-internal/03_Documents/4_Presentations/Slides/2010-list/CHEP-Maximum-likelihood-fits-on-GPUs.pdf. >. > Is it possible to implement this algorithm in R ? > Kind regards, Oyvind Foshaug > > -- > View this message in context: >http://r.789695.n4.nabble.com/Max-likelihood-using-GPU-tp3532034p3532034.html > Sent from the R devel mailing list archive at Nabble.com. > > ______________________________________________ > R-devel@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel [[alternative HTML version deleted]] ______________________________________________ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel