No, I don't have any particular application in mind right now, but in 
general I always found that mixed effects models take a long time to run on 
large data sets. 

On Saturday, August 27, 2016 at 11:13:20 AM UTC-4, Douglas Bates wrote:
>
> On Friday, August 26, 2016 at 6:08:13 PM UTC-5, Min-Woong Sohn wrote:
>>
>> Does anybody know of any plan to support ArrayFire in GLM or MixedModels 
>> any time soon?
>>
>
> Do you have a particular application in mind or is this a general 
> question?  For MixedModels I would say that, depending upon the 
> configuration of the random-effects terms in a model there could be a great 
> advantage or almost no advantage in using a GPU, so details are important.
>
> We're always looking for challenging GLM or mixed-effects problems that 
> can be used to tune up these packages.  If you have cases that seem to be 
> taking a long time and would be suitable for parallel or GPU computing we 
> would love to hear about them. 
>  
>
>> On Friday, June 10, 2016 at 1:08:42 AM UTC-4, [email protected] 
>> wrote:
>>>
>>> Hello, 
>>>
>>> We are pleased to announce ArrayFire.jl, a library for GPU and 
>>> heterogeneous computing in Julia: (
>>> https://github.com/JuliaComputing/ArrayFire.jl). We look forward to 
>>> your feedback and your contributions as well! 
>>>
>>> For more information, check out Julia Computing's latest blog post: 
>>> http://juliacomputing.com/blog/2016/06/09/julia-gpu.html
>>>
>>> Thanks,
>>> Ranjan
>>> Julia Computing, Inc. 
>>>
>>

Reply via email to