We are adding exact first-class derivative calculation operators (Automatic Differentiation or AD) to the lambda calculus, and embodying the combination in a production-quality fast system suitable for numeric computing in general, and compositional machine learning methods in particular. To the programming language community, we seek to contribute a way to make numeric software faster, more robust, and easier to write.
To the machine learning community, in addition to the above practical benefits, we seek to contribute a system that embodies *compositionality*, in that gradient optimisation can be automatically and efficiently performed on systems themselves consisting of many components, even when such components may internally perform optimisation. (Examples of this include, say, optimisation of the rules of a multi-player game to cause the players actions to satisfy some desiderata, where the players themselves optimise their own strategy with using a simple model of the opponent which they optimise according to their opponent's behaviour; or multi-agent learning where one agent learns an internal model of another agent, where that internal model itself performs learning.) To this end, we are seeking two postdoctoral researchers and one research programmer with interest and experience in a cohert subset of: programming language theory, numerics, automatic differentiation, and machine learning. Inquiries to: Barak A. Pearlmutter <ba...@cs.nuim.ie> Informal announcment with more details: http://www.bcl.hamilton.ie/~barak/ad-fp-positions.html, which will have a reference to the formal announcement when it becomes available. -- Barak A. Pearlmutter Hamilton Institute & Dept Computer Science NUI Maynooth
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe