Hi Ralf, Peter,

sklearn's implementation of GPML already enables custom correlation
function as far as they are callables with signature matching corr(theta,
dx).

I can't see anything unusual in the arguments of the sklearn correlation
function. GPML theory (or at least what the GaussianProcess estimator
implements) is restricted to stationary correlation functions, so there is
no need to take x and x' as arguments. The Manhattan distance d between x
and x' is enough, isn't it?

Regarding the spherical Bessel function, did you have a look at scipy
specials? (http://docs.scipy.org/doc/scipy-0.7.x/reference/special.html)

If you're trying to implement the Matern kernel, import the matern function
from the attached python file.

Cheers,
Vincent


2013/12/10 Mathieu Blondel <[email protected]>

> Hi Ralf,
>
> You might find the following gist useful:
> https://gist.github.com/mblondel/6230778
>
> It should support the usual kernel="precomputed". However, my
> implementation doesn't support the usual features that one would expect
> from a GP implementation such as kernel parameter tuning or ARD. In fact,
> this is essentially a kernel ridge implementation.
>
> HTH,
> Mathieu
>
>
>
> On Tue, Dec 10, 2013 at 5:53 PM, Peter Prettenhofer <
> [email protected]> wrote:
>
>> Hi Ralf,
>>
>> unfortunately, I cannot answer your question but it would be indeed very
>> valuabe to allow custom correlation functions.
>>
>> best,
>>  Peter
>>
>>
>> 2013/12/9 Ralf Gunter <[email protected]>
>>
>>> Hi all,
>>>
>>> We're trying to use a custom correlation kernel with GP in the usual
>>> form K(x, x'). However, by looking at the built-in correlation models
>>> (and how they're used by gaussian_process.py) it seems sklearn only
>>> takes models in the form K(theta, dx). There may very well be a
>>> reformulation of our K that depends only on (x-x'), but if so it would
>>> probably be highly non-trivial as it depends on e.g. modified
>>> spherical bessel functions evaluated at a scaled product of the xs. Is
>>> there any way to have the GP module take our kernel without modifying
>>> the GP code?
>>>
>>> I apologize if this has been asked/answered before -- some searching
>>> on google only led me to models that also depend only on (x-x').
>>>
>>> Thanks!
>>>
>>>
>>> ------------------------------------------------------------------------------
>>> Sponsored by Intel(R) XDK
>>> Develop, test and display web and hybrid apps with a single code base.
>>> Download it for free now!
>>>
>>> http://pubads.g.doubleclick.net/gampad/clk?id=111408631&iu=/4140/ostg.clktrk
>>> _______________________________________________
>>> Scikit-learn-general mailing list
>>> [email protected]
>>> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>>>
>>
>>
>>
>> --
>> Peter Prettenhofer
>>
>>
>> ------------------------------------------------------------------------------
>> Sponsored by Intel(R) XDK
>> Develop, test and display web and hybrid apps with a single code base.
>> Download it for free now!
>>
>> http://pubads.g.doubleclick.net/gampad/clk?id=111408631&iu=/4140/ostg.clktrk
>> _______________________________________________
>> Scikit-learn-general mailing list
>> [email protected]
>> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>>
>>
>
>
> ------------------------------------------------------------------------------
> Sponsored by Intel(R) XDK
> Develop, test and display web and hybrid apps with a single code base.
> Download it for free now!
>
> http://pubads.g.doubleclick.net/gampad/clk?id=111408631&iu=/4140/ostg.clktrk
> _______________________________________________
> Scikit-learn-general mailing list
> [email protected]
> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>
>
#!/usr/bin/python
# -*- coding: utf-8 -*-

# Author: Vincent Dubourg <[email protected]>
#         (mostly translation, see implementation details)
# License: BSD style

"""
The built-in correlation models submodule for the gaussian_process module.
"""


import numpy as np
from scipy.special import kv # The modified bessel function of the second kind
from scipy.special import gamma # The gamma function


def absolute_exponential(theta, d):
    """
    Absolute exponential autocorrelation model.
    (Ornstein-Uhlenbeck stochastic process)

                                        n
    theta, dx --> r(theta, dx) = exp(  sum  - theta_i * |dx_i| )
                                      i = 1

    Parameters
    ----------
    theta : array_like
        An array with shape 1 (isotropic) or n (anisotropic) giving the
        autocorrelation parameter(s).

    dx : array_like
        An array with shape (n_eval, n_features) giving the componentwise
        distances between locations x and x' at which the correlation model
        should be evaluated.

    Returns
    -------
    r : array_like
        An array with shape (n_eval, ) containing the values of the
        autocorrelation model.
    """
    theta = np.asanyarray(theta, dtype=np.float)
    d = np.abs(np.asanyarray(d, dtype=np.float))

    if d.ndim > 1:
        n_features = d.shape[1]
    else:
        n_features = 1

    if theta.size == 1:
        return np.exp(- theta[0] * np.sum(d, axis=1))
    elif theta.size != n_features:
        raise ValueError("Length of theta must be 1 or %s" % n_features)
    else:
        return np.exp(- np.sum(theta.reshape(1, n_features) * d, axis=1))


def squared_exponential(theta, d):
    """
    Squared exponential correlation model (Radial Basis Function).
    (Infinitely differentiable stochastic process, very smooth)

                                        n
    theta, dx --> r(theta, dx) = exp(  sum  - theta_i * (dx_i)^2 )
                                      i = 1

    Parameters
    ----------
    theta : array_like
        An array with shape 1 (isotropic) or n (anisotropic) giving the
        autocorrelation parameter(s).

    dx : array_like
        An array with shape (n_eval, n_features) giving the componentwise
        distances between locations x and x' at which the correlation model
        should be evaluated.

    Returns
    -------
    r : array_like
        An array with shape (n_eval, ) containing the values of the
        autocorrelation model.
    """

    theta = np.asanyarray(theta, dtype=np.float)
    d = np.asanyarray(d, dtype=np.float)

    if d.ndim > 1:
        n_features = d.shape[1]
    else:
        n_features = 1

    if theta.size == 1:
        return np.exp(- theta[0] * np.sum(d**2, axis=1))
    elif theta.size != n_features:
        raise ValueError("Length of theta must be 1 or %s" % n_features)
    else:
        return np.exp(- np.sum(theta.reshape(1, n_features) * d**2, axis=1))


def generalized_exponential(theta, d):
    """
    Generalized exponential correlation model.
    (Useful when one does not know the smoothness of the function to be
    predicted.)

                                        n
    theta, dx --> r(theta, dx) = exp(  sum  - theta_i * |dx_i|^p )
                                      i = 1

    Parameters
    ----------
    theta : array_like
        An array with shape 1+1 (isotropic) or n+1 (anisotropic) giving the
        autocorrelation parameter(s) (theta, p).

    dx : array_like
        An array with shape (n_eval, n_features) giving the componentwise
        distances between locations x and x' at which the correlation model
        should be evaluated.

    Returns
    -------
    r : array_like
        An array with shape (n_eval, ) with the values of the autocorrelation
        model.
    """

    theta = np.asanyarray(theta, dtype=np.float)
    d = np.asanyarray(d, dtype=np.float)

    if d.ndim > 1:
        n_features = d.shape[1]
    else:
        n_features = 1

    lth = theta.size
    if n_features > 1 and lth == 2:
        theta = np.hstack([np.repeat(theta[0], n_features), theta[1]])
    elif lth != n_features + 1:
        raise Exception("Length of theta must be 2 or %s" % (n_features + 1))
    else:
        theta = theta.reshape(1, lth)

    td = theta[:, 0:-1].reshape(1, n_features) * np.abs(d) ** theta[:, -1]
    r = np.exp(- np.sum(td, 1))

    return r


def pure_nugget(theta, d):
    """
    Spatial independence correlation model (pure nugget).
    (Useful when one wants to solve an ordinary least squares problem!)

                                         n
    theta, dx --> r(theta, dx) = 1 if   sum |dx_i| == 0
                                       i = 1
                                 0 otherwise

    Parameters
    ----------
    theta : array_like
        None.

    dx : array_like
        An array with shape (n_eval, n_features) giving the componentwise
        distances between locations x and x' at which the correlation model
        should be evaluated.

    Returns
    -------
    r : array_like
        An array with shape (n_eval, ) with the values of the autocorrelation
        model.
    """

    theta = np.asanyarray(theta, dtype=np.float)
    d = np.asanyarray(d, dtype=np.float)

    n_eval = d.shape[0]
    r = np.zeros(n_eval)
    r[np.all(d == 0., axis=1)] = 1.

    return r


def cubic(theta, d):
    """
    Cubic correlation model.

    theta, dx --> r(theta, dx) =
          n
        prod max(0, 1 - 3(theta_j*d_ij)^2 + 2(theta_j*d_ij)^3) ,  i = 1,...,m
        j = 1

    Parameters
    ----------
    theta : array_like
        An array with shape 1 (isotropic) or n (anisotropic) giving the
        autocorrelation parameter(s).

    dx : array_like
        An array with shape (n_eval, n_features) giving the componentwise
        distances between locations x and x' at which the correlation model
        should be evaluated.

    Returns
    -------
    r : array_like
        An array with shape (n_eval, ) with the values of the autocorrelation
        model.
    """

    theta = np.asanyarray(theta, dtype=np.float)
    d = np.asanyarray(d, dtype=np.float)

    if d.ndim > 1:
        n_features = d.shape[1]
    else:
        n_features = 1

    lth = theta.size
    if  lth == 1:
        td = np.abs(d) * theta
    elif lth != n_features:
        raise Exception("Length of theta must be 1 or " + str(n_features))
    else:
        td = np.abs(d) * theta.reshape(1, n_features)

    td[td > 1.] = 1.
    ss = 1. - td ** 2. * (3. - 2. * td)
    r = np.prod(ss, 1)

    return r


def linear(theta, d):
    """
    Linear correlation model.

    theta, dx --> r(theta, dx) =
          n
        prod max(0, 1 - theta_j*d_ij) ,  i = 1,...,m
        j = 1

    Parameters
    ----------
    theta : array_like
        An array with shape 1 (isotropic) or n (anisotropic) giving the
        autocorrelation parameter(s).

    dx : array_like
        An array with shape (n_eval, n_features) giving the componentwise
        distances between locations x and x' at which the correlation model
        should be evaluated.

    Returns
    -------
    r : array_like
        An array with shape (n_eval, ) with the values of the autocorrelation
        model.
    """

    theta = np.asanyarray(theta, dtype=np.float)
    d = np.asanyarray(d, dtype=np.float)

    if d.ndim > 1:
        n_features = d.shape[1]
    else:
        n_features = 1

    lth = theta.size
    if lth == 1:
        td = np.abs(d) * theta
    elif lth != n_features:
        raise Exception("Length of theta must be 1 or %s" % n_features)
    else:
        td = np.abs(d) * theta.reshape(1, n_features)

    td[td > 1.] = 1.
    ss = 1. - td
    r = np.prod(ss, 1)

    return r


def matern(theta, d):
    """
    Matern correlation model.
    (Explicitly controls the degree of differentiability of the prediction.)

                                   n    / 1 / (2 ^ (nu - 1) * gamma(nu))    \
    theta, dx --> r(theta, dx) =  prod  | * abs(theta_i * d_i) ^ nu         |
                                 i = 1  \ * kv(nu, abs(theta_i * d_i))      /

    where kv is the modified bessel function of the second kind
    (from scipy.special).

    Parameters
    ----------
    theta : array_like
        An array with shape 1+1 (isotropic) or n+1 (anisotropic) giving the
        autocorrelation parameter(s) (theta, nu).

    dx : array_like
        An array with shape (n_eval, n_features) giving the componentwise
        distances between locations x and x' at which the correlation model
        should be evaluated.

    Returns
    -------
    r : array_like
        An array with shape (n_eval, ) with the values of the autocorrelation
        model.

    See also
    --------
    matern_once_differentiable, matern_twice_differentiable
    """

    theta = np.asanyarray(theta, dtype=np.float)
    d = np.asanyarray(d, dtype=np.float)

    if d.ndim > 1:
        n_features = d.shape[1]
    else:
        n_features = 1

    lth = theta.size
    if n_features > 1 and lth == 2:
        theta = np.hstack([np.repeat(theta[0], n_features), theta[1]])
    elif lth != n_features + 1:
        raise Exception("Length of theta must be 2 or %s" % (n_features + 1))
    else:
        theta = theta.reshape(1, lth)

    nu = theta[:, -1]
    td = 2. * np.sqrt(nu) \
            * theta[:, 0:-1].reshape(1, n_features) * np.abs(d)
    r = 1. / (2. ** (nu - 1.) * gamma(nu)) * td ** nu * kv(nu, td)
    r[td <= 1e-5] = 1. # The modified Bessel function of the second kind
                       # is badly defined around 0.
    r = np.prod(r, 1)

    return r


def matern_once_differentiable(theta, d):
    """
    Once differentiable Matern correlation model.
    (Explicitly controls the degree of differentiability of the prediction.)

                                   n    / 1 / (2 ^ (3 / 2 - 1) * gamma(nu)) \
    theta, dx --> r(theta, dx) =  prod  | * abs(theta_i * d_i) ^ (3 / 2)    |
                                 i = 1  \ * kv(3 / 2, abs(theta_i * d_i))   /

    where kv is the modified bessel function of the second kind
    (from scipy.special).

    Parameters
    ----------
    theta : array_like
        An array with shape 1 (isotropic) or n (anisotropic) giving the
        autocorrelation length(s).

    dx : array_like
        An array with shape (n_eval, n_features) giving the componentwise
        distances between locations x and x' at which the correlation model
        should be evaluated.

    Returns
    -------
    r : array_like
        An array with shape (n_eval, ) with the values of the autocorrelation
        model.

    See also
    --------
    matern, matern_twice_differentiable
    """

    return matern(np.concatenate([theta.ravel(), [3. / 2.]]), d)


def matern_twice_differentiable(theta, d):
    """
    Twice differentiable Matern correlation model.
    (Explicitly controls the degree of differentiability of the prediction.)

                                   n    / 1 / (2 ^ (5 / 2 - 1) * gamma(nu)) \
    theta, dx --> r(theta, dx) =  prod  | * abs(theta_i * d_i) ^ (5 / 2)    |
                                 i = 1  \ * kv(5 / 2, abs(theta_i * d_i))   /

    where kv is the modified bessel function of the second kind
    (from scipy.special).

    Parameters
    ----------
    theta : array_like
        An array with shape 1 (isotropic) or n (anisotropic) giving the
        autocorrelation length(s).

    dx : array_like
        An array with shape (n_eval, n_features) giving the componentwise
        distances between locations x and x' at which the correlation model
        should be evaluated.

    Returns
    -------
    r : array_like
        An array with shape (n_eval, ) with the values of the autocorrelation
        model.

    See also
    --------
    matern, matern_once_differentiable
    """

    return matern(np.concatenate([theta.ravel(), [5. / 2.]]), d)
------------------------------------------------------------------------------
Rapidly troubleshoot problems before they affect your business. Most IT 
organizations don't have a clear picture of how application performance 
affects their revenue. With AppDynamics, you get 100% visibility into your 
Java,.NET, & PHP application. Start your 15-day FREE TRIAL of AppDynamics Pro!
http://pubads.g.doubleclick.net/gampad/clk?id=84349831&iu=/4140/ostg.clktrk
_______________________________________________
Scikit-learn-general mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general

Reply via email to