I recommend a matrix approach involving eigenvalues and eigenvectors,
and assume you have a way to find eigenvalues and eigenvectors for an n by
n matrix which has n linearly independent eigenvectors.  I confine my
attention to linear differential equations with constant coefficients and
right hand side 0, for example

u''' - u'' - 4 u' + 4 u = 0  with initial conditions  u(0) = 2 , u'(0) = _1
, and u''(0) = 5

This has the solution  u(t) =  ( ^ _2 * t ) + ( ^ t )  where I use a mix of
conventional and J notation.

We convert the equation into a system of first order equations by letting

u0 = u , u1 = u' , and u2 = u''

getting the system

u0'  = u1
u1'  = u2
u2'  = -4 u0 + 4 u1 + u2

or in matrix form

(u0 u1 u2)' = (3 3 $ 0 1 0 0 0 1 _4 4 1) o (u0 u1 u2)

where I use o (lower case oh) for matrix multiplication +/ . *

This last has the form  U' = A o U  which with initial condition  U(0) =
U0  is known to have the solution U(t) = P o (exp L*t) o (%. P) o U0 when A
is n by n and has n linearly iindependent eigenvectors which are columns of
the matrix P , corresponding to eigenvalues on the diagonal of the n by n
diagonal matrix P .  The meaning of  exp L*t  is shown in the example J
session below.  (I do not show how eigenvalues and eigenvectors were found).

   names''

 AA     LL     PP     diag   diagof expm   ff     soln

    AA  NB. matrix for problem u''' - u'' - 4 u' + 4 u = 0

 0 1 0

 0 0 1

 _4 4 1

    LL  NB. eigenvalues of AA on diagonal

 2  0 0

 0 _2 0

 0  0 1

    PP  NB. columns are corresponding eigenvectors

 1r4  1r4 1

 1r2 _1r2 1

   1    1 1

    soln  NB. evaluates solution of X' = AA X with initial condition X(0) =
x and solution argument y (please note line wrapping)

 (3 3$1r4 1r4 1 1r2 _1r2 1 1 1 1) o ([: expm (3 3$2 0 0 0 _2 0 0 0 1) * ])
o (3 3$_2 1 1 2r3 _1 1r3 4r3 0 _1r3) o [

   NB. The last matrix shown above is the inverse of the first.

    expm  NB. matrix exponential, valid for diagonal square matrices

 [: diag [: ^ diagof

    diagof  NB. finds the diagonal of a square matrix

 (<0 1)&|:

    diag  NB. creates a diagonal square matrix with diagonal y

 * ([: =@i. #)

    ff  NB. solution of problem above with u(0) = 2  , u'(0) = _1 , and
u''(0) = 5

 2 _1 5 soln ]

   NB. You can check that this  ff  agrees with the solution  u(t)  given
in the introduction.


--Kip Murray

On Friday, March 13, 2015, Jon Hough <[email protected]> wrote:

> Thanks for the reply. I am thinking more along the lines of analytic
> solutions, rather than numerical solutions. e.g. for simple ODEs just
> solving the equivalent polynomial. For example if the ODE is
> y'' -3y' +2y=0
> then I would like to return the verb (function)
> y = A*exp(-2x) +B*exp(-x)
> (A and B arbitrary, need some initial boundary values to recover
> them).Similar to how Maple and (I presume) Mathematica can analytically
> solve such equations.
>
>
> > Date: Fri, 13 Mar 2015 11:32:08 -0400
> > From: [email protected] <javascript:;>
> > To: [email protected] <javascript:;>
> > Subject: Re: [Jprogramming] Solving Differential Eqns with J
> >
> > Jon,
> >
> >    If you are looking for a standard numerical solution, here are
> > is the standard Runge-Kutta approach, translated to J from Fortran:
> >      http://www.astro.umd.edu/~jph/runge_kutta.ijs
> > and for stiff diff-eq's:
> >      http://www.astro.umd.edu/~jph/stiff.ijs
> > As applied to the classic astrophysics problem of at polytropic star:
> >      http://www.astro.umd.edu/~jph/polytrope.ijs
> >      http://www.astro.umd.edu/~jph/iso_sphere.ijs
> >
> >                                          Patrick
> >
> > On Fri, 13 Mar 2015, Jon Hough wrote:
> > > I have looked through Jsoftware pages but cannot find explicit idea
> for solving linear ODEs in J.
> > > This page  http://www.jsoftware.com/papers/MSLDE.htm  has some
> overall information, but not specific J code.
> > > My attempt at some solution follows:e.g. Solve 7y'' -3y' - 2y = 0
> > > Solution (of sorts):
> > > S =: p. 7 _3 _2 NB. consider the ODE problem as a polynomial
> > > S =: >1{S NB. Get and unbox the two solutions.
> > > F =: 2 : 'u@:(n&*)' NB. conjunction for exponentiation
> > > Solve =:  +/@: (^ F S) NB. returns verb as a (not the) solution to the
> original ODE.
> > >
> > > So Solve is the solution, or at least a solution. In the above case
> solve is
> > >
> > > +/@:(^@:(_2.76556443707463728 1.26556443707463728&*))
> > >
> > >
> > > Solve 0
> > >
> > >
> > > returns 2.
> > >
> > >
> > > This is as far as I have got, and I already notice several issues.
> > >
> > >
> > > 1. In Solve, there is no way to show general solutions of the form
> A*exp(ax)+B*exp(bx) where A and B are arbitrary constants.
> > > 2. My approach needs to be modified slightly for repeated solutions to
> the polynoimial.
> > >
> > >
> > >
> > >
> > > Is there any work already done on this? I can't imagine I'm the first
> to think of attempting this.
> > >
> > >
> > > Regards,
> > > Jon
> > >
> > >
> > > ----------------------------------------------------------------------
> > > For information about J forums see http://www.jsoftware.com/forums.htm
> > >
> > ----------------------------------------------------------------------
> > For information about J forums see http://www.jsoftware.com/forums.htm
>
> ----------------------------------------------------------------------
> For information about J forums see http://www.jsoftware.com/forums.htm
>


-- 
Sent from Gmail Mobile
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to