Hello,
PETSC_SCALAR is not a symbol either. I skimmed through the names in the shared library, and it doesn't look like any data type information is there.

        Jared Crean

On 7/14/2015 9:22 PM, Matthew Knepley wrote:
On Tue, Jul 14, 2015 at 8:03 PM, Jared Crean <[email protected] <mailto:[email protected]>> wrote:

        Hello,
            PETSC_USE_COMPLEX isn't a symbol in the shared library
    when Petsc is built with complex scalars, so I don't see a way to
    access it at runtime. I'll have to write a simple C program that
    uses sizeof() and write the value to a file.


That is crazy. How about

  isComplex = PETSC_COMPLEX == PETSC_SCALAR

   Matt

            As for the MPI communicator, the julia MPI package uses a
    C int to store it, so I will typealias to that to ensure
    consistency.  If an MPI implementation uses an 8 byte pointer,
    MPI.jl will have to change too.

        Jared Crean


    On 7/14/2015 1:04 PM, Matthew Knepley wrote:
    On Tue, Jul 14, 2015 at 10:56 AM, Jared Crean <[email protected]
    <mailto:[email protected]>> wrote:

            Hello everyone,
                I got the package in a reasonably working state and
        Travis testing setup, so I am putting the package up on Github.

        https://github.com/JaredCrean2/PETSc.jl

                There is still a lot more work to do, but its a start.

                A couple questions:
                When looking though the code, I noticed the MPI
        communicator is being passed as a 64 bit integer. mpi.h
        typedefs it as an int, so shouldn't it be a 32 bit integer?


    Some MPI implementations store the communicator as a pointer,
    which may be 64 bits. I think the only thing the standard says is
    that MPI_Comm should be defined.

                Also, is there a way to find out at runtime what
        datatype a PetscScalar is?  It appears PetscDataTypeGetSize
        does not accept PetscScalar as an argument.


    If PETSC_USE_COMPLEX is defined its PETSC_COMPLEX, otherwise its
    PETSC_REAL. You can also just use sizeof(PetscScalar). What do you
    want to do?

      Thanks,

         Matt


            Jared Crean



        On 07/06/2015 09:02 AM, Matthew Knepley wrote:
        On Mon, Jul 6, 2015 at 4:59 AM, Patrick Sanan
        <[email protected] <mailto:[email protected]>>
        wrote:

            I had a couple of brief discussions about this at
            Juliacon as well. I think it would be useful, but there
            are a couple of things to think about from the start of
            any new attempt to do this:
            1. As Jack pointed out, one issue is that the PETSc
            library must be compiled for a particular precision.
            This raises some questions - should several versions of
            the library be built to allow for flexibility?
            2. An issue with wrapping PETSc is always that the
            flexibility of using the PETSc options paradigm is
            reduced - how can this be addressed? Could/should an
            expert user be able to access the options database
            directly, or would this be too much violence to the
            wrapper abstraction?


        I have never understood why this is an issue. Can't you just
        wrap our interface level, and use the options just as we do?
        That
        is essentially what petsc4py does. What is limiting in this
        methodology? On the other hand, requiring specific types,
        ala FEniCS,
        is very limiting.

           Matt

            On Sat, Jul 4, 2015 at 11:00 PM, Jared Crean
            <[email protected] <mailto:[email protected]>> wrote:

                Hello,
                     I am a graduate student working on a CFD code
                written in Julia, and I am interested in using Petsc
                as a linear solver (and possibly for the non-linear
                solves as well) for the code.  I discovered the
                Julia wrapper file Petsc.jl in Petsc and have
                updated it to work with the current version of Julia
                and the MPI.jl package, using only MPI for
                communication (I don't think Julia's internal
                parallelism will scale well enough, at least not in
                the near future).

                     I read the discussion on Github
                [https://github.com/JuliaLang/julia/issues/2645],
                and it looks like
                there currently is not a complete package to access
                Petsc from Julia. With your permission, I would like
                to use the Petsc.jl file as the basis for developing
                a package.  My plan is create a lower level
                interface that exactly wraps Petsc functions, and
                then construct a higher level interface, probably an
                object that is a subtype of Julia's AbstractArray,
                that allows users to store values into Petsc vectors
                and matrices.  I am less interested in integrating
                tightly with Julia's existing linear algebra
                capabilities than ensuring good scalability. The
                purpose of the high level interface it simple to
                populate the vector or matrix.

                     What do you think, both about using the
                Petsc.jl file and the  overall approach?

                     Jared Crean





-- What most experimenters take for granted before they begin
        their experiments is infinitely more interesting than any
        results to which their experiments lead.
        -- Norbert Wiener




-- What most experimenters take for granted before they begin their
    experiments is infinitely more interesting than any results to
    which their experiments lead.
    -- Norbert Wiener




--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

Reply via email to