Hi
On Tue, 21 Feb 2012 14:40:52 -0600 (CST), Roy Stogner
wrote:
> For instance, after discovering that ',' made a lousy delimiter
> between vector components (at least if you don't want to also
> inadvertently disable non-unary functions) in my first version, I'm
> now discovering that '#' makes
olume = 55,
pages ={918-930}
}
Thanks, Martin
--
Dr. Martin Lüthilue...@vaw.baug.ethz.ch
VAW Glaciology http://www.glaciology.ethz.ch
ETH Zürich http://people.ee.ethz.ch/~luthi
8092 Zürich +41 44 632 4093 phone
Switzerland +41 4
Hi
MeshFunction does currently not work in parallel. When I try
interpolating a function I get
[yak:14801] *** An error occurred in MPI_Allreduce: the reduction operation
MPI_MAX is not defined on the MPI_CHAR datatype
[yak:14800] *** An error occurred in MPI_Allreduce: the reduction operation
Hi again
As you can tell, I'm compiling Libmesh on different machines. Now I
stumbled on some (trivial) errors when using the Gnu compilers 3.3.5
(that I am sort of forced to use on one system).
All errors occur with svn:2813, and are probably due to some template
acrobatics that the compiler can
Hi
Thanks for your answers. While I could not figure out what was wrong
(I got no more information in debug mode), using PETSc 2.3.3-12 works
smoothly. The crucial thing seems to be the usage of the mpi
compilers.
> ./configure --with-cxx=mpicxx --with-cc=mpicc --with-f77=mpif77
Should that be
Hi there
After being able to compile SVN-Libmesh with the option
./configure --with-cxx=mpicxx --with-cc=mpicc --with-f77=mpif77
together with PETSC petsc-2.3.2-p10 (openmpi), I get the following
error message when running example 13 (it *has to be* the 13!):
*** Solving time step 0, time = 0.0
Hi
Roy Stogner writes:
> Do you have any specific files or URLs you think we should add? This
> is definitely a good idea, but I bet nobody's going to get around to
> it until the next time we're stuck hunting up documentation ourselves.
> If you've been through that wringer yourself recently
Hi
Libmesh supports a wide variety of data formats. For some formats the
documentation is easily accessible, but for some (e.g. UNV) one has to
hunt down some description of the file format. I think that it should
be mandatory to put the file definitions into the the doc tree (as has
been done f
Hi
Great work going on!
Roy Stogner writes:
> A parallel-friendly format (whether that means wacky multi-file
> partitioning or just eliminating element blocks) is different enough
> to merit if not require a new file extension.
Why not base new file format on HDF5/PHDF5. That library is extr