Yuesu Lin,
Do you have access to a newer compiler? gcc 4.8.5 is very old.
Best,
Bruno
On Wednesday, July 22, 2020 at 11:26:05 PM UTC-4, yuesu jin wrote:
>
> Dear Timo,
> The configuration in summary.log file is:
>
> deal.II configuration:
> #CMAKE_BUILD_TYPE: DebugRelease
> #
Hello!
I'm interested in applying a non-homogeneous Dirichlet boundary condition
to a specific edge. However, I'm unsure how to identify or specify a
particular edge or face to add the boundary condition to. Could you help
clear this up for me?
Thank you for your help,
McKenzie
--
The deal
Dear Deal.II devs and users,
In the latest release a lot of (great) work has been done to make complex
numbers more of a first-class citizen in deal, which has made my code a lot
more readable. Currently, I am stuck with one problem, though. Are there
any distributed datatypes for matrices that
On Sun, Jul 19, 2020 at 7:36 PM Wolfgang Bangerth
wrote:
> On 7/19/20 6:28 PM, Daniel Arndt wrote:
> >
> > The tutorial examples show only how to access values of the solution
> at
> > the quadrature points within each cell.
> >
> >
> https://github.com/dealii/dealii/wiki/Frequently-Asked
Dear Bruno,
Ok, I will try a new gcc for compilation. Thank you!
On Thu, Jul 23, 2020 at 7:57 AM Bruno Turcksin
wrote:
> Yuesu Lin,
>
> Do you have access to a newer compiler? gcc 4.8.5 is very old.
>
> Best,
>
> Bruno
>
> On Wednesday, July 22, 2020 at 11:26:05 PM UTC-4, yuesu jin wrote:
>>
>
Some additional information: If I try to compile deal with TrilinosScalar =
std::complex I get many errors like this:
[ 5%] Building CXX object
source/numerics/CMakeFiles/obj_numerics_release.dir/data_postprocessor.cc.o
[ 5%] Building CXX object
source/numerics/CMakeFiles/obj_numerics_release
Pascal,
The wrapped Trilinos matrices are based on Epetra which only supports
double AFAICT. That's why you can replace TrilinosScalar easily.
On the other hand, you should be able to compile PETSc with complex scalar
type and use that with MPI.
Best,
Daniel
Am Do., 23. Juli 2020 um 12:42 Uhr sc
>
> However, I have two more related questions. BTW, I am a newbie in C++
> programming. So my questions may seem absurd.
>
>1. Now that we have a vector holding all nodal point coordinates, and
>another vector holding all nodal point values of the solution. How do we
>access every noda
McKenzie,
I'm interested in applying a non-homogeneous Dirichlet boundary condition
> to a specific edge. However, I'm unsure how to identify or specify a
> particular edge or face to add the boundary condition to. Could you help
> clear this up for me?
What do you know about that particular edg
On Thu, Jul 23, 2020 at 12:43 PM Daniel Arndt
wrote:
> However, I have two more related questions. BTW, I am a newbie in C++
>> programming. So my questions may seem absurd.
>>
>>1. Now that we have a vector holding all nodal point coordinates, and
>>another vector holding all nodal point
Dear all,
I had tried to implement KDTree in step_1 tutoria
and header file for kdtree is added to the codel. It is as follows:
void first_grid()
{
Triangulation<2> triangulation;
GridGenerator::hyper_cube(triangulation);
triangulation.refine_global(4);
* Poi
Heena,
You are missing an include. Try adding #include https://dealii.org/current/doxygen/deal.II/kdtree_8h_source.html>>
Best,
Bruno
On Thursday, July 23, 2020 at 2:55:53 PM UTC-4, heena patel wrote:
>
>
> Dear all,
> I had tried to implement KDTree in step_1
> tutor
Dear all,
The problem has been solved. The GCC version was too old to compile the
new version of deal.ii. I changed the gcc to a newer version which gave a
very good result. Thanks!
Best regards
On Thu, Jul 23, 2020 at 11:56 AM yuesu jin wrote:
> Dear Bruno,
> Ok, I will try a new gcc for com
Hi Daniel,
oh, I'm really sorry for asking if that works. I had seen that neither
PETSc nor Trilinos Sparse Matrices are templated and assumed that if the
more modern version (Trilinos) doesn't work with complex numbers, trying
PETSc wouldn't be very promising. But you are right, I will try tha
>
> We need to update function v based on solution u in the following loop.
>
> for ( x in vector_of_nodal_points )
> v(x) = f(x, u(x))
>
> where v(x) and u(x) are the nodal point value for functions v and u,
> respectively, and f() is some function depending on function u as well as
> the locati
Dear all,
I installed the Deal.II on a cluster.
The first thing I found is that, the
/dealii-9.2.0/cmake/configure/configure_1_mpi.cmake
automatically set the DEAL_II_WITH_MPI argument as "off", therefore the
MPI_FOUND module cannot run. I switched it on and cmake can find the MPI
library.
The
Yuesu Jin,
You don't need to compile your own PETSc but you need to use the same MPI
library than the one that PETSc is using. There are very hard to debug
problems that appear when PETSc and deal.II use different MPI libraries. I
think that you want to use this MPI library
project/cacds/apps/
Dear Bruno,
Thank you very much! I am using mpi/gcc right now. I will change to
Intel mpi library.
Best regards
On Thu, Jul 23, 2020 at 9:08 PM Bruno Turcksin
wrote:
> Yuesu Jin,
>
> You don't need to compile your own PETSc but you need to use the same MPI
> library than the one that PETSc i
Dear Bruno,
I had already added kdree.h header file,
check the question again. But it seems it does not read KDTree; something
is not compatible between class and header file.
Regards,
Heena
On Thu, Jul 23, 2020 at 9:03 PM Bruno Turcksin
wrote:
> Heena,
>
> You ar
Dear all,
I am installing deal.ii with petsc on a cluster. After I compiled the
petsc library (The arch folder name is arch-linux-c-debug) and gave a cmake
argument -DPETSC_DIR=/home/yjin6/petsc
cmake cannot find the petsc library .so file(It is in the folder), but it
founds the version is 3.13.
If you are using version 9.3pre of deal.II, kdtree was removed. Use RTree
instead, which is faster and more flexible.
Luca
> Il giorno 24 lug 2020, alle ore 05:41, heena patel ha
> scritto:
>
>
> Dear Bruno,
>I had already added kdree.h header file, check
>
21 matches
Mail list logo