Dear PyFR developers, I was wondering if you could help me with my test case. I have a modest OLCF Summit allocation to test strong scaling of ACM for an internal flow simulation and I am trying to set it up in PyFR using Loppi et al files.
My target geometry is a double elbow, but I am starting with a pipe. I am attaching my gmsh and PyFR ini. I use gmsh -o pipe.msh -f msh22 -3 pipe.geo to mesh and then import and partition into 2 subdomains using PyFR. Running with CUDA backend though gives me the following error File "/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/pyfr", line 112, in main args.process(args) File "/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/pyfr", line 245, in process_run args, NativeReader(args.mesh), None, Inifile.load(args.cfg) ... File "/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/solvers/base/system.py", line 193, in _gen_kernels kernels[pn, kn].append(kgetter()) File "/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/solvers/baseadvec/elements.py", line 61, in <lambda> out=slicem(self._scal_fpts, s) File "/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/solvers/base/elements.py", line 131, in _slice_mat if len(mat.ioshape) >= 3: AttributeError: 'CUDAMatrixSlice' object has no attribute 'ioshape' I am attaching my geo and ini file. Please let me know if anything strikes you. My guess is that I am still messing something up with the mesh. Best wishes, Robert -- Dr Robert Manson-Sawko Research Staff Member, IBM Research Europe Daresbury Laboratory Keckwick Lane, Warrington WA4 4AD United Kingdom -- Email (IBM): [email protected] Email (STFC): [email protected] Phone (office): +44 (0) 1925 60 3967 Phone (mobile): +44 759 301 0452 Profile page: http://researcher.watson.ibm.com/researcher/view.php?person=uk-RSawko --Unless stated otherwise above: IBM United Kingdom Limited - Registered in England and Wales with number 741598. Registered office: PO Box 41, North Harbour, Portsmouth, Hampshire PO6 3AU -- You received this message because you are subscribed to the Google Groups "PyFR Mailing List" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web, visit https://groups.google.com/d/msgid/pyfrmailinglist/OFDA544E6D.AE080A90-ON002585F3.00787622-002585F3.00787691%40notes.na.collabserv.com.
Traceback (most recent call last):
File "/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/pyfr", line 267, in <module>
main()
File "/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/pyfr", line 112, in main
args.process(args)
File "/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/pyfr", line 245, in process_run
args, NativeReader(args.mesh), None, Inifile.load(args.cfg)
File "/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/pyfr", line 226, in
_process_common
solver = get_solver(backend, rallocs, mesh, soln, cfg)
File "/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/solvers/__init__.py", line 16,
in get_solver
return get_integrator(backend, systemcls, rallocs, mesh, initsoln, cfg)
File "/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/integrators/__init__.py", line
36, in get_integrator
return integrator(backend, systemcls, rallocs, mesh, initsoln, cfg)
File
"/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/integrators/dual/phys/controllers.py",
line 8, in __init__
super().__init__(*args, **kwargs)
File "/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/integrators/dual/phys/base.py",
line 19, in __init__
initsoln, cfg, self._stepper_coeffs, self._dt
File
"/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/integrators/dual/pseudo/__init__.py",
line 55, in get_pseudo_integrator
initsoln, cfg, tcoeffs, dt)
File
"/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/integrators/dual/pseudo/multip.py",
line 121, in __init__
initsoln, cfg, tcoeffs, dt)
File
"/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/integrators/dual/pseudo/pseudocontrollers.py",
line 12, in __init__
super().__init__(*args, **kwargs)
File
"/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/integrators/dual/pseudo/base.py", line
51, in __init__
nregs=self.nregs, cfg=cfg)
File "/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/solvers/base/system.py", line
64, in __init__
self._gen_kernels(eles, int_inters, mpi_inters, bc_inters)
File "/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/solvers/base/system.py", line
193, in _gen_kernels
kernels[pn, kn].append(kgetter())
File "/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/solvers/baseadvec/elements.py",
line 61, in <lambda>
out=slicem(self._scal_fpts, s)
File "/ccs/proj/cfd141/apps/pyfr/1.10.0/pyfr/solvers/base/elements.py", line
131, in _slice_mat
if len(mat.ioshape) >= 3:
AttributeError: 'CUDAMatrixSlice' object has no attribute 'ioshape'
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
pipe.geo
Description: Binary data
incompressible.ini
Description: Binary data
