Ruochun – I read your email, super thoughtful.
Circling back to Sabrina – the question is also how big the problem is.
The FSI solver in Chrono has made huge progress over the last 12 months – Radu 
and Luning and Huzaifa can speak to that.
If we are talking about 1000 DEM particles here, then I think that the easiest 
way to go is to simply simulate the entire thing in Chrono: the DEM part, using 
DEME; the CFD side, in Chrono::FSI. The solution would be entirely on the GPU. 
We never optimized the communication for DEM-FSI “on-the-GPU” simulation since 
we’ve never been faced with such requests. But the big deal here is that the 
memory for DEME and FSI is GPU resident and therefore can draw on the TB/s 
bandwidth and small latency of device memory (compared to host-device traffic). 
I truly think that if there was funding to do this GPU-GPU, FSI-DEME 
co-simulation, a full blown Chrono solution would be top notch.
However, if for the problem at hand Sabrina needs say 1,000,000 DEM particles, 
that’s a different story. I think no matter what approach is taken in that 
case, it’s going to be really, really slow if one fully resolves the dynamics 
of both the particles and the fluid.
Dan
---------------------------------------------
Bernard A. and Frances M. Weideman Professor
NVIDIA CUDA Fellow
Department of Mechanical Engineering
Department of Computer Science
University of Wisconsin - Madison
4150ME, 1513 University Avenue
Madison, WI 53706-1572
608 772 0914
http://sbel.wisc.edu/
http://projectchrono.org/
---------------------------------------------

From: projectchrono@googlegroups.com <projectchrono@googlegroups.com> On Behalf 
Of Ruochun Zhang
Sent: Saturday, August 23, 2025 8:00 AM
To: ProjectChrono <projectchrono@googlegroups.com>
Subject: Re: [chrono] DEM-Engine SPH model integration


Hi Sabrina,

This is very ambitious indeed. I can comment on it based on what I know, and 
you can decide if it is relevant for your research.

First, I hope the APIs provided in DEM-Engine are enough to allow your 
inclusion of the thermal and electrostatic systems. It seems they are, but feel 
free to post threads here if you later discover other needs.

The biggest bottleneck in regolith–fluid simulations is the enormous scale 
required, and that's why Dan suggested using another, unified model for it. But 
since your focus is on building a comprehensive model, not an engineering 
solution for a particular problem, that's not an option, and I assume you'd 
want two-way coupling (i.e., as much coupling as possible) in your simulation. 
I'd also assume you don't need extreme fluid velocities, like over 0.3 Ma. Then 
the biggest question is: since your DEM side model is already heavy, how much 
emphasis would you like to put on the fluid part? Or, put another way, I think 
it's a problem of what fluid–solid ping-pong paradigm to use, not what package 
to use. One thing is for sure: none of the approaches will be “convenient” to 
make happen.

Using SPH is fine, but I suspect you'll need markers much smaller than DEM 
particles, so limiting the overall problem scale is important. It may face more 
challenges if the Reynolds number is high. Also, it would involve the 
integration between two GPU packages, which is a more serious software 
engineering task, and there might be people who have tried that on 
DualSPHysics' forum. I'd say if you go this route, you are certainly treating 
the fluid part no less seriously than the DEM part, and consulting the 
developers there beforehand is certainly needed.

FVM- or FEA-based CFD solvers are fine too, and I can imagine myself 
building/using a PETSc-based solver for this task. The key would be to update 
the DEM particle-represented boundary (if moving mesh) or track/mark the moving 
boundary-influenced nodes (if immersed boundary), which has very little to do 
with DEM itself — it only needs particle pos/vel info, which DEM-Engine can 
certainly supply. I'd probably recommend an immersed boundary approach for 
reasons I'll give in the LBM-related part. This is also how I imagined 
DEM-Engine users would do fluid co-simulation. As you will have a lot of things 
to do on the host anyway (mesh adjustment, node marking...), you'll use 
DEM-Engine's tracker to bring the information to the host, update the mesh and 
fluid solver, run it, and then feed the fluid force back to DEM-Engine. This 
should position you more as a user of computing packages, rather than a solver 
developer. This approach can be used regardless of whether you think the fluid 
is an emphasis, as you can always choose to use fewer features of the solver to 
make the fluid part easier and faster, or do the opposite. But you probably 
won’t modify the fluid solver that much, so there might be more restrictions in 
coding flexibility.

You can also write your own fluid solver, but I think most likely that means 
the fluid is not a main focus of the research you want to present. And if you 
do, like Dan said, I would say LBM is a good choice. I only recently became 
interested in LBM’s usage in related co-simulations. Two main benefits:

  1.  It's fully Eulerian, therefore easy to use alongside DEM, as the DEM 
particles are the only moving part. For the LBM part, you just mark the DEM 
particle-shadowed grid points as solid. It's similar to why I think the 
immersed boundary is better for your use case. The method is also in general 
easy to implement. You can literally ask ChatGPT to write one for you, after 
you read the basics of it.
  2.  It's massively parallel, and should go well with DEM-Engine on GPUs.

The downside is that LBM is certainly much less used and appreciated than, say, 
FVM. While it should be very serviceable for you, convincing the community 
might be another issue. You could, of course, implement a FVM solver yourself — 
it's again very doable if you don't aim too high. It really doesn't matter if 
it's fully on GPU and only exchanges device arrays with DEM (“implementing a 
fluid model within DEM itself”), or if it brings data to the host for 
processing; I think in the grand scheme of such an ambitious project, it's a 
minor issue and we can always resolve it later if it matters.

As for the software we can provide: Publishing a GPU-based LBM solver is a 
possibility in the longer term, but you have a PhD to finish, so it doesn't 
seem you can wait for us. You could write it yourself, as making one that is at 
least usable is not too hard. I do have a plan to provide a performance-centric 
FEA/FVM-based fluid solver on GPU relatively soon. If you are going to spend a 
couple of months looking into the DEM model before having to consider fluid, 
then the timing may line up. It should naturally go well in co-simulation with 
DEM-Engine or Chrono, as it's the same family and allows for step-wise 
advancing of the simulation, too. However, as it stands, we cannot make 
promises to you about a ready-to-use DEM-capable fluid solution right now.
Let me know if you have further questions,
Ruochun
On Friday, August 22, 2025 at 1:23:00 AM UTC+8 
sabry....@gmail.com<mailto:sabry....@gmail.com> wrote:
Thank you for your reply,
I was already aware of the possibility offered by Chrono, but I necessarily 
have to continue using DEM, since my entire master’s thesis was developed on 
it. By customizing the CUDA kernels, I was able to implement a thermal model 
and modify the electrostatic one. The goal was to build a comprehensive 
regolith model, not just a mechanical one, and moving to Chrono would mean 
losing this work. For my PhD, I will also need to extend what I have done so 
far to include interactions with plasma, which makes it essential to keep the 
electrostatic model. Thank you again for your response and for the suggestion 
regarding DEM-LBM. I now look forward to any comments from Ruochun as well.

Best regards,
Sabrina
Il giorno giovedì 21 agosto 2025 alle 18:46:13 UTC+2 Dan Negrut ha scritto:
Sabrina,
In theory, you can do this in the SPH solver in Chrono, hopefully my colleague 
Radu will comment on this. It’d require very large sim times because the number 
of SPH particles would be really large, which is needed to capture the dynamics 
of the grains.
Another way to do it is DEM-LBM. Chrono has no support for this, and no plan to 
implement in the immediate future. The sim times would probably be very long, 
but it’d be a nice approach. If Ruochun sees this, he might comment on this 
idea.
Lastly, you can homogenize this and represent the regolith–fluid interactions 
through a continuum and then use the CRM solver in Chrono. You’d need to have 
the right material model, which means that you’ll have to go beyond the 
hypo-elastoplastic material model that we have there right now (Drucker-Prager 
plasticity, with no cap).
Dan
---------------------------------------------
Bernard A. and Frances M. Weideman Professor
NVIDIA CUDA Fellow
Department of Mechanical Engineering
Department of Computer Science
University of Wisconsin - Madison
4150ME, 1513 University Avenue
Madison, WI 53706-1572
608 772 0914<tel:(608)%20772-0914>
http://sbel.wisc.edu/
http://projectchrono.org/<https://urldefense.com/v3/__http:/projectchrono.org/__;!!Mak6IKo!Nv_23PeePQg1wn3tIwGRXSublYqV9-esQ2vvdhMbDxIqUQX-Un8-oO32B-t1xcGxam4Z1U2J7cwU2t8$>
---------------------------------------------

From: projec...@googlegroups.com<mailto:projec...@googlegroups.com> 
<projec...@googlegroups.com<mailto:projec...@googlegroups.com>> On Behalf Of 
Sabrina Lanfranco
Sent: Thursday, August 21, 2025 11:36 AM
To: ProjectChrono 
<projec...@googlegroups.com<mailto:projec...@googlegroups.com>>
Subject: [chrono] DEM-Engine SPH model integration


Hello everyone,
I am currently using DEM-Engine to model planetary regolith in scenarios 
involving interactions with space exploration objects. I would now like to 
extend this modeling to include the study of regolith–fluid interactions.

In your opinion, what would be the most convenient approach: integrating DEM 
with a solver such as DualSPHysics, or directly implementing a fluid model 
within DEM itself? In both cases, this would require modifications to the DEM 
codebase. That is why I am writing here, hoping to get some feedback from the 
developers: perhaps there is already something undocumented, or maybe you have 
already considered an approach in this direction.
--
You received this message because you are subscribed to the Google Groups 
"ProjectChrono" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to projectchron...@googlegroups.com<mailto:projectchron...@googlegroups.com>.
To view this discussion visit 
https://groups.google.com/d/msgid/projectchrono/fa00b521-e8a1-4dd6-be54-7e37358cd230n%40googlegroups.com<https://urldefense.com/v3/__https:/groups.google.com/d/msgid/projectchrono/fa00b521-e8a1-4dd6-be54-7e37358cd230n*40googlegroups.com?utm_medium=email&utm_source=footer__;JQ!!Mak6IKo!NWYNqoiUNqQ1gbNA5zSq-hOk7tPnfhQg71_vbAlrHiTOtSqPfwXJytliXNnuGCi9xLpRYSrf2glwAO78v2X2$>.
--
You received this message because you are subscribed to the Google Groups 
"ProjectChrono" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to 
projectchrono+unsubscr...@googlegroups.com<mailto:projectchrono+unsubscr...@googlegroups.com>.
To view this discussion visit 
https://groups.google.com/d/msgid/projectchrono/f67977ca-5c2c-41b6-ad5e-ba3298ede1ban%40googlegroups.com<https://urldefense.com/v3/__https:/groups.google.com/d/msgid/projectchrono/f67977ca-5c2c-41b6-ad5e-ba3298ede1ban*40googlegroups.com?utm_medium=email&utm_source=footer__;JQ!!Mak6IKo!Nv_23PeePQg1wn3tIwGRXSublYqV9-esQ2vvdhMbDxIqUQX-Un8-oO32B-t1xcGxam4Z1U2JCVg1sNc$>.

-- 
You received this message because you are subscribed to the Google Groups 
"ProjectChrono" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to projectchrono+unsubscr...@googlegroups.com.
To view this discussion visit 
https://groups.google.com/d/msgid/projectchrono/DM8PR06MB770358E134BE588F1F28B658B13CA%40DM8PR06MB7703.namprd06.prod.outlook.com.

Reply via email to