Hi Sumaya, About ParaView MPI, what I meant is that, if the memory is the issue, you can consider running a pvserver on a potentially multi-node, high-RAM server, then using something like paraview --client to connect to the server to view the remote-rendered scenes. This way, you are not bound by your local memory. You have to figure out this workflow though.
I am not familiar with rendering using CSV input in a Houdini environment, supposing this is what you are contemplating. But I wouldn't imagine any problems for CSV, as it is just a general-purpose format. Thank you, Ruochun On Wednesday, October 29, 2025 at 4:18:18 AM UTC+8 sumaya wrote: > Hi Ruochun, > > Thank you for your response. Is it possible to elaborate your answer in > terms of using a partitioned view or MPI? > Have the files (CSV / VTK) ever been tested in Houdini environment? > > Thanks > > > On Wednesday, October 15, 2025 at 8:17:56 AM UTC-4 Ruochun Zhang wrote: > >> Hi Sumaiya, >> >> Apparently I meant GRC-1 not GRC-3 in the previous message. It would be >> interesting to develop a GRC-3 representation though! >> >> Ruochun >> >> On Wednesday, October 15, 2025 at 8:09:45 PM UTC+8 Ruochun Zhang wrote: >> >>> Hi Sumaiya, >>> >>> 1. I don't know about the 58GB thing, sounds a bit larger than I expect >>> but you could be right. When you visualize it in ParaView, consider >>> visualizing the points only without generating gpyph; or if you would like >>> glyph, keep the mode being "All Points", but reduce the Theta and Phi >>> resolution, to maybe 3 or something like that, and that will greatly reduce >>> the memory used. However, if the output size keeps growing, at some point >>> you'll have to use some sort of MPI or partitioned view. >>> >>> 2. Sorry for the lack of comments. Part3 is still about preparing a >>> GRC-3 particle bed. It requires finishing Part2. It makes several copies of >>> the results from Part2, put them side by side to make a larger material >>> patch, then let the gravity do the work. After settling, it compresses the >>> resultant material bed so the top is a bit more even/flat. Then it saves >>> the settled material to a file. Note that this file is very large, >>> representing a 4m × 2m soil bin which would allow a rover running on it. If >>> you don't need a test environment this big, you can safely ignore Part3. >>> >>> Thank you, >>> Ruochun >>> >>> On Wednesday, October 15, 2025 at 1:48:33 AM UTC+8 sumaya wrote: >>> >>>> Dear Chrono Users, >>>> >>>> *Part 1: ParaView* >>>> >>>> I would like to ask about your experience running DEM simulations in >>>> Chrono. I have successfully generated files for a wheel drawbar pull >>>> simulation using the Chrono DEM engine. The simulation contains close to a >>>> million particles, as mentioned in the comments of the wheel_DP.exe file, >>>> and the total file size is around 58 GB. >>>> >>>> However, when I try to visualize the results in ParaView, after >>>> applying all the recommended filters from the GitHub documentation, >>>> ParaView becomes unresponsive. I have also tried running ParaView on the >>>> compute node cluster, but I encounter the same issue. The crash >>>> specifically occurs when I change the glyph mode to “All Points.” >>>> >>>> Do you have any suggestions on how to fix this issue, or can you >>>> recommend an alternative software that can handle and visualize millions >>>> of >>>> particles without crashing? >>>> >>>> *Part 2: GRCPrep Files* >>>> >>>> I noticed that the DEMdemo_GRCPrep_Part3.cpp file does not contain >>>> comments. I would like to better understand its purpose. Currently, the >>>> WheelDP executable takes DEMdemo_GRCPrep_Part2 as input. Could someone >>>> explain the difference between using Part 2 and Part 3? >>>> >>>> Thank you, >>>> Sumaiya >>>> >>> -- You received this message because you are subscribed to the Google Groups "ProjectChrono" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion visit https://groups.google.com/d/msgid/projectchrono/90003699-ec83-4236-9471-202d656a743an%40googlegroups.com.
