Hi again community! I have been testing Paraview 5.2.0 in a CAVE mode with pretty big data (1+ billion points) while remotely connected to some HPC servers.
PROBLEM: In CAVE mode, I get the below error when I try to change mode from "Outline" to "Points". Same issue while doing other things as well like applying a Glyph (Sphere), etc. ERROR: terminate called after throwing an instance of 'std::bad_alloc' what(): std::bad_alloc The same issue doesn't occur if I don't use the CAVE mode in that I simple connect a Paraview client with the HPC servers and don't pass any pvx file. I read some similar online discussions pointing towards memory issues but it's hard for me to believe that given a) I have hundreds of gigs of memory and most of it remains empty even with my big data loaded, b) the issue doesn't occur when not in a CAVE mode. Anyone experienced any such similar issues? Thanks for all the help! -- Faiz Abidi | Master's Student at Virginia Tech | www.faizabidi.com | +1-540-998-6636
_______________________________________________ Powered by www.kitware.com Visit other Kitware open-source projects at http://www.kitware.com/opensource/opensource.html Please keep messages on-topic and check the ParaView Wiki at: http://paraview.org/Wiki/ParaView Search the list archives at: http://markmail.org/search/?q=ParaView Follow this link to subscribe/unsubscribe: http://public.kitware.com/mailman/listinfo/paraview
