You could try running a wb_command like smoothing on a random dense timeseries dataset. It should use multiple cores if everything is working correctly with that.
Peace, Matt. From: <[email protected]<mailto:[email protected]>> on behalf of Gengyan Zhao <[email protected]<mailto:[email protected]>> Date: Monday, May 23, 2016 at 10:57 AM To: "[email protected]<mailto:[email protected]>" <[email protected]<mailto:[email protected]>> Subject: [HCP-Users] Questions about Run Diffusion Preprocessing Parallelly Hello HCP Masters, I have a question about the "DiffusionPreprocessingBatch.sh". I want to run it in a multi-thread manner and involve as many cores as possible. There is a line in the script saying: #Assume that submission nodes have OPENMP enabled (needed for eddy - at least 8 cores suggested for HCP data) What shall I do to enable OPENMP? or OPENMP is ready to go? My current state is that the pipeline has just been run with SGE on a Ubuntu 14.04 machine having 32 cores. Thank you very much. Best, Gengyan Research Assistant Medical Physics, UW-Madison _______________________________________________ HCP-Users mailing list [email protected]<mailto:[email protected]> http://lists.humanconnectome.org/mailman/listinfo/hcp-users ________________________________ The materials in this message are private and may contain Protected Healthcare Information or other information of a sensitive nature. If you are not the intended recipient, be advised that any unauthorized use, disclosure, copying or the taking of any action in reliance on the contents of this information is strictly prohibited. If you have received this email in error, please immediately notify the sender via telephone or return mail. _______________________________________________ HCP-Users mailing list [email protected] http://lists.humanconnectome.org/mailman/listinfo/hcp-users
