Is it having problems reading files (in the D state?). Matt.
From: ARMAN PRAFUL KULKARNI <apkulkar...@wisc.edu<mailto:apkulkar...@wisc.edu>> Date: Wednesday, October 10, 2018 at 3:55 PM To: Matt Glasser <glass...@wustl.edu<mailto:glass...@wustl.edu>>, Timothy Coalson <tsc...@mst.edu<mailto:tsc...@mst.edu>> Cc: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" <hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>> Subject: Re: [HCP-Users] Error Running HCP's PostFreeSurferPipelineBatch.sh Hi Matt, Currently, this process is using 0.7% of the CPU--it doesn't appear to be the issue. In terms of where the pipeline is currently at, from the output script (PostFreeSurferPipeline.sh.o) it has been stuck on the following for the past hour or greater: -------- parsed 'var * -1' as '(var*(-1))' parsed 'var * -1' as '(var*(-1))' parsed 'var * -1' as '(var*(-1))' parsed 'abs(thickness)' as 'abs(thickness)' parsed 'thickness > 0' as '(thickness>0)' parsed 'ln(spherereg / sphere) / ln(2)' as '(ln((spherereg/sphere))/ln(2))' parsed 'ln(var) / ln (2)' as '(ln(var)/ln(2))' parsed 'ln(var) / ln (2)' as '(ln(var)/ln(2))' parsed '(atlas + individual) > 0' as '((atlas+individual)>0)' parsed 'var * -1' as '(var*(-1))' parsed 'var * -1' as '(var*(-1))' parsed 'var * -1' as '(var*(-1))' parsed 'abs(thickness)' as 'abs(thickness)' parsed 'thickness > 0' as '(thickness>0)' parsed 'ln(spherereg / sphere) / ln(2)' as '(ln((spherereg/sphere))/ln(2))' parsed 'ln(var) / ln (2)' as '(ln(var)/ln(2))' parsed 'ln(var) / ln (2)' as '(ln(var)/ln(2))' parsed '(atlas + individual) > 0' as '((atlas+individual)>0)' -------- Also, for the error output file (PostFreeSurferPipeline.sh.e), this seems to be the most recent output: -------- WARNING: name collision in input name 'WM-RH-UNKNOWN', changing one to 'WM-RH-UNKNOWN_1' reading colortable from annotation file... colortable with 36 entries read (originally /autofs/space/terrier_001/users/nicks/freesurfer/average/colortable_desikan_killiany.txt) reading colortable from annotation file... colortable with 76 entries read (originally /autofs/space/birn_044/users/christophe_atlas_rebuild//scripts_2008/Simple_surface_labels2009.txt) reading colortable from annotation file... colortable with 36 entries read (originally /autofs/space/terrier_001/users/nicks/freesurfer/average/colortable_desikan_killiany.txt) reading colortable from annotation file... colortable with 76 entries read (originally /autofs/space/birn_044/users/christophe_atlas_rebuild//scripts_2008/Simple_surface_labels2009.txt) -------- Sincerely, Arman ________________________________ From: Glasser, Matthew <glass...@wustl.edu<mailto:glass...@wustl.edu>> Sent: Wednesday, October 10, 2018 3:28:28 PM To: ARMAN PRAFUL KULKARNI; NEUROSCIENCE tim Cc: hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org> Subject: Re: [HCP-Users] Error Running HCP's PostFreeSurferPipelineBatch.sh What is using the CPU? Matt. From: <hcp-users-boun...@humanconnectome.org<mailto:hcp-users-boun...@humanconnectome.org>> on behalf of ARMAN PRAFUL KULKARNI <apkulkar...@wisc.edu<mailto:apkulkar...@wisc.edu>> Date: Wednesday, October 10, 2018 at 3:27 PM To: Timothy Coalson <tsc...@mst.edu<mailto:tsc...@mst.edu>> Cc: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" <hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>> Subject: Re: [HCP-Users] Error Running HCP's PostFreeSurferPipelineBatch.sh Hi Tim, I have the updated version of workbench now, and have tried running the pipeline again. PostFreeSurferPipelineBatch.sh has now been running for over 5 hours. To my knowledge, prior to this update, it only took around 45 minutes when successful. Is this normal for this version of the pipeline to take this long? Thanks again. Sincerely, Arman ________________________________ From: Timothy Coalson <tsc...@mst.edu<mailto:tsc...@mst.edu>> Sent: Tuesday, October 9, 2018 6:13:44 PM To: ARMAN PRAFUL KULKARNI Cc: hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org> Subject: Re: [HCP-Users] Error Running HCP's PostFreeSurferPipelineBatch.sh You need to update workbench to the latest release, that option is a recent addition to the command that is failing. Tim On Tue, Oct 9, 2018 at 6:08 PM, ARMAN PRAFUL KULKARNI <apkulkar...@wisc.edu<mailto:apkulkar...@wisc.edu>> wrote: Hi, I have been running the HCP Pipeline (v 3.27.0) on one subject's unprocessed data from the HCP-1200 dataset. Below is what I have in the SetUpHCPPipeline.sh file. I have been able to run PreFreeSurferPipelineBatch.sh and FreeSurferPipelineBatch.sh with no issues. However, when running PostFreeSurferPipelineBatch.sh, I get the following error after around 10 minutes of the script running: "ERROR: Unexpected parameter: -local-affine-method", and I am not sure how to proceed. I appreciate any insight into this problem. Sincerely, Arman ------- #!/bin/bash echo "This script must be SOURCED to correctly setup the environment prior to running any of the other HCP scripts contained here" # Set up FSL (if not already done so in the running environment) # Uncomment the following 2 lines (remove the leading #) and correct the FSLDIR setting for your setup #export FSLDIR=/usr/share/fsl/5.0 #. ${FSLDIR}/etc/fslconf/fsl.sh # Let FreeSurfer know what version of FSL to use # FreeSurfer uses FSL_DIR instead of FSLDIR to determine the FSL version export FSL_DIR="${FSLDIR}" # Set up FreeSurfer (if not already done so in the running environment) # Uncomment the following 2 lines (remove the leading #) and correct the FREESURFER_HOME setting for your setup #export FREESURFER_HOME=/usr/local/bin/freesurfer #source ${FREESURFER_HOME}/SetUpFreeSurfer.sh > /dev/null 2>&1 # Set up specific environment variables for the HCP Pipeline export HCPPIPEDIR=${HOME}/xxxx/HCP/Pipelines #export CARET7DIR=${HOME}/tools/workbench/bin_rh_linux64 export CARET7DIR=/usr/local/workbench/bin_linux64 export HCPPIPEDIR_Templates=${HCPPIPEDIR}/global/templates export HCPPIPEDIR_Bin=${HCPPIPEDIR}/global/binaries export HCPPIPEDIR_Config=${HCPPIPEDIR}/global/config export HCPPIPEDIR_PreFS=${HCPPIPEDIR}/PreFreeSurfer/scripts export HCPPIPEDIR_FS=${HCPPIPEDIR}/FreeSurfer/scripts export HCPPIPEDIR_PostFS=${HCPPIPEDIR}/PostFreeSurfer/scripts export HCPPIPEDIR_fMRISurf=${HCPPIPEDIR}/fMRISurface/scripts export HCPPIPEDIR_fMRIVol=${HCPPIPEDIR}/fMRIVolume/scripts export HCPPIPEDIR_tfMRI=${HCPPIPEDIR}/tfMRI/scripts export HCPPIPEDIR_dMRI=${HCPPIPEDIR}/DiffusionPreprocessing/scripts export HCPPIPEDIR_dMRITract=${HCPPIPEDIR}/DiffusionTractography/scripts export HCPPIPEDIR_Global=${HCPPIPEDIR}/global/scripts export HCPPIPEDIR_tfMRIAnalysis=${HCPPIPEDIR}/TaskfMRIAnalysis/scripts export MSMBINDIR=${HCPPIPEDIR}/global/templates/MSMAll _______________________________________________ HCP-Users mailing list HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org> http://lists.humanconnectome.org/mailman/listinfo/hcp-users _______________________________________________ HCP-Users mailing list HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org> http://lists.humanconnectome.org/mailman/listinfo/hcp-users ________________________________ The materials in this message are private and may contain Protected Healthcare Information or other information of a sensitive nature. If you are not the intended recipient, be advised that any unauthorized use, disclosure, copying or the taking of any action in reliance on the contents of this information is strictly prohibited. If you have received this email in error, please immediately notify the sender via telephone or return mail. ________________________________ The materials in this message are private and may contain Protected Healthcare Information or other information of a sensitive nature. If you are not the intended recipient, be advised that any unauthorized use, disclosure, copying or the taking of any action in reliance on the contents of this information is strictly prohibited. If you have received this email in error, please immediately notify the sender via telephone or return mail. _______________________________________________ HCP-Users mailing list HCP-Users@humanconnectome.org http://lists.humanconnectome.org/mailman/listinfo/hcp-users