Yes that is particularly true when using the latest version of the pipelines.  
There are also files in T2w and T1w that could be deleted, but will not save as 
much space as Mike’s suggestion.

Peace,

Matt.

From: "Harms, Michael" <mha...@wustl.edu<mailto:mha...@wustl.edu>>
Date: Wednesday, February 21, 2018 at 12:18 PM
To: "Cook, Philip" 
<coo...@pennmedicine.upenn.edu<mailto:coo...@pennmedicine.upenn.edu>>, 
"hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
<hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>>
Cc: Matt Glasser <glass...@wustl.edu<mailto:glass...@wustl.edu>>
Subject: Re: [HCP-Users] Cleaning up intermediate files from the minimal 
pre-processing pipelines


Hi,
While the documentation is overall very good, I don’t know if I’d rely on that 
pdf for a detailed list of all the files that we recommend “keeping”.  For 
that, you could download and unpack the packages for a subject with complete 
data (e.g., 100307), and see what you all get.

As a relatively simpler clean-up, I *think* that if you keep the entire 
contents of anything in $subj/T1w and $subj/MNINonLinear that you’ll have most 
of what you need for any further downstream processing, while achieving 
substantial space savings.  i.e., Most of the intermediates in the fMRI 
processing end up in the $subj/$task directories, and I think that any that 
have been deemed important (e.g., .native.func.gii) have been copied to 
$subj/MNINonLinear/Results/$task.  @Matt: Can you confirm that?

e.g,. For a subject from the HCP-Young Adult study, the output from the MPP of 
a single REST run (e.g., $subj/MNINonLinear/Results/rfMRI_REST1_LR) is about 
3.7 GB, whereas the contents of $subj/rfMRI_REST1_LR are 28 GB).

Cheers,
-MH

--
Michael Harms, Ph.D.
-----------------------------------------------------------
Conte Center for the Neuroscience of Mental Disorders
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.                        Tel: 314-747-6173
St. Louis, MO  63110                                          Email: 
mha...@wustl.edu<mailto:mha...@wustl.edu>

From: 
<hcp-users-boun...@humanconnectome.org<mailto:hcp-users-boun...@humanconnectome.org>>
 on behalf of "Cook, Philip" 
<coo...@pennmedicine.upenn.edu<mailto:coo...@pennmedicine.upenn.edu>>
Date: Wednesday, February 21, 2018 at 11:49 AM
To: "hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>" 
<hcp-users@humanconnectome.org<mailto:hcp-users@humanconnectome.org>>
Subject: [HCP-Users] Cleaning up intermediate files from the minimal 
pre-processing pipelines

Hi,

I am trying to reduce disk usage after running the HCP minimal pre-processing 
pipelines. I would like to clean up intermediate files but retain things needed 
for ongoing analysis. As a reference I have found a list of file names in

    WU-Minn HCP 900 Subjects Data Release: Reference Manual
    Appendix III - File Names and Directory Structure for 900 Subjects Data
    
https://www.humanconnectome.org/storage/app/media/documentation/s900/HCP_S900_Release_Appendix_III.pdf

I would like to retain these and clean up the remainder of the output. Are 
there any scripts available to help with this?


Thanks

_______________________________________________
HCP-Users mailing list
HCP-Users@humanconnectome.org<mailto:HCP-Users@humanconnectome.org>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

_______________________________________________
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

Reply via email to