It is also worth noting that data takes the same amount of space on disk and in 
RAM if it wasn’t compressed on disk.

Peace,

Matt.

From: 
<[email protected]<mailto:[email protected]>>
 on behalf of "Harms, Michael" <[email protected]<mailto:[email protected]>>
Date: Friday, February 24, 2017 at 9:00 AM
To: Xavier Guell Paradis <[email protected]<mailto:[email protected]>>, 
"[email protected]<mailto:[email protected]>" 
<[email protected]<mailto:[email protected]>>
Subject: Re: [HCP-Users] Memory required for -cifti-correlation of 700 subjects?


Hi,
I seriously doubt that a dense connectome from those 787 is going to differ in 
any meaningful way from the one that we’ve already computed using the 820 
subjects with complete rfMRI data.  So, I’d just use what we’ve provided.  If 
you do want exact correspondence in the subject groups, you’ll need to do the 
computation in the manner outlined in that documentation.

cheers,
-MH

--
Michael Harms, Ph.D.
-----------------------------------------------------------
Conte Center for the Neuroscience of Mental Disorders
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave. Tel: 314-747-6173
St. Louis, MO  63110 Email: [email protected]<mailto:[email protected]>

From: 
<[email protected]<mailto:[email protected]>>
 on behalf of Xavier Guell Paradis <[email protected]<mailto:[email protected]>>
Date: Friday, February 24, 2017 at 8:55 AM
To: Michael Harms <[email protected]<mailto:[email protected]>>, 
"[email protected]<mailto:[email protected]>" 
<[email protected]<mailto:[email protected]>>
Subject: Re: [HCP-Users] Memory required for -cifti-correlation of 700 subjects?

Hi Michael,
Thank you for your message. I am interested in calculating resting-state 
functional connectivity in a group including only the subjects who completed 
all tasks (n=787). This is why I would like to generate a new dconn file.
Thank you,
Xavier.
________________________________
From: Harms, Michael [[email protected]<mailto:[email protected]>]
Sent: Friday, February 24, 2017 9:44 AM
To: Xavier Guell Paradis; 
[email protected]<mailto:[email protected]>
Subject: Re: [HCP-Users] Memory required for -cifti-correlation of 700 subjects?


Hi,
Let’s step back.  Why can’t you use the group dense connectome that we’ve 
already computed and provided?

As noted in our documentation
https://www.humanconnectome.org/documentation/S900/820_Group-average_rfMRI_Connectivity_December2015.pdf
computing the dense connectome optimally is not trivial (and involves quite a 
bit more than a -cifti-correlation operation).

cheers,
-MH

--
Michael Harms, Ph.D.
-----------------------------------------------------------
Conte Center for the Neuroscience of Mental Disorders
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave. Tel: 314-747-6173
St. Louis, MO  63110 Email: [email protected]<mailto:[email protected]>

From: 
<[email protected]<mailto:[email protected]>>
 on behalf of Xavier Guell Paradis <[email protected]<mailto:[email protected]>>
Date: Friday, February 24, 2017 at 8:30 AM
To: "[email protected]<mailto:[email protected]>" 
<[email protected]<mailto:[email protected]>>
Subject: [HCP-Users] Memory required for -cifti-correlation of 700 subjects?

Dear HCP experts,
After demeaning and merging resting-state files of 700 subjects (resulting in a 
1200GB file), I would like to do -cifti-correlation to get a .dconn file. I am 
using a computational cluster, and even by using a node with 300GB of memory 
the command does not seem to work (I get the message: "Exceeded job memory 
limit, Job step aborted: waiting up to 32 seconds for job step to finish"). I 
have tried to use no -mem-limit as well as a -mem-limit as low as 5, and I 
still get the same message.

Do you know if it is possible to use -cifti-correlation for a huge file (700 
subjects merged); and if so what level of memory would be required to do this?

Thank you very much,
Xavier.

_______________________________________________
HCP-Users mailing list
[email protected]<mailto:[email protected]>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

________________________________
The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

_______________________________________________
HCP-Users mailing list
[email protected]<mailto:[email protected]>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

________________________________
The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

_______________________________________________
HCP-Users mailing list
[email protected]<mailto:[email protected]>
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

________________________________
The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

_______________________________________________
HCP-Users mailing list
[email protected]
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

Reply via email to