find it.
Thanks again,
Qinqin lee
On 03/20/2017 20:03,Glasser, Matthew<glass...@wustl.edu> wrote:
In individual subjects, you can use the midthickness surface coordinates as the
3D coordinates.
Peace,
Matt.
From: <hcp-users-boun...@humanconnectome.org> on behalf of
Dear HCP teams,
I am trying to threshold surface data of file *.dscalar.nii , and after that, I
don't know how to link data remains in the matrix with its space coordinates.
Is there any matrix contains the information linking vertex index in the matrix
with its space location? If there is
https://github.com/nipy/nibabel
Tim
On Mon, Mar 20, 2017 at 8:39 AM, Irisqql0922 <irisqql0...@163.com> wrote:
Sorry, I thought Matt misunderstanding my words. Ok, I will check the file
*midthickness*.sur.gii.
Thank you Donna, and thank you Matt^_^
Qinqin Lee
On 03/20/2017 21:25,Di
Dear HCP teams,
I need parameters concerning head motion(e.g., FD) to do my task fMRI analyse
and data about subjects' brain size to analyze structure data now (they will
both be used in regression).
So I wonder is there any file on AWS or connectome DB include these information
(I cannot
Dear hcp teams,
I sorry to bother you again with same problem.
I used default options and mounted data successfully. But when I checked
/s3/hcp, I found that data in it has only 900 subjects. Obviously, it's not the
latest 1200-release data.
Since I want to analyse the latest version of
Dear HCP teams,
I am now trying to use Amazon S3 on NITRC-CE, and I stucked with mounting
data from hcp-openaccess bucket to my instance. After I filled in blanks, the
system told me I had mount it successfully. But when I check the folder though
terminal, it shows nothing below.
I
unting it in the terminal like "sudo mount
/s3mnt" and see what error you get. You may have entered the authentication
incorrectly, or requested a new authentication token, but then entered the old,
expired one.
Tim
On Tue, May 9, 2017 at 2:12 AM, Irisqql0922 <irisqql0...@163.co
CP_900 listing works, but the HCP_1200 listing does not, then we will
need to arrange for you to get different credentials.
Tim
On 05/15/2017 08:48 AM, Irisqql0922 wrote:
Dear hcp teams,
I sorry to bother you again with same problem.
I used default options and mounted data successfully. But
:48 AM, Irisqql0922 <irisqql0...@163.com> wrote:
...
I use command:
: > ~/.passwd-s3fs
If this is really the command you used, then it wouldn't work: you need "echo"
at the start of it, like this:
echo : > ~/.passwd-s3fs
Please look at the file's contents with somethi
orking for both the 900 subjects release and the
1200 subjects release.
If the HCP_900 listing works, but the HCP_1200 listing does not, then we will
need to arrange for you to get different credentials.
Tim
On 05/15/2017 08:48 AM, Irisqql0922 wrote:
Dear hcp teams,
I sorry to bother you
Dear HCP teams,
I notice that you have run CIA-FIX denoise process for rfMRI data, but it
seems like there is no such process for task fMRI.
Now I am working on WM data and want to find related SNR data to do some
further analyse. So I wonder if there is any file containing SNR value
Laboratory of Cognitive Neuroscience and Learning
| |
Irisqql0922
|
|
irisqql0...@163.com
|
签名由网易邮箱大师定制
___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users
12 matches
Mail list logo