Re: [HCP-Users] hcp_fix bug

2017-03-22 Thread Glasser, Matthew
Please include the error message, I don’t know what you are referring to else.

Peace,

Matt.

From: 
>
 on behalf of Timothy Hendrickson >
Date: Wednesday, March 22, 2017 at 5:17 PM
To: "hcp-users@humanconnectome.org" 
>
Subject: [HCP-Users] hcp_fix bug

I am getting very similar errors to Nallin. Has this been resolved?


Timothy Hendrickson
Department of Psychiatry
University of Minnesota
Bioinformatics and Computational Biology M.S. Candidate
Office: 612-624-6441
Mobile: 507-259-3434 (texts okay)

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Question regarding Twins in HCP

2017-03-22 Thread Yann Le Guen
HI Jenn,

Thank you for the detail and clear reply. Looking forward for the next update.

Regards, Yann

> On 22 Mar 2017, at 22:59, Elam, Jennifer  wrote:
> 
> Hi Yann and Briana,
> Due to discussions right before the release, Pedigree_ID was removed from the 
> released data because it was deemed confusing being based solely on self 
> reported family relatedness data. The S1200 Reference Manual that was 
> originally put on the website was not the most up to date version that had 
> Pedigree_ID removed, but we now have that fixed. 
> 
> We are working on getting a "FamilyID" measure back into the database and 
> Reference Manual next week, this number will list all MotherIDs and FatherIDs 
> associated with subjects who are siblings so that families with half siblings 
> will be easily identified. 
> 
> MotherIDs and FatherIDs are all biological parents either confirmed by 
> genetics if we have genotyping for a subject and their siblings, or self 
> reported biological parents if not.  FamilyID is all made up of this 
> biological parent information. Whether a subject has genotyping data or not 
> will also be released as a "HasGT" variable.
> 
> We did not ask subjects about who they grew up with or their household 
> environment specifically, so unfortunately there's not a way to look at 
> shared environmental influence with the data we have.
> 
> Best,
> Jenn
> 
> 
> Jennifer Elam, Ph.D.
> Scientific Outreach, Human Connectome Project
> Washington University School of Medicine
> Department of Neuroscience, Box 8108
> 660 South Euclid Avenue
> St. Louis, MO 63110
>  <>314-362-9387 
> e...@wustl.edu 
> www.humanconnectome.org 
> From: Yann Le Guen >
> Sent: Monday, March 6, 2017 6:13:38 AM
> To: Elam, Jennifer
> Cc: hcp-users@humanconnectome.org 
> Subject: Re: [HCP-Users] Question regarding Twins in HCP
>  
> Dear All,
> 
> Another follow up question regarding the family structure of HCP data. The 
> S1200 reference manual mentioned (quote below) a Pedigree_ID field which 
> seems to be missing from the current "Restricted Data" csv file or is it 
> somewhere else ?
> 
> In addition, apart from this field is there any other field to confirm the 
> individuals share the same household environment ?
> 
> Best regards, Yann
> 
> "With the S1200 release, we have now verified the self reports for subjects 
> for which we have
> GWAS genetic data. We have updated these measures to Twin_Stat (original self 
> report),
> ZygositySR (orginal self-reported zygosity), ZygosityGT (twin zygosity 
> verified by genotyping),
> Pedigree_ID (new, family ID based on original, self-reported Mother_ID and 
> Father_ID),
> Mother_ID (verified by genotyping, if available), and Father_ID (verified by 
> genotyping, if
> available). Therefore, for some subjects, Mother_ID or Father_ID has changed 
> from what they
> were in previous releases based on the genotyping data."
> 
> https://www.humanconnectome.org/documentation/S1200/HCP_S1200_Release_Reference_Manual.pdf
>  
> 
> 
> 2017-03-03 16:37 GMT+01:00 Elam, Jennifer  >:
> Hi Ben,
> As described in the S1200 Reference Manual 
> ,
>  if the column ZygosityGT is blank for a set of twins we are missing genetic 
> material/GWAS data on one or both of the twin pair, so unfortunately we don't 
> have more data to share on the 64 twin pairs that do not have that measure. 
> 
> Best,
> Jenn
> 
> Jennifer Elam, Ph.D.
> Scientific Outreach, Human Connectome Project
> Washington University School of Medicine
> Department of Neuroscience, Box 8108
> 660 South Euclid Avenue
> St. Louis, MO 63110
>  <>314-362-9387 
> e...@wustl.edu 
> www.humanconnectome.org 
> From: Benjamin Risk >
> Sent: Friday, March 3, 2017 9:16:56 AM
> To: Elam, Jennifer
> Cc: D. van der Linden; hcp-users@humanconnectome.org 
> 
> Subject: Re: [HCP-Users] Question regarding Twins in HCP
>  
> I have a follow-up question on ZygosityGT. In the new data release, there 
> appear to be 64 twin pairs without zygosity verified by genotyping data (and 
> 244 with ZygosityGT). Are there plans to make genotyping-based zygosity 
> available for these individuals? 
> 
> Thank you,
> Ben 
> 
> On Tue, Feb 28, 2017 at 11:56 AM, Elam, Jennifer  > wrote:
> Hi Dimitri,
> Since your message was in response to an offline thread, for the benefit of 
> the HCP-Users list, let me explain what you are referencing regarding 
> dizygotic (DZ) twins that were "mislabeled" as 

[HCP-Users] hcp_fix bug

2017-03-22 Thread Timothy Hendrickson
I am getting very similar errors to Nallin. Has this been resolved?


Timothy Hendrickson
Department of Psychiatry
University of Minnesota
Bioinformatics and Computational Biology M.S. Candidate
Office: 612-624-6441
Mobile: 507-259-3434 (texts okay)

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] Question regarding Twins in HCP

2017-03-22 Thread Elam, Jennifer
Hi Yann and Briana,

Due to discussions right before the release, Pedigree_ID was removed from the 
released data because it was deemed confusing being based solely on self 
reported family relatedness data. The S1200 Reference Manual that was 
originally put on the website was not the most up to date version that had 
Pedigree_ID removed, but we now have that fixed.


We are working on getting a "FamilyID" measure back into the database and 
Reference Manual next week, this number will list all MotherIDs and FatherIDs 
associated with subjects who are siblings so that families with half siblings 
will be easily identified.


MotherIDs and FatherIDs are all biological parents either confirmed by genetics 
if we have genotyping for a subject and their siblings, or self reported 
biological parents if not.  FamilyID is all made up of this biological parent 
information. Whether a subject has genotyping data or not will also be released 
as a "HasGT" variable.


We did not ask subjects about who they grew up with or their household 
environment specifically, so unfortunately there's not a way to look at shared 
environmental influence with the data we have.


Best,

Jenn


Jennifer Elam, Ph.D.
Scientific Outreach, Human Connectome Project
Washington University School of Medicine
Department of Neuroscience, Box 8108
660 South Euclid Avenue
St. Louis, MO 63110
314-362-9387
e...@wustl.edu
www.humanconnectome.org



From: Yann Le Guen 
Sent: Monday, March 6, 2017 6:13:38 AM
To: Elam, Jennifer
Cc: hcp-users@humanconnectome.org
Subject: Re: [HCP-Users] Question regarding Twins in HCP

Dear All,

Another follow up question regarding the family structure of HCP data. The 
S1200 reference manual mentioned (quote below) a Pedigree_ID field which seems 
to be missing from the current "Restricted Data" csv file or is it somewhere 
else ?

In addition, apart from this field is there any other field to confirm the 
individuals share the same household environment ?

Best regards, Yann

"With the S1200 release, we have now verified the self reports for subjects for 
which we have
GWAS genetic data. We have updated these measures to Twin_Stat (original self 
report),
ZygositySR (orginal self-reported zygosity), ZygosityGT (twin zygosity verified 
by genotyping),
Pedigree_ID (new, family ID based on original, self-reported Mother_ID and 
Father_ID),
Mother_ID (verified by genotyping, if available), and Father_ID (verified by 
genotyping, if
available). Therefore, for some subjects, Mother_ID or Father_ID has changed 
from what they
were in previous releases based on the genotyping data."

https://www.humanconnectome.org/documentation/S1200/HCP_S1200_Release_Reference_Manual.pdf

2017-03-03 16:37 GMT+01:00 Elam, Jennifer 
>:

Hi Ben,

As described in the S1200 Reference 
Manual,
 if the column ZygosityGT is blank for a set of twins we are missing genetic 
material/GWAS data on one or both of the twin pair, so unfortunately we don't 
have more data to share on the 64 twin pairs that do not have that measure.


Best,

Jenn

Jennifer Elam, Ph.D.
Scientific Outreach, Human Connectome Project
Washington University School of Medicine
Department of Neuroscience, Box 8108
660 South Euclid Avenue
St. Louis, MO 63110
314-362-9387
e...@wustl.edu
www.humanconnectome.org



From: Benjamin Risk >
Sent: Friday, March 3, 2017 9:16:56 AM
To: Elam, Jennifer
Cc: D. van der Linden; 
hcp-users@humanconnectome.org
Subject: Re: [HCP-Users] Question regarding Twins in HCP

I have a follow-up question on ZygosityGT. In the new data release, there 
appear to be 64 twin pairs without zygosity verified by genotyping data (and 
244 with ZygosityGT). Are there plans to make genotyping-based zygosity 
available for these individuals?

Thank you,
Ben

On Tue, Feb 28, 2017 at 11:56 AM, Elam, Jennifer 
> wrote:

Hi Dimitri,

Since your message was in response to an offline thread, for the benefit of the 
HCP-Users list, let me explain what you are referencing regarding dizygotic 
(DZ) twins that were "mislabeled" as monozygotic (MZ). Zygosity and other 
family structure information currently available in ConnectomeDB (S900 release 
data) is based on self report. In the upcoming 1200 Subjects Release (S1200), 
we will be introducing a new measure, ZygosityGT, which is zygosity verified by 
the genotyping data available.


When we compare self-reported zygosity (the ZygositySR measure in the S1200) of 
the 244 HCP twin pairs that have with ZygosityGT, we find that 36 twin pairs 
that self-reported to be DZ, were 

Re: [HCP-Users] Language task variables

2017-03-22 Thread Burgess, Gregory
Hi Yann,

Thanks for looking into this. We will take some time to review the data files 
and processing scripts. If there’s need for an update, we’ll announce that to 
this list, and post the information on the known issues wiki page.

Thanks,
--Greg


Greg Burgess, Ph.D.
Staff Scientist, Human Connectome Project
Washington University School of Medicine
Department of Psychiatry
Phone: 314-362-7864
Email: gburg...@wustl.edu

> On Mar 21, 2017, at 10:45 AM, LE GUEN Yann  wrote:
>
> Dear mailing list,
>
>
> Given that I received no answers, I went to check it myself in unprocessed 
> data. The outcome, if that the Math and Story Difficulty Level are indeed 
> inverse. It probably happened during the parsing of the file 
> {$Subject}_3T_LANGUAGE_run{*}_TAB.txt to the file LANGUAGE_Stats.csv.
>
> How to check it yourself :
>
> Go first in
>
> {$Subject}/unprocessed/3T/tfMRI_LANGUAGE_LR/LINKED_DATA/EPRIME/
> or
>  {$Subject}/unprocessed/3T/tfMRI_LANGUAGE_RL/LINKED_DATA/EPRIME/
>
> Open respectively files
> {$Subject}_3T_LANGUAGE_run1_TAB.txt and {$Subject}_3T_LANGUAGE_run2_TAB.txt
>
> Check the columns CurrentMathLevel and CurrentStoryLevel, you will notice 
> that the first one must include number between 6 and 12 and second one 
> between 1 and 5. (approximate ranges).
>
> Now
> {$Subject}/unprocessed/3T/tfMRI_LANGUAGE_LR/LINKED_DATA/EPRIME/EVs/
> or
>  {$Subject}/unprocessed/3T/tfMRI_LANGUAGE_RL/LINKED_DATA/EPRIME/EVs/
>
> Open the files
> LANGUAGE_Stats.csv
>
> Notice that in this case the Math level correspond to the Story level and 
> vice-versa.
>
> Same problem, in the HCP distributed unrestricted data file in which the 
> Language_Task_Math_Avg_Difficulty_Level spans the range of the 
> Language_Task_Story_Avg_Difficulty_Level.
>
>
> If I am not mistaken, maybe it would be good to underline it in the HCP known 
> issues.
>
>
> Another point, that I asked in previous email was the Pedigree_ID field, 
> which is not included in the restricted data file, whereas it is mentioned in 
> S1200 HCP reference manual. If it could also be included, it would help me a 
> lot.
>
> Best regards, Yann
>
>
> De : LE GUEN Yann
> Envoyé : lundi 20 mars 2017 10:11
> À : hcp-users@humanconnectome.org
> Objet : [HCP-Users] Language task variables
>
> Dear HCP users,
>
> How can I check if the task related scores 
> Language_Task_Story_Avg_Difficulty_Level and 
> Language_Task_Math_Avg_Difficulty_Level have been switched together in HCP 
> unrestricted data ? For example, where can I find the levels definition, 
> because the two variables do not seem to span the same range.
>
> I'm asking this question given the following remarks:
>
> Story Acc has a correlation of 0.69 with Math_Diff_Lvl whereas only 0.29 with 
> Story_Diff_Lvl
> Math Acc has a correlation of 0.94 with Story Diff Lvl and only 0.35 with 
> Math Diff Lvl.
> While Story_Acc and Math have a correlation of 0.34, and Story_Diff_Lvl with 
> Math Diff Lvl of 0.38.
>
>
> Overall, it is coherent to observe a correlation of 0.3-0.4 between all these 
> variables taking into account the literature on the subject.
>
> However, the correlations between accuracy and difficulty level seems to be 
> inverse for Story and Math.
>
> Could you please kindly advise me if I am mistaken ?
>
> Best regards, Yann
>
> ___
> HCP-Users mailing list
> HCP-Users@humanconnectome.org
> http://lists.humanconnectome.org/mailman/listinfo/hcp-users



The materials in this message are private and may contain Protected Healthcare 
Information or other information of a sensitive nature. If you are not the 
intended recipient, be advised that any unauthorized use, disclosure, copying 
or the taking of any action in reliance on the contents of this information is 
strictly prohibited. If you have received this email in error, please 
immediately notify the sender via telephone or return mail.

___
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users


Re: [HCP-Users] film_gls surface mode error Newmat: incompatible dimensions

2017-03-22 Thread Blazej Baczkowski
Works!!!

Thanks a lot!

Best, Blazej

On 22 March 2017 at 13:11, Glasser, Matthew  wrote:

> You can use wb_command -metric-dilate first and then reapply the medial
> wall mask on the outputs.
>
> Peace,
>
> Matt.
>
> From: Blazej Baczkowski 
> Date: Wednesday, March 22, 2017 at 4:01 AM
> To: Matt Glasser 
>
> Cc: "hcp-users@humanconnectome.org" 
> Subject: Re: [HCP-Users] film_gls surface mode error Newmat: incompatible
> dimensions
>
> Dear Matt
>
> You were right. The values on the medial wall are all zeros and the time
> series are constant. Do you perhaps have a reliable solution to this?
>
> Many thanks in advance!!
> Best, Blazej
>
>
> On 21 March 2017 at 21:51, Glasser, Matthew  wrote:
>
>> Are there any zeros on the medial wall?  This issue could occur because
>> of timeseries with all zeros, as I recall.
>>
>> Peace,
>>
>> Matt.
>>
>> From: Blazej Baczkowski 
>> Date: Tuesday, March 21, 2017 at 1:22 PM
>> To: Matt Glasser 
>> Cc: "hcp-users@humanconnectome.org" 
>>
>> Subject: Re: [HCP-Users] film_gls surface mode error Newmat:
>> incompatible dimensions
>>
>> Thanks Matt!
>>
>> Yes, the surface and timeseries files display correctly in the workbench.
>>
>> Could you tell me what you mean by "mask the medial wall"? I dont know
>> what I could do here.
>>
>> It appears to me that FILM does not even read the surface file -- I
>> receive the same error even when no input2 ("--in2") is specified, while I
>> believe it should say "file missing".
>>
>> Best, Blazej
>>
>> On 21 March 2017 at 18:03, Glasser, Matthew  wrote:
>>
>>> I wouldn’t recommend smoothing with the inflated surface, but rather the
>>> midthickness surface (which you can create by averaging the white and pial
>>> surface coordinates).  As for your command, it looks correct to me.  Do the
>>> surface and timeseries files display correctly in Connectome Workbench?  Is
>>> the medial wall masked?
>>>
>>> Peace,
>>>
>>> Matt.
>>>
>>> From: Blazej Baczkowski 
>>> Date: Tuesday, March 21, 2017 at 9:00 AM
>>> To: Matt Glasser 
>>> Cc: "HCP-Users@humanconnectome.org" 
>>> Subject: Re: [HCP-Users] film_gls surface mode error Newmat:
>>> incompatible dimensions
>>>
>>> Hi Matt,
>>>
>>> Here are two commands that produce the error:
>>>
>>> 1) film_gls --in=test.lh.func.gii --in2=lh.inflated.surf.gii
>>> --mode=surface --rn=SURF_STATS --con=design.con --pd=design.mat --ar
>>>
>>> 2) film_gls --in=test.lh.func.gii --in2=lh.inflated.surf.gii
>>> --mode=surface --rn=SURF_STATS --con=design.con --pd=design.mat --thr=0.1
>>> --ms=15 --epith=5 --sa
>>>
>>> While this one runs smoothly:
>>>
>>> film_gls --in=test.lh.func.gii --in2=lh.inflated.surf.gii --mode=surface
>>> --rn=SURF_STATS --con=design.con --pd=design.mat --noest
>>>
>>> Thanks!!
>>> Blazej
>>>
>>> On 21 March 2017 at 14:50, Glasser, Matthew  wrote:
>>>
 Please post your film_gls command line.

 Peace,

 Matt.

 From:  on behalf of Blazej
 Baczkowski 
 Date: Tuesday, March 21, 2017 at 8:33 AM
 To: "HCP-Users@humanconnectome.org" 
 Subject: [HCP-Users] film_gls surface mode error Newmat: incompatible
 dimensions

 Hello everyone,

 I encounter a problem with FSL's "film_gls" (v5.0.9) when performing
 first level GLM analysis in surface mode. Data were projected to fsaverage5
 space (10242 vertices). It appears that the problem occurs when FILM
 estimates the autocorrelation, because it works perfectly fine with
 "--noest" option.

 Here is the respective error message:

 *paradigm.getDesignMatrix().Nrows()=413*
 *paradigm.getDesignMatrix().Ncols()=6*
 *sizeTS=413*
 *numTS=10242*
 *Calculating residuals...*
 *Completed*
 *Estimating residual autocorrelation...*
 *Fitting autoregressive model...*


 *An exception has been thrown*
 *Logic error:- detected by Newmat: incompatible dimensions*

 *Trace: SubMatrix(=); AutoCorrEstimator::fitAutoRegressiveModel.*

 My data has 413 timepoints and the design matrix consists of 6 columns
 (enclosed). The volumetric analysis with the same design matrix and
 autocorrelation estimation does not produce any error. The design matrix
 was created with FSL FEAT/Glm Setup.

 Many thanks in advance for any help.

 Best regards, Blazej


 ___
 HCP-Users mailing list
 HCP-Users@humanconnectome.org
 http://lists.humanconnectome.org/mailman/listinfo/hcp-users


 --

 The 

Re: [HCP-Users] film_gls surface mode error Newmat: incompatible dimensions

2017-03-22 Thread Glasser, Matthew
You can use wb_command -metric-dilate first and then reapply the medial wall 
mask on the outputs.

Peace,

Matt.

From: Blazej Baczkowski 
>
Date: Wednesday, March 22, 2017 at 4:01 AM
To: Matt Glasser >
Cc: "hcp-users@humanconnectome.org" 
>
Subject: Re: [HCP-Users] film_gls surface mode error Newmat: incompatible 
dimensions

Dear Matt

You were right. The values on the medial wall are all zeros and the time series 
are constant. Do you perhaps have a reliable solution to this?

Many thanks in advance!!
Best, Blazej


On 21 March 2017 at 21:51, Glasser, Matthew 
> wrote:
Are there any zeros on the medial wall?  This issue could occur because of 
timeseries with all zeros, as I recall.

Peace,

Matt.

From: Blazej Baczkowski 
>
Date: Tuesday, March 21, 2017 at 1:22 PM
To: Matt Glasser >
Cc: "hcp-users@humanconnectome.org" 
>

Subject: Re: [HCP-Users] film_gls surface mode error Newmat: incompatible 
dimensions

Thanks Matt!

Yes, the surface and timeseries files display correctly in the workbench.

Could you tell me what you mean by "mask the medial wall"? I dont know what I 
could do here.

It appears to me that FILM does not even read the surface file -- I receive the 
same error even when no input2 ("--in2") is specified, while I believe it 
should say "file missing".

Best, Blazej

On 21 March 2017 at 18:03, Glasser, Matthew 
> wrote:
I wouldn’t recommend smoothing with the inflated surface, but rather the 
midthickness surface (which you can create by averaging the white and pial 
surface coordinates).  As for your command, it looks correct to me.  Do the 
surface and timeseries files display correctly in Connectome Workbench?  Is the 
medial wall masked?

Peace,

Matt.

From: Blazej Baczkowski 
>
Date: Tuesday, March 21, 2017 at 9:00 AM
To: Matt Glasser >
Cc: "HCP-Users@humanconnectome.org" 
>
Subject: Re: [HCP-Users] film_gls surface mode error Newmat: incompatible 
dimensions

Hi Matt,

Here are two commands that produce the error:

1) film_gls --in=test.lh.func.gii --in2=lh.inflated.surf.gii --mode=surface 
--rn=SURF_STATS --con=design.con --pd=design.mat --ar

2) film_gls --in=test.lh.func.gii --in2=lh.inflated.surf.gii --mode=surface 
--rn=SURF_STATS --con=design.con --pd=design.mat --thr=0.1 --ms=15 --epith=5 
--sa

While this one runs smoothly:

film_gls --in=test.lh.func.gii --in2=lh.inflated.surf.gii --mode=surface 
--rn=SURF_STATS --con=design.con --pd=design.mat --noest

Thanks!!
Blazej

On 21 March 2017 at 14:50, Glasser, Matthew 
> wrote:
Please post your film_gls command line.

Peace,

Matt.

From: 
>
 on behalf of Blazej Baczkowski 
>
Date: Tuesday, March 21, 2017 at 8:33 AM
To: "HCP-Users@humanconnectome.org" 
>
Subject: [HCP-Users] film_gls surface mode error Newmat: incompatible dimensions

Hello everyone,

I encounter a problem with FSL's "film_gls" (v5.0.9) when performing first 
level GLM analysis in surface mode. Data were projected to fsaverage5 space 
(10242 vertices). It appears that the problem occurs when FILM estimates the 
autocorrelation, because it works perfectly fine with "--noest" option.

Here is the respective error message:

paradigm.getDesignMatrix().Nrows()=413
paradigm.getDesignMatrix().Ncols()=6
sizeTS=413
numTS=10242
Calculating residuals...
Completed
Estimating residual autocorrelation...
Fitting autoregressive model...


An exception has been thrown
Logic error:- detected by Newmat: incompatible dimensions

Trace: SubMatrix(=); AutoCorrEstimator::fitAutoRegressiveModel.

My data has 413 timepoints and the design matrix consists of 6 columns 
(enclosed). The volumetric analysis with the same design matrix and 
autocorrelation estimation does not produce any error. The design matrix was 
created with FSL FEAT/Glm Setup.

Many thanks in advance for any help.

Best regards, Blazej



___
HCP-Users mailing list
HCP-Users@humanconnectome.org

Re: [HCP-Users] film_gls surface mode error Newmat: incompatible dimensions

2017-03-22 Thread Blazej Baczkowski
Dear Matt

You were right. The values on the medial wall are all zeros and the time
series are constant. Do you perhaps have a reliable solution to this?

Many thanks in advance!!
Best, Blazej


On 21 March 2017 at 21:51, Glasser, Matthew  wrote:

> Are there any zeros on the medial wall?  This issue could occur because of
> timeseries with all zeros, as I recall.
>
> Peace,
>
> Matt.
>
> From: Blazej Baczkowski 
> Date: Tuesday, March 21, 2017 at 1:22 PM
> To: Matt Glasser 
> Cc: "hcp-users@humanconnectome.org" 
>
> Subject: Re: [HCP-Users] film_gls surface mode error Newmat: incompatible
> dimensions
>
> Thanks Matt!
>
> Yes, the surface and timeseries files display correctly in the workbench.
>
> Could you tell me what you mean by "mask the medial wall"? I dont know
> what I could do here.
>
> It appears to me that FILM does not even read the surface file -- I
> receive the same error even when no input2 ("--in2") is specified, while I
> believe it should say "file missing".
>
> Best, Blazej
>
> On 21 March 2017 at 18:03, Glasser, Matthew  wrote:
>
>> I wouldn’t recommend smoothing with the inflated surface, but rather the
>> midthickness surface (which you can create by averaging the white and pial
>> surface coordinates).  As for your command, it looks correct to me.  Do the
>> surface and timeseries files display correctly in Connectome Workbench?  Is
>> the medial wall masked?
>>
>> Peace,
>>
>> Matt.
>>
>> From: Blazej Baczkowski 
>> Date: Tuesday, March 21, 2017 at 9:00 AM
>> To: Matt Glasser 
>> Cc: "HCP-Users@humanconnectome.org" 
>> Subject: Re: [HCP-Users] film_gls surface mode error Newmat:
>> incompatible dimensions
>>
>> Hi Matt,
>>
>> Here are two commands that produce the error:
>>
>> 1) film_gls --in=test.lh.func.gii --in2=lh.inflated.surf.gii
>> --mode=surface --rn=SURF_STATS --con=design.con --pd=design.mat --ar
>>
>> 2) film_gls --in=test.lh.func.gii --in2=lh.inflated.surf.gii
>> --mode=surface --rn=SURF_STATS --con=design.con --pd=design.mat --thr=0.1
>> --ms=15 --epith=5 --sa
>>
>> While this one runs smoothly:
>>
>> film_gls --in=test.lh.func.gii --in2=lh.inflated.surf.gii --mode=surface
>> --rn=SURF_STATS --con=design.con --pd=design.mat --noest
>>
>> Thanks!!
>> Blazej
>>
>> On 21 March 2017 at 14:50, Glasser, Matthew  wrote:
>>
>>> Please post your film_gls command line.
>>>
>>> Peace,
>>>
>>> Matt.
>>>
>>> From:  on behalf of Blazej
>>> Baczkowski 
>>> Date: Tuesday, March 21, 2017 at 8:33 AM
>>> To: "HCP-Users@humanconnectome.org" 
>>> Subject: [HCP-Users] film_gls surface mode error Newmat: incompatible
>>> dimensions
>>>
>>> Hello everyone,
>>>
>>> I encounter a problem with FSL's "film_gls" (v5.0.9) when performing
>>> first level GLM analysis in surface mode. Data were projected to fsaverage5
>>> space (10242 vertices). It appears that the problem occurs when FILM
>>> estimates the autocorrelation, because it works perfectly fine with
>>> "--noest" option.
>>>
>>> Here is the respective error message:
>>>
>>> *paradigm.getDesignMatrix().Nrows()=413*
>>> *paradigm.getDesignMatrix().Ncols()=6*
>>> *sizeTS=413*
>>> *numTS=10242*
>>> *Calculating residuals...*
>>> *Completed*
>>> *Estimating residual autocorrelation...*
>>> *Fitting autoregressive model...*
>>>
>>>
>>> *An exception has been thrown*
>>> *Logic error:- detected by Newmat: incompatible dimensions*
>>>
>>> *Trace: SubMatrix(=); AutoCorrEstimator::fitAutoRegressiveModel.*
>>>
>>> My data has 413 timepoints and the design matrix consists of 6 columns
>>> (enclosed). The volumetric analysis with the same design matrix and
>>> autocorrelation estimation does not produce any error. The design matrix
>>> was created with FSL FEAT/Glm Setup.
>>>
>>> Many thanks in advance for any help.
>>>
>>> Best regards, Blazej
>>>
>>>
>>> ___
>>> HCP-Users mailing list
>>> HCP-Users@humanconnectome.org
>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users
>>>
>>>
>>> --
>>>
>>> The materials in this message are private and may contain Protected
>>> Healthcare Information or other information of a sensitive nature. If you
>>> are not the intended recipient, be advised that any unauthorized use,
>>> disclosure, copying or the taking of any action in reliance on the contents
>>> of this information is strictly prohibited. If you have received this email
>>> in error, please immediately notify the sender via telephone or return mail.
>>>
>>
>>
>> --
>>
>> The materials in this message are private and may contain Protected
>> Healthcare Information or other information of a sensitive nature. If you
>> are not the intended