I would use the Conte69 for TFCE/cluster area computation purposes.  Think of 
it as a neutral, unbiased atlas surface.

The topology files only define neighbor relationships, so on a standard mesh, 
the same topo will work with a variety of configurations that are on that mesh. 
 The ones I gave you should be fine.

One thing I don't recall talking about is generating composite files of the 
depth metric/shape files.  (Metric and shape are identical in data format.  
Metric was intended more for overlay/functional, while surface_shape is 
intended more for anatomical measures like depth, curvature, thickness, etc.  
But the metric menu has more features than the surface_shape menu, so I 
sometimes purposely use metric.  For this purpose, either is fine.)

The ANOVA test wants composite files for each treatment/group (maybe what you 
meant by factor level.  So at some point you need to generate composite files 
to concatenate your subjects into one composite per group.

http://brainmap.wustl.edu/pub/donna/FREESURFER/SCRIPTS/2009_10/SCRIPTS/gen_composite_filcav.sh
login pub
password download

In that example, Depth was the second of multiple columns per subject.  I don't 
recall what it is for the fs_LR stream.  But if you run caret_command 
-metric-information on one of your surface_shape files, you'll find out which 
column has Depth.


On Nov 21, 2013, at 4:58 PM, Eshita Shah <[email protected]> wrote:

> Hi Donna, 
> 
> Thank you for the files. I seem to be understanding so far what the sample 
> script is doing but I do have a few questions. For the data file input into 
> ANOVA using caret_stats, I notice it's in a different format than in 
> caret_command ANOVA. I just want to clarify that each data file is still a 
> metric file that contains all of the subjects for one factor level. Secondly, 
> I realize I am using the fs_LR average open topo files you provided earlier, 
> but for the average fiducial coordinate file, should I be also using the 
> Conte69 average? I know you pointed out that my data is less comparable to 
> the fs_LR standard mesh data, so I am curious as to whether I should just 
> generate my own average fiducial file and use that instead. 
> 
> Let me know if I'm heading the right way. 
> 
> Thanks for all your help, 
> Eshita 
> 
> 
> 
> On Tue, Nov 19, 2013 at 2:35 PM, Donna Dierker <[email protected]> 
> wrote:
> Here are the caret_stats and jre zip files:
> 
> http://brainvis.wustl.edu/pub/donna/SCRIPTS/caret6.zip
> http://brainvis.wustl.edu/pub/donna/SCRIPTS/linux_java.zip
> login pub
> password download
> 
> A sample script that calls TFCE is here:
> 
> http://brainmap.wustl.edu/pub/donna/SCRIPTS/SHAPE/depth_anova.LH.CHILD.sh
> 
> I am sure you will have more questions.  You can try searching caret-users 
> for TFCE, but if you get stuck, I'll be here in the morning.
> 
> 
> 
> On 11/19/2013 02:56 PM, Eshita Shah wrote:
>> Hi Donna, 
>> 
>> I would definitely be interested in using the TFCE method. Where can I 
>> download caret_stats and the JRE? 
>> 
>> Thank you, 
>> Eshita 
>> 
>> 
>> On Tue, Nov 19, 2013 at 11:40 AM, Donna Dierker <[email protected]> 
>> wrote:
>> Hi Eshita,
>> 
>> You don't need to create an average topo of your subjects, because your data 
>> is on the 164k fs_LR standard mesh, so the open topology files in the link I 
>> provided below is all you will need to define the neighbor relationships 
>> between the vertices.
>> 
>> You do need to make a decision or two, though:  The caret_command 
>> -metric-anova-one-way feature is a valid test, but it requires a 
>> cluster-forming threshold (e.g., whatever f-stat corresponds to p=.01 or 
>> p=.025/hem).  It can make a big difference which cluster-forming threshold 
>> you use, as is described here:
>> 
>> http://www.jneurosci.org/content/suppl/2010/02/12/30.6.2268.DC1/Supplemental_Material.pdf
>> page 6 and supplementary material figure 7
>> 
>> Instead, we now use Threshold-Free Cluster Enhancement (TFCE), which 
>> essentially integrates over the whole range f-stats:
>> 
>> http://brainvis.wustl.edu/wiki/index.php/Caret:Documentation:Statistics:TFCE_Implementation
>> 
>> Smith SM, Nichols TE., "Threshold-free cluster enhancement: addressing 
>> problems of smoothing, threshold dependence and localisation in cluster 
>> inference." Neuroimage. 2009 Jan 1;44(1):83-98. PMID: 18501637 
>> http://www.ncbi.nlm.nih.gov/pubmed/18501637
>> 
>> Using TFCE requires downloading caret_stats and the java runtime engine 
>> (JRE) that has been shown to work well with it.  (Some JREs hang or get 
>> bogged down.)
>> 
>> These features aren't documented in tutorials, but at least two others have 
>> managed to get it to work.
>> 
>> If you're fine with the caret_command feature, you should be good to go.
>> 
>> Donna
>> 
>> 
>> On Nov 19, 2013, at 12:26 PM, Eshita Shah <[email protected]> wrote:
>> 
>> > Hi Donna,
>> >
>> > The above script was helpful, thanks. My main concern now is to run the 
>> > ANOVA test (using caret_command -metric-anova-one-way). You stated earlier 
>> > that I don't need to worry about the open topo file, but to input into 
>> > ANOVA should I be creating an average topo file of all my subjects?
>> >
>> > Please let me know. Thank you for your patience and help.
>> >
>> > Eshita
>> >
>> >
>> > On Fri, Nov 15, 2013 at 3:50 PM, Donna Dierker 
>> > <[email protected]> wrote:
>> > Scroll down eshita
>> >
>> > From: [email protected] <[email protected]>;
>> > To: <[email protected]>;
>> > Subject: Failure Notice
>> > Sent: Fri, Nov 15, 2013 10:41:08 PM
>> >
>> > Sorry, we were unable to deliver your message to the following address.
>> >
>> > <[email protected]>:
>> > No MX or A records for brainvis.wustl.edu
>> >
>> > --- Below this line is a copy of the message.
>> >
>> > Received: from [216.39.60.175] by nm11.access.bullet.mail.gq1.yahoo.com 
>> > with NNFMP; 15 Nov 2013 21:15:13 -0000
>> > Received: from [98.138.104.99] by tm11.access.bullet.mail.gq1.yahoo.com 
>> > with NNFMP; 15 Nov 2013 21:15:12 -0000
>> > Received: from [127.0.0.1] by smtp119.sbc.mail.ne1.yahoo.com with NNFMP; 
>> > 15 Nov 2013 21:15:12 -0000
>> > DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=sbcglobal.net; 
>> > s=s1024; t=1384550112; bh=NtxBzru9khKDAsba5SAlV4YrRgVkHeW8b9YGv0bMztA=; 
>> > h=X-Yahoo-Newman-Id:X-Yahoo-Newman-Property:X-YMail-OSG:X-Yahoo-SMTP:X-Rocket-Received:Message-ID:Date:From:User-Agent:MIME-Version:To:Subject:References:In-Reply-To:Content-Type;
>> >  
>> > b=WV5yG6d6wYHiVVhb2EQHyNRXQWxW1LesfyqTNaXOXmqqqOvPIFG2YS//Ij/FdfTPJj/2vVds/n4M6IksP/0A0F9p1DFQ0f99NlI5Kdnig45dD3sfU7lcXCOg4yTSnjFCUOwFbOKNDdhbE5qw7rGSY2mkoTbXduJwrvIHu6fC/LQ=
>> > X-Yahoo-Newman-Id: [email protected]
>> > X-Yahoo-Newman-Property: ymail-3
>> > X-YMail-OSG: gXcHylQVM1l09L.PSsdYMzRKwLwkv.NYwqaMp.6BrpjiXNE
>> > ho6UQJJXjmXuAKMCqpRBD2pMHNRD6C6IljqIBAI6R6Qm8xhM3bVHels3.Fm5
>> > w8b7Ond.bbI2YmxgKPd57rAIo6ok.Q.vhp4ZhM8s_TaTPWlswXpD2yAlcLHq
>> > 1J3g4GvdFzgSgJ_YzgaHMEiNZaUTqjMiAsBZ30klPBT.yrdrNl9W_9TShNiA
>> > bCG4r9u36LVWGZHlQVxynSOJXA8ldy_K2eYACxDrpigxbeyqkL30yLrOmtQv
>> > rdfyk.fCTiiBI6TOCO_yOj.NPnYttzBTJEhvKwTNrhoIs3t6QxOjFKZI.zOl
>> > js7LRoCYbirS1mueqpF25Kz_lMdVsB3O6ofotbtALNwdi18tELGjU33pq6Hy
>> > .tmvnVoOH1Wy9dd9Gm1O.j2DtcMH1OWMIHROL8lAhs8hGhffYi3T7YY33LIh
>> > ujuoDMqy1hr5D4XI96NYpViITdei1lS_V51d_uov235Mw5xaWhCVuLwNwG7Q
>> > N_saCzXaf8DGq1fTdNgz2LfRRAdnh6yuEkB57kTF4BjRG2YSYgnHiPnfABCY
>> > 45DWmSS4cDvTRob2HbfUDsx5nwS0t6N4joyIUQ3I2_gz2502fqOnNC0h4mDj
>> > cZAi9sdtFy2QMAaFYDLUg62LXsFSlpCxY0gvSmu3MlUoZLuw9wFN0IDHOKBl
>> > YK9SPRb7erKBkS2zi1POQoQOpB2iyoWJsF7_XnF6H.LEnzp0BfxjMvCm0.o9
>> > pdWNfeuoKF3UbR5wwnXXz2LY7okbApoTG7UK_gUrsqG_aLea.qRDsFi5INWP
>> > uOFJfU1pSVKYeCpn7IpEVN7BaeeVww6BI0Iwanc3H0wJtHwMm8HgCUIb4gzL
>> > S8Bls2IYSlA0OYGW_
>> > X-Yahoo-SMTP: q5QnzDOswBAGQX7OHacFHTX9.UGNm04EhVsFT8nwx8VksFe0qzkFUA--
>> > X-Rocket-Received: from [128.252.37.103] ([email protected] 
>> > with )
>> >         by smtp119.sbc.mail.ne1.yahoo.com with SMTP; 15 Nov 2013 21:15:12 
>> > +0000 UTC
>> > Message-ID: <[email protected]>
>> > Date: Fri, 15 Nov 2013 15:15:15 -0600
>> > From: Donna Dierker <[email protected]>
>> > User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:24.0) Gecko/20100101 
>> > Thunderbird/24.1.0
>> > MIME-Version: 1.0
>> > To: "Caret, SureFit, and SuMS software users" 
>> > <[email protected]>
>> > Subject: Re: [caret-users] Different Node Numbers
>> > References: 
>> > <caewtdbixrykoqdg4n5xqdfawhre0wdlmhmbswuu7irxznp+...@mail.gmail.com>    
>> > <CALvPP=_6XMYhcAPSn3ebOOJ2rmo-Y55+wk3zX99-=a8gnpw...@mail.gmail.com>    
>> > <CAEwtdbjAp4TrY=WUMbSZsdBD-HgdOxbc=gvdznnpjirn7+a...@mail.gmail.com>    
>> > <caewtdbgykdtjqo6_vugadnay3q5jkbwbvhsbtcsr-gaa5hu...@mail.gmail.com> 
>> > <caewtdbj-hqvrsskd1lxdhhs_-duith2oro__g4+qzirgw-d...@mail.gmail.com>
>> > In-Reply-To: 
>> > <caewtdbj-hqvrsskd1lxdhhs_-duith2oro__g4+qzirgw-d...@mail.gmail.com>
>> > Content-Type: multipart/alternative;
>> > boundary="------------060202070608050109090303"
>> >
>> > This is a multi-part message in MIME format.
>> > --------------060202070608050109090303
>> > Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>> > Content-Transfer-Encoding: 7bit
>> >
>> > What you're doing seems reasonable.  If this is all you're trying to do,
>> > you don't need to worry about the medial wall borders, because the open
>> > topo file will exclude the medial wall vertices.
>> >
>> > The preborder.sh script only works with the PALS 74k mesh.  This one
>> > might be more helpful:
>> >
>> > http://brainvis.wustl.edu/pub/donna/FREESURFER/SCRIPTS/2013_11/gen_depth.sh
>> > login pub
>> > password download
>> >
>> > Replace the subject directories named like SAIS* with your own list.
>> >
>> >
>> > On 11/15/2013 02:28 PM, Eshita Shah wrote:
>> > > Hi Donna,
>> > >
>> > > Thank you, that clears up a lot. Just to clarify, my main purpose is
>> > > to use Caret to generate sulcal depth maps for each subject and run
>> > > statistical analysis (such as ANOVA) to compare the results of two
>> > > groups. I have completed importing all my data to the fs_LR standard
>> > > mesh, and now I have to generate sulcal depth maps and run analyses.
>> > > To do this, I am planning on running the "generate depth" function
>> > > from PALS-B12/preborder.sh. Will this generate the appropriate input
>> > > parameters for running the one-way ANOVA test through caret_command?
>> > >
>> > > Here is a list of what's needed:
>> > >
>> > >  METRIC STATISTICS ONE-WAY ANOVA
>> > >      caret_command -metric-statistics-anova-one-way
>> > >          <fiducial-coord-file>
>> > >          <open-topo-file>
>> > >          <distortion-metric-shape-file>
>> > >          <distortion-column-number>
>> > >          <output-file-names-prefix>
>> > >          ...
>> > >          <metric-file-names>
>> > >
>> > > After the import of the data into fs_LR mesh, I have multiple coord
>> > > files, shape files, and closed topo files. I do not have specific
>> > > paint or border files. So I am not exactly sure how to exclude the
>> > > paint files that label the medial wall from my analysis, like you
>> > > stated previously. I found a command in postborder.sh that converts
>> > > the closed topo file into an open topo file using the roi file that's
>> > > generated for the medial wall. Any guidance on doing that would be
>> > > appreciated.
>> > >
>> > > Please let me know if I'm going on the right track to generate the
>> > > sulcal depth maps and run my analyses.
>> > >
>> > > Thank you,
>> > > Eshita
>> > >
>> > >
>> > > On Nov 15, 2013, at 10:07 AM, Donna Dierker <[email protected]  
>> > > <mailto:[email protected]>> wrote:
>> > >
>> > > >Hi Eshita,
>> > > >
>> > > >topi = topology, and I hate auto-correct
>> > > >You mean the surface_shape/metric files -- not the topology, right?
>> > > >
>> > > >When all your data is on a standard mesh, as Caret requires to do group
>> > > >analysis (except for summary stats that are not vertex-wise, e.g., 
>> > > >gyrification
>> > > >index), then typically the stats tests use a single mean midthickness 
>> > > >and open
>> > > >topology for computing the areas that are used for the TFCE/cluster
>> > > >distributions.
>> > > >
>> > > >You typically don't need to generate an open topo file for each subject 
>> > > >on the
>> > > >standard mesh.  You have a single paint/label file that labels the 
>> > > >medial wall
>> > > >vertices, so you can exclude them from your analysis.  Or in this case, 
>> > > >you
>> > > >have an open topology file that will accomplish the same thing.  If 
>> > > >they're all
>> > > >on standard mesh, then your standard open topi file should look 
>> > > >reasonable with
>> > > >all your standard mesh midthickness surfaces.
>> > > >
>> > > >An example of a time when you would need an open topi is when you want 
>> > > >the
>> > > >gyrification index computed on the native mesh.  Then you can use a 
>> > > >strategy
>> > > >like thePALS_B12.LR/postborder.sh  <http://PALS_B12.LR/postborder.sh>
>> > > >(http://brainmap.wustl.edu/pub/donna/FREESURFER/SCRIPTS/2011_10/PALS_B12.LR/postborder.sh
>> > > > ; login pub, password download) uses in the freesurfer to PALS 
>> > > > pipeline, which
>> > > >essentially writes a border around the standard mesh medial wall; 
>> > > >"unprojects"
>> > > >the border on the standard mesh midthickness; and then projects it on 
>> > > >the
>> > > >native mid thickness.  When you do that, you need to project to/from the
>> > > >spherical or ellipsoid, so that your medial wall border points don't 
>> > > >get hosed.
>> > > >
>> > > >You can use your own study-specific mean midthickness, but then you 
>> > > >should
>> > > >compute your own mean distortion metric to go with it.  Besides being a 
>> > > >bit of
>> > > >a hassle, it strikes me that your results are slightly less comparable 
>> > > >to
>> > > >others computed on the fs_LR standard mesh that used the more standard 
>> > > >Conte69
>> > > >mean mid thickness/distortion.  I don't use study-specific files for 
>> > > >this
>> > > >purpose, unless the populations are so different (e.g., baby vs adult) 
>> > > >that you
>> > > >can't use the adult.
>> > > >
>> > > >Visualization is another story.  I almost always show the 
>> > > >study-specific mean
>> > > >midthickness for each group in morphometry studies.
>> > > >
>> > > >Donna
>> > >
>> > >
>> > >
>> > >
>> > > On Thu, Nov 14, 2013 at 1:38 PM, Eshita Shah <[email protected]
>> > > <mailto:[email protected]>> wrote:
>> > >
>> > >    Hi Donna,
>> > >
>> > >
>> > >
>> > >    Thank you for your help. From what I understand, the topology files 
>> > > from each subject have to be compiled (using caret_command 
>> > > -metric-composite) into one, and that file is then entered into the 
>> > > ANOVA test. My question however, is regarding the generation of the open 
>> > > topo files for each subject. I have used freesurfer_to_fs_LR pipeline, 
>> > > and I'm wondering if the topo files generated (there is only one for 
>> > > each subject) are indeed the open topo file, because looking at the 
>> > > script for freesurfer_to_fs_LR it looks like it is the closed topo file. 
>> > > Any advice on how to obtain an open topo file?
>> > >
>> > >
>> > >    Also, to clarify, the first parameter in the ANOVA test is a fiducial 
>> > > coord file. I have used caret_command -surface-average to generate an 
>> > > average fiducial file for each hemisphere. Is this correct?
>> > >
>> > >
>> > >    Please let me know.
>> > >
>> > >
>> > >
>> > >
>> > >    Thanks,
>> > >
>> > >    Eshita
>> > >
>> > >    On Wed, Nov 13, 2013 at 10:50 PM. Donna Dierker wrote:
>> > >
>> > >    >Hi Eshita,
>> > >
>> > >    >
>> > >    >I always use the open topology for this purpose (i.e., excludes only 
>> > > medial
>> > >    >wall vertices).  The files here will be helpful:
>> > >
>> 
>> > >    >login pub
>> > >    >password download
>> > >    >
>> > >    >You can get them all in this zip file:
>> > >    >
>> > >    
>> > > >http://brainmap.wustl.edu/pub/donna/FREESURFER/SCRIPTS/2013_11/TFCE_164k.zip
>> > >    >
>> > >    >Just to explain what is going on, the areas in the TFCE/cluster 
>> > > computations
>> > >    >are computed on the Conte69 mean mid thickness, with the open topo 
>> > > file
>> > >    >(excluding medial wall).  The distortion maps pump up the areal 
>> > > value where
>> > >    >substantial smoothing occurs as a result of averaging individuals' 
>> > > coordinate
>> > >    >files (e.g., high 3D variability).  The intent is to make the areas 
>> > > more like
>> > >    >an individual's area would be in that region.  For folks not attuned 
>> > > to what
>> > >    >you are doing, this is during group analysis.
>> > >
>> > >    >
>> > >    >Cheers,
>> > >    >
>> > >    >Donna
>> > >
>> > >    >
>> > >    >On Tue, Nov 12, 2013 at 1:18 PM, Eshita Shah <[email protected]
>> > >    <mailto:[email protected]>> wrote:
>> > >
>> > >        Hello,
>> > >
>> > >        Thanks for your input. I successfully was able to use
>> > >        freesurfer_to_fs_LR Pipeline to import my FreeSurfer files
>> > >        into caret, however when I try running ANOVA, it asks for
>> > >        certain files that have not been generated by the pipeline.
>> > >        Specifically, how do I generate the
>> > >        "distortion-metric-shape-file" that is being asked for?
>> > >        Lastly, is the .topo file that is generated via the pipeline
>> > >        the open topo file or closed? Previously I was able generate
>> > >        the closed topo file, so I'm not sure if the
>> > >        freesurfer_to_fs_LR pipeline does the same. The parameter
>> > >        required for the ANOVA analysis is the open topo file.
>> > >
>> > >        Thank you,
>> > >        Eshita Shah
>> > >
>> > >
>> > >        On Thu, Nov 7, 2013 at 4:12 PM, Rouhollah Abdollahi
>> > >        <[email protected] <mailto:[email protected]>> wrote:
>> > >
>> > >            Hi
>> > >            Actually the code import the original data from freesurfer
>> > >            to caret then automatically you will have different node
>> > >            number for different subjects and hemispheres. To have the
>> > >            same mesh you can use Freesurfer_to_fs_LR Pipeline which
>> > >            is available in the caret website. It imports all the data
>> > >            to the same mesh which here is fs_LR mesh.
>> > >            Hope it helps
>> > >            Best
>> > >            Rouhi
>> > >
>> > >            On Nov 8, 2013 12:06 AM, "Eshita Shah" <[email protected]
>> > >            <mailto:[email protected]>> wrote:
>> > >
>> > >                Hello,
>> > >
>> > >                I have just recently started using Caret, and I am
>> > >                running the freesurfer2caret.sh script in order to
>> > >                import my FreeSurfer files into Caret as well as
>> > >                generate sulcal depth for all subjects. I tried doing
>> > >                a One-Way ANOVA test, but I've realized that the
>> > >                number of nodes in the metric/surface_shape files for
>> > >                the two subjects are different. How is it possible
>> > >                that the same script is creating files with different
>> > >                node numbers? Also, within each subject, sometimes the
>> > >                node numbers for the left and right hemisphere are
>> > >                different as well. How can I resolve this issue so I
>> > >                can successfully run ANOVA on my subjects?
>> > >
>> > >                Any help would be appreciated.
>> > >
>> > >                Thank you,
>> > >                Eshita Shah
>> > >
>> > >                _______________________________________________
>> > >                caret-users mailing list
>> > >                [email protected]
>> > >                <mailto:[email protected]>
>> > >                http://brainvis.wustl.edu/mailman/listinfo/caret-users
>> > >
>> > >
>> > >            _______________________________________________
>> > >            caret-users mailing list
>> > >            [email protected]
>> > >            <mailto:[email protected]>
>> > >            http://brainvis.wustl.edu/mailman/listinfo/caret-users
>> > >
>> > >
>> > >
>> > >
>> > >
>> > > --
>> > > Eshita Shah
>> > > University of California, Los Angeles | 2014
>> > > B.S. Neuroscience
>> > > [email protected] <mailto:[email protected]>
>> > >
>> > >
>> > >
>> > >
>> > > _______________________________________________
>> > > caret-users mailing list
>> > > [email protected]
>> > > http://brainvis.wustl.edu/mailman/listinfo/caret-users
>> >
>> >
>> > --------------060202070608050109090303
>> > Content-Type: text/html; charset=ISO-8859-1
>> > Content-Transfer-Encoding: 7bit
>> >
>> > <html>
>> >   <head>
>> >     <meta content="text/html; charset=ISO-8859-1"
>> >       http-equiv="Content-Type">
>> >   </head>
>> >   <body bgcolor="#FFFFFF" text="#000000">
>> >     What you're doing seems reasonable.&nbsp; If this is all you're trying 
>> > to
>> >     do, you don't need to worry about the medial wall borders, because
>> >     the open topo file will exclude the medial wall vertices.<br>
>> >     <br>
>> >     The preborder.sh script only works with the PALS 74k mesh.&nbsp; This 
>> > one
>> >     might be more helpful:<br>
>> >     <br>
>> > <a class="moz-txt-link-freetext" 
>> > href="http://brainvis.wustl.edu/pub/donna/FREESURFER/SCRIPTS/2013_11/gen_depth.sh";>http://brainvis.wustl.edu/pub/donna/FREESURFER/SCRIPTS/2013_11/gen_depth.sh</a><br>
>> >     login pub<br>
>> >     password download<br>
>> >     <br>
>> >     Replace the subject directories named like SAIS* with your own 
>> > list.<br>
>> >     <br>
>> >     <br>
>> >     <div class="moz-cite-prefix">On 11/15/2013 02:28 PM, Eshita Shah
>> >       wrote:<br>
>> >     </div>
>> >     <blockquote
>> > cite="mid:caewtdbj-hqvrsskd1lxdhhs_-duith2oro__g4+qzirgw-d...@mail.gmail.com"
>> >       type="cite">
>> >       <div dir="ltr">Hi Donna,&nbsp;
>> >         <div><br>
>> >         </div>
>> >         <div>Thank you, that clears up a lot. Just to clarify, my main
>> >           purpose is to use Caret to generate sulcal depth maps for each
>> >           subject and run statistical analysis (such as ANOVA) to
>> >           compare the results of two groups. I have completed importing
>> >           all my data to the fs_LR standard mesh, and now I have to
>> >           generate sulcal depth maps and run analyses. To do this, I am
>> >           planning on running the "generate depth" function from
>> >           PALS-B12/preborder.sh. Will this generate the appropriate
>> >           input parameters for running the one-way ANOVA test through
>> >           caret_command?&nbsp;</div>
>> >         <div><br>
>> >         </div>
>> >         <div>Here is a list of what's needed:&nbsp;</div>
>> >         <div><br>
>> >         </div>
>> >         <div>
>> >           <div>&nbsp;METRIC STATISTICS ONE-WAY ANOVA</div>
>> >           <div>&nbsp; &nbsp; &nbsp; caret_command 
>> > -metric-statistics-anova-one-way</div>
>> >           <div>&nbsp; &nbsp; &nbsp; &nbsp; 
>> > &nbsp;&lt;fiducial-coord-file&gt;</div>
>> >           <div>&nbsp; &nbsp; &nbsp; &nbsp; 
>> > &nbsp;&lt;open-topo-file&gt;</div>
>> >           <div>&nbsp; &nbsp; &nbsp; &nbsp; 
>> > &nbsp;&lt;distortion-metric-shape-file&gt;</div>
>> >           <div>&nbsp; &nbsp; &nbsp; &nbsp; 
>> > &nbsp;&lt;distortion-column-number&gt;</div>
>> >           <div>&nbsp; &nbsp; &nbsp; &nbsp; 
>> > &nbsp;&lt;output-file-names-prefix&gt;</div>
>> >           <div>&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;... &nbsp; &nbsp;</div>
>> >           <div>&nbsp; &nbsp; &nbsp; &nbsp; 
>> > &nbsp;&lt;metric-file-names&gt;</div>
>> >         </div>
>> >         <div><br>
>> >         </div>
>> >         <div>After the import of the data into fs_LR mesh, I have
>> >           multiple coord files, shape files, and closed topo files. I do
>> >           not have specific paint or border files. So I am not exactly
>> >           sure how to exclude the paint files that label the medial wall
>> >           from my analysis, like you stated previously. I found a
>> >           command in postborder.sh that converts the closed topo file
>> >           into an open topo file using the roi file that's generated for
>> >           the medial wall. Any guidance on doing that would be
>> >           appreciated.&nbsp;</div>
>> >         <div><br>
>> >         </div>
>> >         <div>Please let me know if I'm going on the right track to
>> >           generate the sulcal depth maps and run my analyses.&nbsp;</div>
>> >         <div><br>
>> >         </div>
>> >         <div>Thank you,&nbsp;</div>
>> >         <div>Eshita</div>
>> >         <div><br>
>> >         </div>
>> >         <div><br>
>> >         </div>
>> >         <div>
>> >           <pre style="font-family:courier,'courier 
>> > new',monospace;font-size:14px;white-space:pre-wrap;word-wrap:break-word;margin-top:0px;margin-bottom:0px;color:rgb(0,0,0);line-height:19px">On
>> >  Nov 15, 2013, at 10:07 AM, Donna Dierker &lt;<a moz-do-not-send="true" 
>> > href="mailto:[email protected]";>[email protected]</a>&gt; 
>> > wrote:</pre>
>> >         </div>
>> >         <div><br>
>> >         </div>
>> >         <div>
>> >           <pre style="font-family:courier,'courier 
>> > new',monospace;font-size:14px;white-space:pre-wrap;word-wrap:break-word;margin-top:0px;margin-bottom:0px;color:rgb(0,0,0);line-height:19px">&gt;Hi
>> >  Eshita,
>> > &gt;
>> > &gt;topi = topology, and I hate auto-correct</pre>
>> >           <pre style="font-family:courier,'courier 
>> > new',monospace;font-size:14px;white-space:pre-wrap;word-wrap:break-word;margin-top:0px;margin-bottom:0px;color:rgb(0,0,0);line-height:19px">&gt;You
>> >  mean the surface_shape/metric files -- not the topology, right?
>> > &gt;
>> > &gt;When all your data is on a standard mesh, as Caret requires to do group
>> > &gt;analysis (except for summary stats that are not vertex-wise, e.g., 
>> > gyrification
>> > &gt;index), then typically the stats tests use a single mean midthickness 
>> > and open
>> > &gt;topology for computing the areas that are used for the TFCE/cluster
>> > &gt;distributions.
>> > &gt;
>> > &gt;You typically don't need to generate an open topo file for each 
>> > subject on the
>> > &gt;standard mesh.  You have a single paint/label file that labels the 
>> > medial wall
>> > &gt;vertices, so you can exclude them from your analysis.  Or in this 
>> > case, you
>> > &gt;have an open topology file that will accomplish the same thing.  If 
>> > they're all
>> > &gt;on standard mesh, then your standard open topi file should look 
>> > reasonable with
>> > &gt;all your standard mesh midthickness surfaces.
>> > &gt;
>> > &gt;An example of a time when you would need an open topi is when you want 
>> > the
>> > &gt;gyrification index computed on the native mesh.  Then you can use a 
>> > strategy
>> > &gt;like the <a moz-do-not-send="true" 
>> > href="http://PALS_B12.LR/postborder.sh";>PALS_B12.LR/postborder.sh</a>
>> > &gt;(<a moz-do-not-send="true" rel="nofollow" 
>> > href="http://brainmap.wustl.edu/pub/donna/FREESURFER/SCRIPTS/2011_10/PALS_B12.LR/postborder.sh";
>> >  
>> > style="color:rgb(160,30,30)">http://brainmap.wustl.edu/pub/donna/FREESURFER/SCRIPTS/2011_10/PALS_B12.LR/postborder.sh</a>
>> > &gt; ; login pub, password download) uses in the freesurfer to PALS 
>> > pipeline, which
>> > &gt;essentially writes a border around the standard mesh medial wall; 
>> > "unprojects"
>> > &gt;the border on the standard mesh midthickness; and then projects it on 
>> > the
>> > &gt;native mid thickness.  When you do that, you need to project to/from 
>> > the
>> > &gt;spherical or ellipsoid, so that your medial wall border points don't 
>> > get hosed.
>> > &gt;
>> > &gt;You can use your own study-specific mean midthickness, but then you 
>> > should
>> > &gt;compute your own mean distortion metric to go with it.  Besides being 
>> > a bit of
>> > &gt;a hassle, it strikes me that your results are slightly less comparable 
>> > to
>> > &gt;others computed on the fs_LR standard mesh that used the more standard 
>> > Conte69
>> > &gt;mean mid thickness/distortion.  I don't use study-specific files for 
>> > this
>> > &gt;purpose, unless the populations are so different (e.g., baby vs adult) 
>> > that you
>> > &gt;can't use the adult.
>> > &gt;
>> > &gt;Visualization is another story.  I almost always show the 
>> > study-specific mean
>> > &gt;midthickness for each group in morphometry studies.
>> > &gt;
>> > &gt;Donna</pre>
>> >         </div>
>> >         <div><br>
>> >         </div>
>> >         <div><br>
>> >         </div>
>> >       </div>
>> >       <div class="gmail_extra"><br>
>> >         <br>
>> >         <div class="gmail_quote">On Thu, Nov 14, 2013 at 1:38 PM, Eshita
>> >           Shah <span dir="ltr">&lt;<a moz-do-not-send="true"
>> >               href="mailto:[email protected]"; 
>> > target="_blank">[email protected]</a>&gt;</span>
>> >           wrote:<br>
>> >           <blockquote class="gmail_quote" style="margin:0 0 0
>> >             .8ex;border-left:1px #ccc solid;padding-left:1ex">
>> >             <div dir="ltr">
>> >               <pre 
>> > style="line-height:19px;font-size:14px;white-space:pre-wrap;margin-bottom:0px;margin-top:0px;word-wrap:break-word"><font
>> >  face="arial, helvetica, sans-serif">Hi Donna, </font></pre>
>> >               <pre 
>> > style="line-height:19px;font-size:14px;white-space:pre-wrap;margin-bottom:0px;margin-top:0px;word-wrap:break-word"><font
>> >  face="arial, helvetica, sans-serif">
>> >
>> > </font></pre>
>> >               <pre 
>> > style="line-height:19px;font-size:14px;white-space:pre-wrap;margin-bottom:0px;margin-top:0px;word-wrap:break-word"><font
>> >  face="arial, helvetica, sans-serif">Thank you for your help. From what I 
>> > understand, the topology files from each subject have to be compiled 
>> > (using caret_command -metric-composite) into one, and that file is then 
>> > entered into the ANOVA test. My question however, is regarding the 
>> > generation of the open topo files for each subject. I have used 
>> > freesurfer_to_fs_LR pipeline, and I'm wondering if the topo files 
>> > generated (there is only one for each subject) are indeed the open topo 
>> > file, because looking at the script for freesurfer_to_fs_LR it looks like 
>> > it is the closed topo file. Any advice on how to obtain an open topo 
>> > file?</font></pre>
>> >               <pre 
>> > style="line-height:19px;font-size:14px;white-space:pre-wrap;margin-bottom:0px;margin-top:0px;word-wrap:break-word"><font
>> >  face="arial, helvetica, sans-serif">
>> > </font></pre>
>> >               <pre 
>> > style="line-height:19px;font-size:14px;white-space:pre-wrap;margin-bottom:0px;margin-top:0px;word-wrap:break-word"><font
>> >  face="arial, helvetica, sans-serif">Also, to clarify, the first parameter 
>> > in the ANOVA test is a fiducial coord file. I have used caret_command 
>> > -surface-average to generate an average fiducial file for each hemisphere. 
>> > Is this correct?</font></pre>
>> >               <pre 
>> > style="line-height:19px;font-size:14px;white-space:pre-wrap;margin-bottom:0px;margin-top:0px;word-wrap:break-word"><font
>> >  face="arial, helvetica, sans-serif">
>> > </font></pre>
>> >               <pre 
>> > style="line-height:19px;font-size:14px;white-space:pre-wrap;margin-bottom:0px;margin-top:0px;word-wrap:break-word"><font
>> >  face="arial, helvetica, sans-serif">Please let me know. </font></pre>
>> >               <pre 
>> > style="line-height:19px;font-size:14px;white-space:pre-wrap;margin-bottom:0px;margin-top:0px;word-wrap:break-word"><font
>> >  face="arial, helvetica, sans-serif">
>> >
>> >
>> > </font></pre>
>> >               <pre 
>> > style="line-height:19px;font-size:14px;white-space:pre-wrap;margin-bottom:0px;margin-top:0px;word-wrap:break-word"><font
>> >  face="arial, helvetica, sans-serif">Thanks,</font></pre>
>> >               <pre 
>> > style="line-height:19px;font-size:14px;white-space:pre-wrap;margin-bottom:0px;margin-top:0px;word-wrap:break-word"><font
>> >  face="arial, helvetica, sans-serif">Eshita </font></pre>
>> >               <pre 
>> > style="line-height:19px;font-size:14px;white-space:pre-wrap;margin-bottom:0px;font-family:courier,'courier
>> >  new',monospace;margin-top:0px;word-wrap:break-word">
>> > </pre>
>> >               <pre 
>> > style="line-height:19px;font-size:14px;white-space:pre-wrap;margin-bottom:0px;font-family:courier,'courier
>> >  new',monospace;margin-top:0px;word-wrap:break-word"><pre 
>> > style="font-family:courier,'courier 
>> > new',monospace;white-space:pre-wrap;word-wrap:break-word;margin-top:0px;margin-bottom:0px"><span
>> >  
>> > style="color:rgb(34,34,34);font-family:arial;font-size:small;line-height:normal;white-space:normal">On
>> >  Wed, Nov 13, 2013 at 10:50 PM. Donna Dierker 
>> > wrote:&nbsp;</span></pre></pre>
>> >               <pre 
>> > style="line-height:19px;font-size:14px;white-space:pre-wrap;margin-bottom:0px;font-family:courier,'courier
>> >  new',monospace;margin-top:0px;word-wrap:break-word">&gt;Hi Eshita,
>> > </pre>
>> >               <pre 
>> > style="line-height:19px;font-size:14px;white-space:pre-wrap;margin-bottom:0px;font-family:courier,'courier
>> >  new',monospace;margin-top:0px;word-wrap:break-word">&gt;
>> > &gt;I always use the open topology for this purpose (i.e., excludes only 
>> > medial
>> > &gt;wall vertices).  The files here will be helpful:</pre>
>> >               <pre 
>> > style="line-height:19px;font-size:14px;white-space:pre-wrap;margin-bottom:0px;font-family:courier,'courier
>> >  new',monospace;margin-top:0px;word-wrap:break-word">&gt;<a 
>> > moz-do-not-send="true" rel="nofollow" 
>> > href="http://brainmap.wustl.edu/pub/donna/FREESURFER/SCRIPTS/2013_11/TFCE_164k/";
>> >  style="color:rgb(160,30,30)" 
>> > target="_blank">http://brainmap.wustl.edu/pub/donna/FREESURFER/SCRIPTS/2013_11/TFCE_164k/</a>
>> > &gt;login pub
>> > &gt;password download
>> > &gt;
>> > &gt;You can get them all in this zip file:
>> > &gt;
>> > &gt;<a moz-do-not-send="true" rel="nofollow" 
>> > href="http://brainmap.wustl.edu/pub/donna/FREESURFER/SCRIPTS/2013_11/TFCE_164k.zip";
>> >  style="color:rgb(160,30,30)" 
>> > target="_blank">http://brainmap.wustl.edu/pub/donna/FREESURFER/SCRIPTS/2013_11/TFCE_164k.zip</a>
>> > &gt;
>> > &gt;Just to explain what is going on, the areas in the TFCE/cluster 
>> > computations
>> > &gt;are computed on the Conte69 mean mid thickness, with the open topo file
>> > &gt;(excluding medial wall).  The distortion maps pump up the areal value 
>> > where
>> > &gt;substantial smoothing occurs as a result of averaging individuals' 
>> > coordinate
>> > &gt;files (e.g., high 3D variability).  The intent is to make the areas 
>> > more like
>> > &gt;an individual's area would be in that region.  For folks not attuned 
>> > to what
>> > &gt;you are doing, this is during group analysis.</pre>
>> >               <pre 
>> > style="line-height:19px;font-size:14px;white-space:pre-wrap;margin-bottom:0px;font-family:courier,'courier
>> >  new',monospace;margin-top:0px;word-wrap:break-word">&gt;
>> > &gt;Cheers,
>> > &gt;
>> > &gt;Donna
>> > </pre>
>> >               <div>
>> >                 <div class="h5">
>> >                   <div>&gt;</div>
>> >                   <div class="gmail_extra">
>> >                     <div class="gmail_quote">&gt;On Tue, Nov 12, 2013 at
>> >                       1:18 PM, Eshita Shah <span dir="ltr">&lt;<a
>> >                           moz-do-not-send="true"
>> >                           href="mailto:[email protected]"; 
>> > target="_blank">[email protected]</a>&gt;</span>
>> >                       wrote:<br>
>> >                       <blockquote class="gmail_quote" style="margin:0px
>> >                         0px 0px
>> > 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">
>> >                         <div dir="ltr">Hello,&nbsp;
>> >                           <div><br>
>> >                           </div>
>> >                           <div>Thanks for your input. I successfully was
>> >                             able to use freesurfer_to_fs_LR Pipeline to
>> >                             import my FreeSurfer files into caret,
>> >                             however when I try running ANOVA, it asks
>> >                             for certain files that have not been
>> >                             generated by the pipeline. Specifically, how
>> >                             do I generate the
>> >                             "distortion-metric-shape-file" that is being
>> >                             asked for? Lastly, is the .topo file that is
>> >                             generated via the pipeline the open topo
>> >                             file or closed? Previously I was able
>> >                             generate the closed topo file, so I'm not
>> >                             sure if the freesurfer_to_fs_LR pipeline
>> >                             does the same. The parameter required for
>> >                             the ANOVA analysis is the open topo 
>> > file.&nbsp;</div>
>> >                           <div><br>
>> >                           </div>
>> >                           <div>Thank you,&nbsp;</div>
>> >                           <span><font color="#888888">
>> >                               <div>Eshita Shah&nbsp;</div>
>> >                             </font></span>
>> >                           <div>
>> >                             <div>
>> >                               <div>
>> >                                 <div class="gmail_extra"><br>
>> >                                   <br>
>> >                                   <div class="gmail_quote">On Thu, Nov
>> >                                     7, 2013 at 4:12 PM, Rouhollah
>> >                                     Abdollahi <span dir="ltr">&lt;<a
>> >                                         moz-do-not-send="true"
>> >                                         href="mailto:[email protected]";
>> >                                         
>> > target="_blank">[email protected]</a>&gt;</span>
>> >                                     wrote:<br>
>> >                                     <blockquote class="gmail_quote"
>> >                                       style="margin:0px 0px 0px
>> > 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">
>> >                                       <p dir="ltr">Hi<br>
>> >                                         Actually the code import the
>> >                                         original data from freesurfer to
>> >                                         caret then automatically you
>> >                                         will have different node number
>> >                                         for different subjects and
>> >                                         hemispheres. To have the same
>> >                                         mesh you can use
>> >                                         Freesurfer_to_fs_LR Pipeline
>> >
>> > [remainder of message body omitted; too large]
>> >
>> > _______________________________________________
>> > caret-users mailing list
>> > [email protected]
>> > http://brainvis.wustl.edu/mailman/listinfo/caret-users
>> >
>> > _______________________________________________
>> > caret-users mailing list
>> > [email protected]
>> > http://brainvis.wustl.edu/mailman/listinfo/caret-users
>> 
>> 
>> _______________________________________________
>> caret-users mailing list
>> [email protected]
>> http://brainvis.wustl.edu/mailman/listinfo/caret-users
>> 
>> 
>> 
>> _______________________________________________
>> caret-users mailing list
>> 
>> [email protected]
>> http://brainvis.wustl.edu/mailman/listinfo/caret-users
> 
> 
> _______________________________________________
> caret-users mailing list
> [email protected]
> http://brainvis.wustl.edu/mailman/listinfo/caret-users
> 
> _______________________________________________
> caret-users mailing list
> [email protected]
> http://brainvis.wustl.edu/mailman/listinfo/caret-users


_______________________________________________
caret-users mailing list
[email protected]
http://brainvis.wustl.edu/mailman/listinfo/caret-users

Reply via email to