No, that is not what I meant.

I meant that based on the filename, I would have expected only clusters
that DID survive multiple comparisons testing (based on the
supra-threshold cluster test described in Nichols & Holmes 2002, adapted
to the surface) would be non-zero in your *Clusters*metric; however, your
observation that more than one cluster was non-zero in that file is
inconsistent with that expectation.  The significant report (.txt file)
listed only one significant cluster.

So I would not trust the *Clusters*metric to contain corrected clusters
unless:

* Your significance report was in error (e.g., it was from a different test).
* You had some other metric column showing as an underlay, thereby
erroneously leading you to believe *Clusters*metric had more than one
non-zero cluster.

If your goal is to display only the significant cluster, this can be done
manually by loading the unthresholded tmap (i.e., metric w/o clusters in
the name) and using Surface: ROI to select the nodes below -2.66;
disconnect islands; invert the selection; and set metric to zero.

> Hi Donna,
>
> Thanks for your quick reply. so that means that the clusters I see when I
> load T-Map*TMapClusters.metric are all clusters in the real T-Map file
> that exceed my T-threshold, which means that their significance is not
> corrected for mulitple comparisons, right?
>
> Thanks again,
>
> Julia
>
>> Hi Julia,
>>
>> Again, it has been years since I have used the cluster-based test,
>> having switched to TFCE.  I'd probably have to dig around to find an old
>> dataset like that.
>>
>> But from what I remember, there was a tmap*metric that had a column with
>> the unthresholded t-map, but then there was also a tmap*Clusters*metric
>> that zeroed out everything that wasn't in a significant cluster.  Based
>> on your  filename, I'd guess that was the
>> latter type, so should have everything NOT in a significant cluster
>> zeroed out.  If not, I'd wonder whether you have some underlay or
>> secondary overlay that is set to a different column/metric.
>>
>> At any rate, worst case, you can use Surface: ROI to threshold the
>> metric at -2.66 and select only nodes connected to the currently
>> selected node (where you ID a node inside the significant cluster).
>>
>> Donna
>>
>> On 12/01/2010 01:07 PM, Julia Bender wrote:
>>> Hi Donna,
>>>
>>> thanks for updating the code. I've rerun my stats with different
>>> thresholds (T=2.66, p.05). My *_SignificantClusters.txt file tells me
>>> that
>>> there is only one significant cluster:
>>>
>>> TMap
>>> ----
>>> Column    Thresh  Num-Nodes          Area  Area-Corrected     COG-X
>>> COG-Y     COG-Z   P-Value
>>>      1    -2.660       4203   2775.767090     2827.737549    40.409
>>> -12.974   -27.389  0.002000
>>>
>>> but when I load the T-Map*TMapClusters.metric file onto my surface I
>>> can
>>> see several clusters. Are those the uncorrected significant clusters?
>>> TMap
>>> ----
>>> Column    Thresh  Num-Nodes          Area  Area-Corrected     COG-X
>>> COG-Y     COG-Z   P-Value
>>>      1    -2.660       4203   2775.767090     2827.737549    40.409
>>> -12.974   -27.389  0.002000
>>>      1     2.660        435    219.168182      220.613358    28.160
>>> -52.169    56.452  0.258000
>>>      1     2.660        184    153.448578      158.166473    44.849
>>> -24.530    49.967  0.365000
>>>      1     2.660        229    153.459183      149.640244    35.705
>>> -20.871    50.301  0.394000
>>>      1     2.660        153    106.938622      107.896500    45.769
>>> -0.412    39.610  0.541000
>>>      1     2.660        142     90.582855       88.234703    27.666
>>> -23.322   -13.406  0.641000
>>>      1     2.660        143     81.723625       80.440567    49.349
>>> -24.353    -2.452  0.683000
>>>      1     2.660        181     67.850945       68.252045    19.675
>>> -10.697    58.506  0.755000
>>>      1     2.660        151     70.042465       67.372482    38.421
>>> -37.124    46.450  0.757000
>>>      1    -2.660         48     58.272926       59.562595     7.732
>>> 17.195   -16.283  0.804000
>>>      1     2.660         59     57.713173       56.583248    38.887
>>> 13.286    33.244  0.821000
>>>      1    -2.660         43     40.167896       41.627560    29.441
>>> -80.470   -15.188  0.898000
>>>      1     2.660         70     31.806927       31.897322    21.620
>>> -64.665    49.982  0.937000
>>>      1     2.660         57     29.297220       30.987967     5.613
>>> -8.313    65.568  0.941000
>>>      1     2.660         43     27.218239       26.574488    16.105
>>> 30.667    49.513  0.960000
>>>      1     2.660         31     14.005635       13.789747    37.211
>>> -5.401    -5.390  0.993000
>>>      1     2.660         21     12.719942       12.593773    50.215
>>> -25.851    41.114  0.994000
>>>      1     2.660         13     10.736663       10.448134    23.142
>>> 39.879    31.981  0.997000
>>>      1     2.660          9      5.134305        5.243526    17.070
>>> -60.562    62.295  0.999000
>>>      1     2.660          2      1.532340        1.510908    23.168
>>> -34.540    -5.696  0.998000
>>>
>>> It looks like it.
>>>
>>> Is there a way to display only the corrected significant clusters?
>>>
>>> Thanks a lot for your help,
>>>
>>> Julia
>>>
>>>
>>>> All of the statistical tests have been corrected so they should now
>>>> produce the correct P-Value.
>>>>
>>>> An updated caret distribution, v5.616, is now available for download
>>>> from
>>>> http://brainvis.wustl.edu/wiki/index.php/Caret:Download
>>>> (username=Rabbit
>>>>   password=Carrot).
>>>>
>>>> The effect is generally very minimal (2/iterations -- .0004 using our
>>>> typical 5k).
>>>>
>>>> On 11/08/2010 09:18 AM, Donna Dierker wrote:
>>>>
>>>>> Julia,
>>>>>
>>>>> I see what you mean.  Based on the report you uploaded, the p-values
>>>>> listed here seem off by .002:
>>>>>
>>>>>     1     2.500       1712   1229.060425     1242.687012   -27.858
>>>>> -75.308   -13.617  0.036000
>>>>>     1     2.500       1960   1059.480713     1062.812378   -21.474
>>>>> -62.266    50.506  0.044000
>>>>>
>>>>> I'll ask John about it.
>>>>>
>>>>> Donna
>>>>>
>>>>> On 11/08/2010 04:07 AM, Julia Bender wrote:
>>>>>
>>>>>> Hi Donna,
>>>>>>
>>>>>> thanks for your answers. I've uploaded
>>>>>> "T-Map_LH_cCue_EndoLeft.metric_TMap_Significant_Clusters.txt" which
>>>>>> is the
>>>>>> example file we've been talking about. As far as I see it the
>>>>>> cluster
>>>>>>
>>>>>> TMap
>>>>>> ----
>>>>>> Column    Thresh  Num-Nodes          Area  Area-Corrected
>>>>>> COG-X    COG-Y     COG-Z   P-Value
>>>>>>      1     2.500       1712   1229.060425     1242.687012   -27.858
>>>>>> -75.308   -13.617  0.036000
>>>>>>
>>>>>> fits in between rank 37 (area corrected 1273.066772) and 38
>>>>>> (area-corrected 1240.362915).
>>>>>>
>>>>>> Thanks for your help,
>>>>>>
>>>>>> Julia
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>> On 11/04/2010 09:57 AM, Julia Bender wrote:
>>>>>>>
>>>>>>>
>>>>>>>> Hi Donna,
>>>>>>>>
>>>>>>>> thank you so much for your detailed reply. It helped a great deal.
>>>>>>>> Three things remain somewhat unclear to me:
>>>>>>>>
>>>>>>>> 1. I think I figured out why the list of permuted clusters in
>>>>>>>> xx.metric_TMap_Significant_Clusters.txt only has 696 instead of
>>>>>>>> 1000
>>>>>>>> (=number of iterations) columns in the example contrast. My input
>>>>>>>> columns
>>>>>>>> were n=13, which would allow up to 2^13=8192 iterations. But only
>>>>>>>> 696 of
>>>>>>>> the 1000 iterations produce clusters with a T-Value above my
>>>>>>>> defined
>>>>>>>> threshold. Theses are the ones listed in the output file. Is that
>>>>>>>> possible?
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>> Sure, it certainly is possible for the data to not survive
>>>>>>> threshold,
>>>>>>> but I'm not used to it.
>>>>>>>
>>>>>>>
>>>>>>>> 2. The significant cluster in the example has an area-corrected
>>>>>>>> value of
>>>>>>>> 1242.687012 and is assigned a p-value of 0.036. This means that it
>>>>>>>> resides
>>>>>>>> on rank 36 out of my 1000 iterations, right? When I looked up the
>>>>>>>> 100
>>>>>>>> largest clusters list (my predefined P-threshold was .1) in my
>>>>>>>> output
>>>>>>>> file
>>>>>>>> the significant cluster would actually take rank 38. what am I
>>>>>>>> getting
>>>>>>>> wrong here?
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>> Keep in mind that the p-values are based on the distribution built
>>>>>>> by
>>>>>>> the random tmaps, but the area of clusters on the real tmap is
>>>>>>> typically
>>>>>>> in between two areas on the randomized list.
>>>>>>>
>>>>>>> If your real tmap cluster is bigger than the 38th biggest random
>>>>>>> cluster, but smaller than the 37th, then I'd expect it to have a
>>>>>>> p-value
>>>>>>> of 38/iterations.
>>>>>>>
>>>>>>> If this is not happening, I don't know why.  Upload your
>>>>>>> significance
>>>>>>> report, and I'll have a look:
>>>>>>>
>>>>>>> http://pulvinar.wustl.edu/cgi-bin/upload.cgi
>>>>>>>
>>>>>>>
>>>>>>>> 3.The correction for multiple comparisons is done by thresholding
>>>>>>>> all
>>>>>>>> real
>>>>>>>> T-map clusters above the predefined T-Value with the smallest
>>>>>>>> iterations*alpha cluster of the permutation distribution according
>>>>>>>> to
>>>>>>>> area-corrected? The corrected P-value for real T-map clusters that
>>>>>>>> survive
>>>>>>>> this threshold is derived from the rank the clusters hold in the
>>>>>>>> total
>>>>>>>> iterations list according to area-corrected?
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>> I would put it differently.  Any clusters in the real tmap
>>>>>>> surviving
>>>>>>> the
>>>>>>> threshold that exceed the minimum significance cut-off in corrected
>>>>>>> area
>>>>>>> are significant.
>>>>>>>
>>>>>>>
>>>>>>>> Again, thanks a lot for your answers!!
>>>>>>>>
>>>>>>>> Julia
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>> Hi Julia,
>>>>>>>>>
>>>>>>>>> Bear with me, because it's been years since I used the
>>>>>>>>> caret_command
>>>>>>>>> tests, which are mostly cluster tests.  We switched from cluster
>>>>>>>>> to
>>>>>>>>> TFCE
>>>>>>>>> a year or so ago:
>>>>>>>>>
>>>>>>>>> http://brainvis.wustl.edu/wiki/index.php/Caret:Documentation:Statistics#Threshold-Free_Cluster_Enhancement_.28TFCE.29
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> But those tests are in caret_stats, which is a separate
>>>>>>>>> tool/package
>>>>>>>>> based on java.  Let me know if you want to know more about that.
>>>>>>>>>
>>>>>>>>> My inline replies below reflect my best recollection of the
>>>>>>>>> cluster
>>>>>>>>> tests.
>>>>>>>>>
>>>>>>>>> Donna
>>>>>>>>>
>>>>>>>>> On 10/08/2010 08:16 AM, Julia Bender wrote:
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>> Hi Donna,
>>>>>>>>>>
>>>>>>>>>> I've looked more into the "caret_command
>>>>>>>>>> -metric-statistics-one-sample-t-test" output files. I'm having
>>>>>>>>>> trouble
>>>>>>>>>> understanding what all the information means. Maybe you can
>>>>>>>>>> correct
>>>>>>>>>> me:
>>>>>>>>>>
>>>>>>>>>> 1. xx.metric_TMap.metric:
>>>>>>>>>> Map of T-Values for the wholebrain (?) for the contrast defined
>>>>>>>>>> in the
>>>>>>>>>> xx.metric input files. Those are T-Values exceeding the T-Value
>>>>>>>>>> thresholds
>>>>>>>>>> and alpha level I specified in
>>>>>>>>>> "caret_command-metric-statistics-one-sample-t-test" (eg -300000
>>>>>>>>>> 2.5
>>>>>>>>>> 0.1
>>>>>>>>>> in
>>>>>>>>>> my case).
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>> They are not thresholded.  The 2.5 and 0.1 affect the cluster
>>>>>>>>> size,
>>>>>>>>> which affects downstream outputs.  But this TMap is
>>>>>>>>> unthresholded.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>>  They are not corrected for multiple comparisons(?).
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>> No, definitely not.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>> This is the one I should load onto my surface. When I load
>>>>>>>>>> that TMap, why can I still see T-Values below my defined
>>>>>>>>>> threshold?
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>> Right.  I like to use the unthresholded t/f-map for my figures,
>>>>>>>>> but
>>>>>>>>> generate a border around the clusters that were significant, and
>>>>>>>>> show
>>>>>>>>> the border overlaid on the unthresholded t-map.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>> 2.xx.metric_ShuffledTMap.metric:
>>>>>>>>>> Distribution of T-Values derived from permuting + and - on each
>>>>>>>>>> element
>>>>>>>>>> in
>>>>>>>>>> xx in N iterations (in my case 1000)
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>> Right, and less than 1000, depending on how many columns there
>>>>>>>>> are
>>>>>>>>> in
>>>>>>>>> your input composite metric.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>> 3.xx.metric_TMap_Significant_Clusters.txt:
>>>>>>>>>> This is what the help page says:
>>>>>>>>>> "1. Find the biggest cluster in each column of the permutation
>>>>>>>>>> T-Map
>>>>>>>>>> metric/shape file and sort them by cluster size."
>>>>>>>>>> I see two lists of clusters in the output file. I assume the one
>>>>>>>>>> that
>>>>>>>>>> is
>>>>>>>>>> the result of this sorting is the lower one. It has about 700
>>>>>>>>>> rows
>>>>>>>>>> depending on xx, why does it not have 1000 rows, one for each
>>>>>>>>>> permutation?
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>> See Sample Report: Two Sample T-Test here:
>>>>>>>>>
>>>>>>>>> http://brainvis.wustl.edu/OLD/courses/stats_neurosci/2008_BMEcourse/BME_dld_talk.htm
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Yours is a one-sample t-test, but I think the cluster lists will
>>>>>>>>> be the
>>>>>>>>> same.
>>>>>>>>>
>>>>>>>>> If your last list of clusters has less than 1000 rows, then you
>>>>>>>>> had
>>>>>>>>> fewer than 10 columns in your input composite metric.  If n is
>>>>>>>>> the
>>>>>>>>> number of columns, and 2 raised to the n is less than your input
>>>>>>>>> iterations, then Caret will stop at 2^n.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>>  Are the clusters sorted by Num-Nodes, Area or Area corrected?
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>> Descending area-corrected.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>>  In my files
>>>>>>>>>> they seem kind of sorted by both... How is a cluster defined?
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>> By your input thresholds.  The permuted t-maps are thresholded at
>>>>>>>>> the
>>>>>>>>> level specified, and then clusters of contiguous supra-threshold
>>>>>>>>> nodes
>>>>>>>>> are found.  Only the largest in each iteration is saved.  Then
>>>>>>>>> they are
>>>>>>>>> listed in descending order of area-corrected size.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>> "2. Find the largest (alpha)(iterations) clusters in the
>>>>>>>>>> Permutation
>>>>>>>>>> T-Map
>>>>>>>>>> and use its cluster size as the Significant Cluster Cutoff."
>>>>>>>>>> I assume this is the list of T-Values right below the above list
>>>>>>>>>> of
>>>>>>>>>> clusters, why does it contain clusters with a P-Value that is
>>>>>>>>>> above my
>>>>>>>>>> defined alpha?
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>> The last table lists all the largest clusters for each iteration,
>>>>>>>>> regardless of its p.  The second table is what you want.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>> "3. Find clusters in the Real T-Map file." This must be the
>>>>>>>>>> upper
>>>>>>>>>> list
>>>>>>>>>> of
>>>>>>>>>> clusters (containing much less rows than the lower one)
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>> Correct.  That link above shows this the best.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>> "4. Report all clusters in Real T-Map file that are larger than
>>>>>>>>>> Significant Cluster Cutoff." This is the list of T-Values below
>>>>>>>>>> this
>>>>>>>>>> list,
>>>>>>>>>> containing only clusters that are bigger than the cluster with
>>>>>>>>>> the
>>>>>>>>>> highest
>>>>>>>>>> P-Value found in 2. that pass the alpha and T-Value thresholds I
>>>>>>>>>> specified
>>>>>>>>>> in "caret_command-metric-statistics-one-sample-t-test".
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>> Again, here is the relevant excerpt from the link above:
>>>>>>>>>
>>>>>>>>> Significant Area:    306.226 <--- area of smallest cluster listed
>>>>>>>>> in
>>>>>>>>> next
>>>>>>>>> section
>>>>>>>>>
>>>>>>>>> Shuffled TMap <--- Top alpha*iterations biggest clusters are
>>>>>>>>> listed
>>>>>>>>> below,
>>>>>>>>> in descending area-corrected sequence.
>>>>>>>>> (i.e., the smallest of which determines the signifiance area
>>>>>>>>> cut-off)
>>>>>>>>> -------------
>>>>>>>>> Column    Thresh  Num-Nodes          Area  Area-Corrected
>>>>>>>>> COG-X
>>>>>>>>> COG-Y     COG-Z   P-Value
>>>>>>>>>    821     2.660       2300   1439.637207     1559.143188
>>>>>>>>> -37.249
>>>>>>>>> 0.007    -3.456
>>>>>>>>>    150     2.660        792    598.519104      858.216492
>>>>>>>>> -49.142
>>>>>>>>> -32.369    11.918
>>>>>>>>>    548     2.660       1380    641.563843      643.790466
>>>>>>>>> -35.004
>>>>>>>>> 2.519   -14.981
>>>>>>>>> ... (middle biggest alpha*iterations entries omitted)
>>>>>>>>>    681    -2.660        279    249.739059      312.872681
>>>>>>>>> -48.189
>>>>>>>>> -9.576    10.352
>>>>>>>>>    649     2.660        237    198.468857      311.602844
>>>>>>>>> -16.170
>>>>>>>>> -79.223    30.149
>>>>>>>>>    633    -2.660        217    119.337837      306.226257
>>>>>>>>> -42.479
>>>>>>>>> -48.846    41.505
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> TMap <--- Significant real tmap clusters (i.e., >= significant
>>>>>>>>> area)
>>>>>>>>> are
>>>>>>>>> listed here; no entries here means no clusters were significant.
>>>>>>>>> ----
>>>>>>>>> Column    Thresh  Num-Nodes          Area  Area-Corrected
>>>>>>>>> COG-X
>>>>>>>>> COG-Y     COG-Z   P-Value
>>>>>>>>>      3     2.660        344    229.353973      362.322113
>>>>>>>>> -45.677
>>>>>>>>> -27.442    18.039  0.029000
>>>>>>>>>
>>>>>>>>> Shuffled TMap <--- All iterations max clusters are listed below,
>>>>>>>>> in
>>>>>>>>> descending area-corrected sequence.
>>>>>>>>> -------------
>>>>>>>>> Column    Thresh  Num-Nodes          Area  Area-Corrected
>>>>>>>>> COG-X
>>>>>>>>> COG-Y     COG-Z   P-Value
>>>>>>>>>    821     2.660       2300   1439.637207     1559.143188
>>>>>>>>> -37.249
>>>>>>>>> 0.007    -3.456
>>>>>>>>>    150     2.660        792    598.519104      858.216492
>>>>>>>>> -49.142
>>>>>>>>> -32.369    11.918
>>>>>>>>>    548     2.660       1380    641.563843      643.790466
>>>>>>>>> -35.004
>>>>>>>>> 2.519   -14.981
>>>>>>>>> ... (middle biggest alpha*iterations entries omitted)
>>>>>>>>>      3     2.660          1      0.475079        0.472114
>>>>>>>>> -18.520
>>>>>>>>> -24.418   -21.858  0.999000
>>>>>>>>>      3    -2.660          1      0.474738        0.456814
>>>>>>>>> -21.478
>>>>>>>>> -36.579   -17.077  0.999000
>>>>>>>>>      3     2.660          1      0.000000        0.000000
>>>>>>>>> -6.822
>>>>>>>>> -46.751     8.581  0.999000
>>>>>>>>>
>>>>>>>>> ------------------------------------------------------------------------
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>> 4. T-Map_LH_cCue_EndoL_vs_Cue_ExoL.metric_TMapClusters.metric
>>>>>>>>>> Is this the map of clusters defined above?
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>> This is what I believe you thought xx.metric_TMap.metric was, but
>>>>>>>>> it is
>>>>>>>>> unthresholded.
>>>>>>>>>
>>>>>>>>> This TMapClusters one zeroes out all nodes that are NOT within a
>>>>>>>>> significant cluster.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>> I'm sorry, I know these are a many questions. Thanks a lot for
>>>>>>>>>> your
>>>>>>>>>> help!
>>>>>>>>>>
>>>>>>>>>> Julia
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>> Julia,
>>>>>>>>>>> I looked at your report and your t-map, which is consistent
>>>>>>>>>>> with
>>>>>>>>>>> the
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>> caret_command -metric-information output you included below.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>> Just making sure you understand this part of the report:
>>>>>>>>>>> TMap
>>>>>>>>>>> ----
>>>>>>>>>>> Column    Thresh  Num-Nodes          Area  Area-Corrected
>>>>>>>>>>> COG-X
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>> COG-Y     COG-Z   P-
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>> Value
>>>>>>>>>>>      1     2.500       3064   2223.228027     2245.943848
>>>>>>>>>>> -30.317
>>>>>>>>>>> -73.796   -12.228  0.0
>>>>>>>>>>> 12000
>>>>>>>>>>>      1     2.500       3372   1865.999878     1863.883423
>>>>>>>>>>> -21.516
>>>>>>>>>>> -62.842    46.382  0.0
>>>>>>>>>>> 17000
>>>>>>>>>>>      1     2.500       1557    681.308838      674.981384
>>>>>>>>>>> -32.967
>>>>>>>>>>> -5.873    48.701  0.0
>>>>>>>>>>> 59000
>>>>>>>>>>> These are the clusters in your real t-map that were significant
>>>>>>>>>>> at
>>>>>>>>>>> the
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>> 0.1
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>> alpha you specified, using the 2.5 threshold.  (Note that all
>>>>>>>>>>> the
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>> significant clusters were at the positive end.)
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>> I believe the reason you saw different max/min in the Caret GUI
>>>>>>>>>>> was
>>>>>>>>>>> that
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>> you had the permuted t-map loaded, instead of the real one.  In
>>>>>>>>>> your
>>>>>>>>>> message below, you said, "Adjustment:Column: permuted
>>>>>>>>>> T-Values,Threshold
>>>>>>>>>> type".  There is nothing about permuted in the file you
>>>>>>>>>> uploaded.  If
>>>>>>>>>> you
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>> were viewing the permuted/shuffled t-map, this would also
>>>>>>>>>>> explain why
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>> little would survive a low threshold.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>> But we don't necessarily (or even usually) use the same values
>>>>>>>>>>> we
>>>>>>>>>>> used
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>> for
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>> cluster thresholds as the threshold for displaying t-maps,
>>>>>>>>>>> e.g.,
>>>>>>>>>>> for
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>> publication purposes.  I think we like to see some color
>>>>>>>>>> differentiation
>>>>>>>>>> beyond the cluster threshold max.  If they are the same, the
>>>>>>>>>> color
>>>>>>>>>> will
>>>>>>>>>> saturate at the max.  Sometimes we'll use a p-value derived from
>>>>>>>>>> the
>>>>>>>>>> degrees of freedom and get a corresponding t-value from that,
>>>>>>>>>> and
>>>>>>>>>> use
>>>>>>>>>> that
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>> for thresholding.  Other times we might just use, say, +/-4.0
>>>>>>>>>>> or
>>>>>>>>>>> higher,
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>> depending on how big the values get in the data.  Usually we'll
>>>>>>>>>> use a
>>>>>>>>>> symmetric scale (i.e., -x to +x -- rather than different
>>>>>>>>>> min/max).
>>>>>>>>>> Donna
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>> On 08/11/2010 09:12 AM, Julia Bender wrote:
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>> Hi Donna,
>>>>>>>>>>>> I've just uploaded the two files.
>>>>>>>>>>>> Thanks for your help!
>>>>>>>>>>>> Julia
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>> Julia,
>>>>>>>>>>>>> It will be easier for me to get my head around your question
>>>>>>>>>>>>> if
>>>>>>>>>>>>> I
>>>>>>>>>>>>> can
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>> get two files:
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>>>> * T-Map_LH_cCue_Endo.metric_TMap.metric (whatever the final
>>>>>>>>>>>>> output
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>> metric was, but NOT the permuted/shuffled tmap file).
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>>>> * The report named something like *Signicance*.txt
>>>>>>>>>>>>> Could you upload those here:
>>>>>>>>>>>>> http://pulvinar.wustl.edu/cgi-bin/upload.cgi
>>>>>>>>>>>>> My brain would be ever so grateful.
>>>>>>>>>>>>> Donna
>>>>>>>>>>>>> On 08/11/2010 07:07 AM, Julia Bender wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Hey everyone,
>>>>>>>>>>>>>> I'm a bit confused about how to threshold my T-Maps in
>>>>>>>>>>>>>> caret5.
>>>>>>>>>>>>>> I
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>> created
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>>>>> the maps with the following command:
>>>>>>>>>>>>>> /usr/local/caret/bin_linux/caret_command
>>>>>>>>>>>>>> -metric-statistics-one-sample-t-test $EACHMETRIC
>>>>>>>>>>>>>> $FIDUCIAL_COORD
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>> $OPEN_TOPO $SURFACE_SHAPE 3 T-Map_$EACHMETRIC -300000.0 2.5 0.10
>>>>>>>>>> 10 1
>>>>>>>>>> 1000
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>>>>> 0 4
>>>>>>>>>>>>>> So I put the negative threshold to -300000 and the positive
>>>>>>>>>>>>>> threshold
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>> to
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>>>>> 2.5. When I look at the resulting Tmap.metric files it gives
>>>>>>>>>>>>>> me
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>> something
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>>>>> like this:
>>>>>>>>>>>>>> Filename: T-Map_LH_cCue_Endo.metric_TMap.metric
>>>>>>>>>>>>>> Number of Nodes: 73730
>>>>>>>>>>>>>> Number of Columns: 1
>>>>>>>>>>>>>> Column      Minimum      Maximum           Mean     Sample
>>>>>>>>>>>>>> Dev
>>>>>>>>>>>>>> %
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>> Positive     % Negative   Column Name
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>>>>>      1       -9.785        6.076         -0.950
>>>>>>>>>>>>>> 2.639
>>>>>>>>>>>>>> 36.234         63.766   T-Values
>>>>>>>>>>>>>> As far as I understand, this means the maximum negative T
>>>>>>>>>>>>>> value in
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>> this
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>>>>> metric is -9.785 and the maximum positive T value is 6.076.
>>>>>>>>>>>>>> When I
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>> open
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>>>>> the file in caret though (Color mapping: Auto Scale, Display
>>>>>>>>>>>>>> mode:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>> Both,
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>>>>> Display Color Bar, Threshold Adjustment:Column: permuted
>>>>>>>>>>>>>> T-Values,Threshold type: Column) the bar tells me that my
>>>>>>>>>>>>>> maximum
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>> negative
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>>>>> value is -3.7 and my maximum positive value is 2.6. This
>>>>>>>>>>>>>> also
>>>>>>>>>>>>>> holds
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>> when
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>>>>> I
>>>>>>>>>>>>>> adjust the thresholds in the fields below to -2 and 2, when
>>>>>>>>>>>>>> almost
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>> all
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>>>>> activation disappears.
>>>>>>>>>>>>>> Which is the correct information?
>>>>>>>>>>>>>> Thanks a lot for your help!
>>>>>>>>>>>>>> Julia
>>>>>>>>>>>>>> Dipl. Psych. Julia Bender
>>>>>>>>>>>>>> Humboldt Universität zu Berlin
>>>>>>>>>>>>>> Mathematisch - Naturwissenschaftliche Fakultät II
>>>>>>>>>>>>>> Institut für Psychologie, Abt. Klinische Psychologie
>>>>>>>>>>>>>> Unter den Linden 6
>>>>>>>>>>>>>> D-10099 Berlin
>>>>>>>>>>>>>>
>>
>> _______________________________________________
>> caret-users mailing list
>> [email protected]
>> http://brainvis.wustl.edu/mailman/listinfo/caret-users
>>
>
>
> Dipl. Psych. Julia Bender
> Humboldt Universit�t zu Berlin
> Mathematisch - Naturwissenschaftliche Fakult�t II
> Institut f�r Psychologie, Abt. Klinische Psychologie
> Unter den Linden 6
> D-10099 Berlin
>
> _______________________________________________
> caret-users mailing list
> [email protected]
> http://brainvis.wustl.edu/mailman/listinfo/caret-users
>

_______________________________________________
caret-users mailing list
[email protected]
http://brainvis.wustl.edu/mailman/listinfo/caret-users

Reply via email to