On 9/3/19 12:05 PM, cody samth wrote: > > External Email - Use Caution > > Hello, > > I have a question regarding how to graph results that came from a > vertex wise analysis using the command mri_glmfit and mri_glmfit-sim. > > I was interested in investigating an interaction effect between groups > and my variable of interest (continuous) while co-varying for three > nuisance variables. After running mri_glmfit and mri_glmfit-sim to > correct for multiple comparisons. I visualized the results and found > significant clusters. > > I'm interested in graphing these results. Based on archived questions > to this mailing list the individual values for each cluster can be > found within the ocn.dat file and the cluster information can be found > in the summary file. My analysis looked at volume, thickness and > surface area. Since mean volume and area is difficult to interpret I > want to convert the values to a total measure. It has been suggested > in the past this can be done by multiplying each individuals values > within the ocn.dat file by the number of vertices the cluster has. > However, from my understanding this could be done by altering the > mri_segstats command (that mri_glmfit-sim automatically runs) to > include the --accumulate option. > > When I do these methods to convert mean area and volume to total area > and volume the results are different. > > My first question is 1) Shouldn't these values be identical? The > values from multiplying mean volume by number of vertices are roughly > around 3500. Whereas using the --accumulate in mri_segstats are around > 2500. What could be causing this discrepancy? They should not, but the reason it fairly convoluted. When you get a cluster after running mri_glmfit-sim, that cluster is on fsaverage which is an average of 40 subjects. The area of a vertex is computed as the average of the areas of the vertices from the 40 that mapped into that vertex. This is the number that is used to compute the surface area of the cluster in the summary file. Now, when you map your subjects into the fsaverage space, they may have more or less surface area mapping into that cluster relative to the 40 (looks like more from #2 below). Also, you probably smoothed the surface area, which could have an unpredictable effect. > > 2) If my cluster has a size of 1500 mm^2 (in a model for area) does it > make sense that every individual's values after extraction and > conversion to total area are larger than the cluster size? Yes, see above. > > 3) ocn.dat files are the input values meaning they're raw and would > need to be corrected in a statistically (in a similar way that I > modeled it in freesurfer) before graphing right? Not sure what you mean by "corrected" here. In general, you need to be very careful when you extract data from a cluster. It would be circular to do the same test that you used to generate the cluster, though this happens a lot (see "VooDoo correlations" by Ed Vul). > > _______________________________________________ > Freesurfer mailing list > Freesurfer@nmr.mgh.harvard.edu > https://mail.nmr.mgh.harvard.edu/mailman/listinfo/freesurfer

_______________________________________________ Freesurfer mailing list Freesurfer@nmr.mgh.harvard.edu https://mail.nmr.mgh.harvard.edu/mailman/listinfo/freesurfer