Hi Donna,* * I have done 5000 iteration of the permutation, but when I am trying to use the command: significance-threshold-free, I saw some errors. I have upload the file: *LEFT-RIGHT.VertexArea.tmap.random.5000x.metric, *could you please help check the errors?
*Apr 16, 2012 11:10:52 AM edu.wustl.caret.niftilib.NiftiXmlFileEntityResolver resolveEntity* *INFO: Resolving: http://www.nitrc.org/frs/download.php/1594/gifti.dtd* *Apr 16, 2012 11:10:52 AM edu.wustl.caret.niftilib.NiftiXmlFileEntityResolver resolveEntity* *INFO: Ignoring null http://www.nitrc.org/frs/download.php/1594/gifti.dtd* *Apr 16, 2012 11:10:52 AM edu.wustl.caret.giftijlib.GiftiDataArrayFile readFile* *SEVERE: SAX Exception* *org.xml.sax.SAXException: While reading line 1, column 1:Content is not allowed in prolog.* * at edu.wustl.caret.niftilib.NiftiXmlDefaultHandler.fatalError(NiftiXmlDefaultHandler.java:45) * * at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.fatalError(ErrorHandlerWrapper.java:180) * * at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:441) * * at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:368) * * at com.sun.org.apache.xerces.internal.impl.XMLScanner.reportFatalError(XMLScanner.java:1375) * * at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$PrologDriver.next(XMLDocumentScannerImpl.java:996) * * at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:607) * * at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:488) * * at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:835) * * at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:764) * * at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:123) * * at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1210) * * at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:568) * * at edu.wustl.caret.giftijlib.GiftiDataArrayFile.readFile(GiftiDataArrayFile.java:724) * * at edu.wustl.caret.files.GiftiDataArrayBasedFile.readGiftiXmlFormat(GiftiDataArrayBasedFile.java:976) * * at edu.wustl.caret.files.AbstractFile.readFilesData(AbstractFile.java:1292)* * at edu.wustl.caret.files.AbstractFile.readFileOnDisk(AbstractFile.java:1199)* * at edu.wustl.caret.files.AbstractFile.readFile(AbstractFile.java:972)* * at edu.wustl.caret.statistics.StatisticOperationSignificanceTFCE.executeOperation(StatisticOperationSignificanceTFCE.java:319) * * at edu.wustl.caret.statistics.StatisticMain.main(StatisticMain.java:120)* * * *ERROR: edu.wustl.caret.giftijlib.GiftiException: org.xml.sax.SAXException: While reading line 1, column 1:Content is not allowed in prolog.* Thanks a lot, Gang > I realized coord anova doesn't do what you need, because it's more like a > two sample t-test than a paired t-test (but you can have more than two > groups). > It doesn't take the pairing into account. As far as I know, caret_stats > doesn't do what you need it to do for the coords. > If you can use something like local shape mancova ( > http://www.insight-journal.org/browse/publication/694) to generate your > stat maps, then you can feed them to caret_stats' TFCE. But again, we're > back to the randomization problem. I don't think that tool will generate > the randomized stat maps using a strategy like those in Nichols & Holmes' > primer paper. > But you can do a paired diff on the depth maps. Just make sure the > composites have the columns in corresponding sequence. > Sorry to get your hopes up on the coords. > On Apr 5, 2012, at 4:40 PM, Donna Dierker wrote: > >* I put three scripts here: > *> > * *>* http://brainmap.wustl.edu/pub/donna/US/UNC/ > *>* login pub > *>* password download > *> > * *>* For the coords, you'll need a different test -- coord anova: > *> > * *>* > http://brainvis.wustl.edu/wiki/index.php/Caret:Documentation:Statistics#Coordinate_Difference_Analysis_of_Variance > *> > * *>* One of those scripts does that. But first, you'll need to x-flip > either the right or left. I propose x-flipping the left to right, resulting > in mostly positive values for x. See flip-x.sh. > *> > * *>* For your depth, you need to make a composite from your depth files, > and the gen_distortion.sh script does that for distortion. You'll want to > recompute your distortion using something like that, because the order of > the mean fiducial and the individual's fiducial is important. The mean is > the first argument to the distortion command. Else you end up with mostly > negative values, as you apparently did, and as I did when I first > recomputed my term12 version 2 distortion metric. Easy mistake to make, but > we want to pump up areas -- not shrink them down. > *> > * *>* Anyway, that script shows how to generate a composite and compute > an average. You only need the composite for your depth case, but the latter > comes in handy. > *> > * *>* (The gen_distortion.sh script didn't exist when it first came up > the other day. I just needed to do it, so now it does.) > *> > * *>* Donna > *> > * *> > * *>* On Apr 5, 2012, at 10:17 AM, gangli wrote: > *> > * *>>* Hi Donna, > *>> > * *>>* I would like to test the paired difference between left and right > hemispheres on 3D coordinate position and sulcal depth. Please find the > uploaded input files and help me figure out how to do this. > *>> > * *>>* Thanks a lot. > *>> > * *>>* Gang > *>> > * *>> > * *>>* My versions of caret_command can read them all, and the > coords/metric all have roughly 164k nodes -- all good. > *>>* But I still don't have your command line, or inputs like I'd expect > to a paired t-test, if that's what you're doing. > *>>* For each subject, I have a metric that has a column for 3D > variability and a column for areal distortion. I doubt that you want to get > a paired difference between these disparate measures. > *>>* What you need, as input to the paired t-test, is a composite > metric/shape with one column per subject. One composite might have left > sulcal depth, the other right, for example. "It doesn't get any more paired > than the left and right hemispheres of a person's brain." --Tom Nichols > *>>* But it could be thickness at different timepoints, twins' thickness > maps -- anything you can pair. But the two composite input files should > have equal numbers of columns, and those columns should be paired (i.e., > first col of each file corresponds to one another; second col of each file > corresponds to one another; and so on). > *>>* Hope this helps. > *>>* On Apr 4, 2012, at 1:02 PM, gangli wrote: > *>>>* Hi Donna, > *>>> > * *>>>* I have uploaded several generated subjects along with metric and > shape files. Can you please help me figure out how to average and compose > the shape and metric files? > *>>> > * *>>>* Thanks a lot. > *>>> > * *>>>* Gang > *>>> > * *>>>* Hi Gang, > *>>>* Could you provide the full command line and/or output from running > the script? It might not work in Windows, but on Linux or MacOSX command > line you would do: > *>>>* paired.sh >& paired.log > *>>>* Then upload paired.log here: > *>>>* http://pulvinar.wustl.edu/cgi-bin/upload.cgi > *>>>* Depending on what information that gives me, I may want to see more > information, like your mean midthickness, distortion metric, composite > scalar files, etc. > *>>>* Also, it would be helpful if when you reply you keep the prior > message history in your reply, so that I don't have to go to the archives > to remind myself of what you have already done. It is easy to get users > confused -- particularly when I am using this stuff heavily myself. > *>>>* Thanks much, > *>>>* Donna > *>>> > * *>>>* On Apr 4, 2012, at 9:07 AM, gangli wrote: > *>>>>* I am using caret_distribution_Windows32.v5.65 and the input is > *.metric file generated in previous discussion. Thanks. > *>>>> > * *>>>>* Regards, > *>>>>* Gang > *>>>>* _______________________________________________ > *>>>>* caret-users mailing list > *>>>>* caret-users at brainvis.wustl.edu > *>>>>* http://brainvis.wustl.edu/mailman/listinfo/caret-users > *>>>* _______________________________________________ > *>>>* caret-users mailing list > *>>>* caret-users at brainvis.wustl.edu > *>>>* http://brainvis.wustl.edu/mailman/listinfo/caret-users > *>> > * *>>* _______________________________________________ > *>>* caret-users mailing list > *>>* caret-users at brainvis.wustl.edu > <http://brainvis.wustl.edu/mailman/listinfo/caret-users> *>>* > http://brainvis.wustl.edu/mailman/listinfo/caret-users > *> > * *> > * *>* _______________________________________________ > *>* caret-users mailing list > *>* caret-users at brainvis.wustl.edu > <http://brainvis.wustl.edu/mailman/listinfo/caret-users> *>* > http://brainvis.wustl.edu/mailman/listinfo/caret-users*
_______________________________________________ caret-users mailing list [email protected] http://brainvis.wustl.edu/mailman/listinfo/caret-users
