> On May 18, 2016, at 1:25 PM, Manousos Klados <[email protected]> wrote: > > > > Dear HCP community, > > I send you a question of one of my students, because he cannot subscribe in > the mailing list. His email is the following: > > I am currently studying tfMRI data and especially I am interested in the > Working Memory task. However, I came up with a question regarding the onsets > of the WM condition blocks. For example, in Subject 100307 the EV directory > (path:100307/MNINonLinear/Results/tfMRI_WM_LR/ EVs) contains the .txt files > with the onsets and durations of various conditions and trials (as already > stated in the manual). Lets assume that we want to study the 2-bk face task > (2bk_faces.txt). The onset is at 79.208 sec with a duration of 27.5 sec. > > The first part of my question is how to retrieve this information from the > BOLD time-series (405 frames/run)? I mean, is there a fixed way to retrieve > those frames of interest in the specific interval?
Traditionally, a GLM is used to estimate activation in response to each task condition. We used FSL’s FEAT (http://fsl.fmrib.ox.ac.uk/fsl/fslwiki/FEAT) to estimate activation. We have provided the parameter estimates for each subject (averaged over the two task runs) available for download in the ConnectomeDB. > The last part of my question concerns the onsets of the correct trials in > this task. More specifically, the file 2bk_cor.txt contains the onsets of the > correct trials however I would expect the onsets to be on the interval > [79.208, 79.208+27.5] since those trials are executed within the condition > block. Is there a way to match this information according to the 2-bk face > onset? If you want to run your own GLM using different predictors, you will need to create those predictors and GLMs on your own. You could extract the 2bk-face-specific correct trials from the file containing all correct 2bk trials (i.e., 2bk_cor.txt). Or you could code your predictors in any variety of ways by processing the information in TAB.txt files directly. See the documentation at: http://www.humanconnectome.org/documentation/S900/HCP_S900_Release_Appendix_VI.pdf . Running a custom GLM will be a little trickier, since it will require some custom modification to the TaskfMRIAnalysis pipeline within the HCP Pipelines. The pipelines and some documentation is located in GitHub. https://github.com/Washington-University/Pipelines An example of how these scripts might be modified was provided in Practical #7 at last year’s HCP Course https://www.humanconnectome.org/courses/2015/exploring-the-human-connectome.php Hope this helps! --Greg ____________________________________________________________________ Greg Burgess, Ph.D. Staff Scientist, Human Connectome Project Washington University School of Medicine Department of Neuroscience Phone: 314-362-7864 Email: [email protected] > > Best regards, > Bill. > Manousos Klados, PhD > Max Planck Institute for Human Cognitive & Brain Sciences > Research Group of Neuroanatomy and Connectivity > Phone: +49(0)-341-9940-2507 > Mobile: +49(0)-176-6988-1781 > Email: [email protected] > Website: http://www.mklados.com > Skype: mklados | Twitter: @mklados > Address: Stephanstraße 1a PC D-04103 Leipzig Germany > > Join us at our next event:SAN 2016 Conferece > > > > _______________________________________________ > HCP-Users mailing list > [email protected] > http://lists.humanconnectome.org/mailman/listinfo/hcp-users > _______________________________________________ HCP-Users mailing list [email protected] http://lists.humanconnectome.org/mailman/listinfo/hcp-users
