If you want to run wb_command (the command line program) from the terminal, you should execute the file “wb_command” contained in the distribution’s “bin_macosx64” directory (<some-directory-on-your-computer>/workbench/bin_macosx64/wb_command).
wb_command.app is an OS X Bundle (a directory that contains the program and related files). John Harwell > On Dec 2, 2015, at 6:43 AM, David Dalmazzo <davm...@gmail.com> wrote: > > Hi, > I'm trying to run wb_command.app on OSX 10.11.1 but the app doesn't respond > or the terminal. > I'm running on root:xnu-3247.10.11~1/RELEASE_X86_64 x86_64 > Do I miss a first step? > Thanks for the help > > David > > On Wed, Nov 25, 2015 at 12:00 AM, Timothy Coalson <tsc...@mst.edu > <mailto:tsc...@mst.edu>> wrote: > This part of your call: > > , 'readdata', false); > > is telling ft_read_cifti to not load the data matrix, that is why it isn't > there. > > If you are just as comfortable in C++, then yes, CiftiLib is likely the best > solution, it is directly based on the code we use in connectome workbench for > cifti files. If what you need to do with the dconn organizes well with the > on-disk structure of the matrix (i.e., not needing more than a few rows in > memory at once, never needing to read columns), then it will work efficiently > with almost no memory footprint (as long as you don't tell it to read the > entire file into memory). > > I don't think we distribute parcellated versions of the connectome yet, as > there hasn't exactly been a consensus on a parcellation for the entire > cortex. You can use -cifti-parcellate to apply a parcellation to the dconn, > if you have one that you favor. > > Tim > > > On Tue, Nov 24, 2015 at 5:45 AM, David Dalmazzo <davm...@gmail.com > <mailto:davm...@gmail.com>> wrote: > Hello, > With fieldTrip reading using this line > mycifti = > ft_read_cifti('/Users/David/Desktop/HumanConnectomeProject/HCP_S500_R468_rfMRI_MIGP_d4500ROW_zcorr/HCP_S500_R468_MIGPd4500ROW.dconn.nii', > 'readdata', false); > I get: dimond (pos_pos), hdr (1x1 struct), unit ('mm'), brain structure, > brainstructurelabel, dim, pos, transform. But the connectivity matrix is not > there. > > I'm going to check the C++ library. It seems to be the solution. > Seems to be very difficult to work with this 33GB file. Is there in HCP > datasets any other file with connectome information, that have a connectivity > matrix or an array of region's connections? The Hagmann version seems to be > very old for the models we need to implement. > > Thanks for the help > David > > > > On Mon, Nov 23, 2015 at 9:14 PM, Rose Tharail John <mailboxofr...@gmail.com > <mailto:mailboxofr...@gmail.com>> wrote: > Thank you! > > On Mon, Nov 23, 2015 at 3:03 PM, Timothy Coalson <tsc...@mst.edu > <mailto:tsc...@mst.edu>> wrote: > To get a dconn loaded in matlab, you of course need to have a large amount of > available memory. Freezing is what I would expect if you don't have enough > memory, and start using a lot of swap space (some ways of loading might > initially load it as double precision, and need twice the memory). > > As you mention specifically c++ as your end goal, have a look at CiftiLib, as > it is a c++ library for reading and writing cifti files, and has better > support and features than currently available matlab cifti reading/writing > (for instance, on disk reading/writing (not reading the entire file into > memory first), easy construction and full support of dimension mappings of > all supported types): > > https://github.com/Washington-University/CiftiLib > <https://github.com/Washington-University/CiftiLib> > > Tim > > > On Mon, Nov 23, 2015 at 11:18 AM, David Dalmazzo <davm...@gmail.com > <mailto:davm...@gmail.com>> wrote: > Hello, > I work in Specs Lab in Pompeu Fabra University as a Phd student. > I'm building an app for connectome visualisation and brain activity > simulation called BrainX3. The first version use Hagmann dataset based on 998 > nodes and ~14.000 bidirectional connections. > > For the new version I would like to use HCP S500 dataset. My main problem is > that following this two options about How to get CIFTI files into MATLAB, I > don't find the solution. The first way using fieldTrip gives me some errors, > but I'm already in contact with Robert Oostenveld who is helping me. And the > second option using Gifti+wb_command, just crash/freeze my computer. > > https://wiki.humanconnectome.org/display/PublicData/HCP+Users+FAQ#HCPUsersFAQ-2.HowdoyougetCIFTIfilesintoMATLAB > > <https://wiki.humanconnectome.org/display/PublicData/HCP+Users+FAQ#HCPUsersFAQ-2.HowdoyougetCIFTIfilesintoMATLAB>? > > > My question is how is the best way to extract the data, maybe I'm just > missing a tutorial where it's well explained? > My main purpose is to build an app in C++ of connectome visualization with > much more resolution than Hagmann's. > > Right now I'm in OS X 10.11.1 and Matlab R2015b (8.6.0.267246) 64-bit. > > Thanks for the support, > > David > _______________________________________________ > HCP-Users mailing list > HCP-Users@humanconnectome.org <mailto:HCP-Users@humanconnectome.org> > http://lists.humanconnectome.org/mailman/listinfo/hcp-users > <http://lists.humanconnectome.org/mailman/listinfo/hcp-users> > _______________________________________________ > HCP-Users mailing list > HCP-Users@humanconnectome.org <mailto:HCP-Users@humanconnectome.org> > http://lists.humanconnectome.org/mailman/listinfo/hcp-users > <http://lists.humanconnectome.org/mailman/listinfo/hcp-users> > > > > _______________________________________________ > HCP-Users mailing list > HCP-Users@humanconnectome.org > http://lists.humanconnectome.org/mailman/listinfo/hcp-users > _______________________________________________ HCP-Users mailing list HCP-Users@humanconnectome.org http://lists.humanconnectome.org/mailman/listinfo/hcp-users