I am using workbench v1.3. I have used R v3.3, R v3.4.and R v3.5. None of them 
allowed me to install specified software packages versions required to run 
hcp_fix: ('kernlab' version 0.9.24, 'party' version 1.0.25, 'e1071' version 
1.6.7 or  'randomForest' version 4.6.12), so I guess the error I am getting now 
is regarding this issue. I have pasted the error below: 

Error in ctreefit(object = object, controls = controls, weights = weights,  : 
  no slot of name "remove_weights" for this object of class "TreeGrowControl"
Calls: eval -> eval -> <Anonymous> -> ctreefit
Execution halted

Please advice on which R version should I have installed to make it to work. I 
have downloaded software packages and installing them in R as root as follow:
> install.packages('/Users/alayar/Downloads/kernlab_0.9-24.tar', 
> dependencies=TRUE)
Warning message:
package ‘/Users/alayar/Downloads/kernlab_0.9-24.tar’ is not available (for R 
version 3.3.3)

Thanks!

-L

> On May 27, 2018, at 4:24 PM, Glasser, Matthew <glass...@wustl.edu> wrote:
> 
> There is no relationship between R and Workbench (or R and matlab).  As to a 
> Workbench/matlab incompatibility, we would need to know what version of 
> Workbench and matlab to debug this.
> 
> Peace,
> 
> Matt.
> 
> From: Marta Moreno <mmorenoort...@icloud.com 
> <mailto:mmorenoort...@icloud.com>>
> Date: Sunday, May 27, 2018 at 3:22 PM
> To: "st...@fmrib.ox.ac.uk <mailto:st...@fmrib.ox.ac.uk>" 
> <st...@fmrib.ox.ac.uk <mailto:st...@fmrib.ox.ac.uk>>
> Cc: Matt Glasser <glass...@wustl.edu <mailto:glass...@wustl.edu>>, HCP Users 
> <hcp-users@humanconnectome.org <mailto:hcp-users@humanconnectome.org>>
> Subject: Re: [HCP-Users] error running hcp_fix
> 
> Thanks, but with newer versions is not working either because R software 
> packages  such as 'kernlab' version 0.9.24, 'party' version 1.0.25, 'e1071' 
> version 1.6.7 or  'randomForest' version 4.6.12, are incompatible with R 3.3 
> or 3.5. At least based on my experience. So could you please let me know 
> which is the R version I need to install to have it compatible with workbench 
> and also compatible with R software packages listed above that are needed to 
> run hcp_fix? I am using a MAC pro, with workbench v1.3.
> 
> This is becoming a nightmare so I would really appreciate your help.
> 
> Thanks!,
> 
> -L
> 
>> On May 27, 2018, at 3:04 AM, st...@fmrib.ox.ac.uk 
>> <mailto:st...@fmrib.ox.ac.uk> wrote:
>> 
>> Hi - we've seen this in the past with specific combinations of matlab 
>> version and workbench version.  I'm not quite sure if the very latest 
>> versions of both have the issue or not.
>> Cheers.
>> 
>> 
>>> On 27 May 2018, at 02:32, Marta Moreno <mmorenoort...@icloud.com 
>>> <mailto:mmorenoort...@icloud.com>> wrote:
>>> 
>>> I found the following error, please advice: (before is giving me some 
>>> warnings about different functions that has same name as a Matlab builtin, 
>>> I have pasted the output from last warning and first error). 
>>> 
>>> Which are the files that needs to be included in ‘CIFTIMatlabReaderWriter’ 
>>> for the settings.sh?
>>> 
>>> Warning: Function subsref has the same name as a MATLAB builtin. We suggest 
>>> you
>>> rename the function to avoid a potential name conflict. 
>>> > In path (line 109)
>>>   In fix_3_clean (line 45) 
>>> /bin/bash: /usr/local/workbench/bin_macosx64: is a directory
>>> Error using read_gifti_file_standalone (line 20)
>>> [GIFTI] Loading of XML file
>>> /private/var/folders/j2/__433pcd02l1cw9qkydkqs5h0000gn/T/tp6555679429695603.gii
>>> failed.
>>> 
>>> Error in gifti (line 71)
>>>                 this = read_gifti_file_standalone(varargin{1},giftistruct);
>>> 
>>> Error in ciftiopen (line 31)
>>> cifti = gifti([tmpfile '.gii']);
>>> 
>>> Error in fix_3_clean (line 46)
>>>   BO=ciftiopen('Atlas.dtseries.nii',WBC);
>>> 
>>> Thanks!
>>> 
>>> -L
>>> 
>>> 
>>>> On May 26, 2018, at 6:55 PM, Marta Moreno <mmorenoort...@icloud.com 
>>>> <mailto:mmorenoort...@icloud.com>> wrote:
>>>> 
>>>> You were right, now the problem is gone but still cannot find my 
>>>> clean.dtseries
>>>> 
>>>> Here is the output, I could not find any error in .ica folder:
>>>> hcp_fix RS_fMRI_1.nii.gz 2000
>>>> processing FMRI file RS_fMRI_1 with highpass 2000
>>>> running highpass
>>>> running MELODIC
>>>> running FIX
>>>> FIX Feature extraction for Melodic output directory: RS_fMRI_1_hp2000.ica
>>>>  create edge masks
>>>>  run FAST
>>>>  registration of standard space masks
>>>>  extract features
>>>> FIX Classifying components in Melodic directory: RS_fMRI_1_hp2000.ica 
>>>> using training file: /usr/local/fix1.065/training_files/HCP_hp2000.RData 
>>>> and threshold 10
>>>> FIX Applying cleanup using cleanup file: 
>>>> RS_fMRI_1_hp2000.ica/fix4melview_HCP_hp2000_thr10.txt and motion cleanup 
>>>> set to 1
>>>> 
>>>>  Please advice.
>>>> 
>>>> Thanks!
>>>> 
>>>> -L
>>>> 
>>>> 
>>>>> On May 26, 2018, at 2:52 PM, Glasser, Matthew <glass...@wustl.edu 
>>>>> <mailto:glass...@wustl.edu>> wrote:
>>>>> 
>>>>> Perhaps you are running out of memory.
>>>>> 
>>>>> Peace,
>>>>> 
>>>>> Matt.
>>>>> 
>>>>> From: <hcp-users-boun...@humanconnectome.org 
>>>>> <mailto:hcp-users-boun...@humanconnectome.org>> on behalf of Marta Moreno 
>>>>> <mmorenoort...@icloud.com <mailto:mmorenoort...@icloud.com>>
>>>>> Date: Saturday, May 26, 2018 at 1:50 PM
>>>>> To: HCP Users <hcp-users@humanconnectome.org 
>>>>> <mailto:hcp-users@humanconnectome.org>>
>>>>> Subject: [HCP-Users] error running hcp_fix
>>>>> 
>>>>> Dear experts,
>>>>> 
>>>>> I am getting the following error. Please advice:
>>>>> 
>>>>> hcp_fix RS_fMRI_1.nii.gz 2000
>>>>> processing FMRI file RS_fMRI_1 with highpass 2000
>>>>> running highpass
>>>>> running MELODIC
>>>>> running FIX
>>>>> FIX Feature extraction for Melodic output directory: RS_fMRI_1_hp2000.ica
>>>>>  create edge masks
>>>>>  run FAST
>>>>>  registration of standard space masks
>>>>>  extract features
>>>>> FIX Classifying components in Melodic directory: RS_fMRI_1_hp2000.ica 
>>>>> using training file: /usr/local/fix1.065/training_files/HCP_hp2000.RData 
>>>>> and threshold 10
>>>>> FIX Applying cleanup using cleanup file: 
>>>>> RS_fMRI_1_hp2000.ica/fix4melview_HCP_hp2000_thr10.txt and motion cleanup 
>>>>> set to 1
>>>>> sh: line 1: 10513 Killed: 9               
>>>>> /Applications/MATLAB_R2017b.app/bin/matlab -nojvm -nodisplay -nodesktop 
>>>>> -nosplash -r "addpath('/usr/local/fix1.065'); 
>>>>> addpath('/usr/local/fsl/etc/matlab'); fix_3_clean('.fix',0,1,2000)" >> 
>>>>> .fix.log 2>&1
>>>>> 
>>>>> Thanks,
>>>>> 
>>>>> -L
>>>>> 
>>>>> _______________________________________________
>>>>> HCP-Users mailing list
>>>>> HCP-Users@humanconnectome.org <mailto:HCP-Users@humanconnectome.org>
>>>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users 
>>>>> <http://lists.humanconnectome.org/mailman/listinfo/hcp-users>
>>>> _______________________________________________
>>>> HCP-Users mailing list
>>>> HCP-Users@humanconnectome.org <mailto:HCP-Users@humanconnectome.org>
>>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users 
>>>> <http://lists.humanconnectome.org/mailman/listinfo/hcp-users>
>>> _______________________________________________
>>> HCP-Users mailing list
>>> HCP-Users@humanconnectome.org <mailto:HCP-Users@humanconnectome.org>
>>> http://lists.humanconnectome.org/mailman/listinfo/hcp-users 
>>> <http://lists.humanconnectome.org/mailman/listinfo/hcp-users>
>> 
>> ---------------------------------------------------------------------------
>> Stephen M. Smith, Professor of Biomedical Engineering
>> Head of Analysis,  Oxford University FMRIB Centre
>> 
>> FMRIB, JR Hospital, Headington, Oxford  OX3 9DU, UK
>> +44 (0) 1865 222726  (fax 222717)
>> st...@fmrib.ox.ac.uk <mailto:st...@fmrib.ox.ac.uk>    
>> http://www.fmrib.ox.ac.uk/~steve <http://www.fmrib.ox.ac.uk/~steve>
>> ---------------------------------------------------------------------------
>> 
>> Stop the cultural destruction of Tibet <http://smithinks.net/>
>> 
>> 
>> 
>> 
>> 
> 


_______________________________________________
HCP-Users mailing list
HCP-Users@humanconnectome.org
http://lists.humanconnectome.org/mailman/listinfo/hcp-users

Reply via email to