Antoine,
One workflow that works fast for me, for large data files, is to load first the
whole file with mgetl, then remove all empty lines using isempty in a loop (as
shown below), process the header block, isolate the data block and save it to a
temporary backup file to disk using mputl, then load very efficiently from disk
that backup file using fscanfMat.
tlines= mgetl(fid,-1); // reads lines until end of file into 1 column text
vector
bool= ~cellfun(isempty,tlines);
tlines= tlines(bool); // removes empty lines
function out_text=cellfun(fun, in_text)
// Applies function to input text (column strings vector), line by line
n=size(in_text,1);
for i=1:n;
out_text(i)=fun(in_text(i));
end
endfunction
Regards,
Rafael
_______________________________________________
users mailing list
[email protected]
http://lists.scilab.org/mailman/listinfo/users