Correction:  after having obtained ‘tx’ as below, just do:    M=evstr(tx);

From: users [] On Behalf Of Rafael Guerra
Sent: Friday, October 14, 2016 12:53 PM
To: Users mailing list for Scilab <>
Subject: Re: [Scilab-users] using csvRead

Hello Philipp,

Say that after mopen you got all your text input into array of strings ‘txt’:

txt = [
"01.12.2015, 01:15:00.12, 1.1, -2.2";
"03.12.2015, 11:15:00.12, -11.1, 2.5";
"12.12.2015, 21:15:00.12, 5.1, 6.2"];

Then do the following:

tx=txt(2:$);              // get rid of header line
tx1=part(tx,1:24);  // get date and time
tx2=part(tx,25:$);  // get numeric data values
// Now get rid of separators:
tx1 = strsubst(tx1,'.',' ');
tx1 = strsubst(tx1,':',' ');
tx1 = strsubst(tx1,',',' ');
tx2 = strsubst(tx2,',',' ');
tx = [tx1 tx2]; // regroups all data but now with numeric values only

Use  mputl to output ‘tx’ to temporary disk file and use fscanfMat to read like 
a breeze the large disk file (now fully numeric) into a Scilab numeric matrix 


From: users [] On Behalf Of Philipp 
Sent: Friday, October 14, 2016 11:50 AM
To: Users mailing list for Scilab 
Subject: Re: [Scilab-users] using csvRead

Dear Denis,

yes thats the way I do it right now.

use mopen --> open file for reading
use mgetl --> read data, result = array of strings
use strsplit --> split  string Array as desired

use evestring() --> convert string to double

Point of disadvantage

So fa I know strsplit() can handle only one string.
Hence I use a for-loop to split each line of the Initial string array into a 
group of strings and convert each part into a double.

OK for few data...may take long for many data

Read data from file and try to spare conversation string-to-double.
fscanfMat() won't do it, because the data does not contain only numerical 
values and "."-sign as decimal.
So I tried csvRead.
using the help I find:      separator :  a 1-by-1 matrix of strings, the field 
separator used.

Note:   STRINGS   = plural.

So I wonder if it is possible to have more than one separator here.


mclose( :-) )

users mailing list

Reply via email to