Re: [Scilab-users] ?==?utf-8?q? scilab 6.1 crashes when trying to load big matfile (loadmatfile)

2020-06-19 Thread Antoine Monmayrant
Le Vendredi, Juin 19, 2020 15:59 CEST, Rafael Guerra 
 a écrit: 
 
> Script runs fine on Win10, Scilab 6.1, 64 GB RAM.
> But btw, the file a28.mat seems to have only 342MB.

I suspect that it does work fine with 64GB of ram: that's how I managed to work 
around my initial problem.
This script should be adapted to your available ram: you should try to increase 
n.
But in your case, I think you have enough ram to reach scilab maximum variable 
size (is it something like 2^31-1 maximum elements?) before triggering this bug.
Ideally, this should be tested on a ram-limited hardware (~8GB).
The issue is that there might be some malloc() somewhere that fails, while the 
code tries to use the memory without checking whether the malloc() was 
successful or not.
If you have gazillion's of GB of available ram, you never face this problem.

Antoine

> 
> 
> -Original Message-
> From: users  On Behalf Of Stéphane Mottelet
> Sent: Friday, June 19, 2020 10:15 AM
> To: users@lists.scilab.org
> Subject: Re: [Scilab-users] scilab 6.1 crashes when trying to load big 
> matfile (loadmatfile)
> 
> it's the loadmatfile that crashes (here on Scilab-branch-6.1 under OSX, 
> with 16Gb ram)
> 
>    "savematfile('a28.mat','a28');"
> scilab-cli-bin(3310,0x115f1f5c0) malloc: can't allocate region
> *** mach_vm_map(size=18446744071562067968) failed (error code=3)
> scilab-cli-bin(3310,0x115f1f5c0) malloc: *** set a breakpoint in 
> malloc_error_break to debug
> 
> Same crash with Scilab-branch-6.1 under Ubuntu 18.04, with 128Gb ram:
> 
>    "a28=[1:2^28];"
> ATTENTION : Option -v7 ajoutée.
> 
>    "savematfile('a28.mat','a28');"
> Segmentation fault (core dumped)
> 
> S.
> 
> Le 19/06/2020 à 08:57, Antoine Monmayrant a écrit :
> > Hello all,
> >
> >
> > Here is a small script that systematically crashes scilab on my machine:
> >
> >
> > 
> >
> > // on my machine with 8Gb or ram and usual workload, n=28 crashes scilab
> > n=[24,26,28];
> >
> > for i=n
> >
> >     disp(' '+string(i)+' ');
> >     execstr('a'+string(i)+'=[1:2^'+string(i)+'];')
> >     disp('a'+string(i)+'=[1:2^'+string(i)+'];')
> > execstr("savematfile(''a"+string(i)+".mat'',''a"+string(i)+"'');")
> > disp("savematfile(''a"+string(i)+".mat'',''a"+string(i)+"'');")
> >     execstr("loadmatfile(''a"+string(i)+".mat'');")
> >     disp("loadmatfile(''a"+string(i)+".mat'');")
> >     disp(' OK ');
> >
> > end
> >
> 
> ___
> users mailing list
> users@lists.scilab.org
> http://lists.scilab.org/mailman/listinfo/users
>

___
users mailing list
users@lists.scilab.org
http://lists.scilab.org/mailman/listinfo/users


Re: [Scilab-users] ?==?utf-8?q? Plotting a New Xpoly Directly Inside a Compound

2020-06-19 Thread Antoine Monmayrant
> Again, it is not clear why you just do not fuse all data together into one
> matrix and use plot2d() to plot all data at once.

>From my own experience, this might be the case when you want to track the 
>progression of a lengthy calculation: you periodically add the new data on top 
>of the previous results. This is also the case when you work with experimental 
>data that arrives slowly, by successive batches: you receiver a batch of let's 
>say ~10 curves every couple of minutes for example.

Antoine

___
users mailing list
users@lists.scilab.org
http://lists.scilab.org/mailman/listinfo/users


Re: [Scilab-users] scilab 6.1 crashes when trying to load big matfile (loadmatfile)

2020-06-19 Thread Rafael Guerra
Script runs fine on Win10, Scilab 6.1, 64 GB RAM.
But btw, the file a28.mat seems to have only 342MB.


-Original Message-
From: users  On Behalf Of Stéphane Mottelet
Sent: Friday, June 19, 2020 10:15 AM
To: users@lists.scilab.org
Subject: Re: [Scilab-users] scilab 6.1 crashes when trying to load big matfile 
(loadmatfile)

it's the loadmatfile that crashes (here on Scilab-branch-6.1 under OSX, 
with 16Gb ram)

   "savematfile('a28.mat','a28');"
scilab-cli-bin(3310,0x115f1f5c0) malloc: can't allocate region
*** mach_vm_map(size=18446744071562067968) failed (error code=3)
scilab-cli-bin(3310,0x115f1f5c0) malloc: *** set a breakpoint in 
malloc_error_break to debug

Same crash with Scilab-branch-6.1 under Ubuntu 18.04, with 128Gb ram:

   "a28=[1:2^28];"
ATTENTION : Option -v7 ajoutée.

   "savematfile('a28.mat','a28');"
Segmentation fault (core dumped)

S.

Le 19/06/2020 à 08:57, Antoine Monmayrant a écrit :
> Hello all,
>
>
> Here is a small script that systematically crashes scilab on my machine:
>
>
> 
>
> // on my machine with 8Gb or ram and usual workload, n=28 crashes scilab
> n=[24,26,28];
>
> for i=n
>
>     disp(' '+string(i)+' ');
>     execstr('a'+string(i)+'=[1:2^'+string(i)+'];')
>     disp('a'+string(i)+'=[1:2^'+string(i)+'];')
> execstr("savematfile(''a"+string(i)+".mat'',''a"+string(i)+"'');")
> disp("savematfile(''a"+string(i)+".mat'',''a"+string(i)+"'');")
>     execstr("loadmatfile(''a"+string(i)+".mat'');")
>     disp("loadmatfile(''a"+string(i)+".mat'');")
>     disp(' OK ');
>
> end
>

___
users mailing list
users@lists.scilab.org
http://lists.scilab.org/mailman/listinfo/users


Re: [Scilab-users] Plotting a New Xpoly Directly Inside a Compound

2020-06-19 Thread P M
Hallo Robert,

 If you use xpoly() it means you know your data before you plot it?
--> I mean: you do not click in the figure to get the coordinates
--> Why not plot all data at once?

E.g.: Can you not plot all polylines in one single plot2d command?

If you can't, you may try following approach:
Assumption: all data is of same length, e.g.: all lines are drawn by same
number of points

1.) you may access the data that composes the plot by searching for:
*.children.data
   * =: I do not recall from memory how deep you have to search
   e.g.:
   plot2d();
   a = gca();
   a.children.children.dataor .a.children.children.children.data

2.: Save the *.children.data separately in a X-matrix and Y-matrix

3.: Add the data that you would plot with xpoly() to these matrices.

4.: xdel()

5.: replot everything with  plot2d(X-matrix, Y-matrix)

This should give you all lines at the same plot-hirachy.
I did not check, but I would guess that also the line order would not
change.

It should be possible to create a function out of this and loop though it
each time a xpoly() is added to the plot.


Again, it is not clear why you just do not fuse all data together into one
matrix and use plot2d() to plot all data at once.


Best Regards,
Philipp







Am Sa., 13. Juni 2020 um 20:05 Uhr schrieb RolandB :

> Hi,
>
> If one plots several curves into a 2D axes using plot2d(), the curves will
> be resembled as polyline children within a compound, which is a child of
> the
> according axes.
> Adding further polylines one by one using xpoly() puts these polylines
> outside the compound on the same level as the compound is located, as
> children of the according axes.
>
> Is there a simple way to have the polyline generated directly within the
> existing compound?
> I know that I can glue several polylines into a compound, but I would have
> to unglue the compound first and then glue all polylines together back into
> one compound.
> Having many (!) curves makes this quite slow. Especially as glue() reverses
> the order of the polylines, so I would have to do a second unglue();glue();
> in order to preserve the order they had before ungluing (BTW, is this a bug
> or is reversal of the order a desired feature?).
>
> On the other hand, what kind of advantage does the compound have at all,
> assuming I wouldn't need the possibility of making all of the curves in the
> compound visible or invisible by just modifying the visibility of the
> compound?
> Ok, I can address all curves by axes.children.children(1:$) without having
> to care about the legend, which would be together with the curves in
> axes.children(1:$) if all the polylines were not in a compound.
>
> But any additional advantage for using a compound?
>
> Regards,
> Roland
>
>
>
>
> --
> Sent from:
> http://mailinglists.scilab.org/Scilab-users-Mailing-Lists-Archives-f2602246.html
> ___
> users mailing list
> users@lists.scilab.org
> http://lists.scilab.org/mailman/listinfo/users
>
___
users mailing list
users@lists.scilab.org
http://lists.scilab.org/mailman/listinfo/users


Re: [Scilab-users] {EXT} Re: scilab 6.1 crashes when trying to load big matfile (loadmatfile)

2020-06-19 Thread Dang Ngoc Chan, Christophe
Hello,

> De : De la part de Antoine Monmayrant
> Envoyé : vendredi 19 juin 2020 08:58
>
> Here is a small script that systematically crashes scilab on my machine:

It works fine for me, no crash.
Windows 7, RAM 16 GB, 12.9 GB used at peak.

[a, b] = getdebuginfo()
 a  =

  "Memory in use:  39 %"
  "Total Physical Memory (Kbytes): 16673068"
  "Free Physical Memory (Kbytes): 10007928"
  "Total Paging File (Kbytes): 33344236"
  "Free Paging File (Kbytes): 24405280"
  "Total Virtual Memory (Kbytes): 8589934464"
  "Free Virtual Memory (Kbytes): 8586180260"
  "Free Extended Memory (Kbytes):   0"
  "Operating System: Windows Seven x64"
  "Intel(R) Xeon(R) CPU E5-1620 v3 @ 3.50GHz"
  "Number of processors: 8"
  "Number of Video cards: 1"
  "Video card #0: NVIDIA Quadro K2200"
  "Primary Video card driver version: 24.21.14.1163"
  "Screen size: 1920 x 1200 32 bits"
  "Number of Monitors: 2"
  "Path: [...]
 b  =

  "Version: scilab-6.1.0"
  "Compilation date: Feb 25 2020"
  "Compilation time: 11:34:32"
  "Compiler Architecture: X64"
  "Compiled with Microsoft compiler (191627035)"
  "BLAS library optimized version: MKL"
  "XML version: 2.9.1"
  "Tcl/Tk: Enable"
  "TCL version: 8.5.9"
  "TK version: 8.5.9"
  "Path separator: ;"
  "Directory separator: \"
  "PCRE Version: 8.21"

--
Christophe Dang Ngoc Chan
Mechanical calculation engineer

General
This e-mail may contain confidential and/or privileged information. If you are 
not the intended recipient (or have received this e-mail in error), please 
notify the sender immediately and destroy this e-mail. Any unauthorized 
copying, disclosure or distribution of the material in this e-mail is strictly 
forbidden.
___
users mailing list
users@lists.scilab.org
http://lists.scilab.org/mailman/listinfo/users


Re: [Scilab-users] scilab 6.1 crashes when trying to load big matfile (loadmatfile)

2020-06-19 Thread Stéphane Mottelet
it's the loadmatfile that crashes (here on Scilab-branch-6.1 under OSX, 
with 16Gb ram)


  "savematfile('a28.mat','a28');"
scilab-cli-bin(3310,0x115f1f5c0) malloc: can't allocate region
*** mach_vm_map(size=18446744071562067968) failed (error code=3)
scilab-cli-bin(3310,0x115f1f5c0) malloc: *** set a breakpoint in 
malloc_error_break to debug


Same crash with Scilab-branch-6.1 under Ubuntu 18.04, with 128Gb ram:

  "a28=[1:2^28];"
ATTENTION : Option -v7 ajoutée.

  "savematfile('a28.mat','a28');"
Segmentation fault (core dumped)

S.

Le 19/06/2020 à 08:57, Antoine Monmayrant a écrit :

Hello all,


Here is a small script that systematically crashes scilab on my machine:




// on my machine with 8Gb or ram and usual workload, n=28 crashes scilab
n=[24,26,28];

for i=n

    disp(' '+string(i)+' ');
    execstr('a'+string(i)+'=[1:2^'+string(i)+'];')
    disp('a'+string(i)+'=[1:2^'+string(i)+'];')
execstr("savematfile(''a"+string(i)+".mat'',''a"+string(i)+"'');")
disp("savematfile(''a"+string(i)+".mat'',''a"+string(i)+"'');")
    execstr("loadmatfile(''a"+string(i)+".mat'');")
    disp("loadmatfile(''a"+string(i)+".mat'');")
    disp(' OK ');

end



You'll have to adapt the maximum value for n depending on your 
available ram.


Antoine


On 18/06/2020 13:06, Stéphane Mottelet wrote:

Hello Antoine,

I made a 4Gb file with Matlab:

>> a=rand(645,645,645);
>> b=rand(645,645,645);
>> c={a,b};
>> save("c.mat","-7.3","c")

and managed to load it successfully in Scilab:

--> loadmatfile("c.mat");

--> c
 c  =

  [645x645x645 constant]  [645x645x645 constant]

Maybe the structure I tried is too simple (just two hypermatrices in 
a cell) so please give us a representative failing example.


S.

Le 17/06/2020 à 13:22, Antoine Monmayrant a écrit :

Hello All,

I cannot open large matfile in scilab (~3.4Gb).
Scilab is always dying with an error message that is extremely 
instructive: "Killed".


It's a bit cumbersome to share this big fie, so do you have any idea 
on how to investigate this issue and try to locate the root cause?


As a side note, it might be a problem related to either the size of 
the variables or their nature: cells containing hypermatrices.
The file itself is not corrupted (I managed to open it in matlab and 
to save each hypermatrix into individual files that got imported in 
scilab with no problem.


Cheers,

Antoine

___
users mailing list
users@lists.scilab.org
https://antispam.utc.fr/proxy/2/c3RlcGhhbmUubW90dGVsZXRAdXRjLmZy/antispam.utc.fr/proxy/1/c3RlcGhhbmUubW90dGVsZXRAdXRjLmZy/lists.scilab.org/mailman/listinfo/users 




___
users mailing list
users@lists.scilab.org
https://antispam.utc.fr/proxy/1/c3RlcGhhbmUubW90dGVsZXRAdXRjLmZy/lists.scilab.org/mailman/listinfo/users 



--
Stéphane Mottelet
Ingénieur de recherche
EA 4297 Transformations Intégrées de la Matière Renouvelable
Département Génie des Procédés Industriels
Sorbonne Universités - Université de Technologie de Compiègne
CS 60319, 60203 Compiègne cedex
Tel : +33(0)344234688
http://www.utc.fr/~mottelet

___
users mailing list
users@lists.scilab.org
http://lists.scilab.org/mailman/listinfo/users


Re: [Scilab-users] scilab 6.1 crashes when trying to load big matfile (loadmatfile)

2020-06-19 Thread Antoine Monmayrant

Hello all,


Here is a small script that systematically crashes scilab on my machine:




// on my machine with 8Gb or ram and usual workload, n=28 crashes scilab
n=[24,26,28];

for i=n

    disp(' '+string(i)+' ');
    execstr('a'+string(i)+'=[1:2^'+string(i)+'];')
    disp('a'+string(i)+'=[1:2^'+string(i)+'];')
execstr("savematfile(''a"+string(i)+".mat'',''a"+string(i)+"'');")
disp("savematfile(''a"+string(i)+".mat'',''a"+string(i)+"'');")
    execstr("loadmatfile(''a"+string(i)+".mat'');")
    disp("loadmatfile(''a"+string(i)+".mat'');")
    disp(' OK ');

end



You'll have to adapt the maximum value for n depending on your available 
ram.


Antoine


On 18/06/2020 13:06, Stéphane Mottelet wrote:

Hello Antoine,

I made a 4Gb file with Matlab:

>> a=rand(645,645,645);
>> b=rand(645,645,645);
>> c={a,b};
>> save("c.mat","-7.3","c")

and managed to load it successfully in Scilab:

--> loadmatfile("c.mat");

--> c
 c  =

  [645x645x645 constant]  [645x645x645 constant]

Maybe the structure I tried is too simple (just two hypermatrices in a 
cell) so please give us a representative failing example.


S.

Le 17/06/2020 à 13:22, Antoine Monmayrant a écrit :

Hello All,

I cannot open large matfile in scilab (~3.4Gb).
Scilab is always dying with an error message that is extremely 
instructive: "Killed".


It's a bit cumbersome to share this big fie, so do you have any idea 
on how to investigate this issue and try to locate the root cause?


As a side note, it might be a problem related to either the size of 
the variables or their nature: cells containing hypermatrices.
The file itself is not corrupted (I managed to open it in matlab and 
to save each hypermatrix into individual files that got imported in 
scilab with no problem.


Cheers,

Antoine

___
users mailing list
users@lists.scilab.org
https://antispam.utc.fr/proxy/1/c3RlcGhhbmUubW90dGVsZXRAdXRjLmZy/lists.scilab.org/mailman/listinfo/users 




___
users mailing list
users@lists.scilab.org
http://lists.scilab.org/mailman/listinfo/users


Re: [Scilab-users] scilab 6.1 crashes when trying to load big matfile (loadmatfile)

2020-06-19 Thread Stéphane Mottelet

I tried to generate such kind of data file with the following Matlab script:

c1=fillcell();
c2=fillcell();
c3=fillcell();

save("c.mat","-v7.3","c1","c2","c3")

function c = fillcell()
    c={};
    for j=1:11
    c{j}=rand(711,711,33);
    end
end

The c.mat file is 4.18GB and I managed to load it in Scilab.

How much physical memory do you have on your machine ?

S.


Le 18/06/2020 à 13:28, Antoine Monmayrant a écrit :

Hello,


I tried but failed to generate similar data.
What I had was a mat file containing mainly 3 cells.
Each Cell contains 11 hypermatrices that are 711x711x31 or 711x711x33.
I won't have access to the machine that can play with this data in the 
coming days, so I am not sure I can help in the short term.


Antoine

On 18/06/2020 13:06, Stéphane Mottelet wrote:

Hello Antoine,

I made a 4Gb file with Matlab:

>> a=rand(645,645,645);
>> b=rand(645,645,645);
>> c={a,b};
>> save("c.mat","-7.3","c")

and managed to load it successfully in Scilab:

--> loadmatfile("c.mat");

--> c
 c  =

  [645x645x645 constant]  [645x645x645 constant]

Maybe the structure I tried is too simple (just two hypermatrices in 
a cell) so please give us a representative failing example.


S.

Le 17/06/2020 à 13:22, Antoine Monmayrant a écrit :

Hello All,

I cannot open large matfile in scilab (~3.4Gb).
Scilab is always dying with an error message that is extremely 
instructive: "Killed".


It's a bit cumbersome to share this big fie, so do you have any idea 
on how to investigate this issue and try to locate the root cause?


As a side note, it might be a problem related to either the size of 
the variables or their nature: cells containing hypermatrices.
The file itself is not corrupted (I managed to open it in matlab and 
to save each hypermatrix into individual files that got imported in 
scilab with no problem.


Cheers,

Antoine

___
users mailing list
users@lists.scilab.org
https://antispam.utc.fr/proxy/2/c3RlcGhhbmUubW90dGVsZXRAdXRjLmZy/antispam.utc.fr/proxy/1/c3RlcGhhbmUubW90dGVsZXRAdXRjLmZy/lists.scilab.org/mailman/listinfo/users 




--
Stéphane Mottelet
Ingénieur de recherche
EA 4297 Transformations Intégrées de la Matière Renouvelable
Département Génie des Procédés Industriels
Sorbonne Universités - Université de Technologie de Compiègne
CS 60319, 60203 Compiègne cedex
Tel : +33(0)344234688
http://www.utc.fr/~mottelet

___
users mailing list
users@lists.scilab.org
http://lists.scilab.org/mailman/listinfo/users