Re: [Scilab-users] scilab 6.1 crashes when trying to load big matfile (loadmatfile)
Script runs fine on Win10, Scilab 6.1, 64 GB RAM. But btw, the file a28.mat seems to have only 342MB. -Original Message- From: users On Behalf Of Stéphane Mottelet Sent: Friday, June 19, 2020 10:15 AM To: users@lists.scilab.org Subject: Re: [Scilab-users] scilab 6.1 crashes when trying to load big matfile (loadmatfile) it's the loadmatfile that crashes (here on Scilab-branch-6.1 under OSX, with 16Gb ram) "savematfile('a28.mat','a28');" scilab-cli-bin(3310,0x115f1f5c0) malloc: can't allocate region *** mach_vm_map(size=18446744071562067968) failed (error code=3) scilab-cli-bin(3310,0x115f1f5c0) malloc: *** set a breakpoint in malloc_error_break to debug Same crash with Scilab-branch-6.1 under Ubuntu 18.04, with 128Gb ram: "a28=[1:2^28];" ATTENTION : Option -v7 ajoutée. "savematfile('a28.mat','a28');" Segmentation fault (core dumped) S. Le 19/06/2020 à 08:57, Antoine Monmayrant a écrit : > Hello all, > > > Here is a small script that systematically crashes scilab on my machine: > > > > > // on my machine with 8Gb or ram and usual workload, n=28 crashes scilab > n=[24,26,28]; > > for i=n > > disp(' '+string(i)+' '); > execstr('a'+string(i)+'=[1:2^'+string(i)+'];') > disp('a'+string(i)+'=[1:2^'+string(i)+'];') > execstr("savematfile(''a"+string(i)+".mat'',''a"+string(i)+"'');") > disp("savematfile(''a"+string(i)+".mat'',''a"+string(i)+"'');") > execstr("loadmatfile(''a"+string(i)+".mat'');") > disp("loadmatfile(''a"+string(i)+".mat'');") > disp(' OK '); > > end > ___ users mailing list users@lists.scilab.org http://lists.scilab.org/mailman/listinfo/users
Re: [Scilab-users] scilab 6.1 crashes when trying to load big matfile (loadmatfile)
it's the loadmatfile that crashes (here on Scilab-branch-6.1 under OSX, with 16Gb ram) "savematfile('a28.mat','a28');" scilab-cli-bin(3310,0x115f1f5c0) malloc: can't allocate region *** mach_vm_map(size=18446744071562067968) failed (error code=3) scilab-cli-bin(3310,0x115f1f5c0) malloc: *** set a breakpoint in malloc_error_break to debug Same crash with Scilab-branch-6.1 under Ubuntu 18.04, with 128Gb ram: "a28=[1:2^28];" ATTENTION : Option -v7 ajoutée. "savematfile('a28.mat','a28');" Segmentation fault (core dumped) S. Le 19/06/2020 à 08:57, Antoine Monmayrant a écrit : Hello all, Here is a small script that systematically crashes scilab on my machine: // on my machine with 8Gb or ram and usual workload, n=28 crashes scilab n=[24,26,28]; for i=n disp(' '+string(i)+' '); execstr('a'+string(i)+'=[1:2^'+string(i)+'];') disp('a'+string(i)+'=[1:2^'+string(i)+'];') execstr("savematfile(''a"+string(i)+".mat'',''a"+string(i)+"'');") disp("savematfile(''a"+string(i)+".mat'',''a"+string(i)+"'');") execstr("loadmatfile(''a"+string(i)+".mat'');") disp("loadmatfile(''a"+string(i)+".mat'');") disp(' OK '); end You'll have to adapt the maximum value for n depending on your available ram. Antoine On 18/06/2020 13:06, Stéphane Mottelet wrote: Hello Antoine, I made a 4Gb file with Matlab: >> a=rand(645,645,645); >> b=rand(645,645,645); >> c={a,b}; >> save("c.mat","-7.3","c") and managed to load it successfully in Scilab: --> loadmatfile("c.mat"); --> c c = [645x645x645 constant] [645x645x645 constant] Maybe the structure I tried is too simple (just two hypermatrices in a cell) so please give us a representative failing example. S. Le 17/06/2020 à 13:22, Antoine Monmayrant a écrit : Hello All, I cannot open large matfile in scilab (~3.4Gb). Scilab is always dying with an error message that is extremely instructive: "Killed". It's a bit cumbersome to share this big fie, so do you have any idea on how to investigate this issue and try to locate the root cause? As a side note, it might be a problem related to either the size of the variables or their nature: cells containing hypermatrices. The file itself is not corrupted (I managed to open it in matlab and to save each hypermatrix into individual files that got imported in scilab with no problem. Cheers, Antoine ___ users mailing list users@lists.scilab.org https://antispam.utc.fr/proxy/2/c3RlcGhhbmUubW90dGVsZXRAdXRjLmZy/antispam.utc.fr/proxy/1/c3RlcGhhbmUubW90dGVsZXRAdXRjLmZy/lists.scilab.org/mailman/listinfo/users ___ users mailing list users@lists.scilab.org https://antispam.utc.fr/proxy/1/c3RlcGhhbmUubW90dGVsZXRAdXRjLmZy/lists.scilab.org/mailman/listinfo/users -- Stéphane Mottelet Ingénieur de recherche EA 4297 Transformations Intégrées de la Matière Renouvelable Département Génie des Procédés Industriels Sorbonne Universités - Université de Technologie de Compiègne CS 60319, 60203 Compiègne cedex Tel : +33(0)344234688 http://www.utc.fr/~mottelet ___ users mailing list users@lists.scilab.org http://lists.scilab.org/mailman/listinfo/users
Re: [Scilab-users] scilab 6.1 crashes when trying to load big matfile (loadmatfile)
Hello all, Here is a small script that systematically crashes scilab on my machine: // on my machine with 8Gb or ram and usual workload, n=28 crashes scilab n=[24,26,28]; for i=n disp(' '+string(i)+' '); execstr('a'+string(i)+'=[1:2^'+string(i)+'];') disp('a'+string(i)+'=[1:2^'+string(i)+'];') execstr("savematfile(''a"+string(i)+".mat'',''a"+string(i)+"'');") disp("savematfile(''a"+string(i)+".mat'',''a"+string(i)+"'');") execstr("loadmatfile(''a"+string(i)+".mat'');") disp("loadmatfile(''a"+string(i)+".mat'');") disp(' OK '); end You'll have to adapt the maximum value for n depending on your available ram. Antoine On 18/06/2020 13:06, Stéphane Mottelet wrote: Hello Antoine, I made a 4Gb file with Matlab: >> a=rand(645,645,645); >> b=rand(645,645,645); >> c={a,b}; >> save("c.mat","-7.3","c") and managed to load it successfully in Scilab: --> loadmatfile("c.mat"); --> c c = [645x645x645 constant] [645x645x645 constant] Maybe the structure I tried is too simple (just two hypermatrices in a cell) so please give us a representative failing example. S. Le 17/06/2020 à 13:22, Antoine Monmayrant a écrit : Hello All, I cannot open large matfile in scilab (~3.4Gb). Scilab is always dying with an error message that is extremely instructive: "Killed". It's a bit cumbersome to share this big fie, so do you have any idea on how to investigate this issue and try to locate the root cause? As a side note, it might be a problem related to either the size of the variables or their nature: cells containing hypermatrices. The file itself is not corrupted (I managed to open it in matlab and to save each hypermatrix into individual files that got imported in scilab with no problem. Cheers, Antoine ___ users mailing list users@lists.scilab.org https://antispam.utc.fr/proxy/1/c3RlcGhhbmUubW90dGVsZXRAdXRjLmZy/lists.scilab.org/mailman/listinfo/users ___ users mailing list users@lists.scilab.org http://lists.scilab.org/mailman/listinfo/users
Re: [Scilab-users] scilab 6.1 crashes when trying to load big matfile (loadmatfile)
I tried to generate such kind of data file with the following Matlab script: c1=fillcell(); c2=fillcell(); c3=fillcell(); save("c.mat","-v7.3","c1","c2","c3") function c = fillcell() c={}; for j=1:11 c{j}=rand(711,711,33); end end The c.mat file is 4.18GB and I managed to load it in Scilab. How much physical memory do you have on your machine ? S. Le 18/06/2020 à 13:28, Antoine Monmayrant a écrit : Hello, I tried but failed to generate similar data. What I had was a mat file containing mainly 3 cells. Each Cell contains 11 hypermatrices that are 711x711x31 or 711x711x33. I won't have access to the machine that can play with this data in the coming days, so I am not sure I can help in the short term. Antoine On 18/06/2020 13:06, Stéphane Mottelet wrote: Hello Antoine, I made a 4Gb file with Matlab: >> a=rand(645,645,645); >> b=rand(645,645,645); >> c={a,b}; >> save("c.mat","-7.3","c") and managed to load it successfully in Scilab: --> loadmatfile("c.mat"); --> c c = [645x645x645 constant] [645x645x645 constant] Maybe the structure I tried is too simple (just two hypermatrices in a cell) so please give us a representative failing example. S. Le 17/06/2020 à 13:22, Antoine Monmayrant a écrit : Hello All, I cannot open large matfile in scilab (~3.4Gb). Scilab is always dying with an error message that is extremely instructive: "Killed". It's a bit cumbersome to share this big fie, so do you have any idea on how to investigate this issue and try to locate the root cause? As a side note, it might be a problem related to either the size of the variables or their nature: cells containing hypermatrices. The file itself is not corrupted (I managed to open it in matlab and to save each hypermatrix into individual files that got imported in scilab with no problem. Cheers, Antoine ___ users mailing list users@lists.scilab.org https://antispam.utc.fr/proxy/2/c3RlcGhhbmUubW90dGVsZXRAdXRjLmZy/antispam.utc.fr/proxy/1/c3RlcGhhbmUubW90dGVsZXRAdXRjLmZy/lists.scilab.org/mailman/listinfo/users -- Stéphane Mottelet Ingénieur de recherche EA 4297 Transformations Intégrées de la Matière Renouvelable Département Génie des Procédés Industriels Sorbonne Universités - Université de Technologie de Compiègne CS 60319, 60203 Compiègne cedex Tel : +33(0)344234688 http://www.utc.fr/~mottelet ___ users mailing list users@lists.scilab.org http://lists.scilab.org/mailman/listinfo/users
Re: [Scilab-users] scilab 6.1 crashes when trying to load big matfile (loadmatfile)
Hello Antoine, I made a 4Gb file with Matlab: >> a=rand(645,645,645); >> b=rand(645,645,645); >> c={a,b}; >> save("c.mat","-7.3","c") and managed to load it successfully in Scilab: --> loadmatfile("c.mat"); --> c c = [645x645x645 constant] [645x645x645 constant] Maybe the structure I tried is too simple (just two hypermatrices in a cell) so please give us a representative failing example. S. Le 17/06/2020 à 13:22, Antoine Monmayrant a écrit : Hello All, I cannot open large matfile in scilab (~3.4Gb). Scilab is always dying with an error message that is extremely instructive: "Killed". It's a bit cumbersome to share this big fie, so do you have any idea on how to investigate this issue and try to locate the root cause? As a side note, it might be a problem related to either the size of the variables or their nature: cells containing hypermatrices. The file itself is not corrupted (I managed to open it in matlab and to save each hypermatrix into individual files that got imported in scilab with no problem. Cheers, Antoine ___ users mailing list users@lists.scilab.org https://antispam.utc.fr/proxy/1/c3RlcGhhbmUubW90dGVsZXRAdXRjLmZy/lists.scilab.org/mailman/listinfo/users -- Stéphane Mottelet Ingénieur de recherche EA 4297 Transformations Intégrées de la Matière Renouvelable Département Génie des Procédés Industriels Sorbonne Universités - Université de Technologie de Compiègne CS 60319, 60203 Compiègne cedex Tel : +33(0)344234688 http://www.utc.fr/~mottelet ___ users mailing list users@lists.scilab.org http://lists.scilab.org/mailman/listinfo/users
Re: [Scilab-users] scilab 6.1 crashes when trying to load big matfile (loadmatfile)
On 17/06/2020 13:51, Stéphane Mottelet wrote: Hello Antoine, Did you to save a single "hypermatrix in a cell" in matlab and then load in Scilab ? No, just the hypermatrix, not a cell containing only one hypermatrix. S. Le 17/06/2020 à 13:22, Antoine Monmayrant a écrit : Hello All, I cannot open large matfile in scilab (~3.4Gb). Scilab is always dying with an error message that is extremely instructive: "Killed". It's a bit cumbersome to share this big fie, so do you have any idea on how to investigate this issue and try to locate the root cause? As a side note, it might be a problem related to either the size of the variables or their nature: cells containing hypermatrices. The file itself is not corrupted (I managed to open it in matlab and to save each hypermatrix into individual files that got imported in scilab with no problem. Cheers, Antoine ___ users mailing list users@lists.scilab.org https://antispam.utc.fr/proxy/1/c3RlcGhhbmUubW90dGVsZXRAdXRjLmZy/lists.scilab.org/mailman/listinfo/users ___ users mailing list users@lists.scilab.org http://lists.scilab.org/mailman/listinfo/users
Re: [Scilab-users] scilab 6.1 crashes when trying to load big matfile (loadmatfile)
Hello Antoine, There might be some extra copies at loading time that overflow the memory ; what's the amount of memory you have available ? And also, to investigate, could you please either share a script used to generate similar data or describe the used data structure ? Thanks, Clément > -Original Message- > From: users On Behalf Of Stéphane Mottelet > Sent: Wednesday, June 17, 2020 1:52 PM > To: users@lists.scilab.org > Subject: Re: [Scilab-users] scilab 6.1 crashes when trying to load big matfile > (loadmatfile) > > Hello Antoine, > > Did you to save a single "hypermatrix in a cell" in matlab and then load in > Scilab ? > > S. > > Le 17/06/2020 à 13:22, Antoine Monmayrant a écrit : > > Hello All, > > > > I cannot open large matfile in scilab (~3.4Gb). > > Scilab is always dying with an error message that is extremely > > instructive: "Killed". > > > > It's a bit cumbersome to share this big fie, so do you have any idea > > on how to investigate this issue and try to locate the root cause? > > > > As a side note, it might be a problem related to either the size of > > the variables or their nature: cells containing hypermatrices. > > The file itself is not corrupted (I managed to open it in matlab and > > to save each hypermatrix into individual files that got imported in > > scilab with no problem. > > > > Cheers, > > > > Antoine > > > > ___ > > users mailing list > > users@lists.scilab.org > > https://antispam.utc.fr/proxy/1/c3RlcGhhbmUubW90dGVsZXRAdXRjLmZy/lists > > .scilab.org/mailman/listinfo/users > > > > -- > Stéphane Mottelet > Ingénieur de recherche > EA 4297 Transformations Intégrées de la Matière Renouvelable Département > Génie des Procédés Industriels Sorbonne Universités - Université de > Technologie > de Compiègne CS 60319, 60203 Compiègne cedex Tel : +33(0)344234688 > http://www.utc.fr/~mottelet > > ___ > users mailing list > users@lists.scilab.org > http://lists.scilab.org/mailman/listinfo/users ___ users mailing list users@lists.scilab.org http://lists.scilab.org/mailman/listinfo/users
Re: [Scilab-users] scilab 6.1 crashes when trying to load big matfile (loadmatfile)
Hello Antoine, Did you to save a single "hypermatrix in a cell" in matlab and then load in Scilab ? S. Le 17/06/2020 à 13:22, Antoine Monmayrant a écrit : Hello All, I cannot open large matfile in scilab (~3.4Gb). Scilab is always dying with an error message that is extremely instructive: "Killed". It's a bit cumbersome to share this big fie, so do you have any idea on how to investigate this issue and try to locate the root cause? As a side note, it might be a problem related to either the size of the variables or their nature: cells containing hypermatrices. The file itself is not corrupted (I managed to open it in matlab and to save each hypermatrix into individual files that got imported in scilab with no problem. Cheers, Antoine ___ users mailing list users@lists.scilab.org https://antispam.utc.fr/proxy/1/c3RlcGhhbmUubW90dGVsZXRAdXRjLmZy/lists.scilab.org/mailman/listinfo/users -- Stéphane Mottelet Ingénieur de recherche EA 4297 Transformations Intégrées de la Matière Renouvelable Département Génie des Procédés Industriels Sorbonne Universités - Université de Technologie de Compiègne CS 60319, 60203 Compiègne cedex Tel : +33(0)344234688 http://www.utc.fr/~mottelet ___ users mailing list users@lists.scilab.org http://lists.scilab.org/mailman/listinfo/users
[Scilab-users] scilab 6.1 crashes when trying to load big matfile (loadmatfile)
Hello All, I cannot open large matfile in scilab (~3.4Gb). Scilab is always dying with an error message that is extremely instructive: "Killed". It's a bit cumbersome to share this big fie, so do you have any idea on how to investigate this issue and try to locate the root cause? As a side note, it might be a problem related to either the size of the variables or their nature: cells containing hypermatrices. The file itself is not corrupted (I managed to open it in matlab and to save each hypermatrix into individual files that got imported in scilab with no problem. Cheers, Antoine ___ users mailing list users@lists.scilab.org http://lists.scilab.org/mailman/listinfo/users