Thanks for Ye Xianjin's suggestions.

The SizeOf.jar may indeed have some problems. I did a simple test as
follows. The codes are 

     val n = 1; //5; //10; //100; //1000;
     val arr1 = new Array[(Int, Array[Int])](n);
     for(i <- 0 until arr1.length){
        arr1(i) = (i, new Array[Int](43));
     }
     println(SizeOf.humanReadable(SizeOf.deepSizeOf(arr1)));

     
     val arr2 = new Array[Array[Int]](n);
     for(i <- 0 until arr2.length){
        arr2(i) = new Array[Int](43);
     }
     println(SizeOf.humanReadable(SizeOf.deepSizeOf(arr2)));


I changed the value of n, and its results are 
n=1
1016.0b
216.0b
 
n=5
1.9140625Kb
1000.0b

n=10
3.0625Kb
1.9296875Kb

n=100
23.8046875Kb
19.15625Kb

n=1000
231.2265625Kb
191.421875Kb


As suggested by Ye Xianjin, I tried to use SizeEstimator
(https://github.com/phatak-dev/java-sizeof/blob/master/src/main/scala/com/madhukaraphatak/sizeof/SizeEstimator.scala)
The results are 
n=1
264
216

n=5
1240
1000

n=10
2456
1976

n=100
24416
19616

n=1000
227216
182576

It seems that SizeEstimator computes the memory correctly.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/An-interesting-and-serious-problem-I-encountered-tp21637p21652.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to