Hi, Maybe the weighting algorithm isn't correct here. Do you know the size of the blob id?
Regards, Thomas On 7/15/13 11:13 AM, "Chetan Mehrotra" <chetan.mehro...@gmail.com> wrote: >Hi, > >I was trying to get an estimate of the size [1] of various nodes >document in MongoDB for a fresh CQ installation. The largest node was >for path Œ4:/oak:index/lucene/:data/_5_Lucene41_0.tim Œ weighing >upward of 6 MB. It has one binary property jcr:data > >"jcr:data" : { > "r13fd2c82e10-0-1" : >"[\":blobId:00fc1f3fd76c1715424c4f4.....00031aea1\"] >} > >The value stored above is very large. Before digging in further wanted >to check if this is expected or the blobid should be something >smaller? > >Regards >Chetan > >[1] var max = 0; >var maxObj >db.nodes.find().forEach(function(obj) { > var curr = Object.bsonsize(obj); > if(max < curr) { > max = curr; > maxObj = obj; > } >}) >print(max); >printjson(maxObj); > >Chetan Mehrotra