AbdelouahabKhelifati opened a new issue #12245:
URL: https://github.com/apache/druid/issues/12245


   I am interested in Druid's compression performance. I used Druid to create a 
datasource and I loaded my data in it (5GB _csv_ formated time series data), 
here's the dashboard: 
   
   <img width="1410" alt="image" 
src="https://user-images.githubusercontent.com/15266242/153200539-b244bdf3-6aa1-4705-9155-724ef1ba89f6.png";>
   
   
   The dashboard shows that the storage size of _bafu_ datasource is: 1.5GB 
which doesn't seem to be realistic. 
   
   Am I looking at the compression in the right place? How could I measure the 
disk stored size of the datasource? 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to