Hi,

The logstash_index_optimize.py script should help gain back a bit of your
disk space on your old and read-only indices.

Otherwise, unfortunately Logstash and Elasticsearch cannot index data into
per-minute or per-hour summaries. If you are having issues regarding disk
space, I guess there is not other choice but to decrease your rentention
policy or to buy more disks.


On Thu, Feb 13, 2014 at 4:24 PM, Krystian Kulasza <
[email protected]> wrote:

> Hello,
>
> I connect few logstash agent to elasticsearch and it collect under 1GB
> data per day, I want collect all data for 1 or 3 day but older I want
> aggregate. I don't need data per one minute, enough one per hour for
> example.
>
> I used this scripts
> https://github.com/crashdump/logstash-elasticsearch-scripts but this not
> resolve my problem.
>
> Maybe somebody know how can I reduce data collect by elasticsearch?
>
> --
> You received this message because you are subscribed to the Google Groups
> "elasticsearch" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/elasticsearch/a5c0f258-68c6-4270-bf64-279dc27db6a9%40googlegroups.com
> .
> For more options, visit https://groups.google.com/groups/opt_out.
>



-- 
Adrien Grand

-- 
You received this message because you are subscribed to the Google Groups 
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/elasticsearch/CAL6Z4j4v_FuYJksPLF5ahPfCvsAZeFVy5f5uNrLVR2c53b-aQQ%40mail.gmail.com.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to