I rather would use something like syslog, logstash, filebeat to do this
type of job instead of ansible...
Regards,
JYL
Le 15/05/2020 à 09:53, rakesh rathore a écrit :
Hi Team,
Can anyone slove this for me please?
Imaginetherearemanyserversrunninginproductionsetupandstoringsomedataata
particular location on each server. The file name of each server is
unique and follows a
specificpattern.thefilenamestartswiththehostnamedateandtimestampendingwith
.TXT. as soon as the file reaches 100 MB a remote server should fetch
the file and store it locally. The file will keep on getting appended
on the respective server but a remote server will keep on tracking the
file size and keep on thinking it on its local directory.
Suppose there are 10 servers which are in production naming from
host1-10 and the remote server named as rm1. the script on server rm1
will keep on checking the size of the file being written on each of
the hosts 1 to 10. As soon as the file size reaches 100 MB the file
should be synced to RM1 server. The script should make sure that it is
only downloading the latest data not the data which had if it had
previously downloaded. File has become 1.5 GB but the server rm1 has
already downloaded 1.5 GB of data so any new 100Mb that is being
appended to this file shall be sent over the network to RM1 server not
the entire 1.6 GB of data.
Its free exam test
I am trying slove this
- hosts: Producton tasks: - fetch: dest: /root/filename.TXT src:
/home/backup flat: yes
--
You received this message because you are subscribed to the Google Groups "Ansible
Project" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to ansible-project+unsubscr...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/ansible-project/13b79472-d3ef-38c2-536e-22640eb80e04%40lenhof.eu.org.