What system of collecting logs should I use?


Warning: count(): Parameter must be an array or an object that implements Countable in /home/styllloz/public_html/qa-theme/donut-theme/qa-donut-layer.php on line 274
0 like 0 dislike
20 views
Good afternoon.
There is a small Park servers.
about 6 on Ubuntu, as well as two load-bearing cars on kvm (proxmox) and Eshi, in which the same vmci with Ubuntu.

how, in your opinion, better to collect the logs?
basically with the car on Ubuntu, but of course it would be possible to take more data with kwmki (although it is mediocre, you basically want all your virtual machines, what's inside)

Now one of the cars aggregate the logs by plancom, convenient. but there's a limit seems to be 500MB in total.
sometimes it turns out that the day arrives many times more logs.

how can I alter the system? she not only collected data from all the cars, but also to draw/visualize it a La splunk or Elasticsearch.

by the way, the second also tried, but something didn't.
are there alternatives? preferably, free)

thanks for the reply!
by | 20 views

2 Answers

0 like 0 dislike
GrayLog
by
0 like 0 dislike
There is a popular stack ELK (Elasticsearch, Logstash, and Kibana).
Logstash "grinds" different types of logs and sends them to Elasticsearch. In turn for this database is drawn Kibana (UI). On servers running the agent Filebeat sending the logs to the Logstash server.

Is there another option (possibly) simpler, if the server has its own software, which can use a library/API to send logs to cloud services. There are all sorts of LaaS such as Loggly, Papertrail, and many others. Then you don't have to use their server capacity for processing and viewing logs.
by

Related questions

0 like 0 dislike
2 answers
asked Sep 8, 2019 by ivanKut
0 like 0 dislike
2 answers
asked Aug 11, 2019 by ivanKut
0 like 0 dislike
2 answers
asked Jul 1, 2019 by ivanKut
0 like 0 dislike
2 answers
asked Jul 18, 2019 by ivanKut
110,608 questions
257,187 answers
0 comments
40,796 users