ELK Stack (Logstash, Elasticsearch and Kibana) with concurrent remote syslog server?

I'm building a log analyser service to start monitoring mainly our pfSense Firewalls, XenServer Hypervisors, FreeBSD/Linux servers and Windows servers.

There's a lot of documentation on the internet about the ELK stack and how to make it work nicely. But I would like to use it in a different manner, but I don't know if it's a good solution or just a waste of time/disk space.

I already have a FreeBSD 10.2 machine acting as a remote syslog server, and my ideia is to simply concentrate all the logs on this machine and them the syslog server forwards the logs with logstash-forwarder to the ELK server.

It's clear to me that this approach will raise the disk requirements for this setup, but in other hand I will have only one machine with the logstash-forwarder daemon installed, which seems good to me.

But talking about problems. The logstash parser matches [host] with the hostname of the server sending the log messages, and in this approach there's only on "server" show on ELK, the remote syslog server.

I'm aware that I can customize the settings on the logstash configuration files but I don't know (and I don't have the experience to know) if this is just a simple setting on the parsers of it if will compromise the entire ELK experience.

In the end I just want some advices about my logging architecture and if it will work, or if I should go without other option.

Thanks in advance,


Solution 1:

Yes. It is possible to change the host field in logstash output with ruby filter without much hassle.

    ruby {
            code => "
                    event['host'] = event['message'].split(' ')[3]
                   "
    }

Here I assumed in the syslog server logs, the host field is the fourth field where white space is the separator.