cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
2004
Views
0
Helpful
5
Replies

Looking for an ELK virtual machine to collect Netflow.

andrea.meconi
Level 2
Level 2

Hello.

A customer of mine is using an Elastisearch/Logstash/Kabana solution to collect Netflow.

I'm looking for a virtual appliance that I can simply modify to learn these tools.

Many thanks for your suggestions.

Regards.

5 Replies 5

Jason Kopacko
Level 4
Level 4

Did you ever find anything? I built an entire environment for my company using ELK. It's amazing.

Do you mind elaborating, Jason?  I've been reading up on this and it looks interesting.  Thinking of building up a VM and playing around with it.

Details?  Screenshots?

Thanks for any information!

Scott,

I will try to write it out, high level. Keep in mind, I am not collecting Netflow (although it is easily possible). I have a full Riverbed Cascade suite setup for all my routers, switches, and firewalls.

What is it you are looking to accomplish? Logging? Alerting? etc

This would be for home/lab use initially, so mainly netflow/syslog.  The goal is for bandwidth monitoring and statistics.  But ideally I'd want to be able to see which host was talking to which endpoint & details (protocol, port, bytes, time, etc).

And in a utopia be able to start a PCAP from the web interface, but I know this is dreamland.

Right now I have mainly Cisco and Ubiquiti at home.  But I work with Riverbed at work, so I'm familiar with their lineup.

Any information you can provide would be appreciated.  I started to go down this how-to, but it appears to be very outdated so I didn't get very far.  Wasn't sure if it is worth it to figure it out from scratch.

Thanks again.

Ok understood.

First off, I'd forget the Netflow at the start. Once you have a working environment (which can be a much harder task than it seems), any of the special sauce will be easier after the fact.

So starting with capturing syslog is the best first goat rodeo.

That link is incredibly OLD and I would not follow it.

I have some notes I will post that I use for pushing out a ELK box but first, let me articulate "my" design.

I have 30'ish locations and 3 data centers.

Each site (data center included) has a dedicated syslog-ng box. Every device, at each site, points its syslog that IP address.

From there, syslog-ng pushes the sources up to Logstash (installed on the same box). All of the ELK stuff is Java and it is a pain/security risk giving those apps root access to ports below 1024. Plus syslog-ng is incredibly fast and power and functions as a one to many for me. The syslog-ng instance also pipes off a copy to a central Cacti instance for email alerting based on RegEx.

Once the log has been parsed by Logstash, it is configured to push out to each Data Center cluster of 10 Elasticsearch servers. I have a client node, setup that hooks into the cluster for the visualization of all the parsed data.

Bonus: All our windows servers have the Elastic winlogbeat client on them who point to the site local Logstash instance. Which then puts that data in the same pipeline into the data center clusters.

Most of that sounds simple and it is but is incredibly scalable. Since my design and deployment last year, we added 4 sites and 6 unique sources that were a breeze to add into the path. If I have a problem, I can re-route the output pre-parse with syslog-ng or post-parse with Logstash.

Hopefully that all makes sense.

Here are my basic install notes:

I use Ubuntu 16.04

  • Remove rsyslog(installed by default)
  • Install syslog-ng
  • Configure syslog-ng to listen on whatever port(s) you need.

From there the fun begins, installed Elasticsearch.

Those 4 commands and you are setup. There are a LOT of tweaks that you will want to do. I'll be happy to help you with those but worry about that after you are up and running. You will need to systemctl or update-rc.d to make sure the service starts at boot.

Your Elasticsearch config file is [ elasticsearch.yml ].

  • Next, install Kibana.
      • sudo apt-get update && sudo apt-get install kibana

You will need to systemctl or update-rc.d to make sure the service starts at boot.

Your Kibana config file is [ kibana.yml ]

  • Next, install NGINX.
      • sudo apt-get -y install nginx

Nginx is a reverse proxy, long story, good security measure.

  • Finally, install Logstash.
      • sudo apt-get update && sudo apt-get install logstash

Your Logstash config file is [ logstash.yml ]. You will need to systemctl or update-rc.d to make sure the service starts at boot.

Hopefully this helps you. I am glad to help however I can as getting this up and running has been incredibly eye opening for me as to all the data that can be gleaned. If you need help, you can post here or PM. I can talk on the phone or stand a WebEx that we can hop on.

Review Cisco Networking for a $25 gift card