Showing results for 
Search instead for 
Did you mean: 

Logstash configurations for parsing various Cisco Security Technologies Part 2 AMP


Today's logstash conf file is for AMP for endpoints. You will need an api created inside of the amp cloud dashboard. Inside of the two .sh files place the api key in place of yourkeyhere. You will also need the two .sh files i am providing inside of your /etc directory on linux. The needs an entry in /etc/crontab like so */5 * * * * root /etc/ so it runs every 5 mins. It will produce the file /etc/ which i am also providing. It creates the curl line with the proper previous 5 mins command. Make sure you make the owner of these files logstash. Last but not least is the amp.conf file for logstash. This takes the ingested json and creates parsed data for you. I was able to create some nice charts and graphs in kibana of the data. Make sure you remove the .txt part of the files before using. If you get stuck anywhere let me know.



Hey, this was great information.


Are you concerned about the timing of logstash and your update script?  Perhaps I'm not understanding it correctly, but I'm thinking you're missing between 0 - 299 seconds of data on each run.


Anyhow, comment #2: It looks like they have rabbitmq support, which logstash supports directly.  However, I wasn't able to get that to work.  I had the same issue as these guys:

I'm not sure if it is a Logstash bug, or a Cisco bug.  I'd lean to Logstash, only because the API works when accessed by different methods.


To get the rabbitmq stream to work, I ended up making my own python script that attaches to the stream and forwards it to the real logstash process.  I had trouble at first, but a Cisco TAC engineer gave me a test script showing the event stream in text output.  I adapted that same script to work for me.  Then I used portions of your config to import the data.  It would be cool if logstash could just collect it directly, but it keeps reverting to 'localhost' for some reason.