01-28-2019 01:04 PM - edited 02-20-2020 09:07 PM
USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
Hi
I have AMP version - AMP for Endpoints Connector v1.9.1.603
Redhat release 7.4
And i have AMP processes being among top memory consumers. Are there any suggestions around this?
I already have bunch of exclusions configured.
puppet 18076 71.5 70.9 15841620 11546712 ? Sl 15:02 253:04 /usr/bin/java -Xms11g -Xmx11g -XX:MaxPermSize=1g -Djava.security.egd=/dev/urandom -XX:OnOutOfMemoryError=kill -9 %p -cp /opt/puppetlabs/server/apps/puppetserver/puppet-server-release.jar clojure.main -m puppetlabs.trapperkeeper.main --config /etc/puppetlabs/puppetserver/conf.d --bootstrap-config /etc/puppetlabs/puppetserver/services.d/,/opt/puppetlabs/server/apps/puppetserver/config/services.d/ --restart-file /opt/puppetlabs/server/data/puppetserver/restartcounter
root 740 6.6 22.2 8471408 3614112 ? Sl 20:01 3:34 /opt/cisco/amp/bin/ampscansvc -cl3 28
root 5619 0.0 0.2 303688 34076 ? Ssl 2018 3:45 /opt/puppetlabs/puppet/bin/ruby /opt/puppetlabs/puppet/bin/puppet agent
root 24891 0.7 0.1 87100 26928 ? S 20:54 0:00 /usr/bin/perl -x /etc/monitor/bin/ser_mon
root 13451 0.4 0.1 2350296 25260 ? Ssl Jan08 140:46 /opt/cisco/amp/bin/ampdaemon
root 25024 0.3 0.1 85516 24080 ? S 20:55 0:00 /usr/bin/perl -x /etc/monitor/bin/ser_mon
tet-sen+ 20392 0.2 0.1 457084 21404 ? Sl Jan06 84:56 tet-sensor -f sensor.conf
root 25029 0.1 0.1 85516 18272 ? S 20:55 0:00 /usr/bin/perl -x /etc/monitor/bin/ser_mon
root 25030 0.0 0.0 85516 14460 ? S 20:55 0:00 /usr/bin/perl -x /etc/monitor/bin/ser_mon
01-28-2019 01:16 PM
It looks like AMP is using 22.2% of memory from that data. Depending on what you were doing at the time, that isn't unreasonably high. That being said, the best way to see what files AMP is processing would be to run the connector in debug for about 30 minutes of normal use, generate a diagnostic file, and look at the fileops and execs documents to see the highest files and processes monitored.
The steps to do so can be found here:
01-28-2019 01:26 PM
Thanks for the reply. I agree that particular point isnt very high, but i did notice, it takes even higher memory -
It's a concern as before deploying the agent, there werent issues of out-of-meory on the host. As you are suggesting, i will also run debug.
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 740 root 20 0 7961524 7.0g 948 S 91.1 45.0 3:18.04 ampscansvc
Discover and save your favorite ideas. Come back to expert answers, step-by-step guides, recent topics, and more.
New here? Get started with these tips. How to use Community New member guide