cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
466
Views
0
Helpful
0
Replies

High Load average

Cisco Freak
Level 4
Level 4

Hello,

 

I am seeing high load average in vManage. Is this the expected value for 32 core device?

 

vManage:~$ top
top - 22:31:04 up 14 days, 15:58,  4 users,  load average: 472.39, 474.76, 471.29
Tasks: 388 total,   7 running, 381 sleeping,   0 stopped,   0 zombie
Cpu0  : 17.8%us, 15.4%sy,  0.0%ni, 52.9%id,  3.1%wa,  0.0%hi, 10.4%si,  0.4%st
Cpu1  : 15.4%us, 15.8%sy,  0.0%ni, 66.2%id,  1.3%wa,  0.0%hi,  0.9%si,  0.4%st
Cpu2  : 13.9%us, 15.7%sy,  0.0%ni, 67.5%id,  1.4%wa,  0.0%hi,  1.1%si,  0.4%st
Cpu3  : 60.0%us,  8.5%sy,  0.0%ni, 30.8%id,  0.8%wa,  0.0%hi,  0.0%si,  0.0%st
Cpu4  : 10.8%us, 14.7%sy,  0.0%ni, 72.5%id,  0.8%wa,  0.0%hi,  1.2%si,  0.0%st
Cpu5  : 39.1%us, 10.9%sy,  0.0%ni, 48.6%id,  0.4%wa,  0.0%hi,  0.7%si,  0.4%st
Cpu6  : 28.9%us, 13.6%sy,  0.0%ni, 56.4%id,  0.4%wa,  0.0%hi,  0.4%si,  0.4%st
Cpu7  : 14.2%us, 14.2%sy,  0.0%ni, 70.3%id,  0.4%wa,  0.0%hi,  0.4%si,  0.4%st
Cpu8  : 52.1%us,  8.6%sy,  0.0%ni, 39.0%id,  0.0%wa,  0.0%hi,  0.0%si,  0.3%st
Cpu9  :  9.1%us, 21.2%sy,  0.0%ni, 67.8%id,  0.4%wa,  0.0%hi,  1.1%si,  0.4%st
Cpu10 : 16.7%us, 18.3%sy,  0.0%ni, 63.8%id,  0.4%wa,  0.0%hi,  0.8%si,  0.0%st
Cpu11 : 88.8%us,  5.6%sy,  0.0%ni,  5.6%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Cpu12 : 17.4%us, 15.9%sy,  0.0%ni, 64.8%id,  0.7%wa,  0.0%hi,  1.1%si,  0.0%st
Cpu13 : 52.4%us,  8.7%sy,  0.0%ni, 38.2%id,  0.3%wa,  0.0%hi,  0.0%si,  0.3%st
Cpu14 :  9.2%us, 15.4%sy,  0.0%ni, 73.7%id,  0.4%wa,  0.0%hi,  0.9%si,  0.4%st
Cpu15 :  6.7%us, 12.9%sy,  0.0%ni, 79.0%id,  0.5%wa,  0.0%hi,  0.5%si,  0.5%st
Cpu16 : 14.5%us, 15.6%sy,  0.0%ni, 68.0%id,  0.7%wa,  0.0%hi,  0.7%si,  0.4%st
Cpu17 :  8.8%us, 17.1%sy,  0.0%ni, 73.3%id,  0.5%wa,  0.0%hi,  0.0%si,  0.5%st
Cpu18 : 53.1%us,  7.9%sy,  0.0%ni, 38.7%id,  0.0%wa,  0.0%hi,  0.3%si,  0.0%st
Cpu19 : 15.6%us, 14.1%sy,  0.0%ni, 68.5%id,  0.4%wa,  0.0%hi,  1.1%si,  0.4%st
Cpu20 :  7.8%us, 15.3%sy,  0.0%ni, 75.0%id,  0.4%wa,  0.0%hi,  1.1%si,  0.4%st
Cpu21 : 11.7%us, 15.5%sy,  0.0%ni, 71.3%id,  0.4%wa,  0.0%hi,  0.8%si,  0.4%st
Cpu22 : 50.5%us,  7.2%sy,  0.0%ni, 41.6%id,  0.0%wa,  0.0%hi,  0.3%si,  0.3%st
Cpu23 : 48.6%us,  9.4%sy,  0.0%ni, 42.0%id,  0.0%wa,  0.0%hi,  0.0%si,  0.0%st
Cpu24 :  9.7%us, 12.7%sy,  0.0%ni, 75.7%id,  0.4%wa,  0.0%hi,  1.1%si,  0.4%st
Cpu25 : 12.8%us, 12.0%sy,  0.0%ni, 74.1%id,  0.4%wa,  0.0%hi,  0.4%si,  0.4%st
Cpu26 : 10.7%us, 13.5%sy,  0.0%ni, 75.0%id,  0.0%wa,  0.0%hi,  0.8%si,  0.0%st
Cpu27 :  8.1%us, 13.3%sy,  0.0%ni, 77.0%id,  0.0%wa,  0.0%hi,  1.2%si,  0.4%st
Cpu28 : 10.3%us, 16.6%sy,  0.0%ni, 71.9%id,  0.4%wa,  0.0%hi,  0.4%si,  0.4%st
Cpu29 : 15.5%us, 14.7%sy,  0.0%ni, 69.0%id,  0.0%wa,  0.0%hi,  0.4%si,  0.4%st
Cpu30 : 14.9%us, 14.1%sy,  0.0%ni, 69.5%id,  0.4%wa,  0.0%hi,  0.8%si,  0.4%st
Cpu31 : 10.3%us, 14.3%sy,  0.0%ni, 73.2%id,  1.1%wa,  0.0%hi,  0.7%si,  0.4%st
Mem:  60759504k total, 59652240k used,  1107264k free,  1261432k buffers
Swap:        0k total,        0k used,        0k free,  4183972k cached

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND
25700 vmanage   20   0 40.1g  17g  13m S 1086 30.5 172260:16 java
22637 vmanage   20   0 23.1g 5.9g  14m S  193 10.3  43182:00 java
28499 root      20   0  445m 347m 3000 S  166  0.6   7250:13 python
28500 root      20   0  489m 392m 3052 R   30  0.7 695:49.99 python
 1448 vmanage   20   0  513g  16g 1.3g S   26 28.3   9166:46 java
 6214 root      20   0     8    4    0 R   16  0.0   0:00.47 resolvd
 5330 root      20   0  202m  55m 1752 S   12  0.1   1072:32 resolvd
 6224 root      20   0 10532  908  780 D    9  0.0   0:00.28 logrotate
 5426 root      20   0  315m  26m 3300 S    8  0.0   1542:13 vdaemon
 5354 root      20   0  316m  27m 3300 R    8  0.0   1613:48 vdaemon
 5362 root      20   0  314m  25m 3308 S    7  0.0   1400:28 vdaemon
 5374 root      20   0  332m  27m 3300 S    7  0.0   1575:55 vdaemon
 5390 root      20   0  315m  26m 3300 S    7  0.0   1517:45 vdaemon
 6203 root      20   0 13576 1140  968 S    7  0.0   0:00.20 finish
 6240 root      20   0   212    4    0 R    6  0.0   0:00.19 dmidecode
 5341 root      20   0  315m  27m 3344 S    6  0.0   1582:38 vdaemon
 5363 root      20   0  315m  26m 3300 S    5  0.0   1456:11 vdaemon
 5409 root      20   0  305m  25m 3300 S    5  0.0   1414:51 vdaemon
24471 vmanage   20   0 52.8g 3.0g  12m S    5  5.2   1065:55 java
 5455 root      20   0 62884 5628 1788 S    4  0.0   1076:01 vtracker
 6176 root      20   0     0    0    0 R    4  0.0   0:00.12 sh
 6220 root      20   0 13576  640  436 S    3  0.0   0:00.09 finish
 6226 root      20   0 13576  656  452 S    3  0.0   0:00.09 finish
 5324 root      20   0 93276 6876 2092 S    3  0.0 137:25.20 cfgmgr
 6194 root      20   0     8    4    0 R    3  0.0   0:00.08 python
  401 root      20   0     0    0    0 S    2  0.0 612:27.09 kjournald
    8 root      20   0     0    0    0 S    1  0.0 262:57.52 rcu_preempt
 4651 root      20   0 13860 1780 1244 S    1  0.0 379:30.67 sysmgr_stats_st
  272 root       0 -20     0    0    0 S    1  0.0 279:36.47 kworker/0:1H
 4479 root      20   0  790m  88m 3140 S    1  0.1 507:16.93 confd
  486 root      20   0 14368 2204 1144 S    1  0.0 142:20.71 run
  494 root      20   0  293m 6580 2508 S    1  0.0  93:54.80 sysmgrd
28501 root      20   0  109m  16m 2824 S    1  0.0   2:04.82 python
  462 root      20   0  7676  380  312 S    0  0.0  19:31.06 runsv
 5336 root      20   0 65456 5528 1628 S    0  0.0  10:21.58 snmpd
11453 root      20   0 10468  720  616 S    0  0.0   0:01.11 cmdptywrapper
14181 root      20   0 10468  720  612 S    0  0.0   0:04.46 cmdptywrapper
19732 root      20   0  8392  748  644 S    0  0.0   0:00.73 confd_cli
26574 root      20   0     0    0    0 S    0  0.0   8:41.36 kworker/23:1
    1 root      20   0  7808  624  536 S    0  0.0   0:27.94 runsvdir
vManage:~$

vManage# sh ver
18.4.302
0 Replies 0