10-23-2018 01:45 AM
Hi,
I'm developing an automated VA scans solution for Cisco routers and need to query the openVuln API to fetch some metadata about found vulnerabilities.
Some devices have more than 10 vulnerabilities found and since my API calls are launched in parallel, this breaks the current 10 calls per second limit.
We have over 10000 managed Cisco devices, so throttling the API queries is going to make things quite slow.
Is there any way to raise the API calls limit or alternatively provide an API endpoint to query multiple Advisories in a single API call ?
Thanks & Regards
10-23-2018 02:28 AM
Hi there,
Can I make a suggestion to the logic of your program.
Before making an API call, aggregate your devices information together. Group by platform an the IOS version. While you may have 10000 distinct devices, you probably only have about 100 variants/ combinations.
I wrote two scripts which did this. inventoryCollect.py gathered the device info from APIC-EM and ordered it in a data-structure. inventoryAnalyse.py took that data-structure and queried openVuln for the unique platform/ IOS combinations.
https://configif.wordpress.com/2017/07/14/apic-em-collectinventory/
https://configif.wordpress.com/2017/07/19/apic-em-inventoryanalyse/
I tested it on a network with only 600+ devices but never had an issue with slow results being produced.
cheers,
Seb.
10-23-2018 04:45 AM
Hi Seb,
Thanks for your inputs.
I was thinking as well to group them by OS version but given our heterogeneous Cisco routers estate, I'm not 100% confident we'll not hit the API rate limit.
So I found two working solutions which I have tested this afternoon:
1. My app is developed in Golang so I leverage goroutines to have my openVuln requests run in parallel. What I can do is to throttle each goroutine with a ticker channel:
// We set a rate limit to throttle Goroutines querying openVuln API. // This is to overcome Cisco openVuln API rate limiting (10 calls / second) rateLimit := time.Tick(200 * time.Millisecond) // Add the number of found of vulnerabilities to match the number of goroutines we're launching
wg.Add(vulnCount) // Loop to search for found vulnerabilities in the scan report and fetch metadata for each // vulnerability in a goroutine for _, ruleResult := range scanReport.RuleResults { if ruleResult.RuleResult == "fail" { go func(r *ScanReportFileResult) { defer wg.Done() <-rateLimit vulnMeta, err := openvulnapi.GetVulnMetaData((*r).RuleIdentifier[0].ResultCiscoSA) if err == nil { vulnMetaSlice = append(vulnMetaSlice, (*vulnMeta)[0]) } else { log.Println("error when fetching vulnerability metadata for:", (*r).RuleIdentifier[0].ResultCiscoSA, ":", err) } }(ruleResult) } } wg.Wait()
With this solution, at least I don't get any rate limit error.
2. I've noticed openVuln has an API endpoint to pull the entire Cisco Security Advisories ( https://api.cisco.com/security/advisories/all.json ). So I could persist it in a DB and lookup there instead of directly to openVuln API.
Regards
10-23-2018 05:20 AM
I hadn't noticed the all.json call before. IT would certainly be quicker to munge through the dataset on a local disk.
There is also the option of creating multiple service accounts for the openVuln API. You could allocate on to each thread to get past the calls per second limit. I don't think the API would check on source IP of the calls.
cheers,
Seb.
Discover and save your favorite ideas. Come back to expert answers, step-by-step guides, recent topics, and more.
New here? Get started with these tips. How to use Community New member guide