09-13-2017 02:21 PM - edited 03-01-2019 06:07 PM
We are running Prime Infrastructure v3.1, and have a series of Compliance jobs we are running against our infrastructure.
One of these jobs runs a series of checks against our layer 2 switches using profiles, policies and rules. I just ran a job against all of our L2 switches, and it has 100,000 liines in it, most of them failures, which is OK.
If I use the GUI, I'm able to look at the output of these jobs, and see each and every success or failure.
I can easily, but manually, export these jobs to say a XLS file, and then run a script thay I wrote that breaks down a 20,000 line report into one sentence:
On DATE, X devices were found to pass, Y devices failed, and there
were Z violations of individual rules.
I would like to track these four variables on a dashboard that show progress over time. We'll likely use Splunk or Tableau for the final dashboard, or I could easily hack something out in say, RRDTool.
Over time, we'd like to see improvement. We's also use the report to target, um, opportunities to re-educate people if/when they make mistakes.
Alternatively, it would be great to see this type of dashboard in Prime, or even APIC-EM, but this high-level view appears to be missing.
The problem isn't generating the data. I can run a job on a schedulein Prime. We're thinking daily, by the way, as Compliance is becoming a really big deal for us. The problem is the machinations we need to go through to get this data out of Prime and delivered to the dashboard.
I'm hoping that there iis some way to get access to the report data via the API, or the CLI, and some means of delivering the exported report body elsewhere for processing.
Where are the reports stored on the Prime hardware?
What format are they in?
Are they accessible via the API?
Are they accessible via the CLI?
Is it possible to transfer the data externally?
Additionally, the GUI interface(Admin, Job Dashboard, Compliance Jobs) includes showing when a job ran in Prime. However, when the job is exported to a file, the name of the resulting file has the time and date of the export, and not the time and date the job ran. There does not appear to be any reference to the run date in the exported file either, unless I generate it with a rule. I'd probably try to use the date of the saved running config, but I view the issue as a bug, one that seems to have a trivial fix.
09-13-2017 03:02 PM
Where are the reports stored on the Prime hardware?
/localdisk/ftp/reports
What format are they in?
csv
Are they accessible via the API?
Yes.
Are they accessible via the CLI?
Yes.
Is it possible to transfer the data externally?
Yes.
09-19-2017 06:46 AM - edited 09-19-2017 07:02 AM
Flavio, I appreciate your response and I really apprecaite that they are all "Yes".
That being said, I don't seem to have /localdisk/ftp/reports in v3.1. I do have:
Directory of disk:/ 20 Aug 11 2017 09:51:32 crash 4096 Aug 14 2017 05:27:16 defaultRepo/ 4096 Aug 11 2017 15:26:00 ftp/ 16384 Apr 10 2016 23:35:13 lost+found/ 4096 Aug 11 2017 10:19:08 sftp/ 4096 Aug 11 2017 09:49:55 ssh/ 4096 Aug 11 2017 09:49:55 telnet/ 12288 Sep 19 2017 09:50:56 tftp/
The only thing in the ftp directory is something called reportsOnDemand. Not finding any of the reports I'm looking for, using "dir recursive".
Would be very interested in anything in JSON format, as we're trying to get some very particular data into Splunk.
09-19-2017 06:56 AM
Hello,
You can get a detailed API documentation Here
Example of JSON:
https://192.168.115.187/webacs/api/v1/data/ApiHealthRecords/42.json
{
"queryResponse" : {
"@type" : "ApiHealthRecords",
"@responseType" : "getEntity",
"@requestUrl" : "https : \/\/192.168.115.187\/webacs\/api\/v1\/data\/ApiHealthRecords\/42",
"@rootUrl" : "https : \/\/192.168.115.187\/webacs\/api\/v1\/data",
"entity" : {
"@dtoType" : "apiHealthRecordsDTO",
"@type" : "ApiHealthRecords",
"@url" : "https : \/\/192.168.115.187\/webacs\/api\/v1\/data\/ApiHealthRecords\/15",
"apiHealthRecordsDTO" : {
"@displayName" : "String value",
"@id" : "15",
"@uuid" : "String value",
"clientIp" : {
"address" : "192.168.115.243"
},
"method" : "String value",
"path" : "String value",
"query" : "String value",
"requestAcceptHeader" : "String value",
"requestReceivedTime" : 2,
"responseSentTime" : 2,
"responseStatus" : 1,
"responseTime" : 2,
"userAgent" : "String value",
"username" : "String value"
}
}
}
}
09-19-2017 07:44 AM
Hi,
I´m using 3.1 and I see the following directory:
/localdisk/ftp/reports
Are you root?
09-27-2017 02:27 PM
Do you have the reports set to export automatically? If so, you should see them under the /localdisk directory. If you don't see them, you might need to configure the reports to run automatically and save to disk.
ade # ls -l total 48 lrwxrwxrwx. 1 prime gadmin 20 Jul 28 15:27 crash -> /opt/CSCOlumos/crash drwxrwxrwx. 2 prime gadmin 4096 Sep 27 04:24 defaultRepo drwxrwxr-x. 3 prime gadmin 4096 Jul 28 15:17 ftp drwxrw----. 2 prime gadmin 16384 Mar 24 2017 lost+found drwxrwsr-x+ 3 prime gadmin 4096 Sep 27 14:34 sftp drwxrwxr-x. 2 prime gadmin 4096 Jul 28 15:26 ssh drwxrwxr-x. 2 prime gadmin 4096 Jul 28 15:26 telnet drwxrwxr-x. 3 prime gadmin 4096 Sep 27 15:44 tftp ade # du -s ftp/reports/*/ 200 ftp/reports/CPUUtilization/ 600 ftp/reports/Ethernet/ 3856 ftp/reports/OTN/ 92 ftp/reports/Physical/
09-28-2017 01:27 PM
Also don't know how to export the jobs automatically.
I was planning on extracting the JSON via, say
https://prime-b.mitre.org/webacs/api/v1/op/compliance/check.json?jobName=CR19111_M3_Compliance Audit Job_2_10_32_513_PM_9_18_2017
Such a query will pull the latest runID for the job.
Also, I seems to know the shell password, but don't have rights to perform a ls -l.
09-29-2017 05:44 AM
Well, you actually do have rights, but they don't allow you to see anything. :-)
The shell you fall into plops you into root's home directory. So what I do is "cd" which puts me in my home directory. Then you can do your usual administration.
09-29-2017 08:19 AM
Discover and save your favorite ideas. Come back to expert answers, step-by-step guides, recent topics, and more.
New here? Get started with these tips. How to use Community New member guide