01-07-2020 12:18 PM
Hi,
Our DNA backups are scheduled daily.
Analytics (NFS based) backup seem to be incremental, but automation data (SFTP based) backup are not. Those backups add 15GB to our server every day… How can I keep control of the used space?
I could create a script to delete backups on the server, but what do I have to delete? Backup seem to be divided over 8 folders? And would DNA sync to that removal?
Or can I script something from DNA to delete old backups? (via API's?)
Thank you!
Kristof
Solved! Go to Solution.
06-21-2022 05:21 AM
On the remote backup server.
Sylvain.
06-04-2024 04:47 AM
Is there something new with Subject? we hit similar issue & would like to understand how to automate backup purging.
problem is our repository is a bit security hardened & there is no opportunity even to install python of needed version.
in such a hardened envs operating team would expect DNAC's API to suggest needed calls.
06-04-2024 05:18 AM
The simplest way to handle it in that situation is simply to delete folders based on age in a crontab on your repository server. This will delete any backup folders that are older than 7 days for example:
0 0 * * * find /{your directory}/* -mtime +7 -exec rm {} \;
04-28-2025 12:31 AM
I was hoping that managing backups could be easily done through the DNAC/CC APIs...
It seems that's not the case, as I couldn't find any available backup-related APIs.
I don't understand why not everything is available via the API... ?!
Discover and save your favorite ideas. Come back to expert answers, step-by-step guides, recent topics, and more.
New here? Get started with these tips. How to use Community New member guide