Hello:
Can anyone help with this? I attempted a domain backup with the options: "Backup virtual servers", "Download in browser", "Single archive file", in Virtualmin.
Due to another bug or limitation in virtualmin, the download was only at a moderate 450Kb/s or so, this is documented elsewhere, however I did not know about that limit when I started the backup.
My issue is the domain backup was about 20Gb. The server.....not underpowered by any means...took a while to write the tar or tar.gz file, then the download started. Because of the speed limit I had to abort the download. Thankfully the abort option in Webmin worked and server resource usage was back to normal.
The download was started as a 17 Gb tar.gz file.
I have unsuccessfully searched,as root for any large tar, tar.gz files, find / -name *.tar, etc. I found this in /tmp/.webmin:
-rw-r--r-- 1 root root 44 Jun 14 16:13 668358_31882_4_backup.cgi
But it is a small file, with only these text contents: /domains ./virtualmin-backup ./backup.lock
I need to know the flow of the backup, download to browser option. I think, but am not certain, there are some large files somewhere, which I would like to delete. Or does the abort option delete these?
Thanks for your help.