Date: 21-11-06 16:40
I have been backing up a mediawiki server for several months using the mysqldump option, (see code example 1). I'm afraid of re-importing the data when I need to, (due to box migration or crash, either of which should be coming soon). I'd love to hear your thoughts on how to make this easier, maybe scripting the entire process in 2 parts (Export/Import).
tar -cf - /usr/local/www/data-dist/* | gzip -c > ~/bak/$1_www_data.tar.gz
#mysql db backup
sqldump --user=root --password=******** wikidb > ~/bak/$1.mysql.dump
tar -cf - ~/bak/$1.mysql.dump | gzip -c > ~/bak/$1_mysql_dump.tar.gz
As you can clearly see I'm not using the xml option, I would if I thought it would be helpful, please explain.
Also, what about all of my graphic images on the server as well, I've been tar.gz'ing them with the script as well, but I'm not too sure how the images will fare in the tarring/untarring as well as how to automate or ease the transition when moving the Wiki database, pictures and all.
Any tips or pointers on this topic are appreciated as I'll have to use this soon in the real world.
Use the source Luke...
Post Edited (21-11-06 11:41)