Mediawiki maintenance: Difference between revisions
From wikinotes
(→Zim) |
|||
Line 96: | Line 96: | ||
{| class="wikitable" | {| class="wikitable" | ||
|- | |||
| [[xmldump2zim]] || create zimfile from a mediawiki XML dump | |||
|- | |- | ||
| [[wget-2-zim]] || bash script to scrape mediawiki to zimfile | | [[wget-2-zim]] || bash script to scrape mediawiki to zimfile |
Revision as of 14:18, 12 June 2022
Documentation
mediawiki static dump tools https://meta.wikimedia.org/wiki/Static_version_tools mediawiki dumpBackup xml https://www.mediawiki.org/wiki/Manual:DumpBackup.php
Backups
Full Backups
To create a full backup, you'll need to:
Backup Database
mysqldump -u wiki -pPASSWORD wikidb > ~/wikidb-backup.sqlBackup Images
TODO
Backup LocalSettings.php
TODO
Static HTML
Tools
mw2html static-wiki Home Grown
wikicode parsers
See page of mediawiki parsers here: http://www.mediawiki.org/wiki/Alternative_parsers
You can render wikicode to html using the actual parser
echo "'''foo'''" | php ${your_wiki}/maintenance/parse.php --title fooYou can find the wiki's contents in the database
SELECT * FROM text LIMIT 10 OFFSET 100;
wget
Captures/correct links, but not as relative links for me. technically can capture CSS too.
wget --recursive \ --page-requisites \ --adjust-extension \ --convert-links \ --no-parent \ -R "*Special*" \ -R "Special*" \ -R "*action=*" \ -R "*printable=*" \ -R "*oldid=*" \ -R "*title=Talk:*" \ -R "*limit=*" \ "https://yourwiki.com"Zim
xmldump2zim create zimfile from a mediawiki XML dump wget-2-zim bash script to scrape mediawiki to zimfile zim-tools includes zimwriterfs which dumps mediawiki to zimfile mwoffliner scrape a mediawiki to zimfile
Delete Revision History
cd /usr/local/www/mediawiki/maintenance php deleteOldRevisions.php --delete