Mediawiki
Apparence
Links
debug
LocalSettings.php |
$wgShowExceptionDetails = true; error_reporting( -1 ); ini_set( 'display_errors', 1 ); $wgDebugToolbar = true; $wgShowDebug = true; $wgDevelopmentWarnings = true; $wgShowDBErrorBacktrace = true; $wgShowSQLErrors = true; $wgDebugLogFile = "/tmp/mediawiki-debug.log"; |
php maintenance/eval.php > print $wgLocaltimezone Europe/Paris |
Job
By default, jobs are run at the end of a web request.
It is recommended that you instead schedule the running of jobs completely in the background via cron or a systemd service.
LocalSettings.php |
# sets to 0 to avoid jobs to run during ordinary requests. $wgJobRunRate = 0; |
Pour connaître le nombre de tâches dans la pile: api.php?action=query&meta=siteinfo&siprop=statistics&format=jsonfm |
# execute all jobs manually cd /var/www/mediawiki sudo -u www-data php maintenance/run.php --script runJobs.php --conf LocalSettings.php |
Systemd service
/etc/systemd/system/mw-jobs-queue.service |
[Unit] Description=Mediawiki jobs queue service [Service] ExecStartPre=/bin/sleep 60 ExecStart=/usr/local/bin/mw-jobs-runner Nice=10 ProtectSystem=full User=www-data OOMScoreAdjust=200 StandardOutput=journal [Install] WantedBy=multi-user.target |
/usr/local/bin/mwjobrunner |
#!/bin/bash MW_INSTALL_PATH="/var/www/mediawiki" RUN_JOBS="$MW_INSTALL_PATH/maintenance/run.php --script runJobs.php --conf $MW_INSTALL_PATH/LocalSettings.php --maxtime=3600" echo Starting Mediawiki jobs runner... while true; do # Job types that need to be run ASAP no matter how many of them are in the queue # Those jobs should be very "cheap" to run /usr/bin/php $RUN_JOBS --type="enotifNotify" # Everything else, limit the number of jobs on each batch # The --wait parameter will pause the execution here until new jobs are added, # to avoid running the loop without anything to do /usr/bin/php $RUN_JOBS --wait --maxjobs=20 # Wait some seconds to let the CPU do other things, like handling web requests, etc echo Waiting for 10 seconds... sleep 10 done |
Cron
# add a cron task to run all jobs periodically # edit crontab for user www-data sudo -u www-data crontab -e # run runJobs.php every 5 minutes */5 * * * * php /var/www/mediawiki/maintenance/run.php --script runJobs.php --conf /var/www/mediawiki/LocalSettings.php |
Forcer la mise à jour des catégories
sudo -u www-data php maintenance/run.php --script refreshLinks.php --conf LocalSettings.php |
Backup
Avant de réaliser un backup il faut geler le site afin que le backup soit cohérent.
Ensuite il faut réaliser un backup de la base de données et des fichiers.
Stopper les modifications du site
# Fichier LocalSettings.php # Verrouille la base de données et affiche le message sur les pages d'édition $wgReadOnly = 'Backup in progress, access will be restored shortly.'; |
Base de données
sudo mariadb-dump -x -e -B mediawiki -r mediawiki.sql |
Script
backup_mediawiki.sh |
#!/bin/bash set -u set -e set -o pipefail # configuration wiki_path='/var/www/wiki' backup_path='/root/backup/wiki' # functions freeze_website() { sed '2i\$wgReadOnly = "Backup in progress, access will be restored shortly.";' -i ${wiki_path}/LocalSettings.php } backup_files() { cd ${wiki_path%/*} tar cf - $(basename ${wiki_path}) | pigz > ${backup_path}/$(date +%Y-%m-%d-%H-%M-%S).tar.gz cd } backup_databases() { mysqldump mediawiki | pigz > ${backup_path}/$(date +%Y-%m-%d-%H-%M-%S)_wiki.sql.gz } unfreeze_websites() { sed '2d' -i ${wiki_path}/LocalSettings.php } rotate_backups() { find ${backup_path} -type f -mtime +7 -name '*.gz' -delete } ################################################## freeze_websites backup_files backup_databases unfreeze_websites rotate_backups |
Reduce size of the database
# Supprimer l'historique des pages supprimées sudo -u www-data php maintenance/deleteArchivedRevisions.php --delete # Supprimer l'historique des pages # php maintenance/rebuildall.php php maintenance/deleteOldRevisions.php --delete |