Difference between revisions of "Setup MediaWiki on AWS"

From dftwiki3
Jump to: navigation, search
 
(33 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
[[User:Thiebaut|D. Thiebaut]] ([[User talk:Thiebaut|talk]]) 16:44, 13 January 2018 (EST)
 
[[User:Thiebaut|D. Thiebaut]] ([[User talk:Thiebaut|talk]]) 16:44, 13 January 2018 (EST)
 
----
 
----
 +
<br />
 +
{|
 +
|
 +
<bluebox>
 +
This page contains very sketchy steps for setting up (in my case moving) an existing mediawiki installation to AWS.  This list is mostly for myself to remember what steps I have taken.
 +
</bluebox>
 +
|
 +
[[Image:MediaWiki.png|200px]]
 +
|}
 +
<br />
 +
<br />
 +
<onlydft>
 +
=Resources=
 +
 +
* [[Media:dftwiki_to_EC2_diagrams.key.zip]]
 +
 +
=Organization=
 +
[[Image:dftwiki_to_EC2.png|center|700px]]
 +
<br />
 +
[[Image:dftwiki_to_EC2_2.png|center|700px]]
 +
<br />
 +
[[Image:dftwiki_to_EC2_3.png|center|700px]]
 +
 +
<br />
 +
=Mediawiki Installation/Upgrade=
 +
* Follow the steps from [https://www.mediawiki.org/wiki/Manual:Upgrading https://www.mediawiki.org/wiki/Manual:Upgrading]
 +
* Download the new mediawiki in a directory belonging to the Web server
 +
* unzip/untar.
 +
* Create new MySql user/password/database on localhost (same host as Web server)
 +
* Verify that base directory of mediawiki installation is accessible from a browser by putting a phpinfo.php file in it and loading it up with browser.
 +
* use browser and point to /mw-config in the mediawiki web directory.  Enter database information.
 +
* Update the LocalSettings.php file
 +
* If migrating, copy the extensions, skins, images, media, Downloads directory from old wiki to new installed wiki.
 +
* Install one extension at a time in extensions directory, and at the end of LocalSettings.php
 +
==Updated Extensions==
 +
I had to update several extensions to match the new format.
 +
 +
<source lang="php">
 +
<?php
 +
// taken from http://www.mediawiki.org/wiki/Manual:Extensions (DFT)
 +
// Avoid unstubbing $wgParser on setHook() too early on modern (1.12+) MW versions, as per r35980
 +
 +
// To add new ips that are allowed, add them to the array ips in isAllowedIp()
 +
 +
//if ( defined( 'MW_SUPPORTS_PARSERFIRSTCALLINIT' ) ) {
 +
//  $wgHooks['ParserFirstCallInit'][] = 'onlydftSetup';
 +
//} else { // Otherwise do things the old fashioned way
 +
//  $wgExtensionFunctions[] = 'onlydftSetup';
 +
//}
 +
 +
$wgHooks['ParserFirstCallInit'][] = 'OnlyDftExtension::onParserSetup';
 +
 +
class OnlyDftExtension {
 +
      public static function onParserSetup( Parser $parser ) {
 +
      $parser->setHook( 'onlydft', 'OnlyDftExtension::onlyDftRender' );
 +
      }
 +
 +
      public static function onlyDftRender( $input, array $args, Parser $parser, PPFrame $frame ) {
 +
        global $wgUser;
 +
 +
$parser->disableCache();
 +
 +
        $userName = $wgUser->getName();
 +
        $output = $parser->recursiveTagParse( $input, $frame );
 +
        if ( strcmp( strtolower($userName), "thiebaut" ) == 0 ) {
 +
              $output = "<br /><hr />" .  $output . "<br><hr />";
 +
        }
 +
        else {
 +
              $output = "<br /><center><font color=\"orange\">...</font></center><br />\n";
 +
        }
 +
        return $output;
 +
      }
 +
}
 +
 +
</source>
 +
 +
=AWS=
 +
==EC2==
 +
* Go for free-tier t2-micro with attached EBS drive (20GB)
 +
* Set OS to Ubuntu Server 16.04. 
 +
* EC2 Insance i-0ad8bd01648543063
 +
* Install MediaWiki on it, without data
 +
* Install emacs
 +
* Make an AMI out of it
 +
* Move the images, extensions, skins, etc to newly installed MW.
 +
* Install ddclient.  Map domain to hadoop02.dyndns.org
 +
 +
==S3==
 +
* Create a bucket for backups (backup-dft-hadoop02)
 +
 +
==Backups==
 +
* Install duplicity and duply
 +
* In the process create a GnuPG key for encryption of backups on S3
 +
 +
  duply test create
 +
 +
: creates a profile to backup for MediaWiki in /data/html/dftwiki3
 +
* All information in /home/ubuntu/.duply/test/conf
 +
* Duplicity conf file:
 +
 +
::<source lang="text">
 +
GPG_KEY='88E405A1'
 +
GPG_PW='auxmachines'
 +
TARGET='s3://s3.amazonaws.com//backup-dft-hadoop02/dftwiki3'
 +
TARGET_USER='AKIAINNPVCF6X257DRPQ'
 +
TARGET_PASS='GE06l3VK1QHpxFB5Arywu6MJ3xaeWlNfDzoXdyns'
 +
SOURCE='/data/html/dftwiki3'
 +
MAX_AGE=1M
 +
</source>
 +
 +
* create script in /usr/local/bin, called backupDftWiki3MySql.sh to do mysqldump of MW to an sql file, and store it:
 +
<source lang="bash">
 +
#! /bin/bash
 +
# This will be called from a root cron job
 +
mysqldump --user=thiebaut --password=xxxxxxxxxx dftwiki3 > /data/html/dftwiki3/backups/dftwiki3.sql 2> /dev/null
 +
cd /data/html/dftwiki3/backups
 +
gzip -f  dftwiki3.sql
 +
</source>
 +
===Route 53===
 +
* Created new domain dominiquefthiebaut.com
 +
* Create record to map it to EC2 instance.
 +
* Find public IP of EC2 35.153.83.30 and put in edit-box in record.  Accept defaults.
 +
* Disable ddclient on EC2 and with dyndns.
 +
* Change the name of the host in /etc/hosts from hadoop02 to new hostname.
 +
 +
=Crontab=
 +
==Backup MediaWiki EC2 on AWS S3==
 +
* Crontab runs as root on ubuntu EC2
 +
::<source lang="text">
 +
# ------------------------------------------------
 +
# use mysqldump to dump mysql database for dftwiki3
 +
# into a file called dftwiki3.sql in /data/html/dftwiki/backups
 +
0 3 * * * /usr/local/bin/backupDftWiki3MySql.sh
 +
 +
# ------------------------------------------------
 +
# use duplicity (via duply) to backup /data/html/dftwiki3's main
 +
# directories to aws s3.  The configuration and information about
 +
# which S3 bucket is in ~ubuntu/.duply/test/conf
 +
0 4 * * * env HOME=/home/ubuntu duply test backup
 +
</source>
 +
 +
=Script to Copy MediaWiki from Xgridmac2 to EC2=
 +
 +
==On Xgridmac2==
 +
::<source lang="bash">
 +
#! /bin/bash
 +
 +
 +
# -------------------------------------------------------------------------
 +
# copy MySQL from cirrus to xgridmac2
 +
# -------------------------------------------------------------------------
 +
# backup dftwiki3 database to local zipped sql file on cirrus
 +
cd ~/dftwiki3
 +
ssh -t  dthiebaut@www.science.smith.edu '( mysqldump -u thiebaut -ptaratata dftwiki3 )' > dftwiki3.sql 2> /dev/null
 +
gzip -f dftwiki3.sql
 +
 +
# -------------------------------------------------------------------------
 +
# copy wiki directory tree on cirrus (only important files) to xgridmac2
 +
# -------------------------------------------------------------------------
 +
# create zipped tar of dftwiki images and media
 +
#ssh -t  dthiebaut@www.science.smith.edu '( cd /var/www/departments/cs/dftwiki ; tar -czf /Users/dthiebaut/temp/dftwiki.tgz images media skins  2> /dev/null )' 2> /dev/null
 +
#rsync -az -e ssh dthiebaut@www.science.smith.edu:temp/dftwiki3.tgz dftwiki.tgz 2> /dev/null
 +
 +
for dir in images media skins extensions ; do
 +
  rsync -az -e ssh dthiebaut@www.science.smith.edu:/var/www/departments/cs/dftwiki/$dir . 2> /dev/null
 +
done
 +
 +
# -------------------------------------------------------------------------
 +
# copy dftwiki3.sql from xgridmac2 to MySQL dftwiki3 database on
 +
# EC2 instance
 +
# -------------------------------------------------------------------------
 +
gunzip -c  dftwiki3.sql.gz |ssh -t ubuntu@dominiquefthiebaut.com "cat - | mysql -u thiebaut -ptaratata dftwiki3 " 2> /dev/null
 +
 +
# -------------------------------------------------------------------------
 +
# rsync skins images and media directories from ~/dftwiki3/ to EC2
 +
# in /data/html/dftwiki3/
 +
# -------------------------------------------------------------------------
 +
cd ~/dftwiki3/
 +
for dir in images/ skins/ media/ ; do
 +
    fullDir=/data/html/dftwiki3/$dir
 +
    rsync -az -e ssh --rsync-path="sudo rsync" $dir ubuntu@dominiquefthiebaut.com:$fullDir
 +
    ssh -t ubuntu@dominiquefthiebaut.com sudo chown -R www-data:www-data $fullDir 2> /dev/null
 +
 +
done
 +
 +
</source>
 +
 +
 +
<br />
 +
=Log of Transfer to Virtual Host=
 +
 +
== Current versions on wiki special page==
 +
 +
* http://www.science.smith.edu/dftwiki/index.php/Special:Version
 +
* Product Version
 +
* MediaWiki 1.20.5
 +
* PHP 5.5.9-1ubuntu4.20 (apache2handler)
 +
* MySQL 5.5.53-0ubuntu0.14.04.1
 +
 +
==  dftwiki/includes/DefaultSettings.php ==
 +
 +
/** Database host name or IP address */
 +
$wgDBserver = 'localhost';
 +
/** Database port number (for PostgreSQL) */
 +
$wgDBport = 5432;
 +
/** Name of the database */
 +
$wgDBname = 'my_wiki';
 +
/** Database username */
 +
$wgDBuser = 'wikiuser';
 +
/** Database user's password */
 +
$wgDBpassword = '';
 +
/** Database type */
 +
$wgDBtype = 'mysql';
 +
/** Whether to use SSL in DB connection. */
 +
$wgDBssl = false;
 +
/** Whether to use compression in DB connection. */
 +
$wgDBcompress = false;
 +
 +
 +
== dftwiki/LocalSettings.php ==
 +
 +
 +
## Database settings
 +
$wgDBtype          = "mysql";
 +
# change to localhost on 7/26/11
 +
#$wgDBserver        = "maven.smith.edu";
 +
#$wgDBserver        = "scinix.smith.edu";
 +
$wgDBserver        = "localhost";
 +
$wgDBname          = "dftwiki";
 +
$wgDBuser          = "thiebaut";
 +
$wgDBpassword      = "tataratata";
 +
 +
 +
== Backup dftwiki on cirrus ==
 +
 +
* on xgridmac2
 +
ssh -Y dthiebaut@www.science.smith.edu
 +
cd /var/www/departments/cs/dftwiki
 +
 +
cd /var/www/departments/cs
 +
tar --ignore-failed-read -czf dftwiki_010418.tgz dftwiki/
 +
 +
cd /var/www/departments/cs
 +
mysqldump --user=thiebaut --password=taratata dftwiki > dftwiki_010418.sql
 +
tar -czvf dftwiki_010418.sql.tgz dftwiki_010418.sql
 +
 +
 +
* rsync tgz files to xgridmac2, then to hadoop0:temp/
 +
 +
== Add VM with Ubuntu server on hadoop0 ==
 +
 +
== Run  VirtualBox on hadoop0 to create hadoop01 ==
 +
 +
== Once on Hadoop01 ==
 +
 +
* Install ddclient
 +
# Configuration file for ddclient generated by debconf
 +
#
 +
# /etc/ddclient.conf
 +
 +
protocol=dyndns2
 +
use=web, web=checkip.dyndns.com, web-skip='IP Address'
 +
server=members.dyndns.org
 +
login=dthiebaut
 +
password='super##man'
 +
hadoop01.dyndns.org
 +
 +
* sudo apt-get install emacs
 +
* set network to bridge
 +
* reset hadoop01
 +
* edit /etc/apache2/mysql/mysqld…/*.conf and replace 127.0.0.1 by 0.0.0.0 to allow access from outside
 +
* on hadoop01
 +
 +
mysql -u root -p
 +
 +
* create user dftwiki with password taratata
 +
 +
    create user ‘thiebaut’@‘localhost’ identified by ‘taratata’;
 +
    grant all privileges on * . * to ‘thiebaut’@‘localhost';
 +
 +
    CREATE DATABASE dftwiki;
 +
    grant all privileges on dftwiki . * to 'thiebaut'@'localhost';
 +
   
 +
 +
* on bash prompt
 +
* add “USE dftwiki;” as first line of mysqldump sql file
 +
* echo “USE dftwiki;” > newfile.sql
 +
* cat dftwiki_010418.sql >> newfile.sql
 +
* mv newfile.sql dftwiki_010418.sql
 +
 +
== restore database ==
 +
 +
* mysql -u thiebaut -p < dftwiki_010418.sql
 +
 +
::(takes a long time)
 +
 +
*  sudo apt-get install php5.6-gd php5.6-mysql php5.6-dom php5.6-cli php5.6-json php5.6-common php5.6-mbstring php5.6-opcache php5.6-readline
 +
 +
* download mediawiki-1.27.4
 +
 +
 +
 +
==dftwiki==
 +
 +
* dftwiki works
 +
 +
==dftwiki2==
 +
 +
* Download media wiki 1.27
 +
* Install new skin: Chameleon:
 +
* Install composer
 +
* do installation of chameleon skin
 +
* edit composer.json and add
 +
 +
    "mediawiki/chameleon-skin": "~1.0"
 +
 +
:: in composer.json, in require section.  Don’t forget comma before that line
 +
 +
* composer update "mediawiki/chameleon-skin"
 +
 +
* Remove discussion tab:
 +
 +
* type “MediaWiki:Common.css” in wiki search box, then edit window and add
 +
 +
#ca-talk {
 +
    display: none !important;
 +
}
 +
 +
:: to it.
 +
 +
* Updated onlydft, and only smith
 +
 +
* Meta Tag extension not working…  Need to put into a class?
 +
 +
== upgrade to MW 1.30 ==
 +
 +
* make dominique use www-data as default group
 +
 +
sudo usermod -g www-data dominique
 +
 +
* backup mysql database and directory tree
 +
* for mysql backup dftwiki2 from hadoop0 into a sql file
 +
* for directory tree, just create a zip file and rsync it somewhere else.
 +
 +
* create new database and copy sql backup into it.
 +
 +
* download mediawiki 1.30
 +
* untar
 +
* put in /var/www/html/dftwiki3
 +
 +
* point browser to hadoop01.dyndns.org/mw-config
 +
* give info on database containing copy of working database
 +
 
 +
* Create chameleon skin:
 +
 +
*  emacs -nw composer.json , and add  "mediawiki/chameleon-skin": "~1.0" in the “require”
 +
section
 +
 +
* sudo composer update "mediawiki/chameleon-skin"
 +
 +
* import/rsync one extension at a time into LocalSettings.php
 +
 +
* Pick chameleon and navhead as the default font.
 +
 +
</source>
 +
 +
=Install LAMP on Ubuntu EC2=
 +
 +
 +
== Install LAMP on Ubuntu on AWS==
 +
 +
* Pick free-tier
 +
* Ubuntu server
 +
 +
sudo apt-get update
 +
sudo apt-get dist-upgrade
 +
sudo apt-get install apache2
 +
sudo apt-get install emacs
 +
sudo a2enmod rewrite
 +
sudo apt-get install php libapache2-mod-php php-mcrypt
 +
ip addr show eth0 | grep inet | awk '{ print $2; }' | sed 's/\/.*$//'
 +
sudo apt-get install mysql-server
 +
sudo adduser ubuntu www-data
 +
sudo emacs /etc/hosts -nw
 +
 +
cat /etc/hosts
 +
 +
127.0.0.1 localhost
 +
127.0.0.1 ip-172-30-0-30
 +
 +
# The following lines are desirable for IPv6 capable hosts
 +
::1 ip6-localhost ip6-loopback
 +
fe00::0 ip6-localnet
 +
ff00::0 ip6-mcastprefix
 +
ff02::1 ip6-allnodes
 +
ff02::2 ip6-allrouters
 +
ff02::3 ip6-allhosts
 +
 +
 +
=Setup S3cmd on hadoop0=
 +
 +
* install s3cmd
 +
* s3cmd --configure
 +
::* Access Key: AKIAINNPVCF6X257DRPQ
 +
::* Secret Key: GE06l3VK1QHpxFB5Arywu6MJ3xaeWlNfDzoXdyns
 +
::* encryption key: {eZDfswapkcoTXZoKBxLKw2ZmNNwiTKb
 +
* works
 +
* for first backup, go to hadoop0, in 1TB, create s3 dir, then backup-dft-hadoop02 subdir, cd there and issue command:
 +
 +
s3cmd sync -r s3://backup-dft-hadoop02 .
 +
 +
: and it backs s3 down.
 +
* created crontab entry on hadoop0:
 +
 +
0 5 * * 0 /usr/bin/s3cmd sync -q -r s3://backup-dft-hadoop02/ /mnt/1TB/s3/backup-dft-hadoop02/
 +
 +
 +
</onlydft>
 +
<br />
 +
<br />
 +
<br />
 +
<br />
 +
<br />
 +
<br />
 +
[[Category:AWS]][[Category:Mediawiki]][[Category:EC2]][[Category:S3]]

Latest revision as of 18:42, 24 January 2018

D. Thiebaut (talk) 16:44, 13 January 2018 (EST)



This page contains very sketchy steps for setting up (in my case moving) an existing mediawiki installation to AWS. This list is mostly for myself to remember what steps I have taken.

MediaWiki.png




...