Difference between revisions of "Tutorial: PhpRunner 9.8 and Photo Display"

From dftwiki3
Jump to: navigation, search
(Create Photo Project)
 
(9 intermediate revisions by the same user not shown)
Line 2: Line 2:
 
----
 
----
 
<onlydft>
 
<onlydft>
=PhpRunner 9.8=
+
=PhpRunner 9.8 on MacBook Pro=
 
* Get IP of MacBook (192.168.1.151)
 
* Get IP of MacBook (192.168.1.151)
 
* Start Parallels
 
* Start Parallels
Line 44: Line 44:
 
echo "rsyncing php dir to hadoop01"
 
echo "rsyncing php dir to hadoop01"
 
rsync -azv ~/Documents/PHPRunnerOutput/* dominique@hadoop01.dyndns.org:/var/www/html/photos/
 
rsync -azv ~/Documents/PHPRunnerOutput/* dominique@hadoop01.dyndns.org:/var/www/html/photos/
 +
ssh -tt dominique@hadoop01.dyndns.org <<EOF
 +
chmod g+w /var/www/html/photos/files
 +
exit
 +
EOF
  
 
# dump photo database to sql file
 
# dump photo database to sql file
Line 240: Line 244:
  
 
</source>
 
</source>
</onlydft>
 
 
<br />
 
<br />
 
=Create Photo Project=
 
=Create Photo Project=
Line 270: Line 273:
 
* Style Editor
 
* Style Editor
 
::* Boostrap1/Darkly
 
::* Boostrap1/Darkly
 +
* Output Directory
 +
::* Z:Documents\PHPRunnerOutput\
 +
::* Server Database Connections
 +
::* Create new connection
 +
:::*$host="localhost";
 +
:::*$user="nohoSkies";
 +
:::*$pwd="xxxxxxxxxxx"
 +
:::*$port="";
 +
:::*$sys_dbname="NohoSkies";
 +
::* User new connection made
 +
<br />
 +
==Setup Server==
 +
* Make sure to create a '''files''' subdirectory and set its privileges for g+w
 +
==Setup PHPRunner Project on AWS==
 +
* Connect to AWS.amazon.com
 +
* Connect to EC2
 +
* Verify that can connect to running EC2 using PEM key in .ssh
 +
* create directory in /data/html: '''photos'''
 +
* chown '''photos''' to '''ubuntu:www-data'''
 +
* create '''files''' directory in photos
 +
* chmod '''files''' to '''og+w'''
 +
* rsync all files to /data/html/photos/ directory:
 +
 +
cd ~/Documents/PHPRunnerOutput/
 +
fullDir=/data/html/photos/
 +
rsync -az --progress -e "ssh -i ~/.ssh/mykeyDFT.pem" \
 +
      ~/Documents/PHPRunnerOutput/*                \
 +
      ubuntu@dominiquefthiebaut.com:$fullDir
 +
 +
* chown all the files to www-data:www-data:
 +
 
 +
fullDir=/data/html/photos/
 +
ssh -i "~/.ssh/mykeyDFT.pem" -t ubuntu@dominiquefthiebaut.com \
 +
      sudo chown -R www-data:www-data $fullDir 2> /dev/null
 +
 +
* Create MYSQL User and DB.  On EC2
 +
 +
mysql -u root -p
 +
create database NohoSkies;
 +
CREATE USER 'nohoSkies'@'localhost' IDENTIFIED BY 'xxxxxxxx';
 +
grant all privileges on NohoSkies . * to 'nohoSkies'@'localhost' ;
 +
FLUSH PRIVILEGES;
 +
quit
 +
 +
* Migrate MySQL database from MacBook to EC2
 +
 +
cd ~/Downloads/
 +
echo "USE NohoSkies;" > NohoSkies.sql
 +
mysqldump -u root -pvoidm%20  NohoSkies >> NohoSkies.sql
 +
 +
echo "rsyncing MySqlDump to EC2"
 +
rsync -az -e "ssh -i ~/.ssh/mykeyDFT.pem" \
 +
      NohoSkies.sql ubuntu@dominiquefthiebaut.com:Downloads/
 +
 +
echo "loading up MySQL dump in EC2 MySQL DB"
 +
ssh -i "~/.ssh/mykeyDFT.pem" -tt ubuntu@dominiquefthiebaut.com <<EOF
 +
mysql -u root -pvoidm%20 < ~/Downloads/NohoSkies.sql
 +
exit
 +
EOF
 +
 +
*
  
 +
==Cron Script to Backup S2 Photos directory+MySql to S3==
 +
<br />
 +
::<source lang="bash">
 +
#! /bin/bash
 +
# ~/bin/backupNohoSkies.sh
 +
# D. Thiebaut
 +
# --------------------------------------------------------------------------------
  
 +
# --------------------------------------------------------------------------------
 +
# BACKUP MYSQL
 +
# --------------------------------------------------------------------------------
 +
#
 +
echo mysqldump
 +
mysqldump --user=nohoSkies --password=voidm%20 NohoSkies > /home/ubuntu/backups/NohoSkies/NohoSkies.sql
 +
# 2> /dev/null
 +
# copy to S3
 +
cd /home/ubuntu/backups/NohoSkies
 +
gzip -f NohoSkies.sql
 +
echo uploading sql file to S3
 +
s3cmd  put  NohoSkies.sql.gz  s3://backup-dft-hadoop02/NohoSkies/
 +
 +
# --------------------------------------------------------------------------------
 +
# BACKUP PHP FILES (ON SUNDAYS ONLY)
 +
# --------------------------------------------------------------------------------
 +
cd /data/html/photos/
 +
_DATE="$(LC_ALL=C date +%A)"
 +
if test "$_DATE" = "Sunday"
 +
then
 +
  echo tar-zipping php files
 +
  tar --exclude='./files' -czf /home/ubuntu/backups/NohoSkies/NohoSkiesPhp.tgz *
 +
  echo uploading php files to s3
 +
  s3cmd put /home/ubuntu/backups/NohoSkies/NohoSkiesPhp.tgz s3://backup-dft-hadoop02/NohoSkies/
 +
fi
 +
 +
# --------------------------------------------------------------------------------
 +
# SYNC PHOTOS ONLY
 +
# --------------------------------------------------------------------------------
 +
cd /data/html/photos/files/
 +
s3cmd sync ./ s3://backup-dft-hadoop02/NohoSkies/files/
 +
 +
 +
</source>
 
<br />
 
<br />
<br />
+
</onlydft>
<br />
+
 
 
<br />
 
<br />
 
<br />
 
<br />

Latest revision as of 14:59, 8 June 2018

D. Thiebaut (talk) 10:38, 25 May 2018 (EDT)



...