# Current Wiki Setup ## Hosting Currently using a free-tier AWS EC2 instance with the following parameters: - Instance type: t2.micro - vCPUs: 1 - RAM: 1024MiB - Region: us-west-1 - OS: Ubuntu 18.04 LTS - Storage: EBS 30GiB gp2 SSD Protip: save the `.pem` file into `.ssh` and make an alias in `.zshrc` to quickly connect to the EC2 instance, as such: ```bash alias dokuconnect="ssh -i ~/.ssh/.pem @.compute.amazonaws.com" ``` ## Domains All hosted on Namecheap since it's cheaper than Route53. 1. wiki.smirnov.nyc, set up A record to point here 2. sergey.wiki, set up CNAME and A record ## Software ### NGINX Stored the following config in `sites-available` and symlinked in `sites-enabled`: ```nginx server { listen 80; listen [::]:80; server_name sergey.wiki wiki.smirnov.nyc; return 301 https://$server_name$request_uri; } server { listen [::]:443 ssl; listen 443 ssl; server_name sergey.wiki|wiki.smirnov.nyc; # Maximum file upload size is 4MB - change accordingly if needed client_max_body_size 4M; client_body_buffer_size 128k; root /var/www/dokuwiki; index doku.php; ssl_certificate /etc/letsencrypt/live/sergey.wiki/fullchain.pem; # managed by Certbot ssl_certificate_key /etc/letsencrypt/live/sergey.wiki/privkey.pem; # managed by Certbot include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot #Remember to comment the below out when you're installing, and uncomment it when done. location ~ /(conf/|bin/|inc/) { deny all; } #Support for X-Accel-Redirect location ~ ^/data/ { internal ; } location ~ ^/lib.*\.(js|css|gif|png|ico|jpg|jpeg)$ { expires 365d; } location / { try_files $uri $uri/ @dokuwiki; } location @dokuwiki { # rewrites "doku.php/" out of the URLs if you set the userwrite setting to .htaccess in dokuwiki config page rewrite ^/_media/(.*) /lib/exe/fetch.php?media=$1 last; rewrite ^/_detail/(.*) /lib/exe/detail.php?media=$1 last; rewrite ^/_export/([^/]+)/(.*) /doku.php?do=export_$1&id=$2 last; rewrite ^/(.*) /doku.php?id=$1&$args last; } location ~ \.php$ { try_files $uri $uri/ /doku.php; include fastcgi_params; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_param REDIRECT_STATUS 200; fastcgi_pass unix:/var/run/php/php7.2-fpm.sock; # fastcgi_pass unix:/var/run/php5-fpm.sock; #old php version } } ``` ### DokuWiki Running 2018-04-22c "Greebo", setup as standard. Pending update to 2020-06-01 "Hogfather", but since it's a release candidate and first major release in two years, I'll hold off. #### Plugins Apart from the default extensions that come with Dokuwiki, I've found these plugins have greatly increased my usability of Dokuwiki: - [Add new page](https://www.dokuwiki.org/plugin:addnewpage) - [Advanced Dokuwiki](https://www.dokuwiki.org/plugin:advanced) - [Dw2PDF](https://www.dokuwiki.org/plugin:Dw2Pdf) - [GOTO](https://www.dokuwiki.org/plugin:goto) - [Indexmenu](https://www.dokuwiki.org/plugin:indexmenu) - [Markdowku](http://www.tolledomain.ch/dokuwiki/doku.php?id=projects:markdowku) - [nspages](http://www.dokuwiki.org/plugin:nspages) ### S3 Backup #### Initial setup 1. Create S3 bucket with unique name 2. Write down Access Key ID and Secret Access Key from your user page on AWS 3. Install AWS CLI on the EC2 instance 4. Install Python 3 and PIP3 5. Run `aws configure` and enter in your keys and region #### Python3 script Install `pip3 install boto3`, then modify the following file to your needs: ```python import boto3 import botocore import datetime import os import re import tarfile WIKI_PATH = "/var/www/dokuwiki" BACKUP_PATH = "~/backups" AWS_ACCESS_KEY = "your_access_key_here" AWS_SECRET_KEY = "your_secret_key_here" BUCKET_NAME = "your-bucket-name-here" BUCKET_KEY_PREFIX = "dokuwiki/" TARGET_DIRS = ['conf', 'data/attic', 'data/media', 'data/meta', 'data/pages'] dirs = [WIKI_PATH + '/' + d for d in TARGET_DIRS] datestring = re.sub('[^0-9]', '', datetime.datetime.now().isoformat()) filename = '{}/wiki-{}.tar.gz'.format(BACKUP_PATH, datestring) with tarfile.open(os.path.expanduser(filename), 'w:gz') as tar: for dir in dirs: tar.add(dir, arcname=os.path.basename(dir)) s3 = boto3.resource('s3') bucket = s3.Bucket(BUCKET_NAME) try: s3.Object(BUCKET_NAME, BUCKET_KEY_PREFIX + os.path.basename(filename)).put(Body=open(os.path.expanduser(filename), 'rb')) except botocore.exceptions.ClientError as e: error_code = int(e.response['Error']['Code']) print(error_code) ``` #### Cron job Scheduled to run daily at 10:00 UTC using `crontab`. Run `crontab -e` and add the following line: ``` 0 10 * * * python3 ~/scripts/s3-backup.py ```