Skip to main content

Local development and AWS

As of the time of writing, the site is hosted on Amazon Web Services (AWS), with our project itself being held in GitHub.

So, how do you streamline development so that everyone can stay in sync.

GitHub has been setup with pipelines that ensure when code is merged into Master, that the deployment is run on the platform, including Configuration Management importing, but Jeannie set that up, and I have no idea, so she will write one on that.

This article is about syncing the databases.

When someone adds content to the Production site, we need an easy way to get the database down to our local instance.

Amazon is a lot of things, but it's not very user friendly, so I wrote a script that does everything for us.

All we need to do is from our local command line (I use WSL Ubuntu, Jeannie uses Mac) we just run:

./scripts/local/pull-db.sh

And that creates a snapshot database on the AWS server, downloads it locally and imports it.

First I will give you the entire script, and then explain each section in detail:

The Local Script
#!/bin/bash

# Set variables
EC2_USER="ec2-user"
EC2_HOST="ec2-xx-xx-xx-xx.ap-southeast-2.compute.amazonaws.com"
PEM_KEY_PATH="$HOME/.ssh/xxxxxxxx.pem"
REMOTE_SCRIPT="/var/www/html/scripts/remote/export-db.sh"
LOCAL_BACKUP_DIR="$HOME/dispatch-backup"

# Run the remote script and capture the SQL file path
SQL_FILE_PATH=$(ssh -i "$PEM_KEY_PATH" "$EC2_USER@$EC2_HOST" "bash $REMOTE_SCRIPT")

echo "Remote SQL file path: $SQL_FILE_PATH"

read -p "Is this the correct file to download? (y/n): " confirm
if [[ "$confirm" != "y" && "$confirm" != "Y" ]]; then
  echo "Aborting."
  exit 1
fi

# Rsync the SQL file to your local machine
rsync -avz -e "ssh -i $PEM_KEY_PATH" "$EC2_USER@$EC2_HOST:$SQL_FILE_PATH" "$LOCAL_BACKUP_DIR"

# Check if rsync succeeded
if [[ $? -eq 0 ]]; then
  echo "File synced successfully. Deleting remote file..."
  ssh -i "$PEM_KEY_PATH" "$EC2_USER@$EC2_HOST" "rm -f '$SQL_FILE_PATH'"
  echo "Remote file deleted."
else
  echo "rsync failed. Remote file not deleted."
fi

# Ask if user wants to import the database locally
read -p "Do you want to import the database right away? (y/n): " confirm
if [[ "$confirm" != "y" && "$confirm" != "Y" ]]; then
  echo "Skipping local import. Done."
  exit 0
else
  echo "Importing database locally..."
  ddev drush sql-drop -y
  ddev drush sql-cli < "$LOCAL_BACKUP_DIR/$(basename "$SQL_FILE_PATH")"
  echo "Database imported."
fi
The Remote Script
#!/bin/bash

# Set variables
TIMESTAMP=$(date +"%Y%m%d_%H%M%S")
EXPORT_PATH="/home/ec2-user/db_backups"
EXPORT_FILE="drupal_db_$TIMESTAMP.sql"

# Ensure the export directory exists
mkdir -p "$EXPORT_PATH"

# Export the Drupal database using Drush
/var/www/html/vendor/bin/drush sql-dump > "$EXPORT_PATH/$EXPORT_FILE"

# Output file path for rsync
echo "$EXPORT_PATH/$EXPORT_FILE"

As an overview this is what happens:

First, we setup all the key variables we are going to need.

Second, we run a command that SSH's into the AWS EC2 instance and runs the remote script.

Second and a half, the remote script runs, which simply runs a drush sql-dump and creates a file with the current timestamp, and saves it in a specific location, and returns that location.

Thirdly, we ask the user to confirm with a simple Y/N if we have the right file.

Fourth, we rsync the SQL file to the local file system.

Fifth, we make sure the rsync has worked, and if it has, we delete the file from the remote server, because we don't want a bunch of backups cluttering up the space. If the rsync fails, the file remains so that you could SSH in yourself and rsync it manually if you want.

Sixth, the user is asked if they want to import it right away. This was a late addition to the script, as it used to just bring it down and I would manually use Drush to import it, but then I figured, why else would I want the database synced down if I'm not going to import it right away.

 

And that's it.

No need to trigger a job in Lagoon, download a .sql.tz file, extract it, use ahoy to import it.

Just run the one command and boom, your local is updated.