This article explains how to create a tool that periodically backs up your Time Machine and uploads it to an Amazon S3 bucket. By using a custom shell script along with Swif’s scheduled command, you can automate the process—keeping your backups off local iCloud accounts and safely stored in the cloud. In this guide, the script is hosted on S3 and downloaded on-demand by the scheduled command.
Overview
The solution involves:
Creating a snapshot of your Time Machine backup.
Archiving the backup directory into a compressed file.
Uploading the archive to an S3 bucket using the AWS CLI.
Scheduling the script to run at regular intervals with Swif’s scheduled command.
Downloading the backup script from an S3 link each time it runs, ensuring you’re always using the latest version.
Prerequisites
Before proceeding, ensure that you have the following:
Time Machine Backup Drive: Confirm the backup drive is mounted (e.g.,
/Volumes/TimeMachineBackup
). Adjust the path if necessary.AWS CLI Installed and Configured: Install the AWS CLI and configure it with appropriate credentials that have write access to your designated S3 bucket.
Swif Scheduled Command Setup: Familiarize yourself with Swif’s scheduled command documentation to set up and run the script on a regular schedule.
S3 Hosting for the Script: Upload your backup script (detailed in the next section) to an S3 bucket or other location where it can be downloaded. Make sure the link is accessible to the system that will run the scheduled command.
Sufficient Local Disk Space: Temporary storage is needed for both the downloaded script and the archive file (stored here in
/tmp
).
The Backup and Upload Script
Below is a sample bash script that creates a Time Machine snapshot, compresses the backup directory, uploads the archive to your S3 bucket, and cleans up afterward. Save this script and then upload it to your S3 bucket (e.g., s3://your-s3-bucket/scripts/backup_tm_to_s3.sh
).
#!/bin/bash
set -euo pipefail
# Configuration variables
# Path to your mounted Time Machine backup drive (adjust as needed)
TIMEMACHINE_BACKUP_DIR="/Volumes/TimeMachineBackup"
# Local temporary directory for the archive file
LOCAL_TMP_DIR="/tmp"
# AWS S3 bucket destination (ensure the trailing slash is included if desired)
S3_BUCKET="s3://your-s3-bucket/time-machine-backups/"
# Timestamp format for the backup file name
TIMESTAMP=$(date +'%Y-%m-%d_%H-%M-%S')
# Archive file name
ARCHIVE_NAME="tm_backup_${TIMESTAMP}.tar.gz"
ARCHIVE_PATH="${LOCAL_TMP_DIR}/${ARCHIVE_NAME}"
# Create a local Time Machine snapshot (optional but recommended)
echo "Creating a Time Machine snapshot..."
tmutil localsnapshot
# Archive the Time Machine backup directory into a compressed file
echo "Archiving Time Machine backup from ${TIMEMACHINE_BACKUP_DIR}..."
tar -czf "${ARCHIVE_PATH}" -C "$(dirname "${TIMEMACHINE_BACKUP_DIR}")" "$(basename "${TIMEMACHINE_BACKUP_DIR}")"
# Upload the archive file to the specified S3 bucket using AWS CLI
echo "Uploading ${ARCHIVE_NAME} to S3 bucket ${S3_BUCKET}..."
aws s3 cp "${ARCHIVE_PATH}" "${S3_BUCKET}"
# Remove the local archive file to free up space
echo "Cleaning up local archive..."
rm "${ARCHIVE_PATH}"
echo "Backup and upload completed successfully."
Script Explanation
Snapshot Creation:
The commandtmutil localsnapshot
creates a local snapshot of your Time Machine data, ensuring you archive the latest state.Archiving:
The script usestar -czf
to compress the Time Machine backup directory. A timestamp is added to the archive file name to ensure uniqueness.Uploading:
The AWS CLI commandaws s3 cp
uploads the archive file to your specified S3 bucket. Ensure your AWS credentials are properly configured.Cleanup:
The temporary archive file is removed after a successful upload to free up disk space.
Scheduling the Script with Swif
Instead of assuming that the script is already present locally, you can configure Swif’s scheduled command to download the script from an S3 link each time it runs. Follow these steps:
Upload the Script to S3:
After creating the script as shown above, upload it to your S3 bucket (for example, tos3://your-s3-bucket/scripts/backup_tm_to_s3.sh
). Ensure that the file is publicly accessible or that the system executing the command has permission to download it.Configure the Scheduled Command:
In your Swif scheduled command settings, use a command similar to the following. This command downloads the script to a temporary location, makes it executable, and then runs it:#!/bin/bash
set -euo pipefail
# Define URL of the backup script on S3
SCRIPT_URL="https://your-s3-bucket.s3.amazonaws.com/scripts/backup_tm_to_s3.sh"
# Define a local path for the script
LOCAL_SCRIPT="/tmp/backup_tm_to_s3.sh"
echo "Downloading the backup script from ${SCRIPT_URL}..."
curl -sSL "${SCRIPT_URL}" -o "${LOCAL_SCRIPT}"
echo "Making the script executable..."
chmod +x "${LOCAL_SCRIPT}"
echo "Executing the backup script..."
"${LOCAL_SCRIPT}"Set the Schedule:
Using Swif’s interface, configure the above command to run at your desired interval. For example, to run the backup every day at 2 AM, set the appropriate schedule. Refer to Swif Scheduled Command Documentation for detailed scheduling options.
Important Considerations
Storage Space:
Time Machine backups can be large. Monitor the space on both your backup drive and your S3 bucket to avoid unexpected storage costs.Network Bandwidth:
Uploading large backups may consume significant bandwidth. Consider running the backup during off-peak hours.Security:
Ensure that your S3 bucket permissions and AWS credentials restrict unauthorized access. When hosting your script on S3, consider using signed URLs or appropriate access controls.Testing:
Test the entire process manually before scheduling it to verify that the script downloads, executes, and uploads the backup as expected.
Conclusion
By following this guide, you can automate the process of backing up your Time Machine data to Amazon S3. The approach leverages a downloadable script to ensure the most recent version is used for every scheduled run, while Swif’s scheduled command automates the process. This provides a robust cloud backup solution without relying on local iCloud accounts.
If you have any questions or need further assistance, please refer to our support documentation or contact our support team.