Home Server 2023: Photo Backups

I’ve got a good terabyte of RAW photos I’m sitting on. They’ve had various homes throughout the years, from spinning disks, to SATA SSDs, to most recently a 2TB Samsung T5 USB SSD. I discussed my photo workflow a bit in this post here, but I didn’t detail how I kept these images backed up. For the longest while I’ve been relying on Amazon cloud photos running on my desktop to keep things synced to an online source, and with unlimited RAW storage for Prime members, it was a no-brainer. With the move over to the Macbook Pro and my storage sitting on a non-internal drive, I haven’t been running that. So I want to get started again on the backups now that I have the home server up and running, and I’m going to be trying to follow the 3-2-1 rule for data backups of my Lightroom classic catalog:

The basis here is you should create three separate copies of your data - A working copy, and two backups, with one of those backups located off site. I’ll be doing things slightly differently but still keeping things in the spirit of the rule:

  • My working copy will stay on the Samsung T5 portable SSD

  • I’ll be storing two backups on the home server, one to the 4TB SSD, and a copy from that made on the soon to be had 22TB Western Digital hard drive that will be in within a week or two.

  • I’ll be doing a daily backup from the server to a cloud storage provider like Backblaze or Wasabi.

For the Macbook to home server backups, I’ve created an unprivileged Debian container running Samba, and threw a mount point onto the 4TB SSD. I’ve also built a shell script to rclone the RAW file directory to the Samba share if my Macbook is A: docked and B: has the external SSD attached. The shell script is also setup to push a notification to me via Pushover on a successful backup with the details of the backup included:

function push {
    curl -s -F "token=APPTOKENHERE" \
    -F "user=USERTOKENHERE" \
    -F "title=Photo Library Backup Notification" \
    -F "message=$1" https://api.pushover.net/1/messages.json
}

MOUNTCHECK=$(df | awk '' | grep "/Volumes/External SSD")
ETHERNETCHECK=$(networksetup -listallhardwareports | grep "Thunderbolt Ethernet Slot 2")
TIMESTAMP=`date "+%Y/%m/%d %H:%M:%S"`
if [[ -z $MOUNTCHECK || -z $ETHERNETCHECK ]]; then
    echo "$TIMESTAMP INFO : One or more dependencies not available, not attempting backup." >> /path/to/log.txt 2>&1
    if [[ -z $MOUNTCHECK ]]; then
        echo "$TIMESTAMP INFO : The SSD is not mounted to /Volumes/External SSD." >> /path/to/log.txt 2>&1
    fi
    if [[ -z $ETHERNETCHECK ]]; then
        echo "$TIMESTAMP INFO : The computer is not docked." >> /path/to/log.txt 2>&1
    fi
    exit
else
    echo "$TIMESTAMP INFO : Depdendencies available, backing up photos to SMB share..." >> /path/to/log.txt 2>&1
    rclone copy --local-no-set-modtime --log-file=/path/to/log.txt 2>&1 --log-level INFO "/Volumes/External SSD/LightRoomPhotos" "photos:photos"
    OUT=$(tail -n 6 /path/to/log.txt | tr -s '[:blank:]' | tr "\t" " " )
    push "$OUT"
    exit
fi

I can pretty easily get this running daily at 2AM or whenever I decide it’ll be running by using launchd on MacOS. Considering I really only have this drive connected when I’m ingesting content from Creative Cloud into Lightroom Classic, I can just be sure to leave it connected overnight the day I do and the backup should happen. Logging is also setup so I can review as necessary.

Once it hits my Samba share, I’ll have a cron job on the Debian container running daily to rclone the directory over to the 22TB drive, and also schedule a currently undecided backup solution to run a backup on the SSD data to sync up to either Wasabi or Backblaze B2. I’m considering Restic for my backup option here, but will be exploring the options available before jumping.

Overall I think this will give me an advantage of being able to work off relatively fast local storage when I need to touch my Lightroom back catalog, while also maintaining a number of backups on my server and on the cloud. I’m also not tied to a network drive for working off my Lightroom back catalog in this case too!

It was a lot of fun learning more about zsh, how to use the Pushover API via curl, and getting all these little things working like rclone and Samba. It’s been a while since I dug in deep with this at home and it’s refreshing, and I really needed the refresher too!