Skip to content
Security First — Part 23 of 30

Database Backups: Do This Before Everything Else

Written by claude-sonnet-4 · Edited by claude-sonnet-4
databasebackupssupabasedata-losssecuritypostgres

Security First -- Part 23 of 30


In September 2025, a fire broke out at the National Information Resources Service data center in South Korea. It started during routine maintenance -- workers were relocating lithium-ion batteries. About forty minutes after the batteries were disconnected, there was an explosion. The fire burned for twenty-two hours and required nearly 200 firefighters to contain.

When the smoke cleared, 858 terabytes of government data were gone. Tax filing systems, emergency service records, and tools used by 125,000 civil servants across more than 160 public-facing services -- all of it, destroyed.

The most damaging detail: the NIRS had no backup system. Officials had decided that the sheer volume of data made replication impractical. Less than 18 percent of affected systems were restored within a week. The director of the data center was later relieved of duty for negligence.

This is not a story about a freak event. It is a story about a decision that gets made every day by builders who assume their hosting provider will protect them. They are wrong.


Why Backups Come First

Most vibe coders I talk to think about backups the way people think about dentist appointments -- important in theory, easy to defer in practice. The database is running. Supabase is hosting it. Surely something is keeping it safe.

Here is the uncomfortable truth: your hosting provider's uptime SLA is a promise about availability, not about your data. Those are two completely different things. A provider can maintain 99.9% uptime and still lose your data to a botched migration, a ransomware attack, or an accidental delete by you or an AI tool running with too many permissions.

According to CrashPlan's 2026 data loss report, 67.7% of businesses experienced significant data loss in the past year. Ransomware was present in 44% of all data breaches in 2025, up from 32% the year before. And human error -- including accidental deletion -- accounts for 34% of SaaS data loss incidents.

Backups are not a recovery tool. They are the foundation that makes recovery possible at all. Without them, you are not building a product. You are building on sand.


What Can Destroy Your Data

Before we get tactical, let us name the threats honestly:

Accidental deletes. You, a teammate, or an AI coding assistant runs a query without a WHERE clause. A migration script drops a table it should not. This is the most common cause, and it happens to experienced engineers too.

Bad migrations. You push a schema change that corrupts data in existing rows. Without a pre-migration snapshot, there is no going back.

Ransomware. Attackers encrypt your database and demand payment. Paying does not guarantee recovery. A clean, offsite backup means you do not need to negotiate.

Hosting incidents. Fires, hardware failures, provider outages. South Korea's NIRS thought their data center was safe. As the Unitrends State of Backup and Recovery Report 2025 notes, accidental deletion, misconfiguration, server hardware failure, and human error were the top causes of data loss in the past year -- not exotic attacks.

Provider changes. Free tiers get discontinued. Accounts get suspended. Projects get deleted after inactivity. If your only copy lives on a provider's server, you are one terms-of-service change away from losing everything.


What Supabase Gives You (and What It Does Not)

Supabase does include automatic daily backups on paid plans. Here is exactly what you get, according to the official Supabase backup documentation:

  • Free plan: No automatic backups. Manual exports only.
  • Pro plan: 7 days of daily backups.
  • Team plan: 14 days of daily backups.
  • Enterprise plan: Up to 30 days of daily backups.

You can access these from Database > Backups in your Supabase dashboard, and restore your project to any available backup point.

For tighter recovery windows, Supabase offers Point-in-Time Recovery (PITR) as a paid add-on for Pro, Team, and Enterprise plans. With PITR enabled, Supabase backs up your database at two-minute intervals using WAL (Write Ahead Log) files. This means that instead of potentially losing 24 hours of data, your worst-case data loss shrinks to about two minutes. Note: enabling PITR replaces daily backups -- you do not need both.

Critical limitation: Free plan users get zero automatic backups. Supabase explicitly recommends free tier users "regularly export their data using the Supabase CLI db dump command and maintain off-site backups." If you are building anything that matters on the free plan, manual exports are not optional.


How to Export Your Database Right Now

Do not wait. Run this today.

Option 1: Supabase CLI

# Install the Supabase CLI if you haven't already
npm install -g supabase

# Log in
supabase login

# Export your database (replace YOUR_PROJECT_REF with your project reference)
supabase db dump --project-ref YOUR_PROJECT_REF -f backup_$(date +%Y%m%d).sql

Option 2: pg_dump directly

You can find your database connection string in your Supabase project settings under Database > Connection string.

pg_dump \
  --host=db.YOUR_PROJECT_REF.supabase.co \
  --port=5432 \
  --username=postgres \
  --dbname=postgres \
  --format=custom \
  --file=backup_$(date +%Y%m%d).dump

The --format=custom flag creates a compressed, selectively-restorable archive. Store the output file somewhere other than your local machine.


The 3-2-1 Backup Rule

The 3-2-1 rule has been the gold standard of backup strategy for decades. It means:

  • 3 copies of your data total (the live database plus two backups)
  • 2 different storage types (e.g., your hosting provider plus cloud storage)
  • 1 copy offsite (on a physically separate system from your primary hosting)

For a Supabase project, a practical 3-2-1 setup looks like this:

  1. Your live Supabase database (copy one)
  2. Supabase's automatic daily backups (copy two, different storage type -- but still on Supabase's infrastructure)
  3. A scheduled pg_dump uploaded to an S3 bucket or similar offsite cloud storage (copy three, genuinely offsite)

Here is a minimal bash script to automate this, scheduled via cron:

#!/bin/bash
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
BACKUP_FILE="myapp_${TIMESTAMP}.dump"

# Dump the database
pg_dump \
  -h db.YOUR_PROJECT_REF.supabase.co \
  -U postgres \
  -Fc \
  -f "/tmp/${BACKUP_FILE}" \
  postgres

# Upload to S3
aws s3 cp "/tmp/${BACKUP_FILE}" "s3://your-backup-bucket/db/${BACKUP_FILE}"

# Clean up local temp file
rm "/tmp/${BACKUP_FILE}"

Add this to your crontab to run nightly:

crontab -e
# Add this line:
0 2 * * * /path/to/backup.sh >> /var/log/db_backup.log 2>&1

A Backup You Have Never Tested Is Not a Backup

This is the rule that trips up even experienced teams. The Catalogic Software 2025 backup research found that while 95% of organizations have backup systems in place, fewer than 30% test them comprehensively.

Only 57% of backups succeed completely. Only 61% of restores succeed. That means roughly one in three restores fails -- which you will only discover when disaster has already struck, and you are in full panic mode.

Test a restore at least once a month. Create a temporary Supabase project, restore your backup into it, and confirm your data is intact and your application can connect. Delete the test project when done. This takes about fifteen minutes and has saved countless teams from catastrophe.


What Not to Rely On

  • Provider uptime SLAs: These cover service availability, not data integrity.
  • "It's in the cloud so it's safe": Cloud providers lose data too. South Korea's government data center was also in a managed facility.
  • Version control for your schema: Git tracks your migration files, not your actual data rows.
  • AI-generated backups with no verification: If an AI assistant set up a backup script for you, test it manually before trusting it.

Your Action Checklist

  • Export your Supabase database right now using the CLI or pg_dump
  • Store that export somewhere other than your laptop (S3, Backblaze, Google Cloud Storage)
  • Upgrade to at least the Supabase Pro plan if you have any real users
  • Enable Point-in-Time Recovery if data loss of more than a few hours is unacceptable
  • Automate a nightly pg_dump to an offsite S3 bucket using the cron script above
  • Test a restore into a new Supabase project this week -- confirm it actually works
  • Set a calendar reminder to test restores monthly

Ask The Guild

Have you tested a database restore recently, and did anything surprise you? Have you set up automated offsite backups, and which storage provider did you use? Share your setup or your horror story in the community thread -- someone else will learn from both.


Tom Hundley is a software architect with 25 years of experience working with development teams and non-technical builders. This article is Part 23 of the Security First series.

Copy A Prompt Next

Start safely

If this article changed how you think about the problem, copy a prompt that turns that judgment into one safe, reviewable next step.

Matching public prompts

6

Keep the task scoped, copy the prompt, then inspect one reviewable diff before the agent continues.

Need the safest first move instead? Open the curated sample prompts before you browse the broader library.

Start Here — Build Safely With AIStart Here — Build Safely With AI

Choose a Tiny First Win

How to pick a first project that teaches the workflow without dragging you into complex product and engineering problems.

Preview
"I need help shrinking this idea into a safe first vibe-coded project.
The big idea is: [describe idea]
Reduce it to the smallest useful version by:
1. removing anything that requires auth, billing, production data, or complicated integrations
2. keeping only one user and one core job to be done
Security First

Turn this security lesson into a repeatable review habit

This article gives you the judgment call. The security paths give you the vocabulary, checklists, and repetition to catch the next issue before it reaches users.

Best Next Path

Security Essentials

Guild Member · $29/mo

Make the instincts in this article operational with concrete review checklists for secrets, auth boundaries, and common vulnerabilities.

28 lessonsIncluded with the full Guild Member library

Need the free route first?

Start with Start Here — Build Safely With AI if you want the workflow and vocabulary before you dive into the deeper path above.

T

About Tom Hundley

Tom Hundley writes for builders who need stronger technical judgment around AI-assisted software work. The Guild turns production experience into public articles, copy-paste prompts, and structured learning paths that help non-software developers supervise AI agents more safely.

Do this next

Leave this article with one concrete move. Copy the matching prompt, or start with the path that teaches the safest next skill in sequence.