Database Operations (D1)

Cloudflare D1 database operations: creating tables, querying, migrations, and cache management

Overview

Verifieddit uses Cloudflare D1 (SQLite-compatible) for persistent storage. There are separate production and testing databases.

Databases

NameEnvironmentBinding
verifieddit-badgesProductionDB
verifieddit-badges-testingTestingDB

Prerequisites

  • Wrangler CLI: npx wrangler
  • Cloudflare account authenticated: npx wrangler login

Common Operations

List All D1 Databases

1
npx wrangler d1 list

Query a Database

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
# Simple query
npx wrangler d1 execute verifieddit-badges \
  --command "SELECT COUNT(*) FROM badges"

# Query with JSON output
npx wrangler d1 execute verifieddit-badges \
  --command "SELECT * FROM badges LIMIT 10" --json

# Testing database
npx wrangler d1 execute verifieddit-badges-testing \
  --command "SELECT * FROM users"

Create a Table

1
2
3
4
5
6
npx wrangler d1 execute verifieddit-badges \
  --command "CREATE TABLE IF NOT EXISTS new_table (
    id TEXT PRIMARY KEY,
    data TEXT NOT NULL,
    created_at DATETIME DEFAULT CURRENT_TIMESTAMP
  )"

Insert Data

1
2
npx wrangler d1 execute verifieddit-badges \
  --command "INSERT INTO badges (id, url, domain, status) VALUES ('test-1', 'https://example.com', 'example.com', 'verified')"

Update Data

1
2
npx wrangler d1 execute verifieddit-badges \
  --command "UPDATE badges SET status = 'verified', verified_at = CURRENT_TIMESTAMP WHERE id = 'badge-id'"

Delete Data

1
2
npx wrangler d1 execute verifieddit-badges \
  --command "DELETE FROM badges WHERE id = 'badge-id'"

Migrations

Running a Migration File

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
# Create migration file
cat > migrations/002_add_thumbnail.sql << 'EOF'
ALTER TABLE badges ADD COLUMN original_thumbnail_url TEXT;
EOF

# Execute migration
npx wrangler d1 execute verifieddit-badges --file migrations/002_add_thumbnail.sql

# Also run on testing
npx wrangler d1 execute verifieddit-badges-testing --file migrations/002_add_thumbnail.sql

Migration Best Practices

  1. Always test migrations on the testing database first
  2. Use IF NOT EXISTS / IF EXISTS for idempotent migrations
  3. Keep migration files numbered sequentially
  4. SQLite does not support DROP COLUMN – you must recreate the table

Query via Cloudflare API

For programmatic access outside of wrangler:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
CF_TOKEN=$(pass cloudflare/api-token)
ACCOUNT_ID=$(pass cloudflare/account-id)
DATABASE_ID="<d1-database-id>"

curl -s -X POST \
  "https://api.cloudflare.com/client/v4/accounts/${ACCOUNT_ID}/d1/database/${DATABASE_ID}/query" \
  -H "Authorization: Bearer ${CF_TOKEN}" \
  -H "Content-Type: application/json" \
  --data '{
    "sql": "SELECT * FROM badges WHERE status = ? LIMIT ?",
    "params": ["verified", 10]
  }' | jq '.result[0].results'

Backup and Export

Export Table Data

1
2
npx wrangler d1 execute verifieddit-badges \
  --command "SELECT * FROM badges" --json > badges-backup-$(date +%Y%m%d).json

Export Schema

1
2
npx wrangler d1 execute verifieddit-badges \
  --command "SELECT sql FROM sqlite_master WHERE type='table'" --json

Schema Reference

badges

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
CREATE TABLE badges (
  id TEXT PRIMARY KEY,
  url TEXT NOT NULL,
  domain TEXT NOT NULL,
  badge_type TEXT DEFAULT 'standard',
  status TEXT DEFAULT 'pending',
  created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
  verified_at DATETIME,
  c2pa_manifest TEXT,
  original_thumbnail_url TEXT
);

badge_images

1
2
3
4
5
6
7
8
CREATE TABLE badge_images (
  id TEXT PRIMARY KEY,
  badge_id TEXT NOT NULL,
  image_url TEXT NOT NULL,
  ai_score REAL,
  created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
  FOREIGN KEY (badge_id) REFERENCES badges(id)
);

users

1
2
3
4
5
6
CREATE TABLE users (
  clerk_id TEXT PRIMARY KEY,
  email TEXT,
  plan TEXT DEFAULT 'free',
  created_at DATETIME DEFAULT CURRENT_TIMESTAMP
);

Troubleshooting

  • “Table not found”: Run the migration to create the table. Check you’re querying the correct database (production vs testing).
  • “Database locked”: D1 has concurrent write limitations. Retry the operation.
  • Data mismatch between environments: Production and testing databases are independent. Migrations must be run on both.
  • JSON output truncated: Large result sets may be truncated. Use LIMIT and OFFSET for pagination.