implement content-hash based image cache invalidation
Add content-based hashing to recipe images for proper cache invalidation while maintaining graceful degradation through dual file storage. Changes: - Add imageHash utility with SHA-256 content hashing (8-char) - Update Recipe model to store hashed filenames in images[0].mediapath - Modify image upload endpoint to save both hashed and unhashed versions - Update frontend components to use images[0].mediapath with fallback - Add migration endpoint to hash existing images (production-only) - Update image delete/rename endpoints to handle both file versions Images are now stored as: - recipe.a1b2c3d4.webp (hashed, cached forever) - recipe.webp (unhashed, graceful degradation fallback) Database stores hashed filename for cache busting, while unhashed version remains on disk for backward compatibility and manual uploads.
This commit is contained in:
255
MIGRATION_IMAGE_HASHES.md
Normal file
255
MIGRATION_IMAGE_HASHES.md
Normal file
@@ -0,0 +1,255 @@
|
|||||||
|
# Image Hash Migration Guide
|
||||||
|
|
||||||
|
This guide explains how to migrate existing images to use content-based hashing for cache invalidation.
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
The new system stores images with content-based hashes for proper cache invalidation:
|
||||||
|
- **Database**: `images[0].mediapath = "maccaroni.a1b2c3d4.webp"`
|
||||||
|
- **Disk (hashed)**: `maccaroni.a1b2c3d4.webp` - cached forever (immutable)
|
||||||
|
- **Disk (unhashed)**: `maccaroni.webp` - fallback for graceful degradation
|
||||||
|
|
||||||
|
## What This Does
|
||||||
|
|
||||||
|
The migration will:
|
||||||
|
|
||||||
|
1. **Find all recipes** in the database
|
||||||
|
2. **Check each recipe's images**:
|
||||||
|
- If `images[0].mediapath` already has a hash → skip
|
||||||
|
- If image file doesn't exist on disk → skip
|
||||||
|
- Otherwise → generate hash and create hashed copy
|
||||||
|
3. **Generate content hash** from the full-size image (8-char SHA-256)
|
||||||
|
4. **Copy files** (keeps originals!) in all three directories:
|
||||||
|
- `/var/lib/www/static/rezepte/full/`
|
||||||
|
- `/var/lib/www/static/rezepte/thumb/`
|
||||||
|
- `/var/lib/www/static/rezepte/placeholder/`
|
||||||
|
5. **Update database** with new hashed filename in `images[0].mediapath`
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
|
||||||
|
- Must be logged in as admin
|
||||||
|
- Only runs in production (when `IMAGE_DIR=/var/lib/www/static`)
|
||||||
|
- Requires confirmation token to prevent accidental runs
|
||||||
|
- Backup your database before running (recommended)
|
||||||
|
|
||||||
|
## Step 1: Deploy Code Changes
|
||||||
|
|
||||||
|
Deploy the updated codebase to production. The changes include:
|
||||||
|
- Image upload endpoint now saves both hashed and unhashed versions
|
||||||
|
- Frontend components use `images[0].mediapath` for image URLs
|
||||||
|
- New migration endpoint at `/api/admin/migrate-image-hashes`
|
||||||
|
|
||||||
|
## Step 2: Update Nginx Configuration
|
||||||
|
|
||||||
|
Add this to your nginx site configuration for `bocken.org`:
|
||||||
|
|
||||||
|
```nginx
|
||||||
|
location /static/rezepte/ {
|
||||||
|
root /var/lib/www;
|
||||||
|
|
||||||
|
# Cache hashed files forever (they have content hash in filename)
|
||||||
|
location ~ /static/rezepte/(thumb|placeholder|full)/[^/]+\.[a-f0-9]{8}\.webp$ {
|
||||||
|
add_header Cache-Control "public, max-age=31536000, immutable";
|
||||||
|
}
|
||||||
|
|
||||||
|
# Cache unhashed files with revalidation (fallback for manual uploads)
|
||||||
|
location ~ /static/rezepte/(thumb|placeholder|full)/[^/]+\.webp$ {
|
||||||
|
add_header Cache-Control "public, max-age=3600, must-revalidate";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Reload nginx:
|
||||||
|
```bash
|
||||||
|
sudo nginx -t && sudo nginx -s reload
|
||||||
|
```
|
||||||
|
|
||||||
|
## Step 3: Run Migration
|
||||||
|
|
||||||
|
### Option 1: Using curl (Recommended)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Get your session cookie from browser DevTools
|
||||||
|
# In Chrome/Firefox: F12 → Network tab → Click any request → Headers → Copy Cookie value
|
||||||
|
|
||||||
|
curl -X POST https://bocken.org/api/admin/migrate-image-hashes \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-H "Cookie: YOUR_SESSION_COOKIE_HERE" \
|
||||||
|
-d '{"confirm": "MIGRATE_IMAGES"}'
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option 2: Using the Shell Script
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd /path/to/homepage
|
||||||
|
|
||||||
|
# Save your session cookie to a file (from browser DevTools)
|
||||||
|
echo "your-session-cookie-value" > .prod-session-cookie
|
||||||
|
|
||||||
|
# Make script executable
|
||||||
|
chmod +x scripts/migrate-image-hashes.sh
|
||||||
|
|
||||||
|
# Run migration
|
||||||
|
./scripts/migrate-image-hashes.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option 3: Using Browser (Postman, Insomnia, etc.)
|
||||||
|
|
||||||
|
1. Make sure you're logged in to bocken.org in your browser
|
||||||
|
2. Send POST request to: `https://bocken.org/api/admin/migrate-image-hashes`
|
||||||
|
3. Headers:
|
||||||
|
```
|
||||||
|
Content-Type: application/json
|
||||||
|
```
|
||||||
|
4. Body (JSON):
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"confirm": "MIGRATE_IMAGES"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Response Format
|
||||||
|
|
||||||
|
Success response:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"success": true,
|
||||||
|
"message": "Migration complete. Migrated 42 of 100 recipes.",
|
||||||
|
"total": 100,
|
||||||
|
"migrated": 42,
|
||||||
|
"skipped": 58,
|
||||||
|
"errors": [],
|
||||||
|
"details": [
|
||||||
|
{
|
||||||
|
"shortName": "maccaroni",
|
||||||
|
"status": "migrated",
|
||||||
|
"unhashedFilename": "maccaroni.webp",
|
||||||
|
"hashedFilename": "maccaroni.a1b2c3d4.webp",
|
||||||
|
"hash": "a1b2c3d4",
|
||||||
|
"filesCopied": 3,
|
||||||
|
"note": "Both hashed and unhashed versions saved for graceful degradation"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"shortName": "pizza",
|
||||||
|
"status": "skipped",
|
||||||
|
"reason": "already hashed",
|
||||||
|
"filename": "pizza.e5f6g7h8.webp"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## What Gets Skipped
|
||||||
|
|
||||||
|
The migration will skip recipes where:
|
||||||
|
- `images[0].mediapath` already contains a hash pattern (`.[a-f0-9]{8}.webp`)
|
||||||
|
- Image file doesn't exist on disk
|
||||||
|
- Recipe has no images array
|
||||||
|
|
||||||
|
## After Migration
|
||||||
|
|
||||||
|
### Verification
|
||||||
|
|
||||||
|
1. Check a few recipe pages - images should load correctly
|
||||||
|
2. Check browser DevTools → Network tab:
|
||||||
|
- Hashed images should have `Cache-Control: max-age=31536000, immutable`
|
||||||
|
- Unhashed images should have `Cache-Control: max-age=3600, must-revalidate`
|
||||||
|
|
||||||
|
### New Uploads
|
||||||
|
|
||||||
|
All new image uploads will automatically:
|
||||||
|
- Generate content hash
|
||||||
|
- Save both hashed and unhashed versions
|
||||||
|
- Store hashed filename in database
|
||||||
|
- Work immediately with proper cache invalidation
|
||||||
|
|
||||||
|
### Manual Uploads
|
||||||
|
|
||||||
|
If you manually upload an image:
|
||||||
|
1. Drop `recipe-name.webp` in all three folders (full, thumb, placeholder)
|
||||||
|
2. It will work immediately (graceful degradation)
|
||||||
|
3. Next time the recipe is edited and image re-uploaded, it will get a hash
|
||||||
|
|
||||||
|
## Rollback (If Needed)
|
||||||
|
|
||||||
|
If something goes wrong:
|
||||||
|
|
||||||
|
1. **Database rollback**: Restore from backup taken before migration
|
||||||
|
2. **Files**: The original unhashed files are still on disk - no data loss
|
||||||
|
3. **Remove hashed files** (optional):
|
||||||
|
```bash
|
||||||
|
cd /var/lib/www/static/rezepte
|
||||||
|
find . -name '*.[a-f0-9][a-f0-9][a-f0-9][a-f0-9][a-f0-9][a-f0-9][a-f0-9][a-f0-9].webp' -delete
|
||||||
|
```
|
||||||
|
|
||||||
|
## Safety Features
|
||||||
|
|
||||||
|
1. ✅ **Production-only**: Won't run unless `IMAGE_DIR=/var/lib/www/static`
|
||||||
|
2. ✅ **Confirmation token**: Requires `{"confirm": "MIGRATE_IMAGES"}` in request body
|
||||||
|
3. ✅ **Authentication**: Requires logged-in user
|
||||||
|
4. ✅ **Non-destructive**: Copies files (keeps originals)
|
||||||
|
5. ✅ **Skip already migrated**: Won't re-process files that already have hashes
|
||||||
|
6. ✅ **Detailed logging**: Returns detailed report of what was changed
|
||||||
|
|
||||||
|
## Technical Details
|
||||||
|
|
||||||
|
### Hash Generation
|
||||||
|
|
||||||
|
- Uses SHA-256 of image file content
|
||||||
|
- First 8 hex characters used (4 billion combinations)
|
||||||
|
- Same image = same hash (deterministic)
|
||||||
|
- Different image = different hash (cache invalidation)
|
||||||
|
|
||||||
|
### File Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
/var/lib/www/static/rezepte/
|
||||||
|
├── full/
|
||||||
|
│ ├── maccaroni.webp ← Unhashed (fallback)
|
||||||
|
│ ├── maccaroni.a1b2c3d4.webp ← Hashed (cache busting)
|
||||||
|
│ └── ...
|
||||||
|
├── thumb/
|
||||||
|
│ ├── maccaroni.webp
|
||||||
|
│ ├── maccaroni.a1b2c3d4.webp
|
||||||
|
│ └── ...
|
||||||
|
└── placeholder/
|
||||||
|
├── maccaroni.webp
|
||||||
|
├── maccaroni.a1b2c3d4.webp
|
||||||
|
└── ...
|
||||||
|
```
|
||||||
|
|
||||||
|
### Database Schema
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
images: [{
|
||||||
|
mediapath: "maccaroni.a1b2c3d4.webp", // Full filename with hash
|
||||||
|
alt: "Maccaroni and cheese",
|
||||||
|
caption: "Delicious comfort food"
|
||||||
|
}]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Frontend Usage
|
||||||
|
|
||||||
|
```svelte
|
||||||
|
<!-- Card.svelte -->
|
||||||
|
<script>
|
||||||
|
// Uses images[0].mediapath (with hash)
|
||||||
|
// Falls back to short_name.webp if missing
|
||||||
|
const img_name = $derived(
|
||||||
|
recipe.images?.[0]?.mediapath ||
|
||||||
|
`${recipe.short_name}.webp`
|
||||||
|
);
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<img src="https://bocken.org/static/rezepte/thumb/{img_name}" />
|
||||||
|
```
|
||||||
|
|
||||||
|
## Questions?
|
||||||
|
|
||||||
|
If you encounter issues:
|
||||||
|
1. Check nginx error logs: `sudo tail -f /var/log/nginx/error.log`
|
||||||
|
2. Check application logs for the migration endpoint
|
||||||
|
3. Verify file permissions on `/var/lib/www/static/rezepte/`
|
||||||
|
4. Ensure database connection is working
|
||||||
|
|
||||||
|
The migration is designed to be safe and non-destructive. Original files are never deleted, only copied.
|
||||||
49
scripts/migrate-image-hashes.sh
Normal file
49
scripts/migrate-image-hashes.sh
Normal file
@@ -0,0 +1,49 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# Image Hash Migration Script
|
||||||
|
# This script triggers the image hash migration endpoint in production
|
||||||
|
# It will:
|
||||||
|
# 1. Find all images without hashes (shortname.webp)
|
||||||
|
# 2. Generate content hashes for them
|
||||||
|
# 3. Rename them to shortname.{hash}.webp
|
||||||
|
# 4. Update the database
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
echo "======================================"
|
||||||
|
echo "Image Hash Migration Script"
|
||||||
|
echo "======================================"
|
||||||
|
echo ""
|
||||||
|
echo "This will migrate all existing images to use content-based hashes."
|
||||||
|
echo "Images will be renamed from 'shortname.webp' to 'shortname.{hash}.webp'"
|
||||||
|
echo ""
|
||||||
|
echo "⚠️ WARNING: This operation will rename files on disk!"
|
||||||
|
echo ""
|
||||||
|
read -p "Are you sure you want to continue? (yes/no): " confirm
|
||||||
|
|
||||||
|
if [ "$confirm" != "yes" ]; then
|
||||||
|
echo "Migration cancelled."
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "Starting migration..."
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Get the production URL (modify this to your production URL)
|
||||||
|
PROD_URL="https://bocken.org"
|
||||||
|
|
||||||
|
# Make the API call
|
||||||
|
response=$(curl -s -X POST \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-H "Cookie: $(cat .prod-session-cookie 2>/dev/null || echo '')" \
|
||||||
|
-d '{"confirm": "MIGRATE_IMAGES"}' \
|
||||||
|
"${PROD_URL}/api/admin/migrate-image-hashes")
|
||||||
|
|
||||||
|
# Pretty print the response
|
||||||
|
echo "$response" | jq '.'
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo "======================================"
|
||||||
|
echo "Migration complete!"
|
||||||
|
echo "======================================"
|
||||||
@@ -25,9 +25,12 @@ onMount(() => {
|
|||||||
isloaded = document.querySelector("img")?.complete ? true : false
|
isloaded = document.querySelector("img")?.complete ? true : false
|
||||||
});
|
});
|
||||||
|
|
||||||
// Use germanShortName for images if available (English recipes), otherwise use short_name (German recipes)
|
// Use mediapath from images array (includes hash for cache busting)
|
||||||
const imageShortName = $derived(recipe.germanShortName || recipe.short_name);
|
// Fallback to short_name.webp for backward compatibility
|
||||||
const img_name = $derived(imageShortName + ".webp?v=" + recipe.dateModified);
|
const img_name = $derived(
|
||||||
|
recipe.images?.[0]?.mediapath ||
|
||||||
|
`${recipe.germanShortName || recipe.short_name}.webp`
|
||||||
|
);
|
||||||
</script>
|
</script>
|
||||||
<style>
|
<style>
|
||||||
.card_anchor{
|
.card_anchor{
|
||||||
@@ -261,7 +264,7 @@ const img_name = $derived(imageShortName + ".webp?v=" + recipe.dateModified);
|
|||||||
<noscript>
|
<noscript>
|
||||||
<img class="image backdrop_blur" src="https://bocken.org/static/rezepte/thumb/{img_name}" loading={loading_strat} alt="{recipe.alt}"/>
|
<img class="image backdrop_blur" src="https://bocken.org/static/rezepte/thumb/{img_name}" loading={loading_strat} alt="{recipe.alt}"/>
|
||||||
</noscript>
|
</noscript>
|
||||||
<img class="image backdrop_blur" class:blur={!isloaded} src={'https://bocken.org/static/rezepte/thumb/' + imageShortName + '.webp'} loading={loading_strat} alt="{recipe.alt}" on:load={() => isloaded=true}/>
|
<img class="image backdrop_blur" class:blur={!isloaded} src={'https://bocken.org/static/rezepte/thumb/' + img_name} loading={loading_strat} alt="{recipe.alt}" on:load={() => isloaded=true}/>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
{#if showFavoriteIndicator && isFavorite}
|
{#if showFavoriteIndicator && isFavorite}
|
||||||
|
|||||||
@@ -42,7 +42,7 @@ export function generateRecipeJsonLd(data: any) {
|
|||||||
"keywords": data.tags?.join(', '),
|
"keywords": data.tags?.join(', '),
|
||||||
"image": {
|
"image": {
|
||||||
"@type": "ImageObject",
|
"@type": "ImageObject",
|
||||||
"url": `https://bocken.org/static/rezepte/full/${data.short_name}.webp`,
|
"url": `https://bocken.org/static/rezepte/full/${data.images?.[0]?.mediapath || `${data.short_name}.webp`}`,
|
||||||
"width": 1200,
|
"width": 1200,
|
||||||
"height": 800
|
"height": 800
|
||||||
},
|
},
|
||||||
|
|||||||
@@ -9,7 +9,7 @@ const RecipeSchema = new mongoose.Schema(
|
|||||||
dateCreated: {type: Date, default: Date.now},
|
dateCreated: {type: Date, default: Date.now},
|
||||||
dateModified: {type: Date, default: Date.now},
|
dateModified: {type: Date, default: Date.now},
|
||||||
images: [ {
|
images: [ {
|
||||||
mediapath: {type: String, required: true},
|
mediapath: {type: String, required: true}, // filename with hash for cache busting: e.g., "maccaroni.a1b2c3d4.webp"
|
||||||
alt: String,
|
alt: String,
|
||||||
caption: String,
|
caption: String,
|
||||||
}],
|
}],
|
||||||
|
|||||||
@@ -35,10 +35,14 @@
|
|||||||
|
|
||||||
const isEnglish = $derived(data.lang === 'en');
|
const isEnglish = $derived(data.lang === 'en');
|
||||||
|
|
||||||
// Use German short_name for images (they're the same for both languages)
|
// Use mediapath from images array (includes hash for cache busting)
|
||||||
const imageShortName = $derived(data.germanShortName || data.short_name);
|
// Fallback to short_name.webp for backward compatibility
|
||||||
const hero_img_src = $derived("https://bocken.org/static/rezepte/full/" + imageShortName + ".webp?v=" + data.dateModified);
|
const img_filename = $derived(
|
||||||
const placeholder_src = $derived("https://bocken.org/static/rezepte/placeholder/" + imageShortName + ".webp?v=" + data.dateModified);
|
data.images?.[0]?.mediapath ||
|
||||||
|
`${data.germanShortName || data.short_name}.webp`
|
||||||
|
);
|
||||||
|
const hero_img_src = $derived("https://bocken.org/static/rezepte/full/" + img_filename);
|
||||||
|
const placeholder_src = $derived("https://bocken.org/static/rezepte/placeholder/" + img_filename);
|
||||||
|
|
||||||
const months = $derived(isEnglish
|
const months = $derived(isEnglish
|
||||||
? ["January", "February", "March", "April", "May", "June", "July", "August", "September", "October", "November", "December"]
|
? ["January", "February", "March", "April", "May", "June", "July", "August", "September", "October", "November", "December"]
|
||||||
@@ -296,8 +300,8 @@ h4{
|
|||||||
<svelte:head>
|
<svelte:head>
|
||||||
<title>{stripHtmlTags(data.name)} - {labels.title}</title>
|
<title>{stripHtmlTags(data.name)} - {labels.title}</title>
|
||||||
<meta name="description" content="{stripHtmlTags(data.description)}" />
|
<meta name="description" content="{stripHtmlTags(data.description)}" />
|
||||||
<meta property="og:image" content="https://bocken.org/static/rezepte/thumb/{imageShortName}.webp" />
|
<meta property="og:image" content="https://bocken.org/static/rezepte/thumb/{img_filename}" />
|
||||||
<meta property="og:image:secure_url" content="https://bocken.org/static/rezepte/thumb/{imageShortName}.webp" />
|
<meta property="og:image:secure_url" content="https://bocken.org/static/rezepte/thumb/{img_filename}" />
|
||||||
<meta property="og:image:type" content="image/webp" />
|
<meta property="og:image:type" content="image/webp" />
|
||||||
<meta property="og:image:alt" content="{stripHtmlTags(data.name)}" />
|
<meta property="og:image:alt" content="{stripHtmlTags(data.name)}" />
|
||||||
{@html `<script type="application/ld+json">${JSON.stringify(data.recipeJsonLd)}</script>`}
|
{@html `<script type="application/ld+json">${JSON.stringify(data.recipeJsonLd)}</script>`}
|
||||||
|
|||||||
@@ -11,7 +11,7 @@
|
|||||||
export let data: PageData;
|
export let data: PageData;
|
||||||
let preamble = data.recipe.preamble
|
let preamble = data.recipe.preamble
|
||||||
let addendum = data.recipe.addendum
|
let addendum = data.recipe.addendum
|
||||||
let image_preview_url="https://bocken.org/static/rezepte/thumb/" + data.recipe.short_name + ".webp?v=" + data.recipe.dateModified;
|
let image_preview_url="https://bocken.org/static/rezepte/thumb/" + (data.recipe.images?.[0]?.mediapath || `${data.recipe.short_name}.webp`);
|
||||||
let note = data.recipe.note
|
let note = data.recipe.note
|
||||||
|
|
||||||
// Translation workflow state
|
// Translation workflow state
|
||||||
|
|||||||
148
src/routes/api/admin/migrate-image-hashes/+server.ts
Normal file
148
src/routes/api/admin/migrate-image-hashes/+server.ts
Normal file
@@ -0,0 +1,148 @@
|
|||||||
|
import type { RequestHandler } from '@sveltejs/kit';
|
||||||
|
import { error } from '@sveltejs/kit';
|
||||||
|
import { IMAGE_DIR } from '$env/static/private';
|
||||||
|
import { Recipe } from '$models/Recipe';
|
||||||
|
import { connectDB } from '$utils/db';
|
||||||
|
import { generateImageHash, getHashedFilename } from '$utils/imageHash';
|
||||||
|
import path from 'path';
|
||||||
|
import fs from 'fs';
|
||||||
|
import { rename } from 'node:fs/promises';
|
||||||
|
|
||||||
|
export const POST = (async ({ locals, request }) => {
|
||||||
|
// Only allow in production (check if IMAGE_DIR contains production path)
|
||||||
|
const isProd = IMAGE_DIR.includes('/var/lib/www');
|
||||||
|
|
||||||
|
// Require confirmation token to prevent accidental runs
|
||||||
|
const data = await request.json();
|
||||||
|
const confirmToken = data?.confirm;
|
||||||
|
|
||||||
|
if (!isProd) {
|
||||||
|
throw error(403, 'This endpoint only runs in production (IMAGE_DIR must be /var/lib/www)');
|
||||||
|
}
|
||||||
|
|
||||||
|
if (confirmToken !== 'MIGRATE_IMAGES') {
|
||||||
|
throw error(400, 'Missing or invalid confirmation token. Send {"confirm": "MIGRATE_IMAGES"}');
|
||||||
|
}
|
||||||
|
|
||||||
|
const auth = await locals.auth();
|
||||||
|
if (!auth) throw error(401, 'Need to be logged in');
|
||||||
|
|
||||||
|
await connectDB();
|
||||||
|
|
||||||
|
const results = {
|
||||||
|
total: 0,
|
||||||
|
migrated: 0,
|
||||||
|
skipped: 0,
|
||||||
|
errors: [] as string[],
|
||||||
|
details: [] as any[]
|
||||||
|
};
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Get all recipes
|
||||||
|
const recipes = await Recipe.find({});
|
||||||
|
results.total = recipes.length;
|
||||||
|
|
||||||
|
for (const recipe of recipes) {
|
||||||
|
const shortName = recipe.short_name;
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Check if already has hashed filename
|
||||||
|
const currentMediaPath = recipe.images?.[0]?.mediapath;
|
||||||
|
|
||||||
|
// If mediapath exists and has hash pattern, skip
|
||||||
|
if (currentMediaPath && /\.[a-f0-9]{8}\.webp$/.test(currentMediaPath)) {
|
||||||
|
results.skipped++;
|
||||||
|
results.details.push({
|
||||||
|
shortName,
|
||||||
|
status: 'skipped',
|
||||||
|
reason: 'already hashed',
|
||||||
|
filename: currentMediaPath
|
||||||
|
});
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if image file exists on disk (try full size first)
|
||||||
|
const unhashed_filename = `${shortName}.webp`;
|
||||||
|
const fullPath = path.join(IMAGE_DIR, 'rezepte', 'full', unhashed_filename);
|
||||||
|
|
||||||
|
if (!fs.existsSync(fullPath)) {
|
||||||
|
results.skipped++;
|
||||||
|
results.details.push({
|
||||||
|
shortName,
|
||||||
|
status: 'skipped',
|
||||||
|
reason: 'file not found',
|
||||||
|
path: fullPath
|
||||||
|
});
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Generate hash from the full-size image
|
||||||
|
const imageHash = generateImageHash(fullPath);
|
||||||
|
const hashedFilename = getHashedFilename(shortName, imageHash);
|
||||||
|
|
||||||
|
// Create hashed versions and keep unhashed copies (for graceful degradation)
|
||||||
|
const folders = ['full', 'thumb', 'placeholder'];
|
||||||
|
let copiedCount = 0;
|
||||||
|
|
||||||
|
for (const folder of folders) {
|
||||||
|
const unhashedPath = path.join(IMAGE_DIR, 'rezepte', folder, unhashed_filename);
|
||||||
|
const hashedPath = path.join(IMAGE_DIR, 'rezepte', folder, hashedFilename);
|
||||||
|
|
||||||
|
if (fs.existsSync(unhashedPath)) {
|
||||||
|
// Copy to hashed filename (keep original unhashed file)
|
||||||
|
fs.copyFileSync(unhashedPath, hashedPath);
|
||||||
|
copiedCount++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update database with hashed filename
|
||||||
|
if (!recipe.images || recipe.images.length === 0) {
|
||||||
|
// Create images array if it doesn't exist
|
||||||
|
recipe.images = [{
|
||||||
|
mediapath: hashedFilename,
|
||||||
|
alt: recipe.name || '',
|
||||||
|
caption: ''
|
||||||
|
}];
|
||||||
|
} else {
|
||||||
|
// Update existing mediapath
|
||||||
|
recipe.images[0].mediapath = hashedFilename;
|
||||||
|
}
|
||||||
|
|
||||||
|
await recipe.save();
|
||||||
|
|
||||||
|
results.migrated++;
|
||||||
|
results.details.push({
|
||||||
|
shortName,
|
||||||
|
status: 'migrated',
|
||||||
|
unhashedFilename: unhashed_filename,
|
||||||
|
hashedFilename: hashedFilename,
|
||||||
|
hash: imageHash,
|
||||||
|
filesCopied: copiedCount,
|
||||||
|
note: 'Both hashed and unhashed versions saved for graceful degradation'
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (err) {
|
||||||
|
results.errors.push(`${shortName}: ${err instanceof Error ? err.message : String(err)}`);
|
||||||
|
results.details.push({
|
||||||
|
shortName,
|
||||||
|
status: 'error',
|
||||||
|
error: err instanceof Error ? err.message : String(err)
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return new Response(JSON.stringify({
|
||||||
|
success: true,
|
||||||
|
message: `Migration complete. Migrated ${results.migrated} of ${results.total} recipes.`,
|
||||||
|
...results
|
||||||
|
}), {
|
||||||
|
status: 200,
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json'
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
} catch (err) {
|
||||||
|
throw error(500, `Migration failed: ${err instanceof Error ? err.message : String(err)}`);
|
||||||
|
}
|
||||||
|
}) satisfies RequestHandler;
|
||||||
@@ -3,12 +3,19 @@ import type { RequestHandler } from '@sveltejs/kit';
|
|||||||
import { error } from '@sveltejs/kit';
|
import { error } from '@sveltejs/kit';
|
||||||
import { IMAGE_DIR } from '$env/static/private'
|
import { IMAGE_DIR } from '$env/static/private'
|
||||||
import sharp from 'sharp';
|
import sharp from 'sharp';
|
||||||
|
import { generateImageHashFromBuffer, getHashedFilename } from '$utils/imageHash';
|
||||||
|
|
||||||
export const POST = (async ({ request, locals}) => {
|
export const POST = (async ({ request, locals}) => {
|
||||||
const data = await request.json();
|
const data = await request.json();
|
||||||
const auth = await locals.auth();
|
const auth = await locals.auth();
|
||||||
if (!auth) throw error(401, "Need to be logged in")
|
if (!auth) throw error(401, "Need to be logged in")
|
||||||
let full_res = new Buffer.from(data.image, 'base64')
|
let full_res = new Buffer.from(data.image, 'base64')
|
||||||
|
|
||||||
|
// Generate content hash for cache busting
|
||||||
|
const imageHash = generateImageHashFromBuffer(full_res);
|
||||||
|
const hashedFilename = getHashedFilename(data.name, imageHash);
|
||||||
|
const unhashedFilename = data.name + '.webp';
|
||||||
|
|
||||||
// reduce image size if over 500KB
|
// reduce image size if over 500KB
|
||||||
const MAX_SIZE_KB = 500
|
const MAX_SIZE_KB = 500
|
||||||
//const metadata = await sharp(full_res).metadata()
|
//const metadata = await sharp(full_res).metadata()
|
||||||
@@ -18,27 +25,42 @@ export const POST = (async ({ request, locals}) => {
|
|||||||
// webp( { quality: 70})
|
// webp( { quality: 70})
|
||||||
// .toBuffer()
|
// .toBuffer()
|
||||||
//}
|
//}
|
||||||
await sharp(full_res)
|
|
||||||
|
// Save full size - both hashed and unhashed versions
|
||||||
|
const fullBuffer = await sharp(full_res)
|
||||||
.toFormat('webp')
|
.toFormat('webp')
|
||||||
.toFile(path.join(IMAGE_DIR,
|
.toBuffer();
|
||||||
"rezepte",
|
|
||||||
"full",
|
await sharp(fullBuffer)
|
||||||
data.name + ".webp"))
|
.toFile(path.join(IMAGE_DIR, "rezepte", "full", hashedFilename));
|
||||||
await sharp(full_res)
|
await sharp(fullBuffer)
|
||||||
|
.toFile(path.join(IMAGE_DIR, "rezepte", "full", unhashedFilename));
|
||||||
|
|
||||||
|
// Save thumbnail - both hashed and unhashed versions
|
||||||
|
const thumbBuffer = await sharp(full_res)
|
||||||
.resize({ width: 800})
|
.resize({ width: 800})
|
||||||
.toFormat('webp')
|
.toFormat('webp')
|
||||||
.toFile(path.join(IMAGE_DIR,
|
.toBuffer();
|
||||||
"rezepte",
|
|
||||||
"thumb",
|
await sharp(thumbBuffer)
|
||||||
data.name + ".webp"))
|
.toFile(path.join(IMAGE_DIR, "rezepte", "thumb", hashedFilename));
|
||||||
await sharp(full_res)
|
await sharp(thumbBuffer)
|
||||||
|
.toFile(path.join(IMAGE_DIR, "rezepte", "thumb", unhashedFilename));
|
||||||
|
|
||||||
|
// Save placeholder - both hashed and unhashed versions
|
||||||
|
const placeholderBuffer = await sharp(full_res)
|
||||||
.resize({ width: 20})
|
.resize({ width: 20})
|
||||||
.toFormat('webp')
|
.toFormat('webp')
|
||||||
.toFile(path.join(IMAGE_DIR,
|
.toBuffer();
|
||||||
"rezepte",
|
|
||||||
"placeholder",
|
await sharp(placeholderBuffer)
|
||||||
data.name + ".webp"))
|
.toFile(path.join(IMAGE_DIR, "rezepte", "placeholder", hashedFilename));
|
||||||
return new Response(JSON.stringify({msg: "Added image successfully"}),{
|
await sharp(placeholderBuffer)
|
||||||
status: 200,
|
.toFile(path.join(IMAGE_DIR, "rezepte", "placeholder", unhashedFilename))
|
||||||
|
return new Response(JSON.stringify({
|
||||||
|
msg: "Added image successfully",
|
||||||
|
filename: hashedFilename
|
||||||
|
}),{
|
||||||
|
status: 200,
|
||||||
});
|
});
|
||||||
}) satisfies RequestHandler;
|
}) satisfies RequestHandler;
|
||||||
|
|||||||
@@ -9,10 +9,24 @@ export const POST = (async ({ request, locals}) => {
|
|||||||
const auth = await locals.auth()
|
const auth = await locals.auth()
|
||||||
if(!auth) throw error(401, "You need to be logged in")
|
if(!auth) throw error(401, "You need to be logged in")
|
||||||
|
|
||||||
|
// data.filename should be the full filename with hash (e.g., "maccaroni.a1b2c3d4.webp")
|
||||||
|
// For backward compatibility, also support data.name (will construct filename)
|
||||||
|
const hashedFilename = data.filename || (data.name + ".webp");
|
||||||
|
|
||||||
|
// Also extract basename to delete unhashed version
|
||||||
|
const basename = data.name || hashedFilename.replace(/\.[a-f0-9]{8}\.webp$/, '').replace(/\.webp$/, '');
|
||||||
|
const unhashedFilename = basename + '.webp';
|
||||||
|
|
||||||
[ "full", "thumb", "placeholder"].forEach((folder) => {
|
[ "full", "thumb", "placeholder"].forEach((folder) => {
|
||||||
unlink(path.join(IMAGE_DIR, "rezepte", folder, data.name + ".webp"), (e) => {
|
// Delete hashed version
|
||||||
if(e) error(404, "could not delete: " + folder + "/" + data.name + ".webp" + e)
|
unlink(path.join(IMAGE_DIR, "rezepte", folder, hashedFilename), (e) => {
|
||||||
})
|
if(e) console.warn(`Could not delete hashed: ${folder}/${hashedFilename}`, e);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Delete unhashed version (for graceful degradation)
|
||||||
|
unlink(path.join(IMAGE_DIR, "rezepte", folder, unhashedFilename), (e) => {
|
||||||
|
if(e) console.warn(`Could not delete unhashed: ${folder}/${unhashedFilename}`, e);
|
||||||
|
});
|
||||||
})
|
})
|
||||||
return new Response(JSON.stringify({msg: "Deleted image successfully"}),{
|
return new Response(JSON.stringify({msg: "Deleted image successfully"}),{
|
||||||
status: 200,
|
status: 200,
|
||||||
|
|||||||
@@ -3,21 +3,41 @@ import type { RequestHandler } from '@sveltejs/kit';
|
|||||||
import { IMAGE_DIR } from '$env/static/private'
|
import { IMAGE_DIR } from '$env/static/private'
|
||||||
import { rename } from 'node:fs';
|
import { rename } from 'node:fs';
|
||||||
import { error } from '@sveltejs/kit';
|
import { error } from '@sveltejs/kit';
|
||||||
|
import { extractBasename, getHashedFilename } from '$utils/imageHash';
|
||||||
|
|
||||||
export const POST = (async ({ request, locals}) => {
|
export const POST = (async ({ request, locals}) => {
|
||||||
const data = await request.json();
|
const data = await request.json();
|
||||||
const auth = await locals.auth();
|
const auth = await locals.auth();
|
||||||
if(!auth ) throw error(401, "need to be logged in")
|
if(!auth ) throw error(401, "need to be logged in")
|
||||||
|
|
||||||
|
// data.old_filename should be the full filename with hash (e.g., "maccaroni.a1b2c3d4.webp")
|
||||||
|
// data.new_name should be the new basename (e.g., "pasta")
|
||||||
|
// Extract hash from old filename and apply to new basename
|
||||||
|
const oldFilename = data.old_filename || (data.old_name + ".webp");
|
||||||
|
const hashMatch = oldFilename.match(/\.([a-f0-9]{8})\.webp$/);
|
||||||
|
|
||||||
|
let newFilename: string;
|
||||||
|
if (hashMatch) {
|
||||||
|
// Old filename has hash, preserve it
|
||||||
|
const hash = hashMatch[1];
|
||||||
|
newFilename = getHashedFilename(data.new_name, hash);
|
||||||
|
} else {
|
||||||
|
// Old filename has no hash (legacy), new one won't either
|
||||||
|
newFilename = data.new_name + ".webp";
|
||||||
|
}
|
||||||
|
|
||||||
[ "full", "thumb", "placeholder"].forEach((folder) => {
|
[ "full", "thumb", "placeholder"].forEach((folder) => {
|
||||||
const old_path = path.join(IMAGE_DIR, "rezepte", folder, data.old_name + ".webp")
|
const old_path = path.join(IMAGE_DIR, "rezepte", folder, oldFilename)
|
||||||
rename(old_path, path.join(IMAGE_DIR, "rezepte", folder, data.new_name + ".webp"), (e) => {
|
rename(old_path, path.join(IMAGE_DIR, "rezepte", folder, newFilename), (e) => {
|
||||||
console.log(e)
|
console.log(e)
|
||||||
if(e) throw error(500, "could not mv: " + old_path)
|
if(e) throw error(500, "could not mv: " + old_path)
|
||||||
})
|
})
|
||||||
});
|
});
|
||||||
|
|
||||||
return new Response(JSON.stringify({msg: "Deleted image successfully"}),{
|
return new Response(JSON.stringify({
|
||||||
status: 200,
|
msg: "Renamed image successfully",
|
||||||
});
|
filename: newFilename
|
||||||
|
}),{
|
||||||
|
status: 200,
|
||||||
|
});
|
||||||
}) satisfies RequestHandler;
|
}) satisfies RequestHandler;
|
||||||
|
|||||||
50
src/utils/imageHash.ts
Normal file
50
src/utils/imageHash.ts
Normal file
@@ -0,0 +1,50 @@
|
|||||||
|
import crypto from 'crypto';
|
||||||
|
import fs from 'fs';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Generates an 8-character hash from image file content
|
||||||
|
* Uses SHA-256 for reliable, content-based hashing
|
||||||
|
* @param filePath - Path to the image file
|
||||||
|
* @returns 8-character hex hash
|
||||||
|
*/
|
||||||
|
export function generateImageHash(filePath: string): string {
|
||||||
|
const fileBuffer = fs.readFileSync(filePath);
|
||||||
|
const hashSum = crypto.createHash('sha256');
|
||||||
|
hashSum.update(fileBuffer);
|
||||||
|
const hash = hashSum.digest('hex');
|
||||||
|
return hash.substring(0, 8);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Generates an 8-character hash from Buffer content
|
||||||
|
* @param buffer - Image file buffer
|
||||||
|
* @returns 8-character hex hash
|
||||||
|
*/
|
||||||
|
export function generateImageHashFromBuffer(buffer: Buffer): string {
|
||||||
|
const hashSum = crypto.createHash('sha256');
|
||||||
|
hashSum.update(buffer);
|
||||||
|
const hash = hashSum.digest('hex');
|
||||||
|
return hash.substring(0, 8);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Creates a filename with hash for cache busting
|
||||||
|
* @param basename - Base name without extension (e.g., "maccaroni")
|
||||||
|
* @param hash - 8-character hash
|
||||||
|
* @returns Filename with hash (e.g., "maccaroni.a1b2c3d4.webp")
|
||||||
|
*/
|
||||||
|
export function getHashedFilename(basename: string, hash: string): string {
|
||||||
|
return `${basename}.${hash}.webp`;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extracts basename from a potentially hashed filename
|
||||||
|
* @param filename - Filename (e.g., "maccaroni.a1b2c3d4.webp" or "maccaroni.webp")
|
||||||
|
* @returns Basename without hash or extension (e.g., "maccaroni")
|
||||||
|
*/
|
||||||
|
export function extractBasename(filename: string): string {
|
||||||
|
// Remove .webp extension
|
||||||
|
const withoutExt = filename.replace(/\.webp$/, '');
|
||||||
|
// Remove hash if present (8 hex chars preceded by a dot)
|
||||||
|
return withoutExt.replace(/\.[a-f0-9]{8}$/, '');
|
||||||
|
}
|
||||||
Reference in New Issue
Block a user