StorageCaching
Caching
Last updated:
Caching
The storage package supports caching to improve performance and reduce API calls. You provide your own cache implementation that follows the simple Cache interface.
LRU Cache
Use the built-in LRU cache for simple in-memory caching:
import { LRUCache } from "lru-cache";
import { DiskStorage } from "@visulima/storage";
const cache = new LRUCache({
max: 1000, // Maximum number of items
ttl: 3600000, // 1 hour in milliseconds
});
const storage = new DiskStorage({
directory: "/uploads",
cache,
});Custom Cache Implementation
Implement the Cache interface for any cache provider:
import { DiskStorage, type Cache } from "@visulima/storage";
import { Redis } from "ioredis";
// Custom Redis cache implementation
class RedisCache implements Cache<string, any> {
constructor(private redis: Redis) {}
async get(key: string): Promise<any | undefined> {
const value = await this.redis.get(key);
return value ? JSON.parse(value) : undefined;
}
async set(key: string, value: any): Promise<boolean> {
await this.redis.set(key, JSON.stringify(value));
return true;
}
async delete(key: string): Promise<boolean> {
await this.redis.del(key);
return true;
}
async clear(): Promise<void> {
await this.redis.flushall();
}
async has(key: string): Promise<boolean> {
const exists = await this.redis.exists(key);
return exists === 1;
}
}
const redis = new Redis();
const cache = new RedisCache(redis);
const storage = new DiskStorage({
directory: "/uploads",
cache,
});BentoCache Integration
For advanced multi-tier caching, use BentoCache with the adapter:
import { BentoCache, bentostore } from "bentocache";
import { memoryDriver } from "bentocache/drivers/memory";
import { redisDriver } from "bentocache/drivers/redis";
import { BentoCacheAdapter } from "@visulima/storage/utils/cache";
const bento = new BentoCache({
default: "storage",
stores: {
storage: bentostore()
.useL1Layer(memoryDriver({ maxSize: "10mb" }))
.useL2Layer(
redisDriver({
connection: { host: "127.0.0.1", port: 6379 },
}),
),
},
});
const cache = new BentoCacheAdapter({
bento,
namespace: "storage",
defaultTtl: 3600000, // 1 hour
});
const storage = new DiskStorage({
directory: "/uploads",
cache,
});Transformer Caching
Cache transformed media files to avoid reprocessing:
import { MediaTransformer } from "@visulima/storage/transformer";
import ImageTransformer from "@visulima/storage/transformer/image";
import VideoTransformer from "@visulima/storage/transformer/video";
import { LRUCache } from "lru-cache";
const transformer = new MediaTransformer(storage, {
cache: new LRUCache({ max: 100, ttl: 3600000 }),
ImageTransformer: ImageTransformer,
VideoTransformer: VideoTransformer,
});Cache Interface
Any cache implementation must implement this interface:
interface Cache<K = string, V = any> {
get(key: K): V | undefined | Promise<V | undefined>;
set(key: K, value: V, options?: { ttl?: number }): boolean | Promise<boolean>;
delete(key: K): boolean | Promise<boolean>;
clear(): void | Promise<void>;
has(key: K): boolean | Promise<boolean>;
}Cache Strategies
File Metadata Caching
Storage backends automatically cache file metadata to reduce API calls:
const storage = new S3Storage({
bucket: "my-bucket",
cache: new LRUCache({
max: 1000,
ttl: 3600000, // Cache metadata for 1 hour
}),
});Transformed Media Caching
Cache transformed images, videos, and audio to avoid reprocessing:
const imageTransformer = new ImageTransformer(storage, {
cache: new LRUCache({
max: 500, // Cache up to 500 transformed images
ttl: 86400000, // 24 hours
}),
cacheTtl: 86400, // Cache TTL in seconds
});Multi-Tier Caching
Use a multi-tier cache for optimal performance:
// L1: Fast in-memory cache (small, fast)
// L2: Redis cache (larger, slower)
// L3: Storage backend (largest, slowest)
const cache = new BentoCacheAdapter({
bento: new BentoCache({
stores: {
storage: bentostore()
.useL1Layer(memoryDriver({ maxSize: "10mb" }))
.useL2Layer(redisDriver({ connection: { host: "localhost" } })),
},
}),
});Cache Invalidation
Manually invalidate cache entries when needed:
// Delete specific cache entry
await cache.delete(fileId);
// Clear all cache entries
await cache.clear();
// Check if entry exists
const exists = await cache.has(fileId);Best Practices
- Set appropriate TTLs - Balance freshness with performance
- Monitor cache hit rates - Optimize cache size based on usage
- Use multi-tier caching - Combine fast and slow cache layers
- Invalidate on updates - Clear cache when files are modified
- Consider cache size limits - Prevent memory issues with large caches
For more information, see the BentoCache documentation.