I’ve spent years optimizing Rails applications under heavy traffic. Caching isn’t just a performance tweak - it’s the backbone of responsive systems. When database queries start choking response times, layered caching becomes essential. Let me share proven patterns I’ve implemented across e-commerce platforms and SaaS products.
Memory-Redis-Database Layering
The foundation starts with a three-tier cache structure. Memory cache provides nanosecond access for repeated requests, Redis serves as shared storage across instances, and the database acts as the source of truth. Here’s how I structure it:
class ReviewCache < MultiLayerCache
def top_reviews(product_id)
fetch("product_#{product_id}_top_reviews") do
@model.includes(:user)
.where(product_id: product_id, status: :approved)
.order(rating: :desc)
.limit(50)
end
end
def expire_product_reviews(product_id)
expire("product_#{product_id}_top_reviews")
end
end
# Controller implementation
def show
@product = Product.find(params[:id])
@reviews = ReviewCache.new(Review).top_reviews(@product.id)
end
after_action :clear_review_cache, only: [:update, :destroy]
def clear_review_cache
ReviewCache.new(Review).expire_product_reviews(@product.id)
end
The memory cache (typically 64MB per instance) catches repeated requests during the same operation. Redis handles cross-process caching with configurable persistence. Automatic expiration via touch: true
associations ensures data coherence.
Fragment Caching Granularity
Partial templates benefit from surgical caching. I use nested fragment caching like this:
<% cache [@product, 'header'] do %>
<h1><%= @product.name %></h1>
<!-- Product header content -->
<% end %>
<% cache [@product, 'details'] do %>
<div class="specs">
<%= render partial: 'tech_specs', locals: { product: @product } %>
</div>
<% end %>
Versioned keys automatically expire when products update:
# app/models/product.rb
after_commit :expire_fragments
def expire_fragments
Rails.cache.delete_matched("views/products/#{id}-*")
end
Strategic Low-Level Caching
For computationally heavy results, I bypass ActiveRecord:
class WeatherService
def forecast(zip_code)
Rails.cache.fetch("weather/#{zip_code}/v7", expires_in: 30.minutes) do
# External API call
WeatherAPI.get_forecast(zip_code).to_json
end
end
end
# With compression for large payloads
Rails.cache.fetch('large_dataset', compress: true) do
Product.export_report(:annual_sales) # 5MB JSON
end
HTTP Caching Directives
Leverage browser caching with conditional requests:
class ProductsController < ApplicationController
def show
@product = Product.find(params[:id])
if stale?(etag: @product, last_modified: @product.updated_at)
respond_to do |format|
format.html
format.json { render json: @product }
end
end
end
end
Add Cache-Control headers in production:
# config/environments/production.rb
config.public_file_server.headers = {
'Cache-Control' => 'public, max-age=31536000',
'Expires' => 1.year.from_now.httpdate
}
Query Cache Optimization
Rails automatically caches queries within requests. I enhance this with manual control:
ActiveRecord::Base.uncached do
Order.export_analytics_report # Bypass cache for fresh data
end
# Cache complex joins
Product.cache_key_with_version = [Product.maximum(:updated_at), Category.maximum(:updated_at)]
Stampede Protection
Prevent cache regeneration avalanches during expiration:
# config/initializers/redis_mutex.rb
class RedisMutex
def self.with_lock(key, timeout: 5)
return yield if Redis.current.setnx(key, Time.now.to_i + timeout)
sleep(rand * 0.1) # Randomized backoff
retry
end
end
# Cache with lock
Rails.cache.fetch('hot_products', force: false) do
RedisMutex.with_lock('hot_products_lock') do
Product.hot_items(force: true) # Regenerate if empty
end
end
Proactive Cache Warming
Preload caches during off-peak hours:
# lib/tasks/warm_cache.rake
task warm_cache: :environment do
Product.featured_ids.each do |id|
Rails.cache.fetch("product_#{id}_full") do
ProductSerializer.new(Product.find(id)).to_json
end
end
end
# Schedule with cron
0 4 * * * /bin/bash -l -c 'cd /app && RAILS_ENV=production bundle exec rake warm_cache'
Cache Monitoring Essentials
Instrumentation prevents invisible failures:
# config/initializers/cache_metrics.rb
ActiveSupport::Notifications.subscribe('cache_read.active_support') do |*args|
event = ActiveSupport::Notifications::Event.new(*args)
StatsD.increment("cache.read.#{event.payload[:hit] ? 'hit' : 'miss'}")
end
# Redis memory alerts
Redis.current.info('memory').tap do |mem|
alert if mem['used_memory'].to_f / mem['total_system_memory'].to_f > 0.8
end
These patterns form a cohesive strategy. Start with layered caching fundamentals, then implement fragment caching for view efficiency. Low-level caching handles expensive computations while HTTP caching reduces server load. Query caching works automatically but benefits from manual oversight. Stampede protection maintains stability during cache expiration events, and proactive warming ensures cache readiness. Monitoring completes the system with actionable insights.
The key is balancing freshness and performance. I implement 10-15% shorter TTLs for volatile data and version all cache keys. For financial data, I’ll use 1-minute TTLs with synchronous expiration, while product catalogs might use 6-hour caching.
Remember: Caching is iterative. Profile with rack-mini-profiler
, analyze hit rates weekly, and adjust layers as traffic patterns evolve. Properly implemented, these patterns sustain 10X traffic spikes without database overload while maintaining sub-100ms response times.