Caching in Ruby on Rails serves as a vital performance optimization technique that can dramatically improve application response times. I’ve spent years implementing various caching strategies, and I’ll share the most effective techniques I’ve discovered.
Multi-level Cache Architecture
A multi-level cache architecture combines different caching layers to optimize data retrieval. The most common implementation uses memory-based caching (like Memcached) as the first layer and disk-based caching as the backup.
class MultiLevelCache
def fetch(key)
MemoryStore.fetch(key) do
DiskStore.fetch(key) do
yield
end
end
end
end
Cache key design remains crucial for effective caching. Complex keys should incorporate relevant model attributes and timestamps.
def cache_key_generator(record)
"#{record.class.name}/#{record.id}-#{record.updated_at.to_i}"
end
Cache warming prevents the initial slowdown when populating empty caches. Background jobs can pre-populate frequently accessed data.
class CacheWarmer
include Sidekiq::Worker
def perform(resource_type)
resources = resource_type.constantize.frequently_accessed
resources.each do |resource|
Rails.cache.write(resource.cache_key, resource)
end
end
end
Version-based invalidation offers a clean approach to managing cache updates:
class VersionedCache
def fetch_with_version(key)
version = current_version(key)
versioned_key = "#{key}/v#{version}"
Rails.cache.fetch(versioned_key) do
yield
end
end
private
def current_version(key)
Rails.cache.fetch("#{key}/version") { 1 }
end
def increment_version(key)
Rails.cache.increment("#{key}/version")
end
end
Cache store selection impacts performance significantly. Redis offers advanced features like automatic key expiration and atomic operations:
config.cache_store = :redis_cache_store, {
url: ENV['REDIS_URL'],
expires_in: 1.day,
race_condition_ttl: 10,
error_handler: -> (error) { Rails.logger.error(error) }
}
Race condition prevention becomes essential in high-traffic applications:
class SafeCache
def fetch_with_lock(key)
Rails.cache.fetch(key, race_condition_ttl: 10.seconds) do
with_lock(key) do
yield
end
end
end
private
def with_lock(key)
lock_key = "#{key}_lock"
acquired = Rails.cache.write(lock_key, true, unless_exist: true, expires_in: 30.seconds)
if acquired
begin
yield
ensure
Rails.cache.delete(lock_key)
end
else
sleep 0.1 until Rails.cache.read(lock_key).nil?
Rails.cache.read(key)
end
end
end
Cache monitoring helps identify performance bottlenecks:
class CacheMonitor
def track_cache_hit(key)
StatsD.increment('cache.hit', tags: ["key:#{key}"])
end
def track_cache_miss(key)
StatsD.increment('cache.miss', tags: ["key:#{key}"])
end
def track_cache_write(key)
StatsD.increment('cache.write', tags: ["key:#{key}"])
end
end
Fragment caching optimizes partial view rendering:
class ProductsController < ApplicationController
def index
@products = Product.all
end
end
# View template
<% cache_if user_signed_in?, @products do %>
<% @products.each do |product| %>
<% cache [product, current_user&.admin?] do %>
<%= render partial: 'product', locals: { product: product } %>
<% end %>
<% end %>
<% end %>
Russian Doll caching nests cache fragments:
class Comment < ApplicationRecord
belongs_to :post, touch: true
end
class Post < ApplicationRecord
has_many :comments
end
# View template
<% cache post do %>
<%= render post %>
<% post.comments.each do |comment| %>
<% cache comment do %>
<%= render comment %>
<% end %>
<% end %>
<% end %>
Query caching reduces database load:
class QueryCache
def fetch_records
Rails.cache.fetch('expensive_query', expires_in: 1.hour) do
Product.joins(:category)
.where(active: true)
.includes(:variants)
.to_a
end
end
end
HTTP caching implementation:
class ProductsController < ApplicationController
def show
@product = Product.find(params[:id])
fresh_when(@product)
end
end
# Alternative with ETag
class ArticlesController < ApplicationController
def show
@article = Article.find(params[:id])
if stale?(etag: @article, last_modified: @article.updated_at)
render @article
end
end
end
Counter caching improves count queries:
class Post < ApplicationRecord
belongs_to :author
belongs_to :category, counter_cache: true
end
class AddCounterCacheToCategories < ActiveRecord::Migration[6.1]
def change
add_column :categories, :posts_count, :integer, default: 0
reversible do |dir|
dir.up { data }
end
end
def data
Category.find_each do |category|
Category.reset_counters(category.id, :posts)
end
end
end
Implementing cache sweepers:
class ProductSweeper < ActionController::Caching::Sweeper
observe Product
def after_save(product)
expire_cache_for(product)
end
def after_destroy(product)
expire_cache_for(product)
end
private
def expire_cache_for(product)
expire_page(controller: 'products', action: 'show', id: product.id)
expire_fragment("product_#{product.id}")
end
end
Low-level cache implementation:
class LowLevelCache
def read_multi(*keys)
Rails.cache.read_multi(*keys)
end
def write_multi(hash)
hash.each do |key, value|
Rails.cache.write(key, value)
end
end
def fetch_multi(*keys)
results = read_multi(*keys)
missing = keys - results.keys
if missing.any?
new_values = yield(missing)
write_multi(new_values)
results.merge!(new_values)
end
results
end
end
These caching techniques significantly improve Rails application performance when implemented correctly. Regular monitoring and adjustment ensure optimal cache effectiveness as application needs evolve.