Optimizing database interactions remains critical for maintaining fast Rails applications. I’ve seen projects slow to a crawl under heavy data loads, but targeted query improvements can transform performance. Let me share practical techniques from real-world experience.
Eager loading prevents the common N+1 query problem. When fetching users along with their notifications and recent orders, loading associations individually creates excessive database roundtrips. Instead, use includes
to fetch everything in a few efficient queries.
# Efficient loading
def user_profile
@user = User.includes(:profile, :recent_comments => :replies).find(params[:id])
end
I once reduced a page’s loading time from 2.3 seconds to 190 milliseconds just by fixing eager loading. The includes
method smartly chooses between single or multiple queries based on conditions.
Strategic indexing dramatically speeds up filtered sorts. Adding composite indexes on frequently queried columns creates efficient lookup paths. For an orders table sorted by status and creation date, this migration helps:
class OptimizeOrderQueries < ActiveRecord::Migration[7.1]
def change
add_index :orders, [:priority, :created_at], order: { created_at: :desc }
end
end
Batch processing prevents memory overload during large operations. Processing thousands of records at once can crash servers. Instead, process in smaller chunks:
# Safe archiving
ExpiredListing.find_each(batch_size: 300) do |listing|
listing.archive_and_notify
end
Counter caches eliminate expensive counting operations. Without them, counting user posts requires scanning the entire table. Add a cache column:
# Migration
class AddArticlesCountToAuthors < ActiveRecord::Migration[7.1]
def change
add_column :authors, :articles_count, :integer, default: 0, null: false
end
end
# Model
class Author < ApplicationRecord
has_many :articles, counter_cache: true
end
Now author.articles_count
retrieves a precalculated value instantly.
The explain
method reveals how queries execute. When a report page felt sluggish, explain
showed a missing index:
SalesReport.where(quarter: 'Q3').order(total: :desc).explain
# Output showed sequential scan instead of index usage
Materialized views cache complex aggregations. For frequently accessed dashboards with expensive calculations:
# Define the view
class CreateSalesSummaries < ActiveRecord::Migration[7.1]
def up
execute <<-SQL
CREATE MATERIALIZED VIEW sales_summaries AS
SELECT product_id, SUM(quantity) AS total_sold
FROM orders
GROUP BY product_id
SQL
end
def down
execute "DROP MATERIALIZED VIEW sales_summaries"
end
end
# Refresh when needed
ActiveRecord::Base.connection.execute("REFRESH MATERIALIZED VIEW sales_summaries")
Selective column retrieval minimizes data transfer. Fetch only what you need:
# Instead of entire rows
Project.select(:id, :name, :deadline).where(team_id: current_team.id)
I’ve reduced query times by 40% just by avoiding SELECT *
in large tables.
Query monitoring tools like PostgreSQL’s pg_stat_statements
help identify slow performers in production. Combine this with Rails’ built-in logging to spot optimization opportunities.
# In development.rb
config.active_record.query_log_tags = [:controller, :action]
Consistent measurement ensures optimizations actually help. Benchmark before and after changes using the benchmark
method:
Benchmark.ms { Report.generate_complex_stats }
# => 2480.5 ms before, 320.7 ms after indexing
These methods work together to maintain performance as applications scale. Start with the biggest bottlenecks - often N+1 queries or missing indexes - then implement caching strategies. Regular query reviews prevent new performance issues from creeping in over time.