ruby

How to Build Systems That Remember Everything: Event Sourcing in Ruby on Rails

Learn how to implement Event Sourcing in Ruby on Rails to build systems that remember every change. Perfect audit trails & time travel for complex business domains.

How to Build Systems That Remember Everything: Event Sourcing in Ruby on Rails

Let me explain how to build systems that remember everything. In traditional applications, when you update a record, the old data disappears. You’re left with only the latest state, like a chalkboard that gets erased with each change. This makes answering simple questions difficult. What was the order total before the discount? When exactly did the customer’s address change? Which user approved this modification yesterday?

There’s a different way to think about data. Instead of storing just the current state, you can store every change as a distinct, immutable record. This list of changes is the single source of truth. The current state is just a byproduct—it’s what you get when you replay all those changes from the beginning. This approach is called Event Sourcing.

I find this model powerful for complex business domains. It provides a built-in audit log, enables you to travel back in time within your data, and can make complicated business rules easier to manage. Let me show you how to start applying these ideas in a Ruby on Rails application.

We begin with the event itself. An event represents something that happened in the past. It’s a fact. “OrderCreated.” “PaymentReceived.” “AddressUpdated.” These facts are never altered or deleted. We just record new ones.

In code, an event is often a simple object with some data about what occurred. Here’s a basic structure to hold these facts.

# app/models/event.rb
class Event < ApplicationRecord
  # This event belongs to an 'aggregate'—a main entity like an Order or User
  belongs_to :aggregate, polymorphic: true

  # Store the details of the event and any extra context (like who did it)
  serialize :data, JSON
  serialize :metadata, JSON

  # Automatically sequence events for this aggregate to maintain order
  before_create :assign_sequence_number

  private

  def assign_sequence_number
    last_event = Event.where(aggregate: aggregate).order(:sequence_number).last
    self.sequence_number = last_event ? last_event.sequence_number + 1 : 1
  end
end

Each type of event is a subclass that knows how to apply its effect. Think of it as a small, focused instruction.

# app/events/order_created_event.rb
class OrderCreatedEvent < Event
  # This method describes how to apply this fact to an Order object.
  def apply(order)
    order.status = :created
    order.customer_id = data['customer_id']
    order.total_amount_cents = data['total_amount_cents']
    order.created_at = timestamp # The event knows when it happened
  end
end

# app/events/payment_received_event.rb
class PaymentReceivedEvent < Event
  def apply(order)
    order.status = :paid
    order.paid_at = timestamp
    order.payment_method = data['payment_method']
    # The total doesn't change from the creation event; we just mark it paid.
  end
end

Now, where do these events live? They are linked to an Aggregate. This is a term for a cluster of associated objects that we treat as a single unit for data changes. An Order and its LineItems might be one aggregate. The aggregate is responsible for ensuring it’s always in a valid, consistent state.

The aggregate’s current condition isn’t stored directly in a database row. Instead, it’s calculated by starting with a blank slate and replaying every event associated with it. Here’s a simplified Order aggregate.

# app/models/order.rb
class Order < ApplicationRecord
  has_many :events, as: :aggregate

  # Temporary list for new events before we save them
  attr_accessor :uncommitted_events

  def initialize(attributes = {})
    super
    @uncommitted_events = []
    # If loading an existing order, rebuild its state from history
    load_from_history if persisted?
  end

  # A command method: "Create this order."
  def create(customer_id, total_amount_cents)
    # Business rule: can't create an order that already exists.
    raise StandardError, 'Order already exists' if persisted?

    # Build the event fact.
    event = OrderCreatedEvent.new(
      aggregate: self,
      data: {
        'customer_id' => customer_id,
        'total_amount_cents' => total_amount_cents
      }
    )

    # Apply it to myself and stash it for later saving.
    apply_event(event)
  end

  # Another command: "Receive a payment for this order."
  def receive_payment(payment_method)
    # Business rule: you can only pay for a 'created' order.
    raise StandardError, 'Order must be in created state' unless status == :created

    event = PaymentReceivedEvent.new(
      aggregate: self,
      data: { 'payment_method' => payment_method }
    )
    apply_event(event)
  end

  # This is the core: applying an event changes the aggregate's state.
  def apply_event(event)
    event.apply(self) # Call the event's `apply` method
    uncommitted_events << event # Remember to save this event later
  end

  # The magic trick: rebuilding state from persisted events.
  def load_from_history
    events.order(:sequence_number).each do |stored_event|
      # Find the right event class (e.g., OrderCreatedEvent) and apply it.
      event_class = stored_event.type.constantize
      event_instance = event_class.new(stored_event.attributes)
      event_instance.apply(self)
    end
    @uncommitted_events = [] # We've replayed, so nothing is newly uncommitted.
  end

  # Override save to persist both the aggregate's state AND its new events.
  def save
    transaction do
      super if changed? # Save the Order's current snapshot (optional, for convenience)
      uncommitted_events.each(&:save!) # Save all the new event records
      @uncommitted_events.clear # Clear the list
    end
  end
end

You use it like this:

# Creating an order generates and stores an event.
order = Order.new
order.create(123, 5000) # customer_id 123, $50.00
order.save
# This saves the Order record AND an OrderCreatedEvent record.

# Later, loading the order rebuilds its state from scratch.
loaded_order = Order.find(order.id)
puts loaded_order.status # => :created
puts loaded_order.total_amount_cents # => 5000

# Receiving a payment adds a second event.
loaded_order.receive_payment('credit_card')
loaded_order.save
# Now the events table has OrderCreatedEvent and PaymentReceivedEvent for this aggregate.

This gives you a perfect history. But it has a problem. Every time you need an order, your code loads every single event and replays them. For an order with 50 updates, this is fine. For a customer account with 10,000 events, it’s painfully slow.

This is where Snapshots come in. Periodically, you save the aggregate’s entire current state. Next time you need it, you load the latest snapshot and only replay the events that happened after that snapshot was taken.

# app/models/snapshot.rb
class Snapshot < ApplicationRecord
  belongs_to :aggregate, polymorphic: true
  serialize :state, JSON # A JSON blob of the aggregate's attributes

  def self.for(aggregate)
    create!(
      aggregate: aggregate,
      sequence_number: aggregate.events.maximum(:sequence_number),
      state: aggregate.as_json
    )
  end
end

# A smarter repository to handle loading with snapshots.
class OrderRepository
  def find(id)
    order = Order.new(id: id)
    snapshot = Snapshot.where(aggregate: order).order(sequence_number: :desc).first

    events_query = Event.where(aggregate: order).order(:sequence_number)
    if snapshot
      # Start from the snapshot state
      order.assign_attributes(snapshot.state)
      # Only replay events that occurred after the snapshot
      events_query = events_query.where('sequence_number > ?', snapshot.sequence_number)
    end

    events_query.each do |event_record|
      event_class = event_record.type.constantize
      event_class.new(event_record.attributes).apply(order)
    end

    order
  end

  def save(order)
    ApplicationRecord.transaction do
      order.save
      # Simple rule: take a snapshot every 100 events
      if order.events.count % 100 == 0
        Snapshot.for(order)
      end
    end
  end
end

Now, the read side. Rebuilding state for every query is inefficient. In a web app, you often need to display a list of orders with their current status and total. You don’t want to replay events for each one on every page load.

The solution is to separate the Write Model (the aggregates and events) from the Read Model (optimized views for querying). You create special database tables that are updated every time an event occurs. These are called Projections.

# app/models/order_read_model.rb
# This is a simple ActiveRecord model for fast reading.
class OrderReadModel < ApplicationRecord
  # Has columns like: order_id, customer_id, status, total_amount_cents, paid_at
end

# app/projectors/order_projector.rb
class OrderProjector
  # This class listens for events and updates the read model.

  def self.project(event)
    case event
    when OrderCreatedEvent
      OrderReadModel.create!(
        order_id: event.aggregate_id,
        customer_id: event.data['customer_id'],
        status: 'created',
        total_amount_cents: event.data['total_amount_cents'],
        version: event.sequence_number
      )
    when PaymentReceivedEvent
      order = OrderReadModel.find_by!(order_id: event.aggregate_id)
      order.update!(
        status: 'paid',
        paid_at: event.timestamp,
        version: event.sequence_number
      )
    end
  end
end

# You need to hook this projector to run after an event is saved.
# In config/initializers/event_subscriptions.rb
Event.notify_observers(:after_create) do |event|
  OrderProjector.project(event)
end

Your controllers then query the fast OrderReadModel for display, while all commands go through the Order aggregate. This is the essence of the CQRS (Command Query Responsibility Segregation) pattern that often accompanies Event Sourcing.

What about side effects? When an order is created, you might need to reserve inventory or send a confirmation email. These shouldn’t be part of the core event application logic. You handle them with Event Handlers.

# app/handlers/inventory_handler.rb
class InventoryHandler
  def handle(event)
    if event.is_a?(OrderCreatedEvent)
      # Extract items from event data and reserve stock
      event.data['items'].each do |item|
        product = Product.find(item['product_id'])
        product.reserve(item['quantity'])
      end
    end
    if event.is_a?(OrderCancelledEvent)
      # Release the reserved stock
      # ...
    end
  end
end

# app/handlers/email_handler.rb
class EmailHandler
  def handle(event)
    if event.is_a?(OrderCreatedEvent)
      customer = Customer.find(event.data['customer_id'])
      OrderMailer.confirmation(customer, event.aggregate_id).deliver_later
    end
  end
end

# Dispatch events to all handlers after they are successfully saved.
# This ensures side effects don't block the save and can be retried independently.
class EventDispatcher
  HANDLERS = [InventoryHandler.new, EmailHandler.new, OrderProjector]

  def self.dispatch(event)
    HANDLERS.each do |handler|
      handler.handle(event) if handler.respond_to?(:handle)
    end
  end
end

# Hook it up in the Event model
class Event < ApplicationRecord
  after_commit_commit { EventDispatcher.dispatch(self) }
end

The true power of this approach becomes clear when you need to ask new questions of your data. Since you have the entire history, you can create new projections long after the events occurred, without changing your core write logic.

Need to know the average order value per customer last quarter? Build a new projection that listens to OrderCreatedEvent and PaymentReceivedEvent and populates a new CustomerQuarterlyStats table.

Need to debug a specific customer’s strange order journey? Replay their events in sequence. You can see the exact state of their order before and after every single action.

Here is a more complete example, showing a command service that orchestrates this flow, using a dedicated event store table for better performance.

# db/migrate/XXXXXX_create_event_store_events.rb
class CreateEventStoreEvents < ActiveRecord::Migration[7.0]
  def change
    create_table :event_store_events do |t|
      t.string :stream, null: false  # e.g., "Order-#{id}"
      t.string :event_type, null: false # e.g., "OrderCreatedEvent"
      t.jsonb :data, null: false
      t.jsonb :metadata
      t.integer :version, null: false
      t.datetime :created_at, null: false

      t.index [:stream, :version], unique: true
    end
  end
end

# lib/event_store.rb
class EventStore
  def self.append_to_stream(stream_name, event, expected_version: nil)
    ActiveRecord::Base.transaction do
      current_version = last_version_for(stream_name)

      # Optimistic concurrency check
      if expected_version && current_version != expected_version
        raise "Concurrency conflict on stream #{stream_name}"
      end

      new_version = current_version + 1
      EventStoreEvent.create!(
        stream: stream_name,
        event_type: event.class.name,
        data: event.data,
        metadata: event.metadata || {},
        version: new_version,
        created_at: Time.current
      )
      new_version
    end
  end

  def self.read_stream(stream_name)
    EventStoreEvent.where(stream: stream_name).order(:version).to_a
  end

  private

  def self.last_version_for(stream_name)
    EventStoreEvent.where(stream: stream_name).maximum(:version) || 0
  end
end

# app/commands/create_order_command.rb
class CreateOrderCommand
  def initialize(customer_id, items)
    @customer_id = customer_id
    @items = items
  end

  def run
    # Calculate total, etc.
    total = @items.sum { |i| i[:price_cents] * i[:quantity] }

    # Build the event
    event = OrderCreatedEvent.new(
      data: {
        customer_id: @customer_id,
        items: @items,
        total_amount_cents: total
      },
      metadata: { command_id: SecureRandom.uuid }
    )

    # Persist it to its stream
    stream_name = "Order-#{SecureRandom.uuid}"
    EventStore.append_to_stream(stream_name, event)

    # Immediately project it to the read model for quick access
    OrderProjector.project(event, stream_id: stream_name)

    # Return the stream name, which acts as the order ID
    stream_name
  end
end

This structure is more advanced but scales well. The event store is optimized for appending and reading linear streams. The command is the single entry point for this operation.

Is Event Sourcing right for your project? It’s not a free lunch. It adds complexity. Your data is no longer just “in the database”; it’s in the event stream. Debugging requires thinking in terms of events and projections.

I consider it when the business domain has complex rules that evolve over time, when an audit trail is a legal requirement, or when the ability to reconstruct past states is a core feature. Financial systems, e-commerce platforms, and scheduling applications often benefit.

For simpler applications, like a basic blog or content management system, it’s likely overkill. The overhead of managing events, projections, and eventual consistency can outweigh the benefits.

Starting small is key. Try it with a single bounded context in your application—like the Order Processing module. Use simple ActiveRecord models for events at first, as shown in the early examples. Get comfortable with the mental model of commanding an aggregate, storing events, and projecting to read models.

As the needs of that part of the system grow, you can introduce more sophisticated elements like a dedicated event store, snapshotting, or event-driven messaging between services. The fundamental pattern remains the same: capture the fact, make it permanent, and let everything else flow from that record of truth.

Keywords: event sourcing, event driven architecture, rails event sourcing, command query responsibility segregation, CQRS pattern, audit trail database, immutable data storage, domain driven design, aggregate pattern, event store, database snapshots, event streaming, temporal data modeling, business event logging, state reconstruction, event replay, event versioning, domain events, event handlers, event projections, read models, write models, event sourcing ruby, rails CQRS, event driven programming, event sourcing patterns, microservices events, distributed systems, eventual consistency, command pattern, event bus, event sourcing benefits, event sourcing drawbacks, event sourcing implementation, rails domain modeling, business logic patterns, data versioning, historical data tracking, event sourcing architecture, event sourcing best practices, event driven design, saga pattern, event sourcing performance, event store database, event sourcing testing, event sourcing migration, complex business rules, financial systems architecture, e-commerce architecture, order processing systems, event driven microservices, event sourcing frameworks, event sourcing libraries, rails event store, event sourcing vs crud, event sourcing scalability, event sourcing consistency, event sourcing debugging, event sourcing monitoring



Similar Posts
Blog Image
Is Email Testing in Rails Giving You a Headache? Here’s the Secret Weapon You Need!

Easy Email Previews for Rails Developers with `letter_opener`

Blog Image
**Monitoring and Observability Patterns for Reliable Background Job Systems in Production**

Master background job monitoring with Ruby: instrumentation patterns, distributed tracing, queue health checks, and failure analysis. Build reliable systems with comprehensive observability. Get production-ready monitoring code.

Blog Image
Why Stress Over Test Data When Faker Can Do It For You?

Unleashing the Magic of Faker: Crafting Authentic Test Data Without the Hassle

Blog Image
7 Essential Ruby Gems for Building Powerful State Machines in Rails Applications

Discover 7 powerful Ruby gems for Rails state machines. Learn AASM, StateMachines, Workflow & more with code examples. Improve object lifecycle management today.

Blog Image
7 Advanced Sidekiq Optimization Techniques to Boost Rails Performance in Production

Optimize Rails Sidekiq performance with proven techniques: selective column loading, batch processing, Redis tuning & smart retry strategies. Boost job reliability by 40%.

Blog Image
**Rails Caching Mastery: From Slow Queries to Lightning-Fast Performance**

Learn advanced Rails caching strategies to boost app performance. From Russian doll caching to stampede prevention—master techniques that scale.