I want to talk about building features that feel alive. You know the kind—where a new message pops up on your screen without you hitting refresh, or where you see a little indicator that a coworker is typing a reply. In the Rails world, this magic often starts with a tool called Action Cable. It’s the built-in way to handle WebSockets, which are just persistent connections between your user’s browser and your server.
Getting started is straightforward, but building something that’s solid, secure, and can handle a lot of users requires some thoughtful patterns. These aren’t secret recipes, just proven ways to structure your code. I’ll share several that I’ve found indispensable, and we’ll look at the actual code that makes them work.
First things first: who is connecting? You can’t just let anyone connect to your WebSocket server. You need to know who they are, just like you do in a regular controller. This happens in the Connection class.
module ApplicationCable
class Connection < ActionCable::Connection::Base
identified_by :current_user, :session_id
def connect
self.current_user = find_verified_user
self.session_id = request.session.id
reject_unauthorized_connection unless current_user
end
private
def find_verified_user
env['warden'].user || User.find_by(id: cookies.encrypted[:user_id])
end
end
end
Here, I’m doing two important things. I’m finding the user using the same session or cookie data my normal app uses. The identified_by line is crucial. It says, “For this connection, remember this current_user and this session_id.” Later, in my channels, I can access current_user directly. The session_id is useful for tracking anonymous connections. If we can’t find a user, we reject the connection entirely.
Now that someone is connected, they want to subscribe to a specific stream of information. Maybe it’s a chat room. But they shouldn’t be able to listen in on just any room. You need to authorize the subscription.
class ChatChannel < ApplicationCable::Channel
def subscribed
chat = Chat.find(params[:room_id])
if chat.participants.include?(current_user)
stream_from "chat_#{params[:room_id]}"
stream_for current_user
else
reject
end
end
end
In the subscribed method, I look up the chat room based on an ID sent from the frontend. I check if my current_user (from the connection) is allowed to be there. If they are, I start streaming. Notice I set up two streams. stream_from "chat_#{room_id}" is a public broadcast channel for the room. stream_for current_user is a private channel, just for that user. I can use it to send them personal notifications.
When a message comes in from the client, I need to handle it, save it, and send it out.
def receive(data)
message = current_user.messages.create!(
chat_id: params[:room_id],
content: data['content']
)
ChatChannel.broadcast_to(
"chat_#{params[:room_id]}",
message: MessageSerializer.new(message).as_json
)
end
I create the message in the database first. This is vital. Your real-time features should reflect the true state of your application data. Then, I broadcast it to everyone subscribed to the room’s stream. Using a serializer ensures everyone gets the data in the same, clean format.
But what if creating that message involves heavy work? Maybe I need to parse it for @mentions, send emails, or update complex counters. Doing all that inside the receive method will block the connection. It makes everything feel slow. The solution is to hand off the heavy lifting.
class MessageBroadcastJob < ApplicationJob
queue_as :default
def perform(message_id)
message = Message.find(message_id)
chat = message.chat
mentions = extract_mentions(message.content)
mentions.each do |username|
user = User.find_by(username: username)
notify_mentioned_user(user, message) if user
end
ActionCable.server.broadcast(
"chat_#{chat.id}",
message: message_payload(message)
)
end
private
def message_payload(message)
{
id: message.id,
content: message.content,
author: message.user.username,
timestamp: message.created_at.iso8601
}
end
end
Now, back in the channel, my receive method becomes much lighter.
def receive(data)
message = current_user.messages.create!(
chat_id: params[:room_id],
content: data['content']
)
MessageBroadcastJob.perform_later(message.id)
end
It creates the message, queues the job, and immediately frees up the WebSocket connection. The job, processed in the background, does the hard work and finally does the broadcast. The user gets a snappy response, and the complex logic doesn’t hold up the line.
A common real-time feature is showing who is online. This is called presence tracking. It requires knowing when users join and leave a channel. We can use Redis, a fast in-memory store, to keep track of this state.
class PresenceChannel < ApplicationCable::Channel
def subscribed
stream_from "presence_#{params[:room]}"
redis.sadd("presence:#{params[:room]}", current_user.id)
broadcast_presence_update('join')
end
def unsubscribed
redis.srem("presence:#{params[:room]}", current_user.id)
broadcast_presence_update('leave')
end
private
def broadcast_presence_update(action)
ActionCable.server.broadcast(
"presence_#{params[:room]}",
action: action,
user: current_user.username,
timestamp: Time.current.iso8601,
count: redis.scard("presence:#{params[:room]}")
)
end
def redis
@redis ||= Redis.new(url: ENV['REDIS_URL'])
end
end
When a user subscribes, I add their ID to a Redis Set for that room. Sets are great because they automatically handle duplicates. I then broadcast a message to the room’s presence stream saying “John joined.” The payload includes the current count of users, which I get from the set. When the connection closes (unsubscribed), I remove them and broadcast the “leave” event. Every client listening can update their UI in real time.
WebSockets open a direct line to your server. It’s important to guard against abuse, like a client spamming thousands of messages a second. You need rate limiting.
class RateLimitedChannel < ApplicationCable::Channel
RATE_LIMIT = 10 # messages per minute
before_subscribe :check_rate_limit
def receive(data)
if rate_limit_exceeded?
transmit(error: 'Rate limit exceeded')
else
increment_rate_counter
super(data)
end
end
private
def check_rate_limit
reject if rate_limit_exceeded?
end
def rate_limit_exceeded?
count = redis.get(rate_limit_key).to_i
count >= RATE_LIMIT
end
def increment_rate_counter
key = rate_limit_key
redis.incr(key)
redis.expire(key, 60) if redis.ttl(key) == -1
end
def rate_limit_key
"rate_limit:#{connection.session_id}:#{Time.current.to_i / 60}"
end
end
I use a before_subscribe callback to reject the subscription entirely if they’re already over the limit. Inside receive, I check again before processing each message. The key to this logic is the rate_limit_key. It uses the session ID and the current minute (Time.current.to_i / 60). This creates a new bucket for counting every minute. I increment the counter and, if it’s the first time for this bucket, set it to expire in 60 seconds. This automatically cleans up old counts.
As your application grows, you might have thousands of concurrent connections. You need to manage these resources carefully. A simple connection pool can help.
class ConnectionPool
def initialize(max_connections: 1000)
@max_connections = max_connections
@connections = {}
@mutex = Mutex.new
end
def register(connection)
@mutex.synchronize do
if @connections.size >= @max_connections
evict_oldest_connection
end
@connections[connection.connection_id] = {
connection: connection,
last_activity: Time.current
}
end
end
def broadcast_to_user(user_id, data)
user_connections(user_id).each do |connection|
connection.transmit(data)
rescue => e
Rails.logger.error("Failed to transmit: #{e.message}")
remove_connection(connection.connection_id)
end
end
private
def evict_oldest_connection
oldest = @connections.min_by { |_, data| data[:last_activity] }
@connections.delete(oldest.first) if oldest
end
def user_connections(user_id)
@connections.values.select do |data|
data[:connection].current_user&.id == user_id
end.map { |data| data[:connection] }
end
end
The pool has a maximum size. When registering a new connection, if we’re at the limit, we find the connection with the oldest last_activity and remove it. A Mutex ensures two threads don’t modify the @connections hash at the same time. The broadcast_to_user method finds all connections for a specific user and sends the data. If a send fails, we log it and clean up that connection.
Finally, you need a way to manage subscriptions themselves. When a user is in multiple chat rooms, how do you know which ones? A subscription manager keeps track.
class SubscriptionManager
def initialize
@user_subscriptions = Hash.new { |h, k| h[k] = Set.new }
@channel_subscribers = Hash.new { |h, k| h[k] = Set.new }
end
def subscribe(user_id, channel_name)
@user_subscriptions[user_id] << channel_name
@channel_subscribers[channel_name] << user_id
end
def unsubscribe(user_id, channel_name)
@user_subscriptions[user_id].delete(channel_name)
@channel_subscribers[channel_name].delete(user_id)
@user_subscriptions.delete(user_id) if @user_subscriptions[user_id].empty?
@channel_subscribers.delete(channel_name) if @channel_subscribers[channel_name].empty?
end
def broadcast(channel_name, message)
subscribers = @channel_subscribers[channel_name]
subscribers.each do |user_id|
UserChannel.broadcast_to(user_id, message)
end
end
def user_channels(user_id)
@user_subscriptions[user_id].to_a
end
end
This class maintains two views of the same data. @user_subscriptions answers “what channels is user X in?” @channel_subscribers answers “what users are in channel Y?” The Hash.new { |h, k| h[k] = Set.new } pattern creates an empty Set automatically when a new key is accessed. This keeps the logic clean. When unsubscribing, I clean up empty sets to avoid memory leaks. The broadcast method uses the subscriber list to send a message to each user in a channel via their personal stream.
These patterns—from connection authentication and channel authorization, to offloading jobs, tracking presence, limiting rates, pooling connections, and managing subscriptions—form a toolkit. They help transform the basic “it works” of Action Cable into a robust system that can power the live, engaging parts of your application. You start with a simple channel and then, piece by piece, add these layers of structure and resilience. The goal is for the real-time features to feel seamless and reliable, no matter how many users are involved.