While I was building Statusify, I stumbled across the power of caching, in Ruby on Rails. Here's what I did which reduced render times from 110 ms to 10 ms.

Ruby on Rails provides powerful methods for caching everything - database queries, view fragments, HTTP queries, etc.

To begin with, I started caching with the index page. It needs to get loaded fast, there aren't too many options. I began with model-level caching. So, my first bit of caching was,

<% @incidents.each do |i| %>
  <% cache i do %>

This uses model-level caching, and saves you queries if it's a hit. You can trust it to be updated when you add touch: true to your belongs_to.

But what if I wanted to take it a step further, and cache a larger portion of the view? Rails makes that easy for us. So here's what I use for caching my index page.

<% cache_if !signed_in?, cache_key_for_index do %>

Where cache_key_for_index is a method that goes like this.

  def cache_key_for_index
    count = Incident.count
    max_updated_at = Incident.maximum(:updated_at).try(:utc).try(:to_s, :number)

I spent a couple of minutes pondering at this - would this ensure data freshness?
Let's go over the possible scenarios -

  1. Someone deleted an incident. - The count would decrease, and a new cache key would be generated, so data remains fresh.
  2. Someone adds an incident - The count would increase, new cache key and data would remain fresh.
  3. Someone updates an incident - We're using max_updated_at, so we'd have a new cache key.
  4. Someone deletes and adds an incident - The count would remain the same, but max_updated_at would change, so we'd have a new cache key.

This method generates a lot of cache garbage, but memcached and the likes automatically evict the least used cache items, so that's not our worry. We are also feed of gruntwork like manual cache expiration (which everyone forgets at some time or the other, leading to stale data).

But now for the ace of caching in Statusify - dated_incidents. It is one super-heavy method that makes a ton of queries and does some heavy back-end processing. It's one of those things you would never want being called every request. I used low-level Rails caching here.

  def dated_incidents(force = false)
    # Returns a hash containing dates and the incidents that happened on that date
    # Sample output
    # {Sat, 26 Sep 2015=>#<ActiveRecord::Relation [#<Incident id: 980190979, name: "Incident Name", component: "Incident...>>}
    # Says nil if there are no incidents on that day
    # This is a bit heavy, especially if Statusify has been around for some time.
    # Pass true to force reset cache.
    Rails.cache.fetch('dated_incidents', force: force) do
      # Don't panic if we're out of incidents
      return if Incident.count == 0
      # The range over which we operate
      begins = Incident.first.created_at.to_date
      ends = Incident.last.updated_at.to_date
      # Minor check to make sure things don't blow up
      begins, ends = ends, begins if begins > ends
      range = begins..ends
      @dated_incidents = Hash.new
      range.each do |date|
        i = Incident.where(:created_at => date.beginning_of_day..date.end_of_day)
        if !i.empty?
          @dated_incidents[date] = i
          @dated_incidents[date] = nil
      @dated_incidents.sort { |a, b| b <=> a }.to_h

As you can see, dated_incidents is damn heavy. It accepts a parameter, force, which is passed to the Rails.cache.fetch method, along with the block containing the code. So when someone creates an incident, we just call dated_incidents(true), and are guaranteed data freshness.

So this is how I made Statusify fast, and how you can do the same for your Rails app too.