Serving Remote Content with Dragonfly
In the latest project I’ve been working on here, an
issue came up when serving images on the site. We have our app on Heroku and
we have a shop on Shopify as an external service; one of the features we
had to implement was to upload, process and serve images. For this we use the
Dragonfly gem, it stores original versions of the content in a datastore that can
be the filesystem, s3, or any other storage service. The thing is, we needed
to upload the images in the Heroku app but serve them on the Shopify
When we upload images with Dragonfly
in a datastore like Amazon S3 and request the URL of the dragonfly image,
dragonfly serves an URL relative to the site where we upload the image,
therefore, when we render the image in the browser it receives a permanent
redirect (301) to the original URL of the image at Amazon S3, and if we
want to resize the image, dragonfly applies the transformation on the fly.
Both situations lead to performance issues, the need to redirect and resolve
the original image URL, and at the same moment use the server time to apply the
resize process to the image. So if we have a web site, where we render a lot of
images, a significant delay will be observed.
An easy way to solve both problems, is using a couple of methods which
Dragonfly provides to help us remotely serve processed versions of
content such as thumbnails.
For this case, we just needed to create a Thumb table to store the jobs with
two strings columns: job and uid, and then just add the configuration block
for the before_serve and define_url methods inside the app.configure do |config| block in on our ../initializers/dragonfly.rb as follows:
require 'dragonfly' app = Dragonfly[:app_name] app.configure_with(:imagemagick) app.configure_with(:rails) # Is up to us to determine or not an expiration time for the thumbnails app.cache_duration = 3600*24*365*3 app.configure do |config| config.url_host = Rails.env.production? ? 'http://myapp.herokuapp.com' : 'http://localhost:3000' # First we configure our before_serve method, # Before serving, the first time it is requested stores the thumbnail in the datastore config.server.before_serve do |job, env| uid = job.store # Keep track of its uid # Holds all the job info, e.g fetch 'image_uid' then resize to '40x40' Thumb.create!( :uid => uid, :job => job.serialize ) end # Next we define the url for our processed images, overriding the default .url method... config.define_url do |app, job, opts| thumb = Thumb.find_by_job(job.serialize) # If (the job fetch 'image_uid' then resize to '40x40') has been stored already.. # then serve the url from the datastore filesystem, s3, etc if thumb app.datastore.url_for(thumb.uid) else # ...otherwise if the job hasn't been stored, serve it from the Dragonfly server as usual app.server.url_for(job) end end end ...
This would give us the first time the job is stored:
image.thumb(’40×40′).url normal dragonfly url e.g. /media/image…
Then from the second time onwards:
image.thumb(’40×40′).url datastore url /my-bucket.s3.amazonaws.com/2011…
This solution allowed us to cache the images, avoid permanent redirects, decrease
dragonfly’s jobs load, and overall improvement on our site performance.
Hope you can find this post useful, see you next time! XD
PS. Thanks to Mario ‘Chido’ and Mumo for helping me out with some concepts!!