Docker helps packaging software into reusable containers. This allows not only for standardized development environment across your team, but also for scaling your production deployments on most major cloud platforms. This post explains how to dockerize a Ruby on Rails app with PostGreSQL, Redis and Sidekiq.
Docker has benefits not only for production, but also in development:
- Tested containers in production
- Production-like environment across development machines
- Quick bootstrap of development machines for new team members
- Version-controlled environment definition via
- Support of major cloud service providers
The first thought you might want to give is, which base image to use. This decision affects the size of the resulting image and the available dependencies.
These are some of our current go-to images, when we create new docker containers for Ruby applications:
ruby:<version>- official Ruby image based on Debian with most common Debian packages installed, so that our own Dockerfile does not have to install these.
ruby:alpine- official Ruby image based on Alpine Linux which is much smaller than most linux distributions (~5MB). This image does not include any extra packages. We have to install these packages ourselves in our own
Dockerfile in the root directory of your Rails application:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36
FROM ruby:2.4 # Install dependencies RUN apt-get update -qq && \ apt-get install -y --no-install-recommends build-essential libpq-dev nodejs && \ rm -rf /var/lib/apt/lists/* # Set the root of your Rails application ENV RAILS_ROOT /app RUN mkdir -p $RAILS_ROOT # Set working directory to the root path of the Rails app WORKDIR $RAILS_ROOT # Do not install gem documentation RUN echo 'gem: --no-ri --no-rdoc' > ~/.gemrc # If we copy the whole app directory, the bundle would install # everytime an application file changed. Copying the Gemfiles first # avoids this and installs the bundle only when the Gemfile changed. COPY Gemfile Gemfile COPY Gemfile.lock Gemfile.lock RUN gem install bundler && \ bundle install --jobs 20 --retry 5 # Now copy the application code to the application directory COPY . /app # This scripts runs `rake db:create` and `rake db:migrate` before # running the command given ENTRYPOINT ["lib/support/docker-entrypoint.sh"] EXPOSE 3000 # Default command is starting the rails server CMD ["bin/rails", "s", "-b", "0.0.0.0"]
.dockerignore file filters some files and folders before
starting the build process of the Docker image:
1 2 3
db/*.sqlite3 tmp log/*
The entry point is run for each container. This script makes sure that the database exists and migrations are up to date.
1 2 3 4 5 6 7 8 9 10 11 12 13
#!/bin/bash echo "Creating database if it's not present..." bin/rails db:create echo "Migrating database..." bin/rails db:migrate # If the container has been killed, there may be a stale pid file # preventing rails from booting up rm -f tmp/pids/server.pid exec "$@"
Service Configuration via
docker-compose manages all containers needed for your environment. You
describe all services and how they interrelate in a
docker-compose.yml file and
docker-compose takes care of starting
and linking them in the correct order.
The definition for a service contains the image, command, environment variables, port mappings, container linkings and volume informations.
For our Rails app we need the following services:
- Web Application Server (Rails)
- Background Worker (Sidekiq)
- Test Runner (Guard)
The actual definition looks like this:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47
postgres: image: postgres:9.6 ports: - 5432:5432 volumes: - ./tmp/postgresql/9.6/data:/var/lib/postgresql/data redis: image: redis ports: - 6379:6379 volumes: - ./tmp/redis/data:/data app: build: &build . command: rails s -b 0.0.0.0 -p 3000 tty: true volumes: &volumes - .:/app - ./config/database.yml.dev:/app/config/database.yml ports: - 3000:3000 environment: &environment DB_USERNAME: postgres DB_PASSWORD: links: &links - postgres - redis - worker worker: build: *build command: bundle exec sidekiq volumes: *volumes environment: *environment links: - postgres - redis test: build: *build command: guard tty: true volumes: *volumes environment: <<: *environment RAILS_ENV: test links: - postgres - redis
Building the image
Whenever you perform a change to the
Gemfile or want to update the
container, run this command:
docker-compose build app test worker
In development you can now start the whole environment specified in
docker-compose.yml file via one simple command:
docker-compose run -p 3000:3000 app
This will start the app container and map the port 3000 to your local port 3000.
You might ask yourself, why I do not use the
run we have a better terminal output. In development
we might want to use
pry to debug certain scenarios or use similar
docker-compose up I had problems with prompts, which were
docker-compose run. The only drawback is that we have to
specify the port mappings manually for
Common Commands & Tasks
We use guard to watch our source files and run the respective tests
automatically. When you look into the
docker-compose.yml you will
find a service called
test. This runs the guard server and can be
started like this:
docker-compose run test
Guard listens for file changes in your project directory, so when you change a source file, the respective test is executed instead of the whole test suite.
Webpacker With Docker-Compose
For our latest Rails 5.1 projects, we have been using webpacker quite
successfully. In development you have to start the
webpacker-dev-server. Normally this should be accessible by the
Rails development server but also from outside to load the assets
For that we had to adjust the hostname for the rails server in
1 2 3 4 5 6 7 8 9 10 11
# ... development: <<: *default compile: true dev_server: host: webpacker port: 3035 hmr: true https: false # ...
And add a linked service to the
1 2 3 4 5 6 7 8 9
webpacker: build: *build command: bin/webpack-dev-server --host localhost entrypoint: "" ports: - 3035:3035 volumes: *volumes environment: RAILS_ENV: development
--host localhost option makes sure, that the assets are loaded
localhost and the hot-module refresh connects to the correct
host (in that case the forwarded port
Remember to add the
webpacker service to
app → links definitions so that
webpacker-dev-server is startet automatically with
run -p 3000:3000 app.
Normally we integrate the deployment in our continuous integration system. Whenever we push to the main repository a job builds the image, runs all tests and when everything is green it pushes the image to the repository. Depending on the branch and tag for the commit, it then gets deployed to the respective environment.
You could deploy to a variety of servers and cloud platforms:
Single Host -
The simplest scenario would be to deploy to a dedicated server which
has Docker and
In this scenario you have to create a
docker-compose.yml file, for
/srv/docker-compose.yml, adjust some port mappings and
maybe add an nginx reverse proxy.
restart: always directive,
docker-compose would take care
of restarts for your services.
Further reading: docs.docker.com
Dokku is a lightweight PaaS solution. You can install it on a server and manage deployments with Dokku.
Further reading: dokku.viewdocs.io
Amazon AWS Elastic Beanstalk
Amazon Web Services provide a large toolbelt to host, scale and manage cloud infrastructure at Amazon data centers. Elastic Beanstalk is our favourite tool to quickly deploy docker environments to AWS EC2 instances.
It provides a simple command-line interface (
awsebcli) on top of the
basic AWS CLI tool (
awscli). Everything you could do with the AWS
Management Console, you can do with the command-line interface as
We have written a blog post “Rails in Docker via AWS Elastic Beanstalk”, which outlines how you can deploy a dockerized Rails app to AWS with auto-scaling and load-balancing in a few minutes.
Further reading: aws.amazon.com
Heroku is a Cloud Service Provider with very easy command-line utilities to quickly deploy applications to production environments.
Further reading: heroku.com
Docker helps packaging applications in easily deployable containers not only for production, but for development environments as well.
I hope this article could help you to gain a good understanding of the benefits and some of the pitfalls of Rails inside Docker and that you can evaluate whether Docker is something that’s worth to look into for your project too.