After using Docker for a while, you quickly realize that you spend a lot of time downloading or distributing images. This is not necessarily a bad thing for some but for others that scale their infrastructure are required to store a copy of every image that’s running on each Docker host. One solution to make your images lean is to use Alpine Linux which is a security-oriented, lightweight Linux distribution.
Lately I’ve been working with our Docker images for Java and Node.js microservices and when our stack consist of over twenty services, one thing to consider is how we build our docker images and what distributions to use. Building images upon Debian based distributions like Ubuntu works nicely but it gives packages and services which we don’t need. And that’s why developers are aiming to create the thinnest most usable image possible either by stripping conventional distributions, or using minimal distributions like Alpine Linux.

Choosing Linux Distribution for Your Docker Containers

What’s a good choice of Linux distribution to be used with Docker containers? There was a good discussion in Hacker News about small Docker images, which had good points in the comment section to consider when choosing container operating system.
For some, size is a tiny concern, and far more important concerns are, for example:

  • All the packages in the base system are well maintained and updated with security fixes.
  • It’s still maintained a few years from now.
  • It handles all the special corner cases with Docker.

In the end the choice depends on your needs and how you want to run your services. Some like to use the quite large Phusion Ubuntu base image which is modified for Docker-friendliness, whereas others like to keep things simple and minimal with Alpine Linux.

Divide and Conquer?

One question to ask yourself is: do you need full operating system? If you dump an operating system in a container you are treating it like a lightweight virtual machine and that might be fine in some cases. If you however restrict it to exactly what you need and its runtime dependencies plus absolutely nothing more then suddenly it’s something else entirely – it’s process isolation, or better yet, it’s portable process isolation.
Other thing to think about is if you should combine multiple processes in single container. For example if you care about logging you shouldn’t use a logger daemon or logrotate in a container, but you probably want to store them externally – in a volume or mounted host directory. SSH server in container could be useful for diagnosing problems in production, but if you have to log in to a container running in production – you’re doing something wrong (and there’s docker exec anyways). And for cron, run it in a separate container and give access to the exact things your cronjob needs.
There are a couple of different schools of thought about how to use docker containers: as a way to distribute and run a single process, or as a lighter form of a virtual machine. It depends on what you’re doing with docker and how you manage your containers/applications. It makes sense to combine some services, but on the other hand you should still separate everything. It’s preferred to isolate every single process and explicitly telling it how to communicate with other processes. It’s sane from many perspectives: security, maintainability, flexibility and speed. But again, where you draw the line is almost always a personal, aesthetic choice. In my opinion it could make sense to combine nginx and php-fpm in a single container.

Minimal Approach

Lately, there has been some movement towards minimal distributions like Alpine Linux, and it has got a lot of positive attention from the Docker community. Alpine Linux is a security-oriented, lightweight Linux distribution based on musl libc and busybox using a grsecurity/PaX patched Linux kernel and OpenRC as its init system. In its x86_64 ISO flavor, it weighs in at an 82 MB and a container requires no more than 8 MB. Alpine provides a wealth of possible packages via its apk package manager. As it uses musl, you may run into some issues with environments expecting glibc-like behaviour (for example Kubernetes or with compiling some npm modules), but for most use cases it should work just fine. And with minimal base images it’s more convenient to divide your processes to many small containers.
Some advantages for using Alpine Linux are:

  • Speed in which the image is downloaded, installed and running on your Docker host.
  • Security is improved as the image has a smaller footprint thus making the attack surface also smaller.
  • Faster migration between hosts which is especially helpful in high availability and disaster recovery configurations.
  • Your system admin won’t complain as much as you will use less disk space.

For my purposes, I needed to run Spring Boot and Node.js applications on Docker containers, and they were easily switched from Debian based images to Alpine Linux without any changes. There are official Docker images for OpenJDK/OpenJRE on Alpine and Dockerfiles for running Oracle Java on Alpine. Although there isn’t an official Node.js image built on Alpine, you can easily make your own Dockerfile or use community provided files. When official Java Docker image is 642 MB, Alpine Linux with OpenJDK 8 is 150 MB and with Oracle JDK 382 MB (can be stripped down to 172 MB). With official Node.js image it’s 651 MB (or if using slim 211 MB) and with Alpine Linux that’s 36 MB. That’s a quite a reduction in size.
Examples of using minimal container based on Alpine Linux:
For Node.js:

FROM alpine:edge
ENV NODE_ALPINE_VERSION=6.2.0-r0
RUN apk update && apk upgrade \
    && apk add nodejs="$NODE_ALPINE_VERSION"

For Java applications with OpenJDK:

FROM alpine:edge
ENV LANG C.UTF-8
RUN { \
      echo '#!/bin/sh'; \
      echo 'set -e'; \
      echo; \
      echo 'dirname "$(dirname "$(readlink -f "$(which javac || which java)")")"'; \
   } > /usr/local/bin/docker-java-home \
   && chmod +x /usr/local/bin/docker-java-home
ENV JAVA_HOME /usr/lib/jvm/java-1.8-openjdk
ENV PATH $PATH:$JAVA_HOME/bin
ENV JAVA_VERSION 8u92
ENV JAVA_ALPINE_VERSION 8.92.14-r0
RUN set -x \
    && apk update && apk upgrade \
    && apk add --no-cache bash \
    && apk add --no-cache \
      openjdk8="$JAVA_ALPINE_VERSION" \
    && [ "$JAVA_HOME" = "$(docker-java-home)" ]

If you want to read more about running services on Alpine Linux, check Atlassian’s Nicola Paolucci’s nice article about experiences of running Java apps on Alpine.

Go Small or Go Home?

So, should you use Alpine Linux for running your application on Docker? As also Docker official images are moving to Alpine Linux then it seems to make perfect sense from both a performance and security perspectives to switch to Alpine. And if you don’t want to take the leap from Debian or Ubuntu or want support from the downstream vendor you should consider stripping it from unneeded files to make it smaller.
This article was originally published on Rule of Tech, author’s personal blog about technology and software development.

Avatar

Marko Wallin

Marko works as a full stack software engineer and creates better world through digitalization. He writes technology and software development related blog and developes open source applications e.g. for mobile phones. He also likes mountain biking.

Do you know a perfect match? Sharing is caring

We have a diverse range of people working at Leadin – meet three of our newest dudes in the software team. Jaakko, Olli and Ville all joined within the last year.

Your career path is your choice not defined by your degree


Hi, I’m Jaakko and I currently work as a front end developer at Leadin. I am also studying a major in Industrial Engineering and Management and a minor in Software Engineering at Tampere University of Technology (TUT). Whilst working at Leadin I will complete my final course and write my masters thesis within the next year .
I work as a front end developer and I was attracted to the company by the user experience approach to software solutions which focusses on the end users rather than simply coding. I consider myself as a highly visual person and this way of doing things really suits me.
One of the best things about working at Leadin is the size of the company. We are currently just over 50 people distributed in several locations including Finland, Germany and the UK. This means that I get to know everyone in the office and we have a great atmosphere. We have a flat structure with very little hierarchy so we all work as one team.
Where do I see myself in three years time? Wow I’d use the cliché that ‘anything can happen’. If I keep learning new things at the current pace then I really don’t know what could happen! For me this is a big positive.
If I could give advice to students still in university, I would say ‘follow your own interests’. Sometimes you need to change your focus if you want to achieve your desires. This is what happened to me in my first year at TUT when I realised that I wanted to work in the software industry.
Your degree does not define what you can do – your career path is your choice not defined by your university degree. For me a series of conscious decisions turned me into more of a software guy rather than an economic management one. These decisions were some of the best I have ever made!

I get to work on a wide range of projects


Cheers I’m Olli and I also work in Leadin’s software team – I did a few years of majoring in natural sciences at TUT before studying software engineering. I complimented my studies with industrial engineering and management as minor studies. I am still studying and trying to decide on the title for my masters thesis. At Leadin our software team work on everything from back end to front end design and user experience. My technical interests are more on the back end but I also enjoy the opportunity to work across other aspects of software design and development.
During my studies I participated in a project on Virtual Reality and innovative market design that was run by Leadin. This was really interesting and spiked my interest – we were actually nominated as the best project group by our peers. The guys from Leadin were great and I realised that this was a forward looking and innovative company and it looked like a great place to work. I was right! At Leadin I get to work on a wide range of projects and I get support from colleagues with expertise in different areas of software engineering and UX. Recently I have been given responsibility for a major project working as a full stack developer with a really interesting customer and product.
At Leadin I am gaining experience and learning to manage major projects with multiple new technologies. I love this and I hope that I can be part of development far into the future. Looking back, I wish I had picked my minor subjects earlier as I would have liked to have had some cross-professional insights during my software related studies.
My advice to students graduating this year would be to make contact with as many companies as possible and not just the obvious ones that directly relate to your studies. Maintain these contacts even if the company doesn’t currently have open positions as it will benefit you if you have a contact at the company when a position related to your experience opens.

Remember that it is people who do business and not companies


Hi, I’m Ville and I studied software engineering at Helsinki Polytechnic Stadia which is now known as Helsinki Metropolia University of Applied Sciences or Metropolia for short. I’m a programmer and my passion is developing high end, high quality software using the latest tools and techniques. This is exactly what the team at Leadin do and in a growth company with a fresh and warm atmosphere and great people. I really feel part of the team and that I can make a difference. I have the support of strong and skilful colleagues who are pushing through and solving challenges.
I’m looking forward to continuing to learn new software techniques and always developing high quality and expert software solutions. My advice to students graduating this year is to remember that it is people who do business and not companies, build your networks and always remember the who alongside the what.
If you are graduating this year and would like to join Jaakko, Olli and Ville then build your network of contacts by getting in touch or contacting any one of our team in Finland, Germany or the UK.

Avatar

Gofore <3 Leadin

Gofore and Leadin announced their plans to merge in 5/2017

Do you know a perfect match? Sharing is caring

Using a technology in a beta stage is often considered to be nerve-racking and time consuming as there are constant backward-incompatible changes and lots of bugs. Here’s what I observed while using Angular 2 during its beta phase. I also share some advice on how to get through the beta successfully.

Angular 2

Angular 2 has now gone through its alpha and beta stages as the first release candidate was published on 2nd of May, right on the rumored schedule in order to get it out before the ng-conf (the annual Angular conference) that will take place on 4th to 6th of the same month.
This blog post is a wrap-up from our experiences on using Angular 2 for four months now with an application built with our client, Netum Oy, who was ready to take a shot with the Angular 2. We are developing a system for applying and managing the European Regional Development Fund (ERDF) and the European Social Fund (ESF) regulated by Ministry of Employment and the Economy in Finland. This an enterprise Java application, implemented mostly with Apache Wicket until the beginning of this year, when we decided to implement the new parts of the application with RESTful resources and Angular 2 as frontend. Angular 2 had just reached its first beta at that point. To get familiar with Angular 2, you can read their official tutorial.
The choice was made based on traction gained by alpha online forums and actual experiences of another team working with it. This combined with the success of Angular 1 and backing by Google gave us the confidence to build on top of it.

The Experiences

So how has it been then? Great! At least most of time, that is. I took responsibility for keeping Angular and other dependencies up-to-date, relieving the rest of the team of a lot of the stress as it hasn’t been that trivial and straightforward in all cases to keep up with changes.
For the most part we have been satisfied with Angular 2. The framework has been stable enough and documentation has steadily become better. There has of course been some glitches on the way as you could expect. Here’s an unordered list of them:

  • New release almost once a week
  • Angular 2 dependent libraries lagging behind
  • Level of documentation
  • Outdated tutorials
  • No Stack Overflow coverage

New Release Almost Once a Week

There were 16 beta releases of Angular 2 between the December 15, 2015 (beta 0) and April 28, 2016 (beta 17) on Angular. The “missing” 2 beta releases are the beta 4 and beta 5 which were declared as “incorrect” by Angular team straight after they were published and were replaced by beta 6. Out of these 16 releases, 11 of them had breaking changes. This means that on average there were approximately 0.84 releases per week (16 releases, 19 weeks) during the beta phase. If you take out the first three weeks before the beta 1 (which was the first beta having any changes) was released, you get the average of 1 release per week.
This is both, a good and a bad thing, from our perspective. Having the changes brought to you constantly on small batches made it usually easy and painless to adapt to them. On the other hand, that meant you had to adapt constantly, taking time away from the actual development.

Angular 2 Dependent Libraries Lagging Behind

We use a few libraries that depend on the Angular 2 such as angular2-modalng2-file-upload and ng2-select. These libraries rely on more low-level stuff than our application which made it more regular for them to break when breaking changes were published. Even though their authors and other contributors have done excellent jobs on keeping them up-to-date, they inevitably have been lagging behind and thus not letting us upgrade to newest version before they have taken the actions required to upgrade.

Level of Documentation

The level of documentation was in general okay-ish. It has been evolving all the time and has became better. It can clearly be seen that the most important parts are covered better than the lower level parts. Yet, if you open up the documentation of, e.g. ngIf directive, you can see that there are couple of “Not yet documented” sections left and it is one of the very basic directives used in templates.
If you look into some lower level class, you can see that they basically lack the documentation completely. Connection class is a good example.

Outdated Tutorials

The Web is full of tutorials from both Angular core team and community in general. These tutorials provide a great number of code snippets and examples. Unfortunately, not all of these are maintained, leading to a lot of confusion for newcomer.

No Stack Overflow Coverage

Stack Overflow is recognized nowadays as one of the most important resources for a programmer, and many of us visit it several times a day. Even though the number of questions and answers is constantly rising, the coverage is still relatively low and many questions are left forgotten without an answer. As with tutorials, the problem of outdated snippets applies to SO questions and answers as well.

Conclusions

Living on the edge comes always with a cost but there are few strategies that can be adopted to minimize the impact. First and the most important advice is to let others do the hard lifting by accepting a minor delay. In practice this means just waiting for a few days for the community to validate the new release. If there’s something seriously wrong with it, a new version will be published fast enough. You will also find some answers for common issues with the new releases. This also helps as the libraries are usually updated quite fast. But as they basically can’t be updated before the release, there will always be some delays. If you do adapt absolute edge versions, though, it is always a good idea to draft issues for updating the dependencies and even create pull requests for the libraries if possible.
The second strategy is to consider keeping up with new versions in spite of the constant releases. This will make it less of a pain in the long run and allow you to always use the newest functionality available.
In general, Angular 2 seems really promising and our team has been extremely happy with huge improvements, on one hand, in the daily development and, on the other hand, to user experience and performance when compared to earlier technologies. We have enough confidence to have it running on production already at this point.
P.S. Interest in Angular 2? Check out our brand new Angular 2 & TypeScript training!

Avatar

Roope Hakulinen

As a lead software developer Roope works as team lead & software architect in projects where failure is not an option. Currently Roope is leading a project for one of the world's largest furniture retailers.

Do you know a perfect match? Sharing is caring