When something fancy and new rears its head in tech, people tend to rant and rave about all of the new benefits you’ll see. I’m guilty of this myself. When designing systems, however, we’re not concerned solely with the benefits of a given approach, but its drawbacks, too. “Tradeoffs” are not merely positives, but negatives, too. Being aware of what you gain and what you lose is critically important for making reasonable, informed decisions.
And so we come to Docker: tremendous gains in productivity, repeatibility, what have you. So, what’s the catch?
Where I find Docker lacking personally, is in how it inhibits maintainability–the ease by which a system is monitored, serviced, deployed, and updated. Now don’t get me wrong, there are plenty of tools vying for attention in this space, it’s just that they’re all too complicated (right now). Extra containers. Dynamic, distributed discovery. Wild-west micro-tools. All ultimately leading to systems that are harder to understand, more difficult to change, and liable to break without warning.
Coming back around to tradeoffs: it’s not that the people building these groundbreaking tools are wrong–far from it. Rather, these tools trade stability for innovation. In my case, I’m working in the touchier realm of financial transactions. This domain holds robustness and resilience in a high regard, and as such I will have to carefully consider if new Docker deployment techniques are worth the risk
That’s the point of this micro-rant, really: consider your risks/expectations, and how they align with the tradeoffs of technologies under consideration. In the specific case of Docker, be aware of both benefits and drawbacks when it comes to your deployment environments.