Like most years, 2018 was full of exciting technology changes for developers. Whether you are looking to bring in more operational efficiency or scale-up your operations, we review the various prominent changes that gathered steam in the last year and help you get ahead of the curve on the trends for 2019.
We look at technologies that you should consider for packaging and deployment of your application.
Docker is the leader in the containerization market, combining an enterprise-grade container platform with world-class services to give developers and IT alike the freedom to build, manage and secure applications without the fear of technology or infrastructure lock-in.
The stats of Docker adoption is testament to the popularity and the value that Docker brings to any organization.
50B Container downloads
32,000+ GitHub Stars
200+ Meetups Around the Globe
550+ Commercial Customers
2M Dockerized Applications in Hub
100K+ Third-party projects using Docker
The Sales pitch
Docker promises to bring great benefits ranging from faster time to market to increased productivity!
Faster Time to Market
To keep your competitive edge, you need to deliver new applications and services. With Docker, organizations are able to increase their speed to deliver new services with development and operational agility enabled by containerization.
Take the frustration out of setting up development environments with Docker, empower your developers to be productive on day one. Docker removes the friction of “dependency hell” to make getting started and shipping new code faster and easier.
IT Infrastructure Reduction
Optimize your costs by increasing your application workload density, getting better utilization of your server compute density and reducing software licensing costs.
So, what is Docker anyway?
Docker is a computer program that performs operating-system-level virtualization, also known as "containerization" - Wikipedia.
In layman's terms, Docker is a convenient way to package up your application, including both the application layer and operating system dependencies.
In the bygone era, the industry standard for increased application deployability and isolation used to be Virtual Machines (VMs). VMs run applications inside a guest Operating System, which runs on virtual hardware powered by the server’s host OS. VMs encapsulate the entire OS and any applications inside them. Virtual machines are extremely resource heavy. VM’s end up taking a lot of RAM and CPU cycles and come with a significant performance overhead.
In the Docker world, “Containers” provide lightweight virtual environments that group certain processes and resources. They virtualize the OS rather than the hardware. This makes containerization extremely resource efficient, lightweight and portable.
Image source: docker.com
A container is a standard unit of software that packages up the code and all its dependencies so the application runs quickly and reliably from one computing environment to another. A Docker container image is a lightweight, standalone, executable package of software that includes everything needed to run an application: code, runtime, system tools, system libraries, and settings.
Container images become containers at runtime and in the case of Docker containers - images become containers when they run on Docker Engine.
These images can be provisioned as containers any number of times. They can even be shared with other developers and organizations giving immense benefit in easily being able to recreate an environment. Because of this reason, it saves developers from having to set up and configure multiple development environments each time they test or deploy.
It also comes in handy for Continuous integration and DevOps practices because the images can be used as temporary testbeds.
Cloud computing has been around from the mid-2000s, so it is by no means a shiny new technology. It still amazes me to see the number of companies that are only beginning to adopt these technologies now!
Cloud computing provides a simple way to access servers, storage, databases and a broad set of application services over the Internet. A Cloud services platform such as Amazon Web Services (AWS) or Azure from Microsoft owns and maintains the network-connected hardware required for these application services. These services are delivered as a utility: on-demand, available in seconds, with pay-as-you-go pricing.
It relieves you from maintaining the hardware resources required to scale your operations. For instance, AWS spans 55 Zones within 18 geographic Regions, essentially covering the entire world! You can choose to set up your application in any of these geographic areas giving you instant access to computing power.
By using cloud computing, you can achieve a lower variable cost than you can get on your own. Because usage from hundreds of thousands of customers is aggregated in the cloud, providers such as Amazon Web Services can achieve higher economies of scale which translates into lower pay as you go prices. It enables you to focus on projects that differentiate your business, not the infrastructure. Cloud computing lets you focus on your own customers, rather than on the heavy lifting of racking, stacking and powering servers.
All major Cloud providers offer a broad set of global infrastructures such as compute, storage, database, analytics, application, and deployment services. A couple of prominent ones are listed below, for the full list see https://aws.amazon.com/products
Amazon Elastic Compute Cloud (Amazon EC2) is a flexible service that provides resizable cloud-based compute capacity in the form of EC2 instances, which are equivalent to virtual servers. You can commission one or thousands of instances simultaneously, and pay only for what you use, making web-scale cloud computing easy.
Amazon Elastic Container Service
Amazon Elastic Container Service (ECS) is a highly scalable, high-performance container management service that supports Docker containers and allows you to easily run applications on a managed cluster of Amazon EC2 instances. Amazon ECS eliminates the need for you to install, operate, and scale your own cluster management infrastructure.
Amazon Simple Storage Service (Amazon S3) is object storage designed to store and access any type of data over the Internet.
Serverless - The next big thing! The latest Buzz word!
What ‘serverless’ really means is that as a developer you don’t have to think about servers and hardware infrastructure. Serverless applications don't require you to provision, scale, and manage any servers. Serverless is already used in production by companies like Netflix, Reuters, etc.
Serverless computing is a type of cloud computing where the customer does not have to provision servers for the back-end code to run, instead, they are dependent on event-driven cloud-based architecture where application development is based on remote execution of “Functions as a Service”.
By composing and combining different services together in a loose orchestration developers can now build complex systems very quickly and spend most of their time focusing on their core business problem.
“Functions as a Service” or FaaS is a relatively new concept in cloud computing and is now implemented in services such as AWS Lambda, Google Cloud Functions, etc.
It provides a means to achieve the serverless architecture allowing developers to execute code in response to events without maintaining servers. What this means is that you can simply upload modular chunks of functionality into the cloud that are executed independently.
You only have a few milliseconds of time to execute these functions, with most vendors imposing a 300-second timeout. These functions are also executed as pay-per-execution APIs.
FaaS offerings do not require coding to a specific framework or library. So you could have each small piece of your functionality executed in different technologies.
Amazon is a leader in this space with their offering called AWS Lambda. With AWS Lambda, you are charged for every 100ms your code executes and the number of times your code is triggered. You don't pay anything when your code isn't running.
Image source: amazon.com
Believe it or not - there is a movement called NoOps already gaining momentum after DevOps! Serverless is the main driver behind this idea because we can deploy code without provisioning anything beforehand, or managing anything afterward. There is no concept of a server, OS or even a container. No more bothering the Ops department.
The 3 deployment options we discussed should be your primary focus going forward - Docker, Cloud, and Serverless. They can transform your business completely with their inbuilt ability to scale at a global level.
Docker and Cloud are by no means new technology. There may be business or budget reasons why you haven’t adopted them already, for instance, you may already have a reliable product using older deployment technologies that is the cash-cow for your company. But beware that your competitors may be gaining an edge over you in the coming years if you don’t start the transformation into the modern tech stack.