Microservices are component parts of an application that are designed to run independently. A microservices-based application is a collection of loosely coupled services that are lightweight and independently deployable and scalable. Because each individual microservice is modular and runs its own processes, it can be modified without affecting the entire application. Many hundreds or thousands of microservices can make up a single application.
It is not necessary for all microservices in an application to be written in the same programming language or by the same development team. When building microservices-based applications, many development teams, or DevOps teams, choose to use open source tools, which are published by their creators to publicly available repositories such as GitHub. Still other development teams prefer a mix of open source tools and commercial off-the-shelf software.
Microservices have the following characteristics:
The use of microservices is sometimes referred to as cloud native. However, the cloud native approach to application development more broadly encompasses modern software development practices and the use of containers along with container orchestrators and other tools in addition to microservices. A commonly accepted definition of cloud native applies to applications composed of microservices that are packaged and deployed in containers and managed on scalable cloud infrastructure using agile DevOps processes and continuous delivery workflows.
Organizations are moving to microservices to take advantage of the agility benefits they provide. Because each microservice can be developed, deployed, and scaled independently, IT teams can quickly make changes to any part of an application to get products and services to market faster and stay ahead of their competitors.
And since microservices are device and platform agnostic, organizations can develop applications that provide consistent user experiences regardless of the underlying infrastructure.
When adopting microservices and modernizing their application architectures, organizations also need to modernize their application delivery. An application delivery controller (ADC) is key for improving the availability, performance, and security of microservices-based applications.
Most companies that adopt cloud native architectures are building microservices in cloud environments. By deploying their microservices-based applications in public cloud, companies can take advantage of the on-demand scalability that public clouds provide. Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform are the most popular public clouds.
As organizations increasingly choose multi-cloud strategies for application deployment, they are relying on containerized microservices for application portability across on-premises and public cloud. Because each individual microservice can be developed, deployed, and scaled independently, IT teams can quickly make changes to any part of a production application without affecting the application's end users. While microservices benefits are many, the primary benefit is to enable more rapid software development to get products and services to market faster and into the hands of customers.
The use of microservices benefits organization by helping them realize their business objectives, whether that's an overarching focus like digital transformation or a specific need like refactoring an on-premises legacy application to run in a highly scalable cloud environment.
Because of the portability that containers provide and the agility that microservices and cloud afford, many companies are adopting a cloud native approach to application deployment to gain or maintain a competitive advantage in the market.
By developing applications using microservices that can be independently tested and deployed, organizations don’t need to rewrite their entire codebases to add or modify a feature, which enables them to deliver their products and services to market faster.
Microservices are modular, loosely coupled services with lightweight protocols so they are easier for developers to develop, test, and deploy—and also easier to change and maintain. Additionally, microservices give developers the opportunity to focus on specific functions performed by the application rather than on the entire application, itself.
The use of microservices paired with such DevOps practices as assembling small teams rather than large teams enables an agile approach to software development. By narrowing the scope of what they are responsible for developing while also owning the entire software development life cycle for a particular function, DevOps teams can continually release new features and functionality at a faster pace.
When a monolithic application must be rebuilt or redeployed for any reason including to release a major update or to fix a minor bug, the application end-user experience can suffer. With microservices-based applications, developers can quickly make updates to only the affected microservices and release them quickly. For customers and other application end users, there should be no discernible difference in their application experience when microservices are updated in production.
Companies need agility and flexibility in order to accelerate innovation. To achieve that, many organizations are moving their applications to public clouds. But simply shifting their monolithic applications to the cloud does not allow them to take full advantage of the agile, scalable, and resilient features that public cloud infrastructure provides.
Because microservices are device and platform agnostic, they enable organizations to develop microservices-based applications that provide a consistent user experience regardless of the underlying infrastructure.
Microservices advantages include:
A microservice is a lightweight modular component, or service, that performs a unique function within an application. Microservices work together as independent parts of a whole to deliver the complete functionality for an application. By isolating software functionality into multiple modules that are developed and maintained independently, development teams can improve and evolve applications and services more quickly. Individual microservices communicate with each other using APIs, often via the HTTP protocol using a REST API or messaging queue.
Microservices are a modern approach to software development. Yet for all the benefits they provide, microservices development presents challenges. Because they are distributed systems, microservices have additional requirements beyond what monolithic applications demand. Development teams must plan how to handle service discovery, messaging protocols between the client and services and between microservices, API performance monitoring, traffic management, and many other aspects of microservices development that are made more complex by the greater number of endpoints.
Microservices integration is a major consideration when designing a microservices-based application. A best practice is to develop the business logic code as part of the service and to offload the network communication logic to a type of infrastructure called a service mesh. The service mesh manages communications between the individual microservices, but should not contain business logic.
A microservices architecture uses a framework of what's referred to as smart endpoints and dumb pipes. This is a concept that essentially means that the microservices, themselves, employ the logic to manage integration for the entire application.
Microservices deployment follows an agile, scalable, and repeatable process called continuous integration and continuous delivery, or CI/CD for short. A primary benefit of CI/CD is that it effectively merges application development and operations to reduce microservices deployment times.
DevOps teams can make near-instantaneous changes to applications, but with that agility also comes more responsibility. They must plan for and manage performance, high availability, security, and resiliency for their part of the application themselves rather than rely on other more specialized teams within IT as they once had when the waterfall style of application development was popular.
Microservices best practices typically involve automation. With the rapid pace of continuous integration and continuous delivery (CI/CD), development teams can be challenged by the frequent release cycles of a microservices deployment. To efficiently manage large clusters of containers in which microservices run, Kubernetes (pronounced koo-ber-net-ees) is the container orchestrator of choice for the majority of organizations. Kubernetes controls how and where the containers will run.
However, Kubernetes environments can be inherently difficult to deploy and troubleshoot depending on developers' skill sets, so many organizations struggle to deploy microservices-based applications quickly and reliably. By addressing open questions and challenges in the architectural planning phase of development, DevOps can effectively solve for such microservices best practices concerns as:
Microservices are not a new take on application development: Microservices architecture has roots in the design principles of Unix-based operating systems and in the popular service-oriented architecture (SOA) model. SOA introduced the concept of independent services that can access software, perform business functions, and enable general modularity and reuse across IT.
Microservices architecture extends the SOA concept by enabling complex applications to be broken down into many autonomous components that can be wholly managed by small teams of developers. This is a preferred approach for IT organizations looking to make complex applications more agile, scalable, and resilient as well as easier to manage with fewer developer resources.
A microservices architecture that uses container-based microservices is a common architectural style. Kubernetes is an open-source platform for managing containerized workloads and services. As a container orchestrator, Kubernetes automates the management, deployment, and scaling of containers across multiple servers by abstracting the underlying infrastructure. Because it is extensible, Kubernetes makes it easier for developers and operators to automate much of the manual work of container management by using their preferred open source and commercial software tools.
Another architectural style choice to be determined is how to expose the microservices within containers when they receive a request from an external client. A common method is use an an ingress controller, which works as a reverse proxy, or load balancer. All external traffic is routed to the ingress controller and then to the appropriate services.
A microservices-based application may also use an API gateway to provide a single point of entry for requests made to a defined group of microservices. An API gateway helps minimize the complexity of API management by enabling DevOps teams to automate their continuous integration and continuous delivery (CI/CD) workflows.
Each microservice has its own API, which manages requests over a protocol such as HTTP to communicate with other microservices and with the application as a whole. A microservices-based application typically contains numerous microservices and APIs. An API gateway ensures better performance for application traffic by reducing latency associated with multiple hops of TCP or TLS decryption.
An API gateway enables DevOps to:
The primary types of microservices are stateful and stateless.
A stateful microservice records the state of data after an action for use in a subsequent session. For example, online transaction processing such as bank account withdrawals or changes to an account’s settings are stateful because they must be saved so that they persist across sessions. These stateful components are somewhat complex to manage, since they need stateful load balancing and can only be replaced by other components that have saved the same state.
A stateless microservice doesn’t store anything. Its outputs depend solely on events rather than any saved data. In cloud environments, stateless microservices are usually preferable because they can be spun up only as needed and are fully interchangeable with one another. That reduces complexity and avoids the need to pre-commit compute, storage, and networking resources.
Microservices security and authentication starts with adopting a zero-trust approach where all requests to all resources must be authenticated and authorized. Enforcing security within a Kubernetes cluster and securing ingress and egress are key to protecting containerized applications, but it can be challenging to correctly apply role-based access control (RBAC) permissions and security policies in Kubernetes.
From an identity and access management perspective, microservices access control at the user level requires all users to be uniquely identified. By using a central directory service as a single source of identity and authentication, DevOps teams can abstract the function of global authentication and authorization away from individual microservices.
A microservices-based architecture for an application that runs in a containerized environment must solve for providing secure access to dynamic services whose locations change. An API gateway acts as the single entry point of entry and and ensures secure and reliable access to the APIs and microservices within the application. Rather than calling services directly, the API web client calls the API gateway, which forwards the call to the appropriate services on the back end.
The use of security policies provide the ability to declare security parameters for pods and containers that are applied at runtime. Applying security at this level ensures a fine-grained approach so that the security applied to individual application components is appropriate.
Security at the individual service level starts with limiting its access to only necessary resources so that a vulnerability in a single microservice won’t expose the rest of the system to an attack. The use of a distributed firewall that is centrally managed adds fine-grained access control between services, which further serves to minimize the attack area.
Microservices monitoring is one of the three pillars of observability that along with distributed tracing and logging gives site reliability engineers (SREs) and developers a unified view of their environments. Monitoring the status of a large number of microservices is not trivial. And a large number of endpoints exposes an increased attack surface for bad actors to target. With a suspected security vulnerability, the ability to pinpoint the affected microservice and quickly resolve the issue is paramount.
It can often be difficult to identify and troubleshoot root causes of issues in dynamic applications. Teams must be able to troubleshoot individual microservices to see which service is failing, when it failed, why it failed, and which users are being affected. Monitoring key metrics with automated alerts plays a crucial role in helping to pinpoint issues with infrastructure, containers and their contents, and APIs and endpoints.
One challenge of microservices management is maintaining the speed of development without sacrificing security. In microservices-based applications, there are two types of traffic that must be protected: traffic into the application (north-south traffic) and traffic between the individual microservices (east-west traffic).
Many of the same security challenges that are present with monolithic applications also exist in microservices-based applications, especially with regard to north-south traffic that requires:
Communications between microservices (east-west traffic) typically require similar protections with regard to authentication, encryption, and inspection.
Microservices orchestration requires managing the individual containers that microservices are packaged in. Each container includes a full runtime environment with everything an application needs to function, namely its libraries, configuration files, and dependencies. All of these essential pieces are abstracted from the underlying infrastructure, allowing the containerized software to be fully portable across disparate environments.
Container orchestrators like Kubernetes help development teams to deploy the same application across different environments without needing to redesign it.
Building and deploying microservices with Kubernetes is the most common way to manage the containers in which the microservices are packaged and run. Kubernetes (abbreviated K8s) is an open source platform for container orchestration that was developed by Google. Kubernetes automates the management of containers, which includes the provisioning, scaling, networking, and monitoring of containers. The automation provided by Kubernetes can significantly reduce the time it takes to deploy microservices-based applications, relieving developers of much of the manual work that comes with container management.
Containerization is a form of operating system virtualization. Individual containers include a full runtime environment with everything an application needs to function, including libraries, configuration files, and dependencies. These components are abstracted from the underlying infrastructure, allowing the containerized microservices application to be fully portable across public clouds like Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP).
Containerized microservices provide:
For many developers, Kubernetes comes with a steep learning curve. To minimize complexity, many organizations choose to use managed Kubernetes providers. Popular managed Kubernetes services and platforms for deploying enterprise applications at scale include Amazon Elastic Kubernetes Service (Amazon EKS), Azure Kubernetes Service (Azure AKS), Google Kubernetes Engine (GKE), RedHat OpenShift, and Rancher.
Using a managed, production-ready environment for running containerized applications can provide a consistent environment from development through production. Other benefits of a managed Kubernetes service include:
When adopting microservices and modernizing their application architectures, organizations also need to modernize their application delivery and approach to security. An application delivery controller (ADC) is key for improving the availability, performance, and security of microservices-based applications.
Many companies are still running monolithic applications on-premises while at the same time planning to deploy new microservices-based applications in public cloud. Citrix ADC supports an organization's transition to microservices-based applications by providing operational consistency for application delivery across multi-cloud environments to ensure an optimal experience for the application end user.
Citrix offers production-grade, fully supported application delivery and security solutions that provide the most comprehensive integration with Kubernetes platforms and open source tools, greater scale and lower latency, consistent application and API security, and a holistic observability stack.