How to Use NGINX Service Mesh for Traffic Splitting

How to Use NGINX Service Mesh for Traffic Splitting

Traffic splitting – dividing traffic between different versions of an app running in the same environment – is a valuable tool for app development because it helps reduce the risk of bad customer experiences when a new app or version is released. In Kubernetes environments, the Ingress controller is often the default choice for splitting traffic, but a service mesh is a good alternative in more complex environments, where it enables you to compare versions of individual microservices. For example, you might want to do a canary deployment behind your mobile frontend, between two different versions of your geo‑location microservices API.

Setting up traffic splitting with NGINX Service Mesh is a simple task which you can complete in less than 10 minutes. In this demo, we show how to use NGINX Service Mesh to implement both a blue‑green and a canary deployment of a new application version.

To help you easily implement these rollout strategies in your own NGINX Service Mesh environment, we also provide step-by-step tutorials for blue‑green and canary deployments.

For more info on traffic splitting and other traffic management patterns, read our blog How to Improve Resilience in Kubernetes with Advanced Traffic Management on our blog.

Get Started with NGINX Service Mesh Today

NGINX Service Mesh is completely free and available for immediate download and can be deployed in less than 10 minutes! To get started, check out our docs and let us know how it goes via GitHub.

The post How to Use NGINX Service Mesh for Traffic Splitting appeared first on NGINX.

Source: How to Use NGINX Service Mesh for Traffic Splitting

About KENNETH 19694 Articles
지락문화예술공작단

Be the first to comment

Leave a Reply

Your email address will not be published.


*


이 사이트는 스팸을 줄이는 아키스밋을 사용합니다. 댓글이 어떻게 처리되는지 알아보십시오.