The introduction of the Edge server allows your system to scale and makes it easier to manage and implement cross-cutting concerns such as security. Primarily, it provides the following features:
Routing and canarying: This is the main feature. It provides request routing that identifies the pattern and, based on the predicate, routes the request to the respective server. For example, calls having /api/v1/restaurant in the path would be served by restaurant-service and /api/v1/booking would be served by booking-service. Canarying allows you to implement the strategy of routing requests to different instances of the same application based on header information, users' credentials, the time of the request and so on. Canarying can be configured easily on an Edge server. If your app is in a transition mode towards microservices, you could use monolithic strangling to route a few requests to your monolithic application until you migrate to microservices.
Handling of cross-cutting concerns:
Security: Since an Edge server provides you with a single entry point to access all resources, you can easily handle security here. We'll do that in the next chapter.
Monitoring: You can monitor all incoming requests easily and add analytics to identify different patterns, and also use analytics data to fine-tune and improve your system. You can also add rate monitoring to limit the calls based on APIs.
Many other issues, such as logging and so on. You will wish to have traceable logs for each business call, such as booking a table, spread across multiple services—booking-service, billing-service, payment-service, security-service, and so on.
Resiliency: Failures are bound to happen. An Edge server can insulate users from seeing any problem that may occur with any downstream services.