API Gateway 10.11 | Administering API Gateway | Container-based Provisioning | Kubernetes Support
 
Kubernetes Support
 
Deploying API Gateway Pod with API Gateway and Elasticsearch Containers
Deploying API Gateway Pod with API Gateway Container connected to an Elasticsearch Kubernetes Service
API Gateway Clustering on Kubernetes
Kubernetes Sample Files
Helm Chart
Using Helm to Start the API Gateway Service
OpenShift Support
API Gateway can be run within a Kubernetes (k8s) environment. Kubernetes provides a platform for automating deployment, scaling and operations of services. The basic scheduling unit in Kubernetes is a pod. It adds a higher level of abstraction by grouping containerized components. A pod consists of one or more containers that are co-located on the host machine and can share resources. A Kubernetes service is a set of pods that work together, such as one tier of a multi-tier application.
The API Gateway Kubernetes support provides the following:
*Liveliness check to support Kubernetes pod lifecycle.
This helps in verifying that the API Gateway container is up and responding.
*Readiness check to support Kubernetes pod lifecycle.
This helps in verifying that the API Gateway container is ready to serve requests. For details on pod lifecycle, see Kubernetes documentation.
*Prometheus metrics to support the monitoring of API Gateway pods.
API Gateway support is based on the Microservices Runtime Prometheus support. You use the IS metrics endpoint /metrics to gather the required metrics. When the metrics endpoint is called, Microservices Runtime gathers metrics and returns the data in a Prometheus format. Prometheus is an open source monitoring and alerting toolkit, which is frequently used for monitoring containers. For details on the prometheus metrics, see Developing Microservices with webMethods Microservices Runtime.
The following sections describe in detail different deployment models for API Gateway as a Kubernetes service. Each of the deployment models described require an existing Kubernetes environment. For details on setting up of a Kubernetes environment, see Kubernetes documentation.
With the API Gateway Kubernetes support, you can deploy API Gateway in one of the following ways:
*A pod with API Gateway container and an Elasticsearch container
*A pod with API Gateway container connected to an Elasticsearch Kubernetes service
API Gateway also supports Red Hat OpenShift containerized platform that you can use for building and scaling containerized applications. For details and special considerations, see the following sections:
* Building the Docker Image for an API Gateway Instance , in particular the --target.configuration and --os.image parameters
* OpenShift Support
For details about OpenShift in general, see OpenShift documentation.