T4K UI Native Authentication
T4K UI supports authentication through kubeconfig files which contains token, certificate, auth-provider, etc. But for some Kubernetes cluster distributions, kubeconfig may contain cloud-specific exec
action or auth-provider
configuration to fetch the authentication token with the help of credentials file which is not supported by default.
When we use kubeconfig on the local system, cloud-specific action/config in the user
section of kubeconfig will look for credentials file at the specific location which will help kubectl/client-go library generate an authentication token used for authentication. But T4K Backend being deployed in the Cluster Pod, this credentials file won't be available in the Pod to generate authentication token. T4K will provide a cloud distribution specific support to handle and generate token from these credentials file.
Google Kubernetes Engine (GKE)
Default kubeconfig
Credentials
Using credentials for login
For GKE cluster, there exists local binary gcloud
which reads sqlite credentials file with a name credentials.db
under location $HOME/.config/gcloud
to generate an authentication token. All the parameters required to generate the token exists in the same credentials.db
file. While logging into T4K UI deployed in the GKE cluster, a user is expected to provide credentials.db
file under location $HOME/.config/gcloud
to pass the authentication.\
Amazon Elastic Kubernetes Service (EKS)
Default kubeconfig
Credential
Using credentials for login
For EKS cluster, there exists local binary aws
(aws-cli) which reads credentials file with a name credentials
under location $HOME/.aws
to generate an authentication token. There is one more extra parameter eks cluster-name
needed to generate a token which will be asked only once to login. While logging into T4K UI deployed in the EKS cluster, a user is expected to provide credentials
file under location ~/.aws
to pass the authentication.
\
Last updated