T4K UI Native Authentication

T4K UI supports authentication through kubeconfig files which contains token, certificate, auth-provider, etc. But for some Kubernetes cluster distributions, kubeconfig may contain cloud-specific exec action or auth-provider configuration to fetch the authentication token with the help of credentials file which is not supported by default.

When we use kubeconfig on the local system, cloud-specific action/config in the user section of kubeconfig will look for credentials file at the specific location which will help kubectl/client-go library generate an authentication token used for authentication. But T4K Backend being deployed in the Cluster Pod, this credentials file won't be available in the Pod to generate authentication token. T4K will provide a cloud distribution specific support to handle and generate token from these credentials file.

Google Kubernetes Engine (GKE)

Default kubeconfig

apiVersion: v1
clusters:
  - cluster:
      certificate-authority-data: LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUVMRENDQXBTZ0F3SUJBZ0lRU3B5cVp4QzU4NFFEbVFYdz
      server: https://34.138.168.200
    name: gke_amazing-chalice-243510_us-east1-b_dev-cluster
contexts:
  - context:
      cluster: gke_amazing-chalice-243510_us-east1-b_dev-cluster
      user: gke_amazing-chalice-243510_us-east1-b_dev-cluster
    name: gke_amazing-chalice-243510_us-east1-b_dev-cluster
current-context: gke_amazing-chalice-243510_us-east1-b_dev-cluster
kind: Config
preferences: {}
users:
  - name: gke_amazing-chalice-243510_us-east1-b_dev-cluster
    user:
      auth-provider:
        config:
          cmd-args: config config-helper --format=json
          cmd-path: /home/trilio/google-cloud-sdk/bin/gcloud
          expiry-key: '{.credential.token_expiry}'
          token-key: '{.credential.access_token}'
        name: gcp  

Credentials

Using credentials for login

For GKE cluster, there exists local binary gcloud which reads sqlite credentials file with a name credentials.db under location $HOME/.config/gcloud to generate an authentication token. All the parameters required to generate the token exists in the same credentials.db file. While logging into T4K UI deployed in the GKE cluster, a user is expected to provide credentials.db file under location $HOME/.config/gcloud to pass the authentication.\

Amazon Elastic Kubernetes Service (EKS)

Default kubeconfig

apiVersion: v1
clusters:
- cluster:
    certificate-authority-data: LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUN5RENDQWJDZ0F3SUJBZ0lCQURBTkJna3Foa2lHOXcwQkFR
    server: https://6C74ACD3CA40CFCB719CF3464423ADA9.gr7.us-east-1.eks.amazonaws.com
  name: vinod-eks.us-east-1.eksctl.io
contexts:
- context:
    cluster: vinod-eks.us-east-1.eksctl.io
    user: vinod.patil@trilio.io@vinod-eks.us-east-1.eksctl.io
  name: vinod.patil@trilio.io@vinod-eks.us-east-1.eksctl.io
current-context: vinod.patil@trilio.io@vinod-eks.us-east-1.eksctl.io
kind: Config
preferences: {}
users:
- name: vinod.patil@trilio.io@vinod-eks.us-east-1.eksctl.io
  user:
    exec:
      apiVersion: client.authentication.k8s.io/v1alpha1
      args:
      - eks
      - get-token
      - --cluster-name
      - vinod-eks
      - --region
      - us-east-1
      command: aws
      env:
      - name: AWS_STS_REGIONAL_ENDPOINTS
        value: regional

Credential

Using credentials for login

For EKS cluster, there exists local binary aws (aws-cli) which reads credentials file with a name credentials under location $HOME/.aws to generate an authentication token. There is one more extra parameter eks cluster-name needed to generate a token which will be asked only once to login. While logging into T4K UI deployed in the EKS cluster, a user is expected to provide credentials file under location ~/.aws to pass the authentication.

\

Last updated