Add a new dynamic backup target on RHOSO

Please follow the below steps to add new backup target on RHOSO18.

1] Mount Backup target on Trilio Control Plane

Navigate to trilio ctlplane-scripts directory

cd /PATH/TO/triliovault-cfg-scripts/redhat-director-scripts/rhosp18/ctlplane-scripts

Plan which type of backup target you want to add. T4O supports two types of backup targets.

  1. nfs

  2. s3

If you want to add backup target of type 'nfs' then edit the following yaml file and create TVOBackupTarget resource.

vi tvo-backup-target-cr-nfs.yaml
oc -n trilio-openstack apply -f tvo-backup-target-cr-nfs.yaml

To add a backup target of type 'S3', follow the steps below:

Copy the template YAML file Navigate to the scripts directory and copy the existing trilio-s3-backup-target-secret.yaml file to a new one named according to your backup target:

cd /PATH/TO/triliovault-cfg-scripts/redhat-director-scripts/rhosp18/ctlplane-scripts
cp trilio-s3-backup-target-secret.yaml trilio-s3-backup-target-secret-<BACKUP_TARGET_NAME>.yaml
Example:-
cp trilio-s3-backup-target-secret.yaml trilio-s3-backup-target-secret-s3-bt8.yaml

Base64 encode the S3 access and secret keys.

You will need to provide base64-encoded values for both the S3 access key and secret key. Use the following command to encode them:

Replace "s3_key_string" with your actual S3 access key or secret key. The output is the base64-encoded string to be used in the new s3 yaml file.

Edit the copied S3 YAML file

Open the newly created file with your preferred editor (e.g., vi):

Update the following fields with the backup target name and the corresponding base64-encoded values:

Apply the updated secret to the cluster

After saving the file, apply it to the trilio-openstack namespace:

If your S3 bucket is an Amazon S3 bucket.

If your S3 is of any other type edit following file and create TVOBackupTarget resource

Verify that the dynamic backup target added was successfully deployed for the T4O control plane services

2] Mount Backup target on Trilio Data Plane

Navigate to data plane scripts directory

Create templates needed for adding backup targets. Please note that use unique backup target name for parameter <BACKUP_TARGET_NAME>. You should not have used this backup target name earlier for any other trilio backup target. For parameter <BACKUP_TARGET_TYPE>, valid choices are ‘s3' and 'nfs’

A new directory gets created with backup target name having necessary templates. You can list these templates.

Backup target name gets set in all the yaml files created in this directory. BACKUP_TARGET_NAME gets converted to all small case and if any underscore ‘_' character is there in it, it gets replaced by hyphen character '-'. Add backup target details to template

Change to <BACKUP_TARGET_NAME> if not done already

Edit config map file

Create config map

If you are adding 's3' type backup target, then only you need to create following secret. We have already filled in all details in yaml file. We just need to apply it.

Set Trilio Ansible Runner container image url and tag in parameter 'openStackAnsibleEERunnerImage:' of trilio-add-backup-target-service.yaml You don’t need to change any other parameter.

Create custom data plane service for this backup target

Edit trilio-add-backup-target-deployment.yaml file and set parameter ‘nodeSets' with correct value from your environment. You don’t need to change any other parameter.

Get OpenStackDataPlaneNodeSet name

Set 'nodeSets' parameter

Trigger deployment of this backup target

Check logs Edit <DEPLOYMENT_NAME>, take it from above deployment yaml.

If this pod does not get created, it means you have not used unique backup target name or some other issue happend. Please verify that

Verify that the dynamic backup target added was successfully deployed for the T4O dataplane services

3] Add Backup Target Records

Login to triliovault-wlm-api pod and run below CLI command to create backup target in Trilio DB.

For NFS Backup Target:

Sample command:

For Object lock enabled S3 Backup Target:

Sample command:

For non-object lock S3 Backup Target:

Sample command:

Last updated

Was this helpful?