Trilio 4.1 HF13 Release

Release Artifacts

1. Introduction

This document provides information on TVO-4.1.HF13 Release.

Important Info:

To use this hotfix (4.1.HF13)

  1. Customers (except Canonical Openstack) and having Openstack Ussuri need to have an already deployed and working TVO-4.1 GA OR TVO-4.1.HF1 OR TVO-4.1.HF2 OR TVO-4.1.HF3 OR TVO-4.1.HF4 OR TVO-4.1.HF5 OR TVO-4.1.HF6 OR TVO-4.1.HF7 OR TVO-4.1.HF8 OR TVO-4.1.HF9 OR TVO-4.1.HF10 OR TVO-4.1.HF11 OR HF12

  2. Customers (except Canonical Openstack) and having Openstack Victoria OR TripleO Train need to follow the TVO-4.1 GA deployment process and directly upgrade to 4.1.HF13 containers/packages. The high-level flow below:

    1. Deplo T4O-4.1 GA appliance.

    2. Upgrade to 4.1.HF13 packages on the appliance.

    3. Kolla & TripleO

      1. Deploy Trilio components via 4.1.HF13 containers/packages on Openstack Victoria/TripleO Train.

    4. Openstack Ansible

      1. Deploy Trilio components Openstack Victoria [This will deploy 4.1 GA packages]

      2. Upgrade TrilioVault packages to 4.1.HF13 on Openstack Victoria.

    5. Configure the Trilio appliance.

  3. Canonical users having Openstack Ussuri can either upgrade (on top of 4.1 GA) using Trilio upgrade documents OR do a fresh deployment using 4.1 Deployment documents.

  4. Canonical users having Openstack Victoria can either upgrade (on top of 4.1.HF4) using Trilio upgrade documents OR do a fresh deployment using 4.1 Deployment documents.

The deploy/upgrade documentations provide the detailed steps to deploy/upgrade to the hotfix.

2. Release Scope

Current Hotfix release targets the following:

  1. Verification of Jira issues targeted for 4.1.HF13 release.

  2. As part of the new process, the delivery will be via packages; end users would need to do the rolling upgrade on top of 4.1 GA OR 4.1.HF1 OR 4.1.HF2 OR TVO-4.1.HF3 OR TVO-4.1.HF4 OR TVO-4.1.HF5 OR TVO-4.1.HF6 OR TVO-4.1.HF7 OR TVO-4.1.HF8 OR TVO-4.1.HF9 OR TVO-4.1.HF10 OR TVO-4.1.HF11 OR TVO-4.1.HF12

3. Tag References for Rolling Upgrades

Tag Reference in Upgrade Docs

Value

Comments

1

4.1-HOTFIX-LABEL

hotfix-13-TVO/4.1

Label against the Trilio repositories from where required code to be pulled for upgrades.

2

4.1-RHOSP13-CONTAINER

4.1.94-hotfix-16-rhosp13

RHOSP13 Container tag against 4.1.HF13

3

4.1-RHOSP16.1-CONTAINER

4.1.94-hotfix-16-rhosp16.1

RHOSP16.1 Container tag against 4.1.HF13

4

4.1-RHOSP16.2-CONTAINER

4.1.94-hotfix-16-rhosp16.2

RHOSP16.2 Container tag against 4.1.HF13

5

4.1-KOLLA-CONTAINER

4.1.94-hotfix-13-ussuri

4.1.94-hotfix-12-victoria

Kolla Container tag against 4.1.HF13

6

4.1-TRIPLEO-CONTAINER

4.1.94-hotfix-12-tripleo

TripleO Container tag against 4.1.HF13

4. Resolved Issues

Issues logged by Customers to be documented in this section

Summary

1

horizon logs getting dumped with errors

2

T4O 4.1 vulnerability reported by Fortinet

3

All the network ports of a project are deleted in case Restore Network Topology fails

5. Deliverables against 4.1.HF13

Package/Container Names

Package Kind

Package/Container Version/Tags

1

dmapi

deb

4.1.94.3

2

dmapi

rpm

4.1.94.3-4.1

3

python3-dmapi

rpm

4.1.94.3-4.1

4

python3-dmapi

deb

4.1.94.3

5

tvault-contego

rpm

4.1.94.10-4.1

6

tvault-contego

deb

4.1.94.10

7

python3-tvault-contego

deb

4.1.94.10

8

python3-tvault-contego

rpm

4.1.94.10-4.1

9

s3fuse

python

4.1.94.7

10

s3-fuse-plugin

deb

4.1.94.7

11

python3-s3-fuse-plugin

deb

4.1.94.7

12

python3-s3fuse-plugin

rpm

4.1.94.7-4.1

13

python-s3fuse-plugin-cent7

rpm

4.1.94.7-4.1

Following packages changed/added in the current release

Package/Container Names

Package Kind

Package/Container Version/Tags

1

workloadmgr

deb

4.1.95.22

2

workloadmgr

python

4.1.94.23

3

tvault_configurator

python

4.1.94.17

4

tvault-horizon-plugin

rpm

4.1.94.7-4.1

5

tvault-horizon-plugin

deb

4.1.94.7

6

python3-tvault-horizon-plugin

deb

4.1.94.7

7

python3-tvault-horizon-plugin-el8

rpm

4.1.94.7-4.1

8

RHOSP16.1 Containers

Containers

4.1.94-hotfix-16-rhosp16.1

9

RHOSP16.2 Containers

Containers

4.1.94-hotfix-16-rhosp16.2

10

RHOSP13 Containers

Containers

4.1.94-hotfix-16-rhosp13

11

Kolla Containers

Containers

4.1.94-hotfix-13-ussuri

4.1.94-hotfix-12-victoria

12

TripleO Containers

Containers

4.1.94-hotfix-12-tripleo

6. T4O Deployment Coverage

The following table gives the overview of coverage against Trilio Deployment with Openstack:

TVault Deployment Tool
Covered ?
Comments

1

Shell Script

NO

Scoped out since TVO-4.1

2

Ansible (Openstack native)

YES

For Kolla & Openstack ansible

3

Debian Packages

YES

Used on Ubuntu based distro via all TVault Deployment methods

4

RPM Packages

YES

Used on RH based distro via all TVault Deployment methods

5

RH Director

YES

For RHOSP

6

TripleO

YES

For TripleO

7

Juju/Charms

YES

For Canonical Openstack

7. Backup Store Coverage

The following table gives the overview of coverage against backup stores covered as part of the development and testing of 4.1.HF9 release.

Backup Storage
Covered?

1

AWS S3

NO

2

NFS

YES

3

RH Ceph S3

YES

4

Wasabi S3

NO

8. Known Issues

Summary
Workaround/Comments (if any)

1

2

restore fails for SRIOV network

(Fixed in 4.1.HF7; documenting single scenario)

If port_security_enabled=False on the network , restore will pass and user can attach security group to the restored vm network port later after restore is done.

3

[Intermittent] All API calls are getting stuck.

Note: Respective steps added to common T4O upgrade document.

Set oslo.messaging package version to 12.1.6 on all T4O nodes.

/home/stack/myansible/bin/pip install oslo.messaging==12.1.6 Restart all the wlm services.

4

Contego package installation failing on HF5 OSA with S3.

Note : Respective steps added to common T4O components upgrade document against OSA distro.

Before contego package upgrade unmount /var/triliovault-mounts path

5

In-place restore not working properly with multiattach volume

Select all the VM's boot disk as well as cinder multiattach disk on the in-place restore window. Restore will work fine for all the VM

6

Snapshot mount only shows volume group/LVM for one VM when 2 or more VMs have volume group with same name

NA

7

Snapshot Disk Integrity Check Disabled for 4.1.HF1 release.

Impact

  1. If any snapshot disk OR the chain gets corrupted, T4O will identify it and log the warning message in logs however the snapshot will not be marked failed. Workload reset also will not be happening.

  2. Restore of such snapshot will fail.

None

8

Backup and restore should not break for instances with multi-attach volumes.

After upgrade from 4.1 GA to 4.1HF1 , snapshots which trigger just after upgrade for workloads having multi-attach volume would be of “mixed” type after that all snapshots will be of incremental types .

9

[FRM] Snapshot mount not working

Update permissions of NFS mount point to 755 on the NFS server and retry snapshot mount operation.

{noformat}chmod 755 /mnt/tvault/tvm4{noformat}

10

[Intermittent] [RHOSP 16.1] [Horizon] After the overcloud deployment, openstack UI messed UP.

Login to the Horizon container and run the following commands:

  1. podman exec -it -u root horizon /bin/bash

  2. /usr/bin/python3 /usr/share/openstack-dashboard/manage.py collectstatic --clear --noinput

  3. /usr/bin/python3 /usr/share/openstack-dashboard/manage.py compress --force

  4. Restart the Horizon container : podman restart horizon

11

[DR] Selective restore fails, If original image is deleted in canonical focal-victoria environment

None

Last updated