QNAP Hyper Data Protector best practice

QNAP Hyper Data Protector as backup for VM Best Practice Part 2

QNAP Hyper Data Protector: best practice, features, troubleshooting. Part 2 of my Tutorial You find Part 1 HERE

//Diesen Artikel auf DEUTSCH findest du HIER

QNAP’s own tool “Hyper Data Protector” can be used (QNAP NAS required) for additional data backup of virtual machines VM. However I can’t recommended as a solo backup tool. The tool is described by QNAP HERE so I limit myself purely to the practical application and use as well as potential errors. It is a free tool that, in combination with a QNAP, offers the admin an additional sense of security.

Hyper Data Protector (HDP) setup and functionality

It is simply installed on a QNAP from the AppStore. The program also requires the Container Station* tool, which is installed at the same time if it is not available. After installation, an additional page is available that can be called up directly in the browser. With HDP, virtual machines of systems based on VMWARE and HYPERV can be backed up. For this purpose, snapshots* of the VMs are created when an order is executed and transferred to the previously set up repo of the HDP, then deleted on the hypervisor.

The repo is kept small on the QNAP with data deduplication* and compression. The interface is very intuitive and QNAP’s own tutorial is well described.

QNAP Hyper Data Protector best practice
Creating of a backup job in HDP

You will ask when everything is so easy and great…

Why you don’t recommended QNAP Hyper Data Protector as backup for VMs as a standalone solution?

Did you notice the * in the sentences above 😊

I marked sources of error with the *, which after several years of using the tool led to the complete failure of this tool as solo solution. The tool is good, no question about it, but don’t give yourself sleepless nights and use it wisely.

Sources of error known to me

  • QNAP Container Station – is a tool that not only enables Docker containers on your QNAP. It also handles the management of the HDP’s repo, which itself is essentially a container. The problem is that when one tool depends on another, errors can occur. It happened to me and I was able to reproduce the error as well. When Container Station updates, HDP may run the jobs. BUT your job lists will be empty and no new jobs can be run, neither restore nor backup. Solution HS3 backup or two HDP on other QNAP or no data deduplication.
  • SNAPSHOTS: cannot be taken from all virtual machines. Of course, machines with secured RAM applications (e.g.: SUSE Linux Enterprise Secured Core or ASL) or active RDP memory (e.g.: Cisco ISE/ASA) are not backed up. Creating and transferring the snapshots is quick, deleting the snapshot on the HOST takes an extremely long time and happens in 99% of the data backup. Sometimes the remaining 1% takes longer than the 99% before it. If the snapshot is not successfully deleted, the entire backup on the HDP is gone, although the data has been transferred. On the host, many snapshots can cause problems.
  • Data deduplication: It’s a great thing. However, when backing up the HDP repo via HSB3 or exporting to another QNAP to restore the repo, the process is sometimes aborted (even when the mark is set “stop deduplication” when created the job).

HDP Best practices

  • Do not set up HDP on the QNAP used as NFS storage for your VMs
  • Uses HDP only as an additional backup not as a main backup
  • If you have a lot of free storage, don’t use data deduplication
  • Export your HDP repo before updates
  • If deduplication is used, buy a QNAP with a good CPU (minimum 2 cores) and corresponding RAM
  • Set up notifications for alarms and warnings from HDP jobs, otherwise you won’t automatically be notified if they fail
  • Backup the VMs based on VMware, preferably via EsXI (faster, more independent but less stable) or vCenter (more stable and more accurate) not both mixed
  • Backups will be deactivated when your vCenter or Hypervisor is not reachable they will not be reactivated automatically


Popular Posts