avatarSeth Goldin

Summary

The author shares their experience with setting up HP ZCentral Remote Boost (ZCRB) for remote post-production workflows, detailing the installation process, challenges faced with VPN configuration, and the performance of the software through a VPN in AWS Lightsail.

Abstract

The article provides a personal account of configuring HP ZCentral Remote Boost (ZCRB) for remote collaboration in post-production. The author experiments with both Windows and Linux (CentOS 7.7) as sender platforms and macOS Catalina 10.15.4 as the receiver, noting the compatibility and ease of installation on these systems. They discuss the necessity of a VPN for remote connections, opting to use Algo VPN on an AWS Lightsail Ubuntu VM due to its cost-effectiveness and compatibility with Algo's Ansible scripts. The author successfully connects the sender and receiver endpoints using Wireguard, despite some limitations and quirks with the macOS receiver app, such as the lack of Advanced Video Compression and poor dual-screen handling. The performance through the VPN is reported to be excellent, with frame-accurate audio sync, suggesting that HP ZCRB is a viable solution for remote work when properly configured.

Opinions

  • The author finds HP ZCentral Remote Boost to be an attractive solution for remote work with Premiere and Resolve due to its availability at no additional charge for HP Z workstations.
  • There is surprise expressed about HP ZCRB not being as "out-of-the-box" as other remote desktop solutions like TeamViewer or AnyDesk.
  • The author appreciates the detailed documentation and support from HP, particularly the assistance from a Tier 2 representative.
  • They show a preference for Red Hat-derived distros but acknowledge the necessity to use Ubuntu for Algo VPN due to its specific tool requirements.
  • The author is satisfied with the performance of the VPN setup, noting that the internet bandwidth is comparable with and without the VPN.
  • There is disappointment regarding the limitations of the macOS receiver app, specifically the unavailability of Advanced Video Compression and suboptimal handling of dual monitors.
  • The author is cautiously optimistic about the scalability of the VPN setup for production use with multiple users, indicating a willingness to upgrade the VM resources if necessary.

Adventures in configuring HP ZCentral Remote Boost

Some thoughts after gluing together a couple of endpoints through a VPN in AWS Lightsail

To help support remote post-production workflows, I’ve been playing around with HP ZCentral Remote Boost (ZCRB) [formerly known as Remote Graphics Software (RGS)]. What follows is not a detailed guide on how to set all this up — it’s just some scattered thoughts from my experience, intended for some IT professionals who might be curious. So far I’ve only played with a CentOS 7.7 sender and macOS Catalina 10.15.4 receiver.

I had seen some chatter on Twitter about HP ZCentral Remote Boost, chastising Apple for not having equivalent software for macOS. For HP Z workstations, HP offers it for no additional charge! I presume that the sender software somehow validates that it’s actually running on a Z workstation by checking something in the BIOS, similar to how Windows 10 does. I myself obtained the sender and receiver downloads just by logging into the HP site with my HP account, without entering any kind of payment information.

Supposedly, you can also get a demo copy of the software for 90 days and run the sender software even on other machines that aren’t HP Z workstations, but I’m not sure what the pricing might look like for that. It looks like you might have to contact a sales person to get pricing.

The sender apps run on both Windows and Linux, specifically RHEL. Since I use both Windows 10 and CentOS for Premiere and Resolve, this looked especially attractive. I popped into the installation shell scripts for Linux side, and saw that the script includes some checks with $ cat /etc/redhat-release, but since CentOS is functionally compatible with RHEL, CentOS has the information file /etc/redhat-release in addition to /etc/centos-release. So the installation script worked perfectly with bash.

There are receiver apps for Windows, Linux, and macOS, which is great.

There’s a bit of a quirk when installing the macOS receiver app. Just double-clicking it in the GUI prompts a Gatekeeper warning: “HP RGS Receiver.pkg” can’t be opened because Apple cannot check it for malicious software. This software needs to be updated. Contact the developer for more information. However, right-clicking and then selecting theOpen With option or hopping into Terminal and running $ sudo installer -pkg “HP\RGS\Receiver.pkg” -target /Applications works just fine.

Though the marketing material on HP’s landing page for ZCRB touts it as a solution for global collaboration, I was surprised to discover that it’s not quite as “out-of-the-box” as something like TeamViewer or AnyDesk.

When I popped open the receiver app, I was a little confused because I only saw a prompt for a hostname or IP address. Then I searched for a bit in the documentation, and saw, “IMPORTANT: The sender and receiver must be on the same network for a Remote Boost connection to be established between them.”

So I put in a support ticket to HP, and wound up chatting with a great and friendly Tier 2 rep who indeed confirmed for me that the typical way to do this was to make sure that all the endpoints were indeed linked together by VPN.

While I have a personal subscription to one of those “out-of-the-box” VPN services so as to better protect myself on public WiFi, I didn’t have any kind of experience rolling my own VPN. Coincidentally though, I had recently watched a great talk from Ruben Rubio Rey about how sketchy a lot of the VPN providers are, which included a recommendation for some tools to roll your own VPN in the cloud, specifically Streisand and Algo. From a cursory look, Algo seemed a bit better engineered, so I figured I’d give Algo a shot.

I’ve recently started playing around with S3 a bit, so I figured I’d try hosting the VPN server in Lightsail, since I had already set up the AWS infrastructure.

I wasn’t quite sure how many vCPUs or how much RAM to include in the VM, but I found a closed issue on the GitHub repository that gave instructions to use the extremely cost-effective $3.50/month option. That option is for a VM with 512 MB RAM, 1 vCPU, and 20 GB of SSD space.

I’m a bit more familiar with the Red Hat-derived distros, so at first I attempted to create a CentOS VM, but Algo’s set of Ansible scripts seem to refer to some Ubuntu-specific tools and paths. So running it with CentOS failed. I destroyed the CentOS VM, and then started from scratch again with an Ubuntu VM, and that worked perfectly.

One important step here in the config.cfg file was to make sure that the “road warrior” configuration was set: BetweenClients_DROP: false. That lets the different endpoints connect to each other.

So, once the VPN server was set up, I then had to set up the two different endpoints as VPN clients, both the sender CentOS Z8 and the receiver macOS laptop. I used Wireguard for both.

From the SSH terminal in the Lightsail console in my browser, I grabbed the Wireguard .conf file for each “user,” called them up in vim, and then just copied and pasted the contents into matching .conf files on the VPN client machines.

For the CentOS endpoint, Wireguard is available through ELRepo:

$ sudo yum install epel-release
$ sudo yum install https://www.elrepo.org/elrepo-release-7.el7.elrepo.noarch.rpm
$ sudo yum install kmod-wireguard wireguard-tools

I had to make the the /etc/wireguard directory:

$ sudo mkdir /etc/wireguard

Then I installed the .conf file that had been generated from the server, and started the systemd units:

$ sudo install -o root -g root -m 600 <username>.conf /etc/wireguard/wg0.conf
$ sudo systemctl start wg-quick@wg0

I configured the machine to connect automatically upon booting:

$ sudo systemctl enable wg-quick@wg0

On CentOS, it’s easy to toggle the VPN on or off, if necessary:

  • Turn it on: $ sudo wg-quick up wg0
  • Turn it off: $ sudo wg-quick down wg0

Once the CentOS endpoint was set up, it was time to configure Wireguard on macOS. It’s a bit easier on macOS. The App Store has a build with a nice GUI. You can just install the .conf with the GUI.

The .conf file for the Z8 lists what its internal IP address is within the VPN. So once both endpoints are connected within the VPN, I entered the “internal” IP address of the Z8 from the ZCRB receiver app on the Mac, and was able to log into the Z8 with the Z8’s own username and password.

So far, I’ve only tested the macOS receiver app, and there are a few wonky things about the macOS app in particular.

In the documentation, there’s an option for “Advanced Video Compression:”

Advanced Video Compression is an HP ZCentral Remote Boost Advanced Feature that enables the use of a modern video codec to greatly reduce the network bandwidth needed for high-quality video streams.

Advanced Video Compression is ideal for video or 3D applications in textured mode. It is not recommended for use with wireframes or fine lines, as screen artifacts might appear when in motion. Advanced Video Compression can be enabled in the Performance panel of the HP ZCentral Remote Boost Receiver settings.

It’s somewhat weird that they don’t name the codec explicitly, but maybe, given the wording, it’s H.264, since Advanced Video Coding is another name for H.264? Alas, regardless, it’s not available for the macOS receiver app.

Another disappointing limitation with the macOS receiver app is that it doesn’t handle dual screens very well. Showing two screens from the sending Z8 to two screens on the receiving macOS machine requires disabling different Spaces for different monitors, without the use of macOS’s built-in Full Screen functionality.

Despite all these quirks, the performance through the VPN is actually pretty great. Whether testing with “multi” or “single” connections on Ookla’s Speedtest, Internet bandwidth is about the same as not using the VPN.

In ZCRB, playing through some DNxHR LB proxies with embedded 48 kHz audio on the Z8, I’m actually able to send 44.1 kHz audio through the VPN and everything plays great, with frame-accurate audio sync.

This is with dedicated business fiber on the sending side, residential Verizon FiOS on the receiving side, and the VPN server VM in a pretty close AWS data center. So these conditions are about as good as it gets.

I’m not sure how many concurrent connections might necessitate a beefier VM with more vCPUs and more RAM, but so far, I’m just testing. If and when I put this into production with many actual users, I’ll be keeping an eye on performance across all the different endpoints, and will be ready to create and deploy a more powerful VM.

Hp Z Workstations
Zcrb
Algo
Lightsail
Remote Working
Recommended from ReadMedium