Set up a High Availability Cluster

This topic describes how to set up a highly available server that is based on the Windows failover cluster and iSCSI technologies.

Important: GE Vernova does not provide support for the Microsoft Cluster infrastructure. Our support is limited to the Proficy products installed on the cluster.

The information in this topic is an example configuration for Windows clustering intended to provide guidance only. You may need to adjust based on your specific requirements and environment.

In a clustered setup, all servers need to have the same hardware, operating system, and software. Prior to setting up a failover cluster, make sure to gather the minimum requirements for your environment. This document helps to set up a two-node failover cluster with examples:
Node Description
Node1 iSCSI initiator (primary machine) connects to the target to utilize storage.
  • Microsoft® Windows Server 2019/2022 virtual machine
  • MURHAOPSHUB1VM
Node2 iSCSI initiator (secondary machine) connects to the target to utilize storage.
  • Microsoft® Windows Server 2019/2022 virtual machine
  • MURHAOPSHUB2VM
NodeX iSCSI target machine provides access to shared storage.
  • Microsoft® Windows Server 2019/2022 virtual machine
  • MUROPSHUBLBVM

Task Roadmap

The following table outlines the key milestones involved in accomplishing the high availability setup for Operations Hub.
Step Task Description
1 Set up the iSCSI target on NodeX. Complete the following steps:
  1. Configure iSCSI Target
  2. Create iSCSI Virtual Disk
2 Set up the iSCSI initiators on Node1 and Node2. Complete the following steps:
  1. Configure iSCSI Initiator
  2. Initialize iSCSI Volume
3 Configure a failover cluster for cluster nodes. Complete the following steps:
  1. Configure Failover Cluster Manager
  2. Configure Role
4 Deploy Proficy Authentication and Configuration Hub on cluster nodes. Complete the following steps:
  1. Log in to the Node1 server.
    Note: In Failover cluster manager, ensure:
    • Node1 is the active node.
    • Node2 is paused.
    1. Go to the shared drive.
    2. Create a folder named Postgres.
    3. Inside the Postgres folder, create another folder named uaa.
    4. Install Proficy Authentication on Cluster Nodes
    5. Install Configuration Hub on Cluster Nodes
    6. Open the Windows Services Management Console and configure the services as follows:
      Status > Stop:
      • Proficy Authentication PostgreSQL Database
      • Proficy Authentication Tomcat Web Server
  2. Log in to the Node2 server.
    Note: In Failover cluster manager, ensure:
    • Node2 is the active node.
    • Node1 is paused.
    1. Install Proficy Authentication on Cluster Nodes
    2. Install Configuration Hub on Cluster Nodes
    3. Add Proficy Authentication and Configuration Hub generic services
    4. Set dependencies for Proficy Authentication and Configuration Hub generic services
    5. To bring all the added services online, right-click Opshub Role and select Start Role.
  3. Replicate Cluster Nodes for Proficy Authentication and Configuration Hub
5 Restart both Node1 and Node2 servers. To apply the Proficy Authentication and Configuration Hub updates, restart the machines that represent the nodes in the cluster.

Test on both the nodes: Set up authentication and log in to Configuration Hub to verify the application's functionality.

6 Deploy Operations Hub on cluster nodes.
  1. Log in to the Node1 server.
    Note: In Failover cluster manager, ensure:
    • Node1 is the active node.
    • Node2 is paused.
    1. Set up Junction Links for Shared Folders
    2. Install Operations Hub
      Attention: Make sure to save a copy of the temp_windows_timestamp.env while Operations Hub installation is in progress.
    3. Open the Windows Services Management Console and configure the services as follows:

      Startup Type > Disable:

      • Proficy Operations Hub Master Control
      • Proficy Operations Hub IQP Provisioner
      • Proficy Operations Hub UAA Provisioner Service

      Startup Type > Automatic:

      • Proficy Operations Hub Httpd Reverse Proxy
      • Proficy Operations Hub OPC UA Browse Service
      Status > Stop:
      • Proficy Operations Hub Master Control
      • Proficy Operations Hub IQP PostgreSQL Database
      • Proficy Operations Hub WebHMI PostgreSQL Database
  2. Log in to the Node2 server.
    Note: In Failover cluster manager, ensure:
    • Node2 is the active node.
    • Node1 is paused.
    1. Set up Junction Links for Shared Folders
    2. Launch Command prompt and execute the Operations Hub executable file along with the password variables (extracted from .env temporary file). Pass the variables separated by spaces as shown here:
    3. Complete the installation wizard to start the installation. Refer to the Install Operations Hub topic from step 2.
    4. Repeat step 1-c.
    5. Add Operations Hub generic services
    6. Set dependencies for Operations Hub generic services
    7. To bring all the added services online, right-click Opshub Role and select Start Role.
  3. Replicate Cluster Nodes for Operations Hub
7 Restart both Node1 and Node2 servers. To apply the Operations Hub updates, restart the machines that represent the nodes in the cluster.

Test on both the nodes: Set up authentication and log in to Configuration Hub to verify the application's functionality.