Skip to content

How to Sync a Hyperscale Job

The new endpoint is useful when you have a database masking job setup on a Continuous Compliance Engine and need to use the same masking inventory in a Hyperscale job. You can export the masking job details from a Continuous Compliance Engine and import in the Hyperscale Compliance engine using the below steps.

  1. Export the masking job from the Delphix Continuous Compliance Engine that needs to be imported on the Hyperscale Engine for the dataset preparation. For more information about exporting a job, refer to Export the job.

  2. After the job is exported, you can make a request on the Hyperscale Engine with the new /import API endpoint to upload the response blob along with mount_filesystem_id (Required) and data_info_settings (Optional) for source and target dataset. This data_info_settings will be applicable on all the data_info objects in the dataset. For more information, refer to the /import API page.

    The following is an example of the request blob.

    {
      "exportResponseMetadata": {
        "exportHost": "1.1.1.1",
        "exportDate": "Tue Sep 13 12:55:31 UTC 2022",
        "requestedObjectList": [
          {
            "objectIdentifier": {
              "id": 3
            },
            "objectType": "MASKING_JOB",
            "revisionHash": "2873bd283bd"
          }
        ],
        "exportedObjectList": [
          {
            "objectIdentifier": {
              "id": 2
            },
            "objectType": "SOURCE_DATABASE_CONNECTOR",
            "revisionHash": "8723bd8273b"
          },
          {
            "objectIdentifier": {
              "id": 4
            },
            "objectType": "DATABASE_CONNECTOR",
            "revisionHash": "273db2738vd"
          },
          {
            "objectIdentifier": {
              "id": 4
            },
            "objectType": "DATABASE_RULESET",
            "revisionHash": "f8c0997c804c"
          }
        ]
      },
      "blob": "983nd0239nd923ndf023nfd2p3nd923dn239dn293fn293fnb2",
      "signature": "923nd023nd02",
      "publicKey": "f203fn23fn203[fn230[f",
      "mount_filesystem_id": 1,
      "data_info_settings": [
        {
          "prop_key": "unload_split",
          "prop_value": "2"
          "apply_to": "SOURCE"
        },
        {
          "prop_key": "stream_size",
          "prop_value": "65536"
          "apply_to": "TARGET"
        }
      ]
    }
    
  3. The Hyperscale Engine will then process the required data object from the sync bundle and prepare the connector and data objects that are required for the hyperscale job creation.

  4. The Hyperscale Engine will provide the data object identifier that can be further used as it is (after updating the passwords of associated connector) to create a hyperscale job or if needed, can also be updated before configuring a job.
    The following is an example of the response.

    {
     "data_set_id": id
    }
    

Note

After successful import, you must provide the password for connectors manually. To do so, perform the following steps:

  1. Get newly created data-set using GET /data-sets/{dataSetId} to get the newly created connector-info id.
  2. Copy the connector-id and call the GET /connector-info/{connectorInfoId} and copy the response.
  3. Use the PUT /connector-info/{connectorInfoId} and in the body, paste the GET response and add the new password field with password value in the source and target to update the connector password.
  4. If the bundle is passphrase protected, then the same needs to be provided while importing the bundle in the API header as “passphrase”. For more information about how to export passphrase encrypt bundle, refer to the Export the object section.

How to Sync Global Settings from a Delphix Continuous Compliance Engine

The new endpoint is useful when you have global-objects setup on a Continuous Compliance Engine and need to use the same global-objects like algorithms in a Hyperscale job. You can export the details of the global object from a Continuous Compliance Engine and import in the Hyperscale Compliance Engine using the below steps.

  1. Export the global settings from the Delphix Continuous Compliance Engine that needs to be imported on the Hyperscale Clustered Continuous Compliance Engines. For more information about exporting global settings, refer to Syncing all Global Objects.

  2. Once the bundle is exported, you can make a request on the Hyperscale Engine with the new /sync-compliance-engines endpoint to upload the response blob along with a list of Hyperscale Clusters Compliance Engines. For more information, refer to the /sync-compliance-engines API page.
    The following is an example of the request blob.

    {
      "exportResponseMetadata": {
        "exportHost": "1.1.1.1",
        "exportDate": "Tue Sep 13 12:55:31 UTC 2022",
        "requestedObjectList": [
          {
            "objectIdentifier": {
              "id": "global"
            },
            "objectType": "GLOBAL_OBJECT",
            "revisionHash": "897weqwj76"
          }
        ],
        "exportedObjectList": [
          {
            "objectIdentifier": {
              "id": 12
            },
            "objectType": "PROFILE_EXPRESSION",
            "revisionHash": "7dc67asch8a"
          },
          {
            "objectIdentifier": {
              "id": "BIOMETRIC"
            },
            "objectType": "DOMAIN",
            "revisionHash": "7edb8ewbd8w"
          },
          {
            "objectIdentifier": {
              "algorithmName": "dlpx-core:Email SL"
            },
            "objectType": "USER_ALGORITHM",
            "revisionHash": "87h823d23d23"
          }
        ]
      },
      "blob": "39fdn23d9834fn3948f348fbw3pd9234nf9p4hf89",
      "signature": "7823hd823bd8",
      "publicKey": "892d3un293dn2p39db8283",
      "compliance_engine_ids": [
        1,
        2
      ]
    }
    

Note

  1. After import, if Hyperscale Clustered Continues Compliance Engines already have same objects with same id or properties, then those objects will be overwritten.
  2. If the bundle is passphrase protected, then the same needs to be provided while importing the bundle in the API header as “passphrase”. For more information about how to export passphrase encrypt bundle, refer to the Export the object section.

Limitations

Hyperscale Job Sync feature has the following limitations:

  1. Pre and post-script import from the Continuous Compliance Engine to Hyperscale Engine is not supported.
  2. Import of Kerberos and Custom JDBC drivers connector-based making job is not supported.
  3. The Hyperscale Compliance Engine does not support jobs with multi-column (MC) algorithms. Therefore, inventory details need to be modified before the export from the Continuous Compliance Engine or dataset info needs to be updated after the import into the Hyperscale Engine.