User Community Service Desk Downloads
If you can't find the product or version you're looking for, visit

DQG 15.2.0 Upgrade Notes

Azure Key Vault

In previous versions, if you wanted to integrate with Azure Key Vault you had to do so at the connection level, that is, provide the Key Vault credentials for every connection authenticated with Key Vault.

  • For JDBC database connections (for example, MSSQL and other relational databases), this was done via driver properties, as described in Add Driver Properties and Azure Active Directory Key Vault Authentication.

    Expand image
    Add driver properties
  • For PowerBI and Azure Data Lake Storage Gen2 connections, there was a dedicated section in the Add connection > Add credentials interface, but you still had to provide the key vault credentials manually with every new connection.

    Expand image
    Key vault old connection UI

From version 15.2, we have added the Secret Management Services. This means you can create a central connection with Azure Key Vault and then provide access to this central storage location, and retrieve secrets from it, when connecting to a data source. This is now the recommended method for integrating with Azure Key Vault.

What happens when I upgrade?

Vault credentials provided via driver properties (such as SQL server, Azure SQL, Azure Synapse)

No automatic migration to the new service is performed, and your existing connections using Key Vault are still valid. However, we highly recommend configuring Key Vault in the Secret Management Service and then editing the connections to use this service, according to the instructions found in Secret Management Services and Relational Database Connection respectively, as driver properties might be obsolete in the future.

Vault credentials provided in Power BI or Amazon Data Lake connections

Your Key Vault configurations are automatically migrated to the new Secret Management Services.

However, a new Secret Management Service is added for every set of Key Vault credentials previously provided on your connections. It might therefore be necessary to consolidate these into fewer instances, or even a single instance, and then edit the connections to use the newly consolidated services, according to the instructions found in Secret Management Services and Azure Data Lake Storage Gen2 Connection or Power BI Connection respectively.

DQ Firewall and MinIO bucket communication

A new configuration has been added to DQ Firewalls that allows you to configure which buckets are accessible in component rules. By default, DQF cannot communicate with any MinIO buckets.

Add the following configuration to /opt/ataccama/ one/dqf-${DQF_VERSION}/etc/

# Resources definition
# Component rules can reference external resource via resource://<name>/<objectName>
# This configuration provides mapping between URLs and actual files

# Name of the resource used in URL
# What storage should be used for this resource
# Has to reference existing[*].storage-id
# Bucket name

Multiple resources can be added. In the previous example, you could add a second resource, for example:

Index sequence must be continuous (no gaps).
This configuration is very similar to what you configure in DPM runtime configuration. In almost all cases, those configurations should be kept in sync so the same resources can be used in DQF and in DPE processing.

Lineage permissions

To be able to work with lineage metadata, you need to modify the View metadata access level for the catalogConfiguration entity.

  1. Go to Global Settings > Metadata model.

  2. Find and select the catalogConfiguration entity and switch to the Access levels tab.

  3. Select View metadata access and then Edit.

  4. Under Lineage, select the following permission: Lineage: Verify Jwt Token For Lineage Permissions.

  5. Save your changes.

Was this page useful?