User Community Service Desk Downloads

Ataccama 17.0.0 Release Notes

Products

17.0.0:

  • ONE Data Quality & Catalog, including ONE Data and Data Stories

  • ONE RDM

  • ONE MDM

Release date

February 13, 2026

Downloads

Support Service Desk

Security updates

Ataccama Security Advisories

ONE

For upgrade details, see DQ&C 17.0.0 Upgrade Notes.

Asset Promotion via Import and Export

Asset promotion allows automated migration of data quality assets across environments (for example, DEV → TEST → PROD) with minimal manual intervention. Export monitoring projects, DQ rules, transformation plans, catalog items, and other assets with complete dependency resolution and automatic environment remapping.

Key capabilities include:

  • Export and import of monitoring projects, DQ rules, transformation plans, DQ firewalls, terms, lookup items, and catalog items.

  • Automatic environment remapping for connections, schemas, and database names.

  • Comprehensive audit trails for compliance and traceability.

  • Incremental updates — promote individual assets without re-exporting entire projects.

  • Encrypted, password-protected archives for secure transfer between environments.

  • Validation-only mode to preview import results before applying changes.

Asset promotion

Export Profiling Results to a Microsoft Excel File

Export profiling results for a catalog item as an Excel file.

Use this to share profiling statistics for company audits or to demonstrate data quality insights to teams evaluating the platform. The export includes links to detailed attribute-level profiles for ease of access.

Export profiling results to Excel

Export Invalid Records from Monitoring Projects

Available for BigQuery and Databricks used in pushdown mode.

Export records that fail DQ evaluation rules to an external table directly from your monitoring projects. The export includes information about failing rules and selected attribute values, making it easier to share data quality issues with stakeholders who don’t have access to Ataccama ONE or to integrate with external remediation workflows.

Records are exported to a table in the same data source (write permissions required). This replaces post-processing plans, which are not compatible with SQL pushdown processing.

Export invalid records from monitoring projects

Schedule Removal of Obsolete Catalog Items

Schedule the Sweep documentation flow to automatically delete obsolete catalog items without manual review.

Keep your catalog up to date by removing items that are no longer relevant after being removed from the original data source: set the schedule once and let it run. For organizations that prefer more control, a manual workflow is also available: identify, review, and bulk delete obsolete catalog items on demand.

Sweep documentation flow

Cancel Documentation Flow

Cancel a running documentation flow directly from the data source header.

Cancel documentation flow

Apply Data Slices to Catalog Items

Apply data slices directly to catalog items, extending their use beyond monitoring projects.

When applied, a data slice filters the data used across all catalog item operations, including profiling, DQ evaluation, Data Observability checks, exports, and imports to ONE Data. The slice name is displayed next to the catalog item name, making it immediately visible that you are working with filtered data.

Create data slice on catalog item

IOMETE Pushdown Processing

IOMETE pushdown processing is now available.

Profile and monitor data quality of large-scale tables directly in the IOMETE lakehouse using SQL pushdown. Apply DQ checks and profiling without moving data out of IOMETE.

Key capabilities include:

  • Simple configuration through a single JDBC connection.

  • Full DQ rule designer support — the advanced expression editor checks expression compatibility with pushdown.

  • Pushdown can be applied at the connection or monitoring project level.

Secret Management Services: CyberArk and AWS Secrets Manager Support

ONE now supports two additional secret management providers: CyberArk Secrets Manager and AWS Secrets Manager. These join the existing Azure Key Vault and HashiCorp Vault integrations, giving you more flexibility in how you retrieve credentials when connecting to data sources.

Reuse a shared secret management service across many connections, governed by the existing user access management model.

  • AWS Secrets Manager: Retrieve secrets stored as key/value pairs using AWS access key credentials, with optional IAM assume role support for cross-account setups.

  • CyberArk Secrets Manager: Retrieve secrets from CyberArk Central Credential Provider (CCP). Authenticate the CCP connection with a client certificate: either a PFX file (certificate and private key bundled) or a separate Cert/Key file pair.

Configure integrations in Global Settings > Application Settings > Secret Management, then use them when adding credentials to your data source connections.

New Geospatial ONE Expressions

Use new geospatial functions to work with geographic data in ONE expressions. The functions accept geometries in common formats (GeoJSON, WKT, WKB, EWKT, EWKB) and assume the WGS84 model.

Use these expressions to build spatial data quality rules, for example:

  • Verifying that insured properties fall within policy coverage zones.

  • Detecting overlapping territories that should be mutually exclusive.

  • Ensuring clinical trial sites are within approved geographic regions.

  • Checking that branch locations are within regulatory jurisdictions.

For the full list of functions, see ONE Expressions Reference > Geospatial operations.

MDM

For upgrade details, see MDM 17.0.0 Upgrade Notes.

Streaming Event Handler: Batching and Performance

The Streaming Event Handler now supports collecting smaller transactions into a single publishing batch, controlled by batch size and time thresholds (whichever is reached first). This increases throughput for high-volume scenarios where many small changes occur in quick succession.

Monitoring

Monitor event handler activity directly in the MDM Web App Admin Center. View publishing details for both single and traversing publishers, including information about individual publishing phases and streaming progress.

Event Handler monitoring

State Change Notifications

Configure event listeners that fire when an event handler changes state. Use these to trigger shell commands, execute SQL statements, or call external services when a handler starts, pauses, or encounters an error, enabling faster response to publishing pipeline issues.

PostgreSQL as Sole Supported MDM Storage

As announced in 15.4.0 and reiterated in 16.3.0, MS SQL and Oracle are no longer supported for MDM storage. PostgreSQL is now the only supported database engine for MDM.

If you have not yet migrated, see MDM 17.0.0 Upgrade Notes for guidance.

Improvements to Admin Center

The MDM Web App Admin Center includes several usability improvements:

  • Environment banner: Display a colored banner with the environment name (for example, DEV, TEST, PROD) to help distinguish between environments at a glance. See Environment banner.

  • Lookup management screen: View the list of file system locations configured within the Versioned File System Component. See Lookup management.

  • Runtime parameter export: Download the current runtime parameters as a runtime.parameters file, making it easier to migrate configuration between environments or keep it under version control. See Runtime parameters.

  • License information: View license details in the Orchestration section. See Licenses.

ONE Desktop

JSON Assigner Step

Create JSON payloads in ONE plans using the new JSON Assigner step. The step works identically to the existing XML Assigner step, providing a familiar interface for building structured output.

For details, refer to the in-product documentation in ONE Desktop.

Fixes

ONE

Click here to expand
  • Monitoring projects

    • Anomaly detection on the Profile & Insights tab displays the expected values correctly.

    • Anomaly detection pop-up notifications now display correctly.

    • Profile inspector no longer shows mixed results when monitoring projects target the same catalog item with and without data slices. Results now correctly reflect whether the project uses the full dataset or a specific data slice.

    • Monitoring projects with post-processing plans deactivated no longer remain in the post-processing status indefinitely due to duplicate filters. A validation error now prevents publishing duplicate filters on the same attribute.

    • Manual runs of monitoring projects no longer fail after rule modifications.

    • Monitoring projects no longer fail after DQ dimension rename.

    • When importing monitoring project configuration containing filters, filter references to catalog items and attributes are correctly updated after remapping. Previously, filters would appear invisible after remapping but were still submitted to DQ evaluation as duplicates, causing exponential growth in filter combinations and processing failures.

    • When reimporting the same monitoring project configuration but with a post-processing plan attached, related catalog items are no longer duplicated and the project runs as expected.

    • Long path names on the monitoring project Notifications tab do not overflow.

    • DQ filters in monitoring projects now work correctly with pushdown enabled.

    • Long rule explanations on the monitoring project Report tab do not overflow.

    • Monitoring project results in email notifications are now rounded to two decimal places.

    • Updating individual data quality checks using the Update rule option in the DQ check modal works as expected. Previously, the update would fail to be propagated.

  • Rules

    • Rule suggestions now correctly respect your access permissions when using the Check for rule suggestions option. You’ll see suggestions for catalog items you have access to (even without term access), while rules requiring higher permissions are excluded.

    • Rule suggestions load correctly in monitoring projects for catalog items with assigned rules or terms you don’t have access to.

    • Removed the non-functional Apply the following rules automatically on all attributes with this term option from the term Settings tab. Rules are always applied when a term is assigned to an attribute.

    • A rule detail page loads correctly when opened from the Rule suggestions screen.

    • Rule suggestions on the attribute Profile & DQ insights tab do not overflow.

  • Transformation plans

    • When using transformation plans to export monitoring project results to a database or a CSV file, invalid and valid columns are correctly populated for failing rules instead of containing only null values.

    • Transformation plan previews are generated for all plans with at least one valid step, even if they contain invalid steps.

    • In transformation plans, Stewardship can now be edited for governance roles with the Full access level.

  • Data slices

    • Data slices in monitoring projects now include null values.

    • The Create data slice option is correctly hidden for users with view-only access to the catalog item.

  • Catalog & data processing

    • Power BI report previews load reliably without intermittent errors.

    • Setting a custom preview for a Power BI report no longer deactivates the Open in reporting tool option.

    • The Schemas screen reflects term updates more quickly after modifications.

    • Attribute IDs on SQL catalog items imported from content packs are now preserved when editing the query.

    • BigQuery pushdown processing now displays the correct source name in error messages.

    • Canceling pushdown data quality jobs now immediately stops running database queries.

    • A warning is displayed when attempting to delete a catalog item that is referenced by a monitoring project.

  • Documentation flows

    • After upgrading, Sweep documentation flow now appears in the documentation flow menu.

    • When a documentation flow is discarded, the corresponding jobs are now canceled.

    • The Delete all option in the Sweep documentation flow deletes all obsolete catalog items, not just those on the current page.

    • Users with full access level to a source can run the Sweep documentation flow.

  • Business glossary

    • Long term names do not overflow the Relationships widget.

    • The term Occurrence tab displays correctly regardless of metadata model property configurations.

    • Term Suggestions services no longer overload application logs with zero-confidence term suggestions.

    • Term Suggestions metrics are no longer collected when the services are not configured.

  • Workflows & tasks

    • Fixed an issue where an error was shown instead of creating a Review task when non-admin users requested reviewing and publishing of changes.

  • ONE Data

    • Loading to ONE Data works correctly for data with null or empty values of DateTime data type.

ONE Runtime Server

Click here to expand

ONE Desktop

Click here to expand
  • Time zones are synchronized between ONE Desktop and the processing node for DQ operations.

  • Data quality processing steps in ONE Desktop now correctly preserve time zones from input timestamps. The steps respect the time zone specified in input data or use the local time zone setting of the execution environment.

Was this page useful?