17.0.0 Patch Releases
Improvements and fixes released for Ataccama ONE 17.0.0 since the initial release in February 2026.
For new features and enhancements, see 17.0.0 Release Notes.
April 2026
Patch 5
ONE
ONE Web Application, Data Processing Engine, Task Service, Audit Module, Lineage Scanning Service, Metadata Extraction Service, Data Stories, AI Evolution
- Lineage
-
-
When selecting an attribute on the home datastore, the diagram now filters to show only datastores connected to that attribute. This option is auto-enabled for large diagrams; you can manage it in the view settings.
-
Faster attribute display and transformation context loading on large lineage diagrams.
-
Lineage scan evidence download links are now generated on demand and remain valid as long as the scan result is available, instead of expiring after a day.
-
Reduced memory consumption during catalog matching on large datasets.
-
- Catalog & data processing
-
-
Transformation plans with a Database writer step now let you select both the catalog and the schema when creating catalog items on Unity Catalog-enabled Databricks sources.
-
SAP metadata imports no longer fail with
Columns must have unique orderswhen source tables have attributes with duplicate order values.
-
- Data source connections
-
-
Username/password credentials migrated from 15.4 can now be edited in the web application.
-
- Data Stories
-
-
Data Stories visualizations now work correctly with ONE Data catalog items whose underlying tables are re-created during postprocessing exports.
-
- Usability & display
-
-
Faster loading of user and role selection dropdowns in project notifications, observability settings, and DQ firewall configuration, especially in environments with many users.
-
File imports during asset promotion now show upload progress, preventing accidental interruption of long uploads.
-
- AI
-
-
AI rule indexing no longer fails for rules without input variables.
-
- Audit
-
-
Improved audit query performance in high-activity environments through autovacuum tuning on the audit assets table.
-
New environments now use the intended 90-day audit database retention period by default, instead of 365 days.
-
- Security
-
-
Upgraded third-party dependencies in Metadata Extraction Service for enhanced security.
-
March 2026
Patch 4
ONE
ONE Web Application, Metadata Management Module, Data Processing Engine, Metadata Extraction Service, AI Evolution service
- Performance and stability
-
-
Optimized system validations to reduce resource usage on large environments.
-
Fixed long-running database transactions that could slow down other operations.
-
Improved Task service stability by correcting memory allocation settings.
-
- Catalog & data processing
-
-
Snowflake pushdown profiling no longer fails with a SQL compilation error when frequency analysis is turned off.
-
DPE correctly creates the job folder when a custom
TEMP_ROOTis specified in the configuration, preventing temporary file errors during profiling.
-
- Data quality
-
-
When creating a DQ firewall from a monitoring project, attributes with unsupported data types are now skipped instead of blocking firewall creation entirely.
-
Monitoring project notifications no longer display an incorrect dash before DQ result percentages.
-
- ONE AI
-
-
In Ataccama Cloud environments, resolved failures in chat with documentation and rule suggestions caused by an incompatible PostgreSQL version.
-
- Usability & display
-
-
Job duration is now displayed correctly in the web application, even for jobs running longer than 24 hours.
-
The Insert Term function in rich text editors now correctly lists available terms instead of showing term types.
-
Tableau reports embedded in catalog items now load correctly.
-
- Upgradability
-
-
Username and password credentials can now be edited after migrating from 15.4.1 to 16.3. Previously, the edit page did not display these fields for migrated credentials.
-
- Lineage
-
-
During lineage import, the Expand and Overwrite options now remain unavailable until the import finishes, preventing accidental re-triggers.
-
Canceling a lineage import no longer gets stuck on environments with large or complex database schemas.
-
Patch 3
ONE
ONE Web Application, Metadata Management Module, Data Processing Module, Data Processing Engine
- Performance and stability
-
-
Faster metadata processing on large environments through optimized history change queries.
-
Improved metadata import performance and stability on large environments by optimizing memory usage and permission processing.
-
Fixed long-running database transactions during data observability runs that could slow down other operations.
-
Reduced server load when many jobs are running simultaneously.
-
- Monitoring projects
-
-
Removing all additional attributes from pushdown invalid record export settings works as expected.
-
Pushdown invalid record export now includes additional columns and record identifiers even when invalid samples are turned off.
-
Monitoring projects with multiple custom report sections no longer fail during evaluation.
-
- Data quality
-
-
Attributes no longer show unrelated DQ dimensions after pushdown DQ evaluation.
-
Validation component templates now include
DQD_REC_IDon integration inputs, enabling DQ evaluation with complex component rule plans. -
Filtering by attributes and applied rules on the catalog item Data Quality tab now returns all matching results.
-
- Data observability
-
-
Saving Data Observability configuration no longer times out or takes over a minute in environments with large data volumes.
-
- Catalog & data processing
-
-
Job submission no longer fails with an "Engine not available" error when DPE is temporarily reconnecting.
-
Metadata import jobs can now be assigned a custom priority in the processing queue.
-
Search index event processing no longer accumulates partly failed events, which could lead to stale or incomplete search results.
-
- Workflows & tasks
-
-
Moving tasks on the tasks board no longer freezes or fails to persist the change.
-
- Lineage
-
-
Import no longer fails when a previously mapped connection has been deleted from the catalog.
-
Patch 2
ONE
ONE Web Application, Metadata Management Module, Data Processing Module, Data Processing Engine
- Monitoring projects
-
*Specific notifications in monitoring projects are no longer sent when data quality evaluation does not complete. Previously, a "Project run failed" email was incorrectly sent to recipients.
-
When importing monitoring project configuration with an applied data slice to another monitoring project, the data slice reference is no longer carried over if a different catalog item is selected. This prevents monitoring project run failures with the "Data slice was not found" reason.
-
Catalog item Data Quality tab loads significantly faster, especially on catalog items with a large number of attributes.
-
- Transformation plans & job processing
-
-
Transformation plans that use SAP RFC catalog items as input and write results to a database no longer fail with a
ClassNotFoundExceptionerror. -
Job submission no longer fails with
JobInvalidArgumentExceptionwhen a job with the same ID already exists on the Data Processing Engine. -
Added an option to improve reliability of profiling jobs for catalog items with many applied rules or attributes.
-
- Usability & display
-
-
Source detail screen now loads correctly when reopening a previously viewed source.
-
ONE Data tables open correctly without
TypeErrorerror notifications.
-
- Other fixes
-
-
Databricks Spark pushdown jobs now work correctly when ADLS credentials are stored in Azure Key Vault.
-
When a scheduled sweep flow is started with missing or invalid configuration, an error notification with the specific reason is now displayed, instead of failing silently.
-
February 2026
Patch 1
ONE Web Application, Metadata Management Module, Data Processing Module, Data Processing Engine, AI Evolution
ONE
- Monitoring projects
-
-
Re-importing a monitoring project configuration with SQL catalog items after editing the project configuration no longer fails due to duplicate catalog item entries in the export file.
-
Overall data quality results for sections in the monitoring project Report tab now correctly exclude dimensions that do not contribute to overall data quality.
-
- Catalog & data processing
-
-
When creating a DQ firewall from a catalog item, attributes with unsupported data types are now skipped instead of blocking firewall creation entirely. A warning lists any skipped attributes.
-
Fixed intermittent
ConcurrentModificationExceptionfailures in JDBC Reader and SQL Execute steps, sometimes surfacing as "Cannot connect to the database" errors despite the database being healthy. The issue was caused by a race condition when multiple jobs ran concurrently. -
When editing rule configuration, conditions with functions or expressions not supported by Snowflake pushdown are now correctly marked. Rules with such conditions are excluded from evaluation in Snowflake pushdown.
-
IOMETE pushdown processing now uses the ArrowFlight JDBC driver instead of the deprecated Hive JDBC driver.
-
IOMETE pushdown processing can be selected as an experimental feature for version 17.0.0 in Ataccama Cloud environments.
-
- Generative AI
-
-
Chat with documentation now links to the correct documentation version.
-
Increased internal token limits for AI-assisted generation of SQL queries and rule logic, reducing the likelihood of failures when generating longer outputs.
-
- Performance improvements
-
-
Faster loading on multiple screens:
-
Edit screens.
-
Catalog Items tab on source details.
-
Data Quality tab on catalog items.
-
Pushdown connection details, especially for environments with many connections.
-
Entity listing pages.
-
-
Improved DPM database query performance, reducing unnecessary load on the DPM database.
-
Improved audit module write performance by switching to time-ordered identifiers for audit records.
-
Fixed an issue where DPE slots were incorrectly reported as fully utilized, causing queued jobs to stall indefinitely.
-
- Other fixes
-
-
Thread dumps now include additional per-thread diagnostics, such as memory allocation and class loading statistics, for improved troubleshooting.
-
In Ataccama Cloud environments, the PostgreSQL monitoring exporter now correctly loads custom metric queries from the configuration, ensuring all configured database metrics are collected as expected.
-
Security fixes for ONE Web Application.
-
Was this page useful?