User Community Service Desk Downloads

Orchestrator Connections

Orchestrator connections enable Ataccama ONE Agentic to receive and monitor pipeline execution metadata from your data orchestration tools using the OpenLineage standard. This integration provides visibility into your data pipeline runs directly within the Data Observability module.

What are orchestrator connections?

Orchestrator connections establish a link between your pipeline orchestration tools (such as Apache Airflow or dbt Core) and Ataccama ONE Agentic. Using the OpenLineage standard for collecting and exchanging metadata, these connections allow you to track job executions, monitor pipeline health, and identify failures across your data ecosystem.

Currently supported orchestrators:

  • Apache Airflow with OpenLineage

  • dbt Core with OpenLineage

Prerequisites

Before creating an orchestrator connection, ensure you have administrative access to your orchestration tool (Airflow or dbt) to configure the OpenLineage endpoint URL and API key.

Create an orchestrator connection

To set up a new orchestrator connection:

  1. Navigate to Data Observability > Connections.

  2. Select Add connection.

  3. In the dialog that appears, provide:

    • Display name: A meaningful name for your connection (for example, "Production Airflow").

    • Connection type: Select either:

      • Airflow with OpenLineage for Apache Airflow integrations

      • dbt Core with OpenLineage for dbt Core integrations

  4. Select Add connection.

You are automatically redirected to the connection’s Settings tab to complete the configuration. Keep this page open for the next steps.

Configure the connection

Gather connection credentials

You’ll need these values from Ataccama ONE Agentic when configuring your orchestrator in the next section.

Generate an API key

  1. In the Settings tab of your new connection, locate the API keys section.

  2. Select Generate API key.

  3. Provide a Key name to identify this key (for example, production-key).

    The API key is displayed only once immediately after generation. It is securely hashed and cannot be viewed again. Make sure to copy it before leaving or refreshing the page.

  4. Select Generate.

  5. Copy the displayed API key immediately using the copy button. You’ll use this when configuring your orchestrator below.

Copy the OpenLineage endpoint URL

In the Settings tab, you’ll find the OpenLineage API Endpoint URL. This URL follows the pattern:

https://<YOUR_INSTANCE>.ataccama.one/gateway/openlineage/<CONNECTION_ID>

Come back and copy this URL when you configure your orchestrator below.

Configure your orchestrator

Use the API key and endpoint URL above to configure your orchestration tool. You can copy these values as needed during configuration.

Configuration parameters (endpoint URL and API key) remain the same regardless of deployment method. Whether you configure directly or through deployment tooling (Helm charts, Terraform, Docker Compose), these values must ultimately be set in your orchestrator.

Follow the configuration guide for your orchestrator:

Advanced: Airflow orchestrating dbt Core

When Airflow is responsible for running dbt Core, you can decide which integration should emit OpenLineage events for the run. Pick the option that matches the level of detail you need in lineage.

Whichever option you choose, make sure the Airflow environment inherits the same OpenLineage URL/API key variables described earlier.

Option 1: Airflow provider only (simplest)

Airflow emits lineage events for the DAG and task lifecycle. Use this option when you only need to see Airflow task boundaries and don’t require model-level visibility inside dbt.

Configure the OpenLineage transport once in Airflow (see Apache Airflow configuration). Airflow tasks that shell out to dbt run will still send run-level events, but dbt-specific metadata is not present.

Option 2: Call dbt with dbt-ol inside Airflow tasks

Each task that executes dbt uses the dbt-ol wrapper described in dbt Core configuration. Airflow emits its own task events, and the dbt integration emits dataset and model-level events. You will see two related runs: one from the Airflow provider and one from dbt.

This option is recommended when you want detailed lineage without restructuring your DAG. Disable duplicate emission only if it causes noise for your observability tooling; otherwise the two perspectives are useful together.

Option 3: Use Astronomer Cosmos for model-level tasks

Cosmos converts dbt models into individual Airflow tasks, so the Airflow provider emits detailed events per model without requiring dbt-ol. This is useful when you prefer native Airflow task observability (retries, SLAs) mapped to dbt resources.

Monitor pipeline executions

Once configuration is completed according to the steps above, your orchestrator will begin sending execution metadata to Ataccama ONE Agentic.

View connection status

  1. Navigate to Data Observability > Connections.

  2. Your connections are listed with their current status:

    • Connected (green): Actively receiving events

    • Setup incomplete (gray): Configuration pending

    • Connection details show:

      • Total jobs: Number of unique jobs tracked

      • Running jobs: Currently executing jobs

      • Last event: Time since last received event

View job details

  1. Select View details on a connection or click the connection name.

  2. The Overview tab displays:

    • Connection summary information

    • A table of jobs with:

      • Job name/identifier

      • Last run start and end times

      • Duration of last run

      • Last run status (Completed, Failed, Running)

    • A refresh button to update the displayed data

    • Timestamp showing when data was last updated

If no events have been received yet, you’ll see Complete setup to start receiving monitoring events with instructions to configure your orchestrator. Return to the configuration guide for your orchestrator to complete setup.

Manage API keys

You can generate multiple API keys for different environments or rotate keys for security:

Generate additional keys

  1. Go to the Settings tab of your connection.

  2. Select Generate API key.

  3. Provide a unique name for the new key.

  4. Copy the generated key immediately.

Delete API keys

  1. In the Settings tab, view the list of stored API keys.

  2. Click the delete icon (trash can) in the Actions column for the key you want to remove.

Deleting an API key will immediately prevent any orchestrator using that key from sending events. Ensure you’ve updated your orchestrator configuration with a new key before deleting the old one.

Manage connections

Rename a connection

  1. Open the connection details.

  2. Click the three-dot menu in the top right.

  3. Select Rename connection.

  4. Enter the new name and confirm.

Remove a connection

  1. Open the connection details.

  2. Click the three-dot menu in the top right.

  3. Select Remove connection.

  4. Confirm the deletion.

Removing a connection will delete all associated historical job execution data. This action cannot be undone.

Was this page useful?