Skip to main content
Version: Next

Airflow Integration

note

If you're looking to schedule DataHub ingestion using Airflow, see the guide on scheduling ingestion with Airflow.

The DataHub Airflow plugin supports:

  • Automatic column-level lineage extraction from various operators e.g. SqlOperators (including MySqlOperator, PostgresOperator, SnowflakeOperator, and more), S3FileTransformOperator, and a few others.
  • Airflow DAG and tasks, including properties, ownership, and tags.
  • Task run information, including task successes and failures.
  • Manual lineage annotations using inlets and outlets on Airflow operators.

There's two actively supported implementations of the plugin, with different Airflow version support.

ApproachAirflow VersionNotes
Plugin v22.3+Recommended. Requires Python 3.8+
Plugin v12.1+No automatic lineage extraction; may not extract lineage if the task fails.

If you're using Airflow older than 2.1, it's possible to use the v1 plugin with older versions of acryl-datahub-airflow-plugin. See the compatibility section for more details.

DataHub Plugin v2

Installation

The v2 plugin requires Airflow 2.3+ and Python 3.8+. If you don't meet these requirements, use the v1 plugin instead.

pip install 'acryl-datahub-airflow-plugin[plugin-v2]'

Configuration

Set up a DataHub connection in Airflow.

airflow connections add  --conn-type 'datahub-rest' 'datahub_rest_default' --conn-host 'http://datahub-gms:8080' --conn-password '<optional datahub auth token>'

No additional configuration is required to use the plugin. However, there are some optional configuration parameters that can be set in the airflow.cfg file.

airflow.cfg
[datahub]
# Optional - additional config here.
enabled = True # default
NameDefault valueDescription
enabledtrueIf the plugin should be enabled.
conn_iddatahub_rest_defaultThe name of the datahub rest connection.
clusterprodname of the airflow cluster
capture_ownership_infotrueExtract DAG ownership.
capture_tags_infotrueExtract DAG tags.
capture_executionstrueExtract task runs and success/failure statuses. This will show up in DataHub "Runs" tab.
enable_extractorstrueEnable automatic lineage extraction.
disable_openlineage_plugintrueDisable the OpenLineage plugin to avoid duplicative processing.
log_levelno change[debug] Set the log level for the plugin.
debug_emitterfalse[debug] If true, the plugin will log the emitted events.

Automatic lineage extraction

To automatically extract lineage information, the v2 plugin builds on top of Airflow's built-in OpenLineage extractors.

The SQL-related extractors have been updated to use DataHub's SQL parser, which is more robust than the built-in one and uses DataHub's metadata information to generate column-level lineage. We discussed the DataHub SQL parser, including why schema-aware parsing works better and how it performs on benchmarks, during the June 2023 community town hall.

DataHub Plugin v1

Installation

The v1 plugin requires Airflow 2.1+ and Python 3.8+. If you're on older versions, it's still possible to use an older version of the plugin. See the compatibility section for more details.

If you're using Airflow 2.3+, we recommend using the v2 plugin instead. If you need to use the v1 plugin with Airflow 2.3+, you must also set the environment variable DATAHUB_AIRFLOW_PLUGIN_USE_V1_PLUGIN=true.

pip install 'acryl-datahub-airflow-plugin[plugin-v1]'

# The DataHub rest connection type is included by default.
# To use the DataHub Kafka connection type, install the plugin with the kafka extras.
pip install 'acryl-datahub-airflow-plugin[plugin-v1,datahub-kafka]'

Configuration

Disable lazy plugin loading

airflow.cfg
[core]
lazy_load_plugins = False

On MWAA you should add this config to your Apache Airflow configuration options.

Setup a DataHub connection

You must configure an Airflow connection for Datahub. We support both a Datahub REST and a Kafka-based connections, but you only need one.

# For REST-based:
airflow connections add --conn-type 'datahub_rest' 'datahub_rest_default' --conn-host 'http://datahub-gms:8080' --conn-password '<optional datahub auth token>'
# For Kafka-based (standard Kafka sink config can be passed via extras):
airflow connections add --conn-type 'datahub_kafka' 'datahub_kafka_default' --conn-host 'broker:9092' --conn-extra '{}'

Configure the plugin

If your config doesn't align with the default values, you can configure the plugin in your airflow.cfg file.

airflow.cfg
[datahub]
enabled = true
conn_id = datahub_rest_default # or datahub_kafka_default
# etc.
NameDefault valueDescription
enabledtrueIf the plugin should be enabled.
conn_iddatahub_rest_defaultThe name of the datahub connection you set in step 1.
clusterprodname of the airflow cluster
capture_ownership_infotrueIf true, the owners field of the DAG will be capture as a DataHub corpuser.
capture_tags_infotrueIf true, the tags field of the DAG will be captured as DataHub tags.
capture_executionstrueIf true, we'll capture task runs in DataHub in addition to DAG definitions.
graceful_exceptionstrueIf set to true, most runtime errors in the lineage backend will be suppressed and will not cause the overall task to fail. Note that configuration issues will still throw exceptions.

Validate that the plugin is working

  1. Go and check in Airflow at Admin -> Plugins menu if you can see the DataHub plugin
  2. Run an Airflow DAG. In the task logs, you should see Datahub related log messages like:
Emitting DataHub ...

Manual Lineage Annotation

Using inlets and outlets

You can manually annotate lineage by setting inlets and outlets on your Airflow operators. This is useful if you're using an operator that doesn't support automatic lineage extraction, or if you want to override the automatic lineage extraction.

We have a few code samples that demonstrate how to use inlets and outlets:

For more information, take a look at the Airflow lineage docs.

Custom Operators

If you have created a custom Airflow operator that inherits from the BaseOperator class, when overriding the execute function, set inlets and outlets via context['ti'].task.inlets and context['ti'].task.outlets. The DataHub Airflow plugin will then pick up those inlets and outlets after the task runs.

class DbtOperator(BaseOperator):
...

def execute(self, context):
# do something
inlets, outlets = self._get_lineage()
# inlets/outlets are lists of either datahub_airflow_plugin.entities.Dataset or datahub_airflow_plugin.entities.Urn
context['ti'].task.inlets = self.inlets
context['ti'].task.outlets = self.outlets

def _get_lineage(self):
# Do some processing to get inlets/outlets

return inlets, outlets

If you override the pre_execute and post_execute function, ensure they include the @prepare_lineage and @apply_lineage decorators respectively. Reference the Airflow docs for more details.

Emit Lineage Directly

If you can't use the plugin or annotate inlets/outlets, you can also emit lineage using the DatahubEmitterOperator.

Reference lineage_emission_dag.py for a full example.

In order to use this example, you must first configure the Datahub hook. Like in ingestion, we support a Datahub REST hook and a Kafka-based hook. See the plugin configuration for examples.

Debugging

Missing lineage

If you're not seeing lineage in DataHub, check the following:

  • Validate that the plugin is loaded in Airflow. Go to Admin -> Plugins and check that the DataHub plugin is listed.
  • With the v2 plugin, it should also print a log line like INFO [datahub_airflow_plugin.datahub_listener] DataHub plugin v2 using DataHubRestEmitter: configured to talk to <datahub_url> during Airflow startup, and the airflow plugins command should list datahub_plugin with a listener enabled.
  • If using the v2 plugin's automatic lineage, ensure that the enable_extractors config is set to true and that automatic lineage is supported for your operator.
  • If using manual lineage annotation, ensure that you're using the datahub_airflow_plugin.entities.Dataset or datahub_airflow_plugin.entities.Urn classes for your inlets and outlets.

Incorrect URLs

If your URLs aren't being generated correctly (usually they'll start with http://localhost:8080 instead of the correct hostname), you may need to set the webserver base_url config.

airflow.cfg
[webserver]
base_url = http://airflow.mycorp.example.com

Compatibility

We no longer officially support Airflow <2.1. However, you can use older versions of acryl-datahub-airflow-plugin with older versions of Airflow. Both of these options support Python 3.7+.

  • Airflow 1.10.x, use DataHub plugin v1 with acryl-datahub-airflow-plugin <= 0.9.1.0.
  • Airflow 2.0.x, use DataHub plugin v1 with acryl-datahub-airflow-plugin <= 0.11.0.1.

DataHub also previously supported an Airflow lineage backend implementation. While the implementation is still in our codebase, it is deprecated and will be removed in a future release. Note that the lineage backend did not support automatic lineage extraction, did not capture task failures, and did not work in AWS MWAA. The documentation for the lineage backend has already been archived.

Additional references

Related Datahub videos: