How to fetch job_id of BigQueryOperator in downstream task

Solution 1:

I don't know what version you're running on, but I checked the code on Airflow main branch, where the BigQueryExecuteQueryOperator (the name for BigQueryOperator since Airflow 2.0) pushes the job_id to XCom, with a key "job_id". This means you can pull the job id from XCom in the downstream task:

bq_operator = BigQueryExecuteQueryOperator(task_id="bq_task", ...)

def do_something(**context):
    job_id = context["task_instance"].xcom_pull(task_ids="bq_task", key="job_id")
    # and do something with job_id...

run_sourc = PythonOperator(task_id="run", python_callable=do_something, ...)

bq_operator >> run_sourc