Where do you view the output from airflow jobs

If a task is in Airflow, here is how to find its logs in the web UI:

  1. Click on the name of the task's DAG

enter image description here

  1. Click on the task run enter image description here

  2. Click on "View Log" button in the pop-up that opens

enter image description here

  1. The logs page will open up (one needs to keep refreshing it to see logs in real-time).

enter image description here


Like @tobi6 said, you can view the output from your DAG runs in your webserver or in your console depending on the environment.

To do so in your webserver:

  1. Select the DAG you just ran and enter into the Graph View.
  2. Select the task in that DAG that you want to view the output of.
  3. In the following popup, click View Log.
  4. In the following log, you can now see the output or it will give you the link to a page where you can view the output (if you were using Databricks for example, the last line might be "INFO - View run status, Spark UI, and logs at domain.cloud.databricks.com#job/jobid/run/1").

If you want to view the logs from your run, you do so in your airflow_home directory.

  • Information from Airflow official documentation on logs below:

Users can specify a logs folder in airflow.cfg. By default, it is in the AIRFLOW_HOME directory.

In addition, users can supply a remote location for storing logs and log backups in cloud storage. At this time, Amazon S3 and Google Cloud Storage are supported. To enable this feature, airflow.cfg must be configured as in this example:

[core]
# Airflow can store logs remotely in AWS S3 or Google Cloud Storage. Users
# must supply a remote location URL (starting with either 's3://...' or
# 'gs://...') and an Airflow connection id that provides access to the storage
# location.
remote_base_log_folder = s3://my-bucket/path/to/logs
remote_log_conn_id = MyS3Conn
# Use server-side encryption for logs stored in S3
encrypt_s3_logs = False
Remote logging uses an existing Airflow connection to read/write logs. If you don’t have a connection properly setup, this will fail.

In the above example, Airflow will try to use S3Hook('MyS3Conn').

In the Airflow Web UI, local logs take precedance over remote logs. If local logs can not be found or accessed, the remote logs will be displayed. Note that logs are only sent to remote storage once a task completes (including failure). In other words, remote logs for running tasks are unavailable. Logs are stored in the log folder as {dag_id}/{task_id}/{execution_date}/{try_number}.log.


In Airflow 2 to view the logs:

  1. Open a dag
  2. Click on the square(which represents a single task run; square is red if failed, green if success) shown against a task. This opens details to the right.
  3. Click on the "Log" button on the right details tab.

enter image description here

Tags:

Airflow