■初期設定
前回までのコマンドを実行すると、Airflowは$AIRFLOW_HOMEフォルダーを作成し、「airflow.cfg」ファイルをデフォルトで配置します。 $AIRFLOW_HOME/airflow.cfgで、または[管理]-> [構成]メニューのUIを使用して、ファイルを検査できます。 WebサーバーのPIDファイルは、$AIRFLOW_HOME/airflow-webserver.pidまたはsystemdによって開始された場合は/run/airflow/webserver.pidに保存されます。(Quick Start参照)
→今回はrootでairflowを実行してしまったので~root配下に$AIRFLOW_HOMEディレクトリが作成されてしまった。
いくつかのタスクインスタンスをトリガーするいくつかのコマンドを次に示します。下記のコマンドを実行すると、example1DAGでジョブのステータスの変化を確認できるはずです。
(Quick Start参照)
・ run your first task instance
[centos7copy]$ /root/.local/lib/python3.6/site-packages/airflow/bin/airflow run example_bash_operator runme_0 2015-01-01
[2020-11-04 11:21:20,589] {__init__.py:50} INFO - Using executor SequentialExecutor
[2020-11-04 11:21:20,589] {dagbag.py:417} INFO - Filling up the DagBag from /root/airflow/dags
/root/.local/lib/python3.6/site-packages/airflow/models/dag.py:1342: PendingDeprecationWarning: The requested task could not be added to the DAG because a task with task_id create_tag_template_field_result is already in the DAG. Starting in Airflow 2.0, trying to overwrite a task will raise an exception.
category=PendingDeprecationWarning)
Running %s on host %s <TaskInstance: example_bash_operator.runme_0 2015-01-01T00:00:00+00:00 [None]> centos7copy
Traceback (most recent call last):
File "/root/.local/lib/python3.6/site-packages/airflow/bin/airflow", line 37, in <module>
args.func(args)
File "/root/.local/lib/python3.6/site-packages/airflow/utils/cli.py", line 76, in wrapper
return f(*args, **kwargs)
File "/root/.local/lib/python3.6/site-packages/airflow/bin/cli.py", line 579, in run
_run(args, dag, ti)
File "/root/.local/lib/python3.6/site-packages/airflow/bin/cli.py", line 511, in _run
executor.heartbeat()
File "/root/.local/lib/python3.6/site-packages/airflow/executors/base_executor.py", line 134, in heartbeat
self.sync()
File "/root/.local/lib/python3.6/site-packages/airflow/executors/sequential_executor.py", line 57, in sync
subprocess.check_call(command, close_fds=True)
File "/usr/lib64/python3.6/subprocess.py", line 306, in check_call
retcode = call(*popenargs, **kwargs)
File "/usr/lib64/python3.6/subprocess.py", line 287, in call
with Popen(*popenargs, **kwargs) as p:
File "/usr/lib64/python3.6/subprocess.py", line 729, in __init__
restore_signals, start_new_session)
File "/usr/lib64/python3.6/subprocess.py", line 1364, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'airflow': 'airflow'
[centos7copy]$ export PATH="/root/.local/lib/python3.6/site-packages/airflow/bin:$PATH"
[centos7copy]$ /root/.local/lib/python3.6/site-packages/airflow/bin/airflow run example_bash_operator runme_0 2015-01-01
[2020-11-04 11:25:24,386] {__init__.py:50} INFO - Using executor SequentialExecutor
[2020-11-04 11:25:24,387] {dagbag.py:417} INFO - Filling up the DagBag from /root/airflow/dags
/root/.local/lib/python3.6/site-packages/airflow/models/dag.py:1342: PendingDeprecationWarning: The requested task could not be added to the DAG because a task with task_id create_tag_template_field_result is already in the DAG. Starting in Airflow 2.0, trying to overwrite a task will raise an exception.
category=PendingDeprecationWarning)
Running %s on host %s <TaskInstance: example_bash_operator.runme_0 2015-01-01T00:00:00+00:00 [None]> centos7copy
[2020-11-04 11:25:31,403] {__init__.py:50} INFO - Using executor SequentialExecutor
[2020-11-04 11:25:31,404] {dagbag.py:417} INFO - Filling up the DagBag from /root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py
Running %s on host %s <TaskInstance: example_bash_operator.runme_0 2015-01-01T00:00:00+00:00 [None]> centos7copy
[centos7copy]$ ll /root/airflow/
total 148
-rw-r--r--. 1 root root 38611 Nov 4 07:22 airflow.cfg
-rw-r--r--. 1 root root 92160 Nov 4 11:32 airflow.db
-rw-r--r--. 1 root root 7 Nov 4 08:12 airflow-webserver.pid
drwxr-xr-x. 6 root root 4096 Nov 4 11:20 logs
-rw-r--r--. 1 root root 2533 Nov 4 07:22 unittests.cfg
[centos7copy]$ ll /root/airflow/logs
total 12
drwxr-xr-x. 2 root root 4096 Nov 4 08:24 dag_processor_manager
drwxrwxrwx. 3 root root 4096 Nov 4 11:20 example_bash_operator
drwxr-xr-x. 4 root root 4096 Nov 4 09:00 scheduler
[centos7copy]$ ll /root/airflow/logs/dag_processor_manager
total 1960
-rw-r--r--. 1 root root 2001257 Nov 4 11:33 dag_processor_manager.log
[centos7copy]$ wc -l /root/airflow/logs/dag_processor_manager/dag_processor_manager.log
12976 /root/airflow/logs/dag_processor_manager/dag_processor_manager.log
[centos7copy]$ tail /root/airflow/logs/dag_processor_manager/dag_processor_manager.log
/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_external_task_marker_dag.py 0 0 6.47s 2020-11-04T02:35:19
/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_trigger_target_dag.py 0 0 6.46s 2020-11-04T02:37:29
/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_skip_dag.py 0 0 6.46s 2020-11-04T02:37:22
/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_pig_operator.py 0 0 6.47s 2020-11-04T02:36:05
/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_python_operator.py 0 0 6.47s 2020-11-04T02:36:56
/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_branch_python_dop_operator_3.py 0 0 6.46s 2020-11-04T02:37:03
/root/.local/lib/python3.6/site-packages/airflow/example_dags/subdags/subdag.py 0 0 6.46s 2020-11-04T02:36:11
================================================================================
[2020-11-04 11:37:35,811] {dag_processing.py:1312} INFO - Finding 'running' jobs without a recent heartbeat
[2020-11-04 11:37:35,811] {dag_processing.py:1316} INFO - Failing jobs without heartbeat after 2020-11-04 02:32:35.811714+00:00
[centos7copy]$ ll /root/airflow/logs/example_bash_operator
total 4
drwxrwxrwx. 3 root root 4096 Nov 4 11:20 runme_0
[centos7copy]$ ll /root/airflow/logs/example_bash_operator/runme_0/
total 4
drwxrwxrwx. 2 root root 4096 Nov 4 11:20 2015-01-01T00:00:00+00:00
[centos7copy]$ ll /root/airflow/logs/example_bash_operator/runme_0/2015-01-01T00\:00\:00+00\:00/
total 8
-rw-rw-rw-. 1 root root 5039 Nov 4 11:26 1.log
[centos7copy]$ wc -l /root/airflow/logs/example_bash_operator/runme_0/2015-01-01T00\:00\:00+00\:00/1.log
33 /root/airflow/logs/example_bash_operator/runme_0/2015-01-01T00:00:00+00:00/1.log
[centos7copy]$ cat /root/airflow/logs/example_bash_operator/runme_0/2015-01-01T00\:00\:00+00\:00/1.log
[2020-11-04 11:20:26,219] {logging_mixin.py:112} INFO - Sending to executor.
[2020-11-04 11:20:26,220] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_bash_operator', 'runme_0', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py']
[2020-11-04 11:20:26,221] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_bash_operator', 'runme_0', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py']
[2020-11-04 11:21:11,199] {logging_mixin.py:112} INFO - Sending to executor.
[2020-11-04 11:21:11,200] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_bash_operator', 'runme_0', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py']
[2020-11-04 11:21:11,201] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_bash_operator', 'runme_0', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py']
[2020-11-04 11:21:26,990] {logging_mixin.py:112} INFO - Sending to executor.
[2020-11-04 11:21:26,990] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_bash_operator', 'runme_0', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py']
[2020-11-04 11:21:26,991] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_bash_operator', 'runme_0', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py']
[2020-11-04 11:25:29,180] {logging_mixin.py:112} INFO - Sending to executor.
[2020-11-04 11:25:29,181] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_bash_operator', 'runme_0', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py']
[2020-11-04 11:25:29,182] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_bash_operator', 'runme_0', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py']
[2020-11-04 11:25:47,482] {taskinstance.py:670} INFO - Dependencies all met for <TaskInstance: example_bash_operator.runme_0 2015-01-01T00:00:00+00:00 [None]>
[2020-11-04 11:25:47,489] {taskinstance.py:670} INFO - Dependencies all met for <TaskInstance: example_bash_operator.runme_0 2015-01-01T00:00:00+00:00 [None]>
[2020-11-04 11:25:47,489] {taskinstance.py:880} INFO -
--------------------------------------------------------------------------------
[2020-11-04 11:25:47,489] {taskinstance.py:881} INFO - Starting attempt 1 of 1
[2020-11-04 11:25:47,489] {taskinstance.py:882} INFO -
--------------------------------------------------------------------------------
[2020-11-04 11:25:47,497] {taskinstance.py:901} INFO - Executing <Task(BashOperator): runme_0> on 2015-01-01T00:00:00+00:00
[2020-11-04 11:25:47,499] {standard_task_runner.py:54} INFO - Started process 119493 to run task
[2020-11-04 11:25:47,518] {standard_task_runner.py:77} INFO - Running: ['airflow', 'run', 'example_bash_operator', 'runme_0', '2015-01-01T00:00:00+00:00', '--job_id', '3', '--pool', 'default_pool', '--raw', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmpotct5i3g']
[2020-11-04 11:25:47,519] {standard_task_runner.py:78} INFO - Job 3: Subtask runme_0
[2020-11-04 11:25:53,583] {logging_mixin.py:112} INFO - Running %s on host %s <TaskInstance: example_bash_operator.runme_0 2015-01-01T00:00:00+00:00 [running]> centos7copy
[2020-11-04 11:25:59,684] {bash_operator.py:113} INFO - Tmp dir root location:
/tmp
[2020-11-04 11:25:59,685] {bash_operator.py:136} INFO - Temporary script location: /tmp/airflowtmpkpo23qoz/runme_0_3w_kkxh
[2020-11-04 11:25:59,685] {bash_operator.py:146} INFO - Running command: echo "example_bash_operator__runme_0__20150101" && sleep 1
[2020-11-04 11:25:59,690] {bash_operator.py:153} INFO - Output:
[2020-11-04 11:25:59,692] {bash_operator.py:157} INFO - example_bash_operator__runme_0__20150101
[2020-11-04 11:26:00,695] {bash_operator.py:161} INFO - Command exited with return code 0
[2020-11-04 11:26:00,704] {taskinstance.py:1070} INFO - Marking task as SUCCESS.dag_id=example_bash_operator, task_id=runme_0, execution_date=20150101T000000, start_date=20201104T022547, end_date=20201104T022600
[2020-11-04 11:26:05,759] {local_task_job.py:102} INFO - Task exited with return code 0
[centos7copy]$ ls -lR /root/airflow/logs/scheduler/
/root/airflow/logs/scheduler/:
total 8
drwxr-xr-x. 2 root root 4096 Nov 4 07:22 2020-11-03
drwxr-xr-x. 2 root root 4096 Nov 4 09:00 2020-11-04
lrwxrwxrwx. 1 root root 39 Nov 4 09:00 latest -> /root/airflow/logs/scheduler/2020-11-04
/root/airflow/logs/scheduler/2020-11-03:
total 0
/root/airflow/logs/scheduler/2020-11-04:
total 0
[centos7copy]$ file /root/airflow/airflow.db
/root/airflow/airflow.db: SQLite 3.x database
[centos7copy]$ wc -l /root/airflow/airflow.cfg
1073 /root/airflow/airflow.cfg
・ログ設定確認
[centos7copy]$ cat -n /root/airflow/airflow.cfg
1 [core]
----(略)----
6 # The folder where airflow should store its log files
7 # This path must be absolute
8 base_log_folder = /root/airflow/logs
9
10 # Airflow can store logs remotely in AWS S3, Google Cloud Storage or Elastic Search.
11 # Set this to True if you want to enable remote logging.
12 remote_logging = False
13
14 # Users must supply an Airflow connection id that provides access to the storage
15 # location.
16 remote_log_conn_id =
17 remote_base_log_folder =
18 encrypt_s3_logs = False
19
20 # Logging level
21 logging_level = INFO
22
23 # Logging level for Flask-appbuilder UI
24 fab_logging_level = WARN
25
26 # Logging class
27 # Specify the class that will specify the logging configuration
28 # This class has to be on the python classpath
29 # Example: logging_config_class = my.path.default_local_settings.LOGGING_CONFIG
30 logging_config_class =
31
32 # Flag to enable/disable Colored logs in Console
33 # Colour the logs when the controlling terminal is a TTY.
34 colored_console_log = True
35
36 # Log format for when Colored logs is enabled
37 colored_log_format = [%%(blue)s%%(asctime)s%%(reset)s] {%%(blue)s%%(filename)s:%%(reset)s%%(lineno)d} %%(log_color)s%%(levelname)s%%(reset)s - %%(log_color)s%%(message)s%%(reset)s
38 colored_formatter_class = airflow.utils.log.colored_log.CustomTTYColoredFormatter
39
40 # Format of Log line
41 log_format = [%%(asctime)s] {%%(filename)s:%%(lineno)d} %%(levelname)s - %%(message)s
42 simple_log_format = %%(asctime)s %%(levelname)s - %%(message)s
43
44 # Log filename format
45 log_filename_template = {{ ti.dag_id }}/{{ ti.task_id }}/{{ ts }}/{{ try_number }}.log
46 log_processor_filename_template = {{ filename }}.log
47 dag_processor_manager_log_location = /root/airflow/logs/dag_processor_manager/dag_processor_manager.log
48
49 # Name of handler to read task instance logs.
50 # Default to use task handler.
51 task_log_reader = task
----(略)----
344 # Log files for the gunicorn webserver. '-' means log to stderr.
345 access_logfile = -
346
347 # Log files for the gunicorn webserver. '-' means log to stderr.
348 error_logfile = -
----(略)----
385 # The amount of time (in secs) webserver will wait for initial handshake
386 # while fetching logs from other worker machine
387 log_fetch_timeout_sec = 5
388
389 # Time interval (in secs) to wait before next log fetching.
390 log_fetch_delay_sec = 2
391
392 # Distance away from page bottom to enable auto tailing.
393 log_auto_tailing_offset = 30
394
395 # Animation speed for auto tailing log display.
396 log_animation_speed = 1000
----(略)----
439 # Default setting for wrap toggle on DAG code and TI log views.
440 default_wrap = False
----(略)----
456 # Minutes of non-activity before logged out from UI
457 # 0 means never get forcibly logged out
458 force_log_out_after = 0
----(略)----
508 # When you start an airflow worker, airflow starts a tiny web server
509 # subprocess to serve the workers local log files to the airflow main
510 # web server, who then builds pages and sends them to users. This defines
511 # the port on which the logs are served. It needs to be unused, and open
512 # visible from the main web server to connect into the workers.
513 worker_log_server_port = 8793
----(略)----
627 # How often should stats be printed to the logs. Setting to 0 will disable printing stats
628 print_stats_interval = 30
629
630 # If the last scheduler heartbeat happened more than scheduler_health_check_threshold
631 # ago (in seconds), scheduler is considered unhealthy.
632 # This is used by the health check in the "/health" endpoint
633 scheduler_health_check_threshold = 30
634 child_process_log_directory = /root/airflow/logs/scheduler
----(略)----
767 # Format of the log_id, which is used to query for a given tasks logs
768 log_id_template = {dag_id}-{task_id}-{execution_date}-{try_number}
769
770 # Used to mark the end of a log stream for a task
771 end_of_log_mark = end_of_log
772
773 # Qualified URL for an elasticsearch frontend (like Kibana) with a template argument for log_id
774 # Code will construct log_id using the log_id template from the argument above.
775 # NOTE: The code will prefix the https:// automatically, don't include that here.
776 frontend =
777
778 # Write the task logs to the stdout of the worker, rather than the default files
779 write_stdout = False
780
781 # Instead of the default log formatter, write the log lines as JSON
782 json_format = False
----(略)----
860 # For volume mounted logs, the worker will look in this subpath for logs
861 logs_volume_subpath =
862
863 # A shared volume claim for the logs
864 logs_volume_claim =
----(略)----
870 # A hostPath volume for the logs
871 # Useful in local environment, discouraged in production
872 logs_volume_host =
・run a backfill over 2 days
[centos7copy]$ /root/.local/lib/python3.6/site-packages/airflow/bin/airflow backfill example_bash_operator -s 2015-01-01 -e 2015-01-02
[2020-11-04 12:11:16,997] {__init__.py:50} INFO - Using executor SequentialExecutor
[2020-11-04 12:11:16,998] {dagbag.py:417} INFO - Filling up the DagBag from /root/airflow/dags
/root/.local/lib/python3.6/site-packages/airflow/models/dag.py:1342: PendingDeprecationWarning: The requested task could not be added to the DAG because a task with task_id create_tag_template_field_result is already in the DAG. Starting in Airflow 2.0, trying to overwrite a task will raise an exception.
category=PendingDeprecationWarning)
[2020-11-04 12:11:23,337] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_bash_operator', 'runme_0', '2015-01-02T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmpn52f1n_f']
[2020-11-04 12:11:23,351] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_bash_operator', 'runme_1', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmp5k2oixew']
[2020-11-04 12:11:23,366] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_bash_operator', 'runme_1', '2015-01-02T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmpc4xyk0wj']
[2020-11-04 12:11:23,381] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_bash_operator', 'runme_2', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmpvepb8m5d']
[2020-11-04 12:11:23,396] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_bash_operator', 'runme_2', '2015-01-02T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmpzopwz3b6']
[2020-11-04 12:11:23,412] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_bash_operator', 'also_run_this', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmpep8bmzvt']
[2020-11-04 12:11:23,427] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_bash_operator', 'also_run_this', '2015-01-02T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmpa9ioy2s1']
[2020-11-04 12:11:28,218] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_bash_operator', 'runme_0', '2015-01-02T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmpn52f1n_f']
[2020-11-04 12:11:29,256] {__init__.py:50} INFO - Using executor SequentialExecutor
[2020-11-04 12:11:29,256] {dagbag.py:417} INFO - Filling up the DagBag from /root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py
Running %s on host %s <TaskInstance: example_bash_operator.runme_0 2015-01-02T00:00:00+00:00 [queued]> centos7copy
[2020-11-04 12:12:03,547] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_bash_operator', 'runme_1', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmp5k2oixew']
[2020-11-04 12:12:04,467] {__init__.py:50} INFO - Using executor SequentialExecutor
[2020-11-04 12:12:04,468] {dagbag.py:417} INFO - Filling up the DagBag from /root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py
Running %s on host %s <TaskInstance: example_bash_operator.runme_1 2015-01-01T00:00:00+00:00 [queued]> centos7copy
[2020-11-04 12:12:40,450] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_bash_operator', 'runme_1', '2015-01-02T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmpc4xyk0wj']
[2020-11-04 12:12:41,366] {__init__.py:50} INFO - Using executor SequentialExecutor
[2020-11-04 12:12:41,367] {dagbag.py:417} INFO - Filling up the DagBag from /root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py
Running %s on host %s <TaskInstance: example_bash_operator.runme_1 2015-01-02T00:00:00+00:00 [queued]> centos7copy
[2020-11-04 12:13:17,262] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_bash_operator', 'runme_2', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmpvepb8m5d']
[2020-11-04 12:13:18,211] {__init__.py:50} INFO - Using executor SequentialExecutor
[2020-11-04 12:13:18,211] {dagbag.py:417} INFO - Filling up the DagBag from /root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py
Running %s on host %s <TaskInstance: example_bash_operator.runme_2 2015-01-01T00:00:00+00:00 [queued]> centos7copy
[2020-11-04 12:13:53,825] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_bash_operator', 'runme_2', '2015-01-02T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmpzopwz3b6']
[2020-11-04 12:13:54,740] {__init__.py:50} INFO - Using executor SequentialExecutor
[2020-11-04 12:13:54,741] {dagbag.py:417} INFO - Filling up the DagBag from /root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py
Running %s on host %s <TaskInstance: example_bash_operator.runme_2 2015-01-02T00:00:00+00:00 [queued]> centos7copy
[2020-11-04 12:14:30,501] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_bash_operator', 'also_run_this', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmpep8bmzvt']
[2020-11-04 12:14:31,464] {__init__.py:50} INFO - Using executor SequentialExecutor
[2020-11-04 12:14:31,464] {dagbag.py:417} INFO - Filling up the DagBag from /root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py
Running %s on host %s <TaskInstance: example_bash_operator.also_run_this 2015-01-01T00:00:00+00:00 [queued]> centos7copy
[2020-11-04 12:15:07,109] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_bash_operator', 'also_run_this', '2015-01-02T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmpa9ioy2s1']
[2020-11-04 12:15:08,014] {__init__.py:50} INFO - Using executor SequentialExecutor
[2020-11-04 12:15:08,015] {dagbag.py:417} INFO - Filling up the DagBag from /root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py
Running %s on host %s <TaskInstance: example_bash_operator.also_run_this 2015-01-02T00:00:00+00:00 [queued]> centos7copy
[2020-11-04 12:15:43,910] {backfill_job.py:364} INFO - [backfill progress] | finished run 0 of 2 | tasks waiting: 4 | succeeded: 8 | running: 0 | failed: 0 | skipped: 0 | deadlocked: 0 | not ready: 4
[2020-11-04 12:15:43,924] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_bash_operator', 'run_after_loop', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmp9_8g1fjk']
[2020-11-04 12:15:43,943] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_bash_operator', 'run_after_loop', '2015-01-02T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmpd_m5815p']
[2020-11-04 12:15:43,980] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_bash_operator', 'run_after_loop', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmp9_8g1fjk']
[2020-11-04 12:15:44,897] {__init__.py:50} INFO - Using executor SequentialExecutor
[2020-11-04 12:15:44,898] {dagbag.py:417} INFO - Filling up the DagBag from /root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py
Running %s on host %s <TaskInstance: example_bash_operator.run_after_loop 2015-01-01T00:00:00+00:00 [queued]> centos7copy
[2020-11-04 12:16:20,398] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_bash_operator', 'run_after_loop', '2015-01-02T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmpd_m5815p']
[2020-11-04 12:16:21,302] {__init__.py:50} INFO - Using executor SequentialExecutor
[2020-11-04 12:16:21,303] {dagbag.py:417} INFO - Filling up the DagBag from /root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py
Running %s on host %s <TaskInstance: example_bash_operator.run_after_loop 2015-01-02T00:00:00+00:00 [queued]> centos7copy
[2020-11-04 12:16:57,384] {backfill_job.py:364} INFO - [backfill progress] | finished run 0 of 2 | tasks waiting: 2 | succeeded: 10 | running: 0 | failed: 0 | skipped: 0 | deadlocked: 0 | not ready: 2
[2020-11-04 12:16:57,397] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_bash_operator', 'run_this_last', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmpmu_bn0q5']
[2020-11-04 12:16:57,415] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'example_bash_operator', 'run_this_last', '2015-01-02T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmplauu48v2']
[2020-11-04 12:16:57,427] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_bash_operator', 'run_this_last', '2015-01-01T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmpmu_bn0q5']
[2020-11-04 12:16:58,329] {__init__.py:50} INFO - Using executor SequentialExecutor
[2020-11-04 12:16:58,329] {dagbag.py:417} INFO - Filling up the DagBag from /root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py
Running %s on host %s <TaskInstance: example_bash_operator.run_this_last 2015-01-01T00:00:00+00:00 [queued]> centos7copy
[2020-11-04 12:17:33,991] {sequential_executor.py:54} INFO - Executing command: ['airflow', 'run', 'example_bash_operator', 'run_this_last', '2015-01-02T00:00:00+00:00', '--local', '--pool', 'default_pool', '-sd', '/root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py', '--cfg_path', '/tmp/tmplauu48v2']
[2020-11-04 12:17:34,952] {__init__.py:50} INFO - Using executor SequentialExecutor
[2020-11-04 12:17:34,953] {dagbag.py:417} INFO - Filling up the DagBag from /root/.local/lib/python3.6/site-packages/airflow/example_dags/example_bash_operator.py
Running %s on host %s <TaskInstance: example_bash_operator.run_this_last 2015-01-02T00:00:00+00:00 [queued]> centos7copy
[2020-11-04 12:18:10,572] {dagrun.py:320} INFO - Marking run <DagRun example_bash_operator @ 2015-01-01T00:00:00+00:00: backfill_2015-01-01T00:00:00+00:00, externally triggered: False> successful
[2020-11-04 12:18:10,582] {dagrun.py:320} INFO - Marking run <DagRun example_bash_operator @ 2015-01-02 00:00:00+00:00: backfill_2015-01-02T00:00:00+00:00, externally triggered: False> successful
[2020-11-04 12:18:10,588] {backfill_job.py:364} INFO - [backfill progress] | finished run 2 of 2 | tasks waiting: 0 | succeeded: 12 | running: 0 | failed: 0 | skipped: 0 | deadlocked: 0 | not ready: 0
[2020-11-04 12:18:10,589] {backfill_job.py:813} INFO - Backfill done. Exiting.
[centos7copy]$ ll /root/airflow/
total 176
-rw-r--r--. 1 root root 38611 Nov 4 07:22 airflow.cfg
-rw-r--r--. 1 root root 122880 Nov 4 12:19 airflow.db
-rw-r--r--. 1 root root 7 Nov 4 08:12 airflow-webserver.pid
drwxr-xr-x. 6 root root 4096 Nov 4 11:20 logs
-rw-r--r--. 1 root root 2533 Nov 4 07:22 unittests.cfg
[centos7copy]$ ll /root/airflow/logs/
total 12
drwxr-xr-x. 2 root root 4096 Nov 4 08:24 dag_processor_manager
drwxrwxrwx. 8 root root 4096 Nov 4 12:16 example_bash_operator
drwxr-xr-x. 4 root root 4096 Nov 4 09:00 scheduler
■Web serverにアクセス
・「example_bash_operator」DAGをクリック






