ansible-playbook [core 2.17.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-q4K executable location = /usr/local/bin/ansible-playbook python version = 3.12.11 (main, Aug 14 2025, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-11)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_mount.yml ****************************************************** 1 plays in /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml PLAY [Basic mount snapshot test] *********************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:2 Friday 10 October 2025 11:12:53 -0400 (0:00:00.020) 0:00:00.020 ******** [WARNING]: Platform linux on host managed-node3 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed-node3] TASK [Setup] ******************************************************************* task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:46 Friday 10 October 2025 11:12:54 -0400 (0:00:01.095) 0:00:01.116 ******** included: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/setup.yml for managed-node3 TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/setup.yml:10 Friday 10 October 2025 11:12:54 -0400 (0:00:00.032) 0:00:01.148 ******** ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } TASK [Set mount parent] ******************************************************** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/setup.yml:15 Friday 10 October 2025 11:12:55 -0400 (0:00:00.520) 0:00:01.668 ******** ok: [managed-node3] => { "ansible_facts": { "test_mnt_parent": "/mnt" }, "changed": false } TASK [Run the storage role install base packages] ****************************** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/setup.yml:19 Friday 10 October 2025 11:12:55 -0400 (0:00:00.061) 0:00:01.730 ******** included: fedora.linux_system_roles.storage for managed-node3 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 10 October 2025 11:12:55 -0400 (0:00:00.044) 0:00:01.775 ******** included: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 10 October 2025 11:12:55 -0400 (0:00:00.035) 0:00:01.810 ******** skipping: [managed-node3] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 10 October 2025 11:12:55 -0400 (0:00:00.059) 0:00:01.870 ******** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node3] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 10 October 2025 11:12:55 -0400 (0:00:00.068) 0:00:01.939 ******** ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 10 October 2025 11:12:55 -0400 (0:00:00.397) 0:00:02.336 ******** ok: [managed-node3] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 10 October 2025 11:12:55 -0400 (0:00:00.072) 0:00:02.409 ******** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 10 October 2025 11:12:55 -0400 (0:00:00.045) 0:00:02.454 ******** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 10 October 2025 11:12:55 -0400 (0:00:00.046) 0:00:02.502 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 10 October 2025 11:12:56 -0400 (0:00:00.129) 0:00:02.632 ******** fatal: [managed-node3]: FAILED! => { "changed": false, "rc": 1, "results": [] } MSG: Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried TASK [Cleanup] ***************************************************************** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:204 Friday 10 October 2025 11:13:02 -0400 (0:00:06.460) 0:00:09.093 ******** included: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/cleanup.yml for managed-node3 TASK [Remove storage volumes] ************************************************** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/cleanup.yml:7 Friday 10 October 2025 11:13:02 -0400 (0:00:00.049) 0:00:09.142 ******** included: fedora.linux_system_roles.storage for managed-node3 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 10 October 2025 11:13:02 -0400 (0:00:00.062) 0:00:09.205 ******** included: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 10 October 2025 11:13:02 -0400 (0:00:00.065) 0:00:09.270 ******** skipping: [managed-node3] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 10 October 2025 11:13:02 -0400 (0:00:00.104) 0:00:09.375 ******** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node3] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 10 October 2025 11:13:02 -0400 (0:00:00.120) 0:00:09.495 ******** skipping: [managed-node3] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 10 October 2025 11:13:03 -0400 (0:00:00.061) 0:00:09.557 ******** skipping: [managed-node3] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 10 October 2025 11:13:03 -0400 (0:00:00.057) 0:00:09.614 ******** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 10 October 2025 11:13:03 -0400 (0:00:00.029) 0:00:09.643 ******** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 10 October 2025 11:13:03 -0400 (0:00:00.024) 0:00:09.668 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 10 October 2025 11:13:03 -0400 (0:00:00.183) 0:00:09.852 ******** fatal: [managed-node3]: FAILED! => { "changed": false, "rc": 1, "results": [] } MSG: Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried PLAY RECAP ********************************************************************* managed-node3 : ok=19 changed=0 unreachable=0 failed=2 skipped=4 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.17.14", "end_time": "2025-10-10T15:13:02.524388+00:00Z", "host": "managed-node3", "message": "Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried", "rc": 1, "start_time": "2025-10-10T15:12:56.079349+00:00Z", "task_name": "Make sure blivet is available", "task_path": "/tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2" }, { "ansible_version": "2.17.14", "end_time": "2025-10-10T15:13:07.484723+00:00Z", "host": "managed-node3", "message": "Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried", "rc": 1, "start_time": "2025-10-10T15:13:03.301704+00:00Z", "task_name": "Make sure blivet is available", "task_path": "/tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Friday 10 October 2025 11:13:07 -0400 (0:00:04.189) 0:00:14.041 ******** =============================================================================== fedora.linux_system_roles.storage : Make sure blivet is available ------- 6.46s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 fedora.linux_system_roles.storage : Make sure blivet is available ------- 4.19s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Gathering Facts --------------------------------------------------------- 1.10s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:2 Check if system is ostree ----------------------------------------------- 0.52s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/setup.yml:10 fedora.linux_system_roles.storage : Check if system is ostree ----------- 0.40s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 fedora.linux_system_roles.storage : Include the appropriate provider tasks --- 0.18s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 fedora.linux_system_roles.storage : Include the appropriate provider tasks --- 0.13s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 fedora.linux_system_roles.storage : Set platform/version specific variables --- 0.12s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 fedora.linux_system_roles.storage : Ensure ansible_facts used by role --- 0.10s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 fedora.linux_system_roles.storage : Set flag to indicate system is ostree --- 0.07s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 fedora.linux_system_roles.storage : Set platform/version specific variables --- 0.07s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 fedora.linux_system_roles.storage : Set platform/version specific variables --- 0.07s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Remove storage volumes -------------------------------------------------- 0.06s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/cleanup.yml:7 fedora.linux_system_roles.storage : Check if system is ostree ----------- 0.06s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Set mount parent -------------------------------------------------------- 0.06s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/tests/snapshot/tasks/setup.yml:15 fedora.linux_system_roles.storage : Ensure ansible_facts used by role --- 0.06s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 fedora.linux_system_roles.storage : Set flag to indicate system is ostree --- 0.06s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Cleanup ----------------------------------------------------------------- 0.05s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/tests/snapshot/tests_mount.yml:204 fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing --- 0.05s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing --- 0.05s /tmp/collections-q4K/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Oct 10 11:12:52 managed-node3 sshd-session[9721]: Accepted publickey for root from 10.31.40.254 port 42876 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 10 11:12:52 managed-node3 systemd-logind[610]: New session 15 of user root. ░░ Subject: A new session 15 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 15 has been created for the user root. ░░ ░░ The leading process of the session is 9721. Oct 10 11:12:52 managed-node3 systemd[1]: Started Session 15 of User root. ░░ Subject: A start job for unit session-15.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-15.scope has finished successfully. ░░ ░░ The job identifier is 1591. Oct 10 11:12:52 managed-node3 sshd-session[9721]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 10 11:12:52 managed-node3 sshd-session[9724]: Received disconnect from 10.31.40.254 port 42876:11: disconnected by user Oct 10 11:12:52 managed-node3 sshd-session[9724]: Disconnected from user root 10.31.40.254 port 42876 Oct 10 11:12:52 managed-node3 sshd-session[9721]: pam_unix(sshd:session): session closed for user root Oct 10 11:12:52 managed-node3 systemd[1]: session-15.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-15.scope has successfully entered the 'dead' state. Oct 10 11:12:52 managed-node3 systemd-logind[610]: Session 15 logged out. Waiting for processes to exit. Oct 10 11:12:52 managed-node3 systemd-logind[610]: Removed session 15. ░░ Subject: Session 15 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 15 has been terminated. Oct 10 11:12:52 managed-node3 sshd-session[9749]: Accepted publickey for root from 10.31.40.254 port 42888 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 10 11:12:52 managed-node3 systemd-logind[610]: New session 16 of user root. ░░ Subject: A new session 16 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 16 has been created for the user root. ░░ ░░ The leading process of the session is 9749. Oct 10 11:12:52 managed-node3 systemd[1]: Started Session 16 of User root. ░░ Subject: A start job for unit session-16.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-16.scope has finished successfully. ░░ ░░ The job identifier is 1660. Oct 10 11:12:52 managed-node3 sshd-session[9749]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 10 11:12:52 managed-node3 sshd-session[9752]: Received disconnect from 10.31.40.254 port 42888:11: disconnected by user Oct 10 11:12:52 managed-node3 sshd-session[9752]: Disconnected from user root 10.31.40.254 port 42888 Oct 10 11:12:52 managed-node3 sshd-session[9749]: pam_unix(sshd:session): session closed for user root Oct 10 11:12:52 managed-node3 systemd-logind[610]: Session 16 logged out. Waiting for processes to exit. Oct 10 11:12:52 managed-node3 systemd[1]: session-16.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-16.scope has successfully entered the 'dead' state. Oct 10 11:12:52 managed-node3 systemd-logind[610]: Removed session 16. ░░ Subject: Session 16 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 16 has been terminated. Oct 10 11:12:54 managed-node3 python3.9[9950]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 10 11:12:55 managed-node3 python3.9[10125]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 10 11:12:55 managed-node3 python3.9[10274]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 10 11:12:56 managed-node3 python3.9[10423]: ansible-ansible.legacy.dnf Invoked with name=['python3-blivet', 'libblockdev-crypto', 'libblockdev-dm', 'libblockdev-lvm', 'libblockdev-mdraid', 'libblockdev-swap', 'vdo', 'kmod-kvdo', 'xfsprogs', 'stratisd', 'stratis-cli', 'libblockdev'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 10 11:13:03 managed-node3 python3.9[10605]: ansible-ansible.legacy.dnf Invoked with name=['python3-blivet', 'libblockdev-crypto', 'libblockdev-dm', 'libblockdev-lvm', 'libblockdev-mdraid', 'libblockdev-swap', 'vdo', 'kmod-kvdo', 'xfsprogs', 'stratisd', 'stratis-cli', 'libblockdev'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 10 11:13:07 managed-node3 sshd-session[10663]: Accepted publickey for root from 10.31.40.254 port 35130 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 10 11:13:07 managed-node3 systemd-logind[610]: New session 17 of user root. ░░ Subject: A new session 17 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 17 has been created for the user root. ░░ ░░ The leading process of the session is 10663. Oct 10 11:13:07 managed-node3 systemd[1]: Started Session 17 of User root. ░░ Subject: A start job for unit session-17.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-17.scope has finished successfully. ░░ ░░ The job identifier is 1729. Oct 10 11:13:07 managed-node3 sshd-session[10663]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 10 11:13:07 managed-node3 sshd-session[10666]: Received disconnect from 10.31.40.254 port 35130:11: disconnected by user Oct 10 11:13:07 managed-node3 sshd-session[10666]: Disconnected from user root 10.31.40.254 port 35130 Oct 10 11:13:07 managed-node3 sshd-session[10663]: pam_unix(sshd:session): session closed for user root Oct 10 11:13:07 managed-node3 systemd[1]: session-17.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-17.scope has successfully entered the 'dead' state. Oct 10 11:13:07 managed-node3 systemd-logind[610]: Session 17 logged out. Waiting for processes to exit. Oct 10 11:13:07 managed-node3 systemd-logind[610]: Removed session 17. ░░ Subject: Session 17 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 17 has been terminated. Oct 10 11:13:07 managed-node3 sshd-session[10691]: Accepted publickey for root from 10.31.40.254 port 35138 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 10 11:13:07 managed-node3 systemd-logind[610]: New session 18 of user root. ░░ Subject: A new session 18 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 18 has been created for the user root. ░░ ░░ The leading process of the session is 10691. Oct 10 11:13:07 managed-node3 systemd[1]: Started Session 18 of User root. ░░ Subject: A start job for unit session-18.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-18.scope has finished successfully. ░░ ░░ The job identifier is 1798. Oct 10 11:13:07 managed-node3 sshd-session[10691]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0)