ansible-playbook [core 2.17.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-OPq executable location = /usr/local/bin/ansible-playbook python version = 3.12.11 (main, Aug 14 2025, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-11)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ssh.yml ******************************************************** 1 plays in /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/tests/kdump/tests_ssh.yml PLAY [Ensure that the rule runs with ssh] ************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/tests/kdump/tests_ssh.yml:2 Saturday 11 October 2025 03:16:33 -0400 (0:00:00.018) 0:00:00.018 ****** [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed-node2] TASK [Gather facts from managed-node2] ***************************************** task path: /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/tests/kdump/tests_ssh.yml:56 Saturday 11 October 2025 03:16:34 -0400 (0:00:01.048) 0:00:01.066 ****** ok: [managed-node2] TASK [Print message that this test is skipped on EL 6] ************************* task path: /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/tests/kdump/tests_ssh.yml:61 Saturday 11 October 2025 03:16:35 -0400 (0:00:00.787) 0:00:01.854 ****** skipping: [managed-node2] => { "false_condition": "ansible_distribution_major_version == '6'" } TASK [Skip the test on EL 6 when control node == managed node] ***************** task path: /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/tests/kdump/tests_ssh.yml:71 Saturday 11 October 2025 03:16:35 -0400 (0:00:00.013) 0:00:01.867 ****** META: end_host conditional evaluated to False, continuing execution for managed-node2 skipping: [managed-node2] => { "skip_reason": "end_host conditional evaluated to False, continuing execution for managed-node2" } MSG: end_host conditional evaluated to false, continuing execution for managed-node2 TASK [Run the role. If reboot is not required - the play succeeds.] ************ task path: /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/tests/kdump/tests_ssh.yml:82 Saturday 11 October 2025 03:16:35 -0400 (0:00:00.006) 0:00:01.874 ****** included: fedora.linux_system_roles.kdump for managed-node2 TASK [fedora.linux_system_roles.kdump : Ensure ansible_facts used by role] ***** task path: /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/roles/kdump/tasks/main.yml:2 Saturday 11 October 2025 03:16:35 -0400 (0:00:00.035) 0:00:01.909 ****** included: /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/roles/kdump/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.kdump : Ensure ansible_facts used by role] ***** task path: /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/roles/kdump/tasks/set_vars.yml:2 Saturday 11 October 2025 03:16:35 -0400 (0:00:00.032) 0:00:01.941 ****** skipping: [managed-node2] => { "changed": false, "false_condition": "__kdump_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.kdump : Check if system is ostree] ************* task path: /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/roles/kdump/tasks/set_vars.yml:10 Saturday 11 October 2025 03:16:35 -0400 (0:00:00.050) 0:00:01.992 ****** ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.kdump : Set flag to indicate system is ostree] *** task path: /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/roles/kdump/tasks/set_vars.yml:15 Saturday 11 October 2025 03:16:35 -0400 (0:00:00.493) 0:00:02.485 ****** ok: [managed-node2] => { "ansible_facts": { "__kdump_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.kdump : Set platform/version specific variables] *** task path: /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/roles/kdump/tasks/set_vars.yml:19 Saturday 11 October 2025 03:16:35 -0400 (0:00:00.022) 0:00:02.507 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS_9.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS_9.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.kdump : Install required packages] ************* task path: /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/roles/kdump/tasks/main.yml:5 Saturday 11 October 2025 03:16:35 -0400 (0:00:00.034) 0:00:02.542 ****** fatal: [managed-node2]: FAILED! => { "changed": false, "rc": 1, "results": [] } MSG: Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried TASK [If reboot is required - assert the expected fail message] **************** task path: /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/tests/kdump/tests_ssh.yml:86 Saturday 11 October 2025 03:16:39 -0400 (0:00:04.195) 0:00:06.738 ****** fatal: [managed-node2]: FAILED! => { "assertion": "'Reboot is required to apply changes.' in ansible_failed_result.msg", "changed": false, "evaluated_to": false } MSG: Assertion failed PLAY RECAP ********************************************************************* managed-node2 : ok=6 changed=0 unreachable=0 failed=1 skipped=3 rescued=1 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.17.14", "end_time": "2025-10-11T07:16:39.993037+00:00Z", "host": "managed-node2", "message": "Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried", "rc": 1, "start_time": "2025-10-11T07:16:35.803411+00:00Z", "task_name": "Install required packages", "task_path": "/tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/roles/kdump/tasks/main.yml:5" }, { "ansible_version": "2.17.14", "end_time": "2025-10-11T07:16:40.011395+00:00Z", "host": "managed-node2", "message": "Assertion failed", "start_time": "2025-10-11T07:16:39.998704+00:00Z", "task_name": "If reboot is required - assert the expected fail message", "task_path": "/tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/tests/kdump/tests_ssh.yml:86" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Saturday 11 October 2025 03:16:40 -0400 (0:00:00.014) 0:00:06.752 ****** =============================================================================== fedora.linux_system_roles.kdump : Install required packages ------------- 4.20s /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/roles/kdump/tasks/main.yml:5 Gathering Facts --------------------------------------------------------- 1.05s /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/tests/kdump/tests_ssh.yml:2 Gather facts from managed-node2 ----------------------------------------- 0.79s /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/tests/kdump/tests_ssh.yml:56 fedora.linux_system_roles.kdump : Check if system is ostree ------------- 0.49s /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/roles/kdump/tasks/set_vars.yml:10 fedora.linux_system_roles.kdump : Ensure ansible_facts used by role ----- 0.05s /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/roles/kdump/tasks/set_vars.yml:2 Run the role. If reboot is not required - the play succeeds. ------------ 0.04s /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/tests/kdump/tests_ssh.yml:82 fedora.linux_system_roles.kdump : Set platform/version specific variables --- 0.03s /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/roles/kdump/tasks/set_vars.yml:19 fedora.linux_system_roles.kdump : Ensure ansible_facts used by role ----- 0.03s /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/roles/kdump/tasks/main.yml:2 fedora.linux_system_roles.kdump : Set flag to indicate system is ostree --- 0.02s /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/roles/kdump/tasks/set_vars.yml:15 If reboot is required - assert the expected fail message ---------------- 0.01s /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/tests/kdump/tests_ssh.yml:86 Print message that this test is skipped on EL 6 ------------------------- 0.01s /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/tests/kdump/tests_ssh.yml:61 Skip the test on EL 6 when control node == managed node ----------------- 0.01s /tmp/collections-OPq/ansible_collections/fedora/linux_system_roles/tests/kdump/tests_ssh.yml:71 Oct 11 03:16:32 managed-node2 sshd-session[9135]: Accepted publickey for root from 10.31.11.139 port 47698 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 11 03:16:32 managed-node2 systemd-logind[607]: New session 10 of user root. ░░ Subject: A new session 10 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 10 has been created for the user root. ░░ ░░ The leading process of the session is 9135. Oct 11 03:16:32 managed-node2 systemd[1]: Started Session 10 of User root. ░░ Subject: A start job for unit session-10.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-10.scope has finished successfully. ░░ ░░ The job identifier is 1246. Oct 11 03:16:32 managed-node2 sshd-session[9135]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 11 03:16:32 managed-node2 sshd-session[9138]: Received disconnect from 10.31.11.139 port 47698:11: disconnected by user Oct 11 03:16:32 managed-node2 sshd-session[9138]: Disconnected from user root 10.31.11.139 port 47698 Oct 11 03:16:32 managed-node2 sshd-session[9135]: pam_unix(sshd:session): session closed for user root Oct 11 03:16:32 managed-node2 systemd[1]: session-10.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-10.scope has successfully entered the 'dead' state. Oct 11 03:16:32 managed-node2 systemd-logind[607]: Session 10 logged out. Waiting for processes to exit. Oct 11 03:16:32 managed-node2 systemd-logind[607]: Removed session 10. ░░ Subject: Session 10 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 10 has been terminated. Oct 11 03:16:32 managed-node2 sshd-session[9163]: Accepted publickey for root from 10.31.11.139 port 47700 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 11 03:16:32 managed-node2 systemd-logind[607]: New session 11 of user root. ░░ Subject: A new session 11 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 11 has been created for the user root. ░░ ░░ The leading process of the session is 9163. Oct 11 03:16:32 managed-node2 systemd[1]: Started Session 11 of User root. ░░ Subject: A start job for unit session-11.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-11.scope has finished successfully. ░░ ░░ The job identifier is 1315. Oct 11 03:16:32 managed-node2 sshd-session[9163]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 11 03:16:32 managed-node2 sshd-session[9166]: Received disconnect from 10.31.11.139 port 47700:11: disconnected by user Oct 11 03:16:32 managed-node2 sshd-session[9166]: Disconnected from user root 10.31.11.139 port 47700 Oct 11 03:16:32 managed-node2 sshd-session[9163]: pam_unix(sshd:session): session closed for user root Oct 11 03:16:32 managed-node2 systemd[1]: session-11.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-11.scope has successfully entered the 'dead' state. Oct 11 03:16:32 managed-node2 systemd-logind[607]: Session 11 logged out. Waiting for processes to exit. Oct 11 03:16:32 managed-node2 systemd-logind[607]: Removed session 11. ░░ Subject: Session 11 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 11 has been terminated. Oct 11 03:16:33 managed-node2 python3.9[9364]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 11 03:16:34 managed-node2 python3.9[9539]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 11 03:16:35 managed-node2 python3.9[9714]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 11 03:16:36 managed-node2 python3.9[9863]: ansible-ansible.legacy.dnf Invoked with name=['grubby', 'iproute', 'kexec-tools', 'openssh-clients'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 11 03:16:40 managed-node2 sshd-session[9920]: Accepted publickey for root from 10.31.11.139 port 47706 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 11 03:16:40 managed-node2 systemd-logind[607]: New session 12 of user root. ░░ Subject: A new session 12 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 12 has been created for the user root. ░░ ░░ The leading process of the session is 9920. Oct 11 03:16:40 managed-node2 systemd[1]: Started Session 12 of User root. ░░ Subject: A start job for unit session-12.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-12.scope has finished successfully. ░░ ░░ The job identifier is 1384. Oct 11 03:16:40 managed-node2 sshd-session[9920]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 11 03:16:40 managed-node2 sshd-session[9923]: Received disconnect from 10.31.11.139 port 47706:11: disconnected by user Oct 11 03:16:40 managed-node2 sshd-session[9923]: Disconnected from user root 10.31.11.139 port 47706 Oct 11 03:16:40 managed-node2 sshd-session[9920]: pam_unix(sshd:session): session closed for user root Oct 11 03:16:40 managed-node2 systemd[1]: session-12.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-12.scope has successfully entered the 'dead' state. Oct 11 03:16:40 managed-node2 systemd-logind[607]: Session 12 logged out. Waiting for processes to exit. Oct 11 03:16:40 managed-node2 systemd-logind[607]: Removed session 12. ░░ Subject: Session 12 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 12 has been terminated. Oct 11 03:16:40 managed-node2 sshd-session[9948]: Accepted publickey for root from 10.31.11.139 port 47708 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 11 03:16:40 managed-node2 systemd-logind[607]: New session 13 of user root. ░░ Subject: A new session 13 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 13 has been created for the user root. ░░ ░░ The leading process of the session is 9948. Oct 11 03:16:40 managed-node2 systemd[1]: Started Session 13 of User root. ░░ Subject: A start job for unit session-13.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-13.scope has finished successfully. ░░ ░░ The job identifier is 1453. Oct 11 03:16:40 managed-node2 sshd-session[9948]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0)