[WARNING]: Collection infra.leapp does not support Ansible version 2.14.18 [WARNING]: running playbook inside collection infra.leapp ansible-playbook [core 2.14.18] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections executable location = /usr/bin/ansible-playbook python version = 3.9.25 (main, Nov 10 2025, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-11)] (/usr/bin/python3) jinja version = 3.1.2 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_default.yml **************************************************** 1 plays in /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml PLAY [Test] ******************************************************************** TASK [Gathering Facts] ********************************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml:2 ok: [managed-node01] TASK [Test | Run role analysis] ************************************************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml:9 [WARNING]: Collection community.general does not support Ansible version 2.14.18 TASK [infra.leapp.common : Log directory exists] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:3 ok: [managed-node01] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/var/log/ripu", "secontext": "unconfined_u:object_r:var_log_t:s0", "size": 6, "state": "directory", "uid": 0} TASK [infra.leapp.common : Check for existing log file] ************************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:11 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : Fail if log file already exists] ******************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:16 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : Create new log file] ******************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:23 NOTIFIED HANDLER infra.leapp.common : Check for log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Add end time to log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Slurp ripu.log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Decode ripu.log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Rename log file for managed-node01 changed: [managed-node01] => {"changed": true, "checksum": "51d3d3132e7ee23785cb943964ac2ea1806b9c57", "dest": "/var/log/ripu/ripu.log", "gid": 0, "group": "root", "md5sum": "514f5e21b113862c18fc98c1f4520e76", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 61, "src": "/root/.ansible/tmp/ansible-tmp-1764684548.5892422-5765-71523638202369/source", "state": "file", "uid": 0} TASK [infra.leapp.common : /etc/ansible/facts.d directory exists] ************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:35 ok: [managed-node01] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/ansible/facts.d", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 57, "state": "directory", "uid": 0} TASK [infra.leapp.common : Capture current ansible_facts for validation after upgrade] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:43 changed: [managed-node01] => {"changed": true, "checksum": "646ddb7ff0a735fdb07d25e224e2d52f6747b24c", "dest": "/etc/ansible/facts.d/pre_ripu.fact", "gid": 0, "group": "root", "md5sum": "e898deb976e7a8eb64b76d49bae6f8d7", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 11997, "src": "/root/.ansible/tmp/ansible-tmp-1764684549.4504232-5793-171275072128967/source", "state": "file", "uid": 0} TASK [infra.leapp.common : Capture a list of non-rhel versioned packages] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:51 ok: [managed-node01] => {"changed": false, "cmd": "set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\\.|+]el7' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort", "delta": "0:00:00.375342", "end": "2025-12-02 09:09:10.732151", "failed_when_result": false, "msg": "", "rc": 0, "start": "2025-12-02 09:09:10.356809", "stderr": "", "stderr_lines": [], "stdout": "epel-release-7-14.noarch\ntps-devel-2.44.50-1.noarch", "stdout_lines": ["epel-release-7-14.noarch", "tps-devel-2.44.50-1.noarch"]} TASK [infra.leapp.common : Create fact with the non-rhel versioned packages list] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:65 ok: [managed-node01] => {"ansible_facts": {"non_rhel_packages": ["epel-release-7-14.noarch", "tps-devel-2.44.50-1.noarch"]}, "changed": false} TASK [infra.leapp.common : Capture the list of non-rhel versioned packages in a separate fact file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:69 ok: [managed-node01] => {"changed": false, "checksum": "6d36b22d9c2b2f366fc090edfbac427c77d524a5", "dest": "/etc/ansible/facts.d/non_rhel_packages.fact", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "path": "/etc/ansible/facts.d/non_rhel_packages.fact", "secontext": "system_u:object_r:etc_t:s0", "size": 58, "state": "file", "uid": 0} TASK [infra.leapp.analysis : Include tasks for preupg assistant analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:9 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : Include tasks for leapp preupgrade analysis] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/main.yml:13 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml for managed-node01 TASK [infra.leapp.analysis : analysis-leapp | Register to leapp activation key] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:2 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [analysis-leapp | Include custom_local_repos for local_repos_pre_leapp] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:14 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.analysis : analysis-leapp | Install packages for preupgrade analysis on RHEL 7] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tasks/analysis-leapp.yml:22 fatal: [managed-node01]: FAILED! => {"changed": false, "msg": "\n\n One of the configured repositories failed (Unknown),\n and yum doesn't have enough cached data to continue. At this point the only\n safe thing yum can do is fail. There are a few ways to work \"fix\" this:\n\n 1. Contact the upstream for the repository and get them to fix the problem.\n\n 2. Reconfigure the baseurl/etc. for the repository, to point to a working\n upstream. This is most often useful if you are using a newer\n distribution release than is supported by the repository (and the\n packages for the previous distribution release still work).\n\n 3. Run the command with the repository temporarily disabled\n yum --disablerepo= ...\n\n 4. Disable the repository permanently, so yum won't use it by default. Yum\n will then just ignore the repository until you permanently enable it\n again or use --enablerepo for temporary usage:\n\n yum-config-manager --disable \n or\n subscription-manager repos --disable=\n\n 5. Configure the failing repository to be skipped, if it is unavailable.\n Note that yum will try to contact the repo. when it runs most commands,\n so will have to try and fail each time (and thus. yum will be be much\n slower). If it is a very temporary problem though, this is often a nice\n compromise:\n\n yum-config-manager --save --setopt=.skip_if_unavailable=true\n\nCannot find a valid baseurl for repo: rhel\n", "rc": 1, "results": []} TASK [Cleanup | Remove log files] ********************************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/analysis/tests/tests_default.yml:13 changed: [managed-node01] => {"changed": true, "cmd": "set -euxo pipefail\nrm -f /var/log/leapp/leapp-upgrade.log\nrm -f /var/log/ripu/ripu.log*\n", "delta": "0:00:00.004110", "end": "2025-12-02 09:09:12.358997", "msg": "", "rc": 0, "start": "2025-12-02 09:09:12.354887", "stderr": "+ rm -f /var/log/leapp/leapp-upgrade.log\n+ rm -f /var/log/ripu/ripu.log", "stderr_lines": ["+ rm -f /var/log/leapp/leapp-upgrade.log", "+ rm -f /var/log/ripu/ripu.log"], "stdout": "", "stdout_lines": []} PLAY RECAP ********************************************************************* managed-node01 : ok=11 changed=3 unreachable=0 failed=1 skipped=4 rescued=0 ignored=0 -- Logs begin at Tue 2025-12-02 09:04:53 EST, end at Tue 2025-12-02 09:09:12 EST. -- Dec 02 09:09:07 managed-node01 ansible-ansible.legacy.setup[3974]: Invoked with filter=[] gather_subset=['all'] fact_path=/etc/ansible/facts.d gather_timeout=10 Dec 02 09:09:08 managed-node01 ansible-ansible.builtin.file[4064]: Invoked with src=None selevel=None force=False setype=None _original_basename=None unsafe_writes=False access_time=None seuser=None recurse=False state=directory access_time_format=%Y%m%d%H%M.%S group=root modification_time=None serole=None _diff_peek=None modification_time_format=%Y%m%d%H%M.%S path=/var/log/ripu owner=root follow=True attributes=None mode=0755 Dec 02 09:09:08 managed-node01 ansible-ansible.builtin.stat[4125]: Invoked with checksum_algorithm=sha1 get_checksum=True follow=False path=/var/log/ripu/ripu.log get_md5=False get_mime=True get_attributes=True Dec 02 09:09:08 managed-node01 ansible-ansible.legacy.stat[4186]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/var/log/ripu/ripu.log follow=False get_md5=False get_mime=True get_attributes=True Dec 02 09:09:09 managed-node01 ansible-ansible.legacy.copy[4232]: Invoked with src=/root/.ansible/tmp/ansible-tmp-1764684548.5892422-5765-71523638202369/source directory_mode=None force=True attributes=None remote_src=None unsafe_writes=False dest=/var/log/ripu/ripu.log seuser=None setype=None group=root content=NOT_LOGGING_PARAMETER _original_basename=tmpg5dpceyn serole=None mode=0644 selevel=None owner=root follow=False validate=None checksum=51d3d3132e7ee23785cb943964ac2ea1806b9c57 backup=False local_follow=None Dec 02 09:09:09 managed-node01 ansible-ansible.builtin.file[4293]: Invoked with src=None selevel=None force=False setype=None _original_basename=None unsafe_writes=False access_time=None seuser=None recurse=False state=directory access_time_format=%Y%m%d%H%M.%S group=root modification_time=None serole=None _diff_peek=None modification_time_format=%Y%m%d%H%M.%S path=/etc/ansible/facts.d owner=root follow=True attributes=None mode=0755 Dec 02 09:09:09 managed-node01 ansible-ansible.legacy.stat[4354]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/etc/ansible/facts.d/pre_ripu.fact follow=False get_md5=False get_mime=True get_attributes=True Dec 02 09:09:09 managed-node01 ansible-ansible.legacy.copy[4402]: Invoked with src=/root/.ansible/tmp/ansible-tmp-1764684549.4504232-5793-171275072128967/source directory_mode=None force=True attributes=None remote_src=None unsafe_writes=False dest=/etc/ansible/facts.d/pre_ripu.fact seuser=None setype=None group=root content=NOT_LOGGING_PARAMETER _original_basename=tmp0s0bc743 serole=None mode=0644 selevel=None owner=root follow=False validate=None checksum=646ddb7ff0a735fdb07d25e224e2d52f6747b24c backup=False local_follow=None Dec 02 09:09:10 managed-node01 ansible-ansible.legacy.command[4463]: Invoked with executable=None _uses_shell=True strip_empty_ends=True _raw_params=set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\.|+]el7' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort removes=None argv=None creates=None chdir=None stdin_add_newline=True stdin=None Dec 02 09:09:11 managed-node01 ansible-ansible.legacy.stat[4529]: Invoked with checksum_algorithm=sha1 get_checksum=True path=/etc/ansible/facts.d/non_rhel_packages.fact follow=False get_md5=False get_mime=True get_attributes=True Dec 02 09:09:11 managed-node01 ansible-ansible.legacy.file[4561]: Invoked with force=False _original_basename=tmph5_olux0 owner=root follow=True group=root unsafe_writes=False serole=None state=file selevel=None setype=None dest=/etc/ansible/facts.d/non_rhel_packages.fact access_time=None access_time_format=%Y%m%d%H%M.%S modification_time=None path=/etc/ansible/facts.d/non_rhel_packages.fact src=None seuser=None recurse=False _diff_peek=None mode=0644 modification_time_format=%Y%m%d%H%M.%S attributes=None Dec 02 09:09:11 managed-node01 ansible-ansible.legacy.yum[4622]: Invoked with lock_timeout=30 update_cache=False conf_file=None exclude=[] allow_downgrade=False sslverify=True disable_gpg_check=False disable_excludes=None use_backend=auto validate_certs=True state=latest disablerepo=[] releasever=None skip_broken=False cacheonly=False autoremove=False download_dir=None installroot=/ install_weak_deps=True name=['leapp-upgrade'] download_only=False bugfix=False list=None install_repoquery=True update_only=False disable_plugin=[] enablerepo=['rhel-7-server-extras-rpms'] security=False enable_plugin=[] Dec 02 09:09:12 managed-node01 ansible-ansible.legacy.command[4691]: Invoked with executable=/bin/bash _uses_shell=True strip_empty_ends=True _raw_params=set -euxo pipefail rm -f /var/log/leapp/leapp-upgrade.log rm -f /var/log/ripu/ripu.log* removes=None argv=None creates=None chdir=None stdin_add_newline=True stdin=None Dec 02 09:09:12 managed-node01 sshd[4704]: Accepted publickey for root from 10.31.44.30 port 57456 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Dec 02 09:09:12 managed-node01 systemd[1]: Started Session 9 of user root. -- Subject: Unit session-9.scope has finished start-up -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit session-9.scope has finished starting up. -- -- The start-up result is done. Dec 02 09:09:12 managed-node01 systemd-logind[503]: New session 9 of user root. -- Subject: A new session 9 has been created for user root -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 9 has been created for the user root. -- -- The leading process of the session is 4704. Dec 02 09:09:12 managed-node01 sshd[4704]: pam_unix(sshd:session): session opened for user root by (uid=0) Dec 02 09:09:12 managed-node01 sshd[4704]: Received disconnect from 10.31.44.30 port 57456:11: disconnected by user Dec 02 09:09:12 managed-node01 sshd[4704]: Disconnected from 10.31.44.30 port 57456 Dec 02 09:09:12 managed-node01 sshd[4704]: pam_unix(sshd:session): session closed for user root Dec 02 09:09:12 managed-node01 systemd-logind[503]: Removed session 9. -- Subject: Session 9 has been terminated -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A session with the ID 9 has been terminated. Dec 02 09:09:12 managed-node01 sshd[4716]: Accepted publickey for root from 10.31.44.30 port 57470 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Dec 02 09:09:12 managed-node01 systemd[1]: Started Session 10 of user root. -- Subject: Unit session-10.scope has finished start-up -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- -- Unit session-10.scope has finished starting up. -- -- The start-up result is done. Dec 02 09:09:12 managed-node01 systemd-logind[503]: New session 10 of user root. -- Subject: A new session 10 has been created for user root -- Defined-By: systemd -- Support: http://lists.freedesktop.org/mailman/listinfo/systemd-devel -- Documentation: http://www.freedesktop.org/wiki/Software/systemd/multiseat -- -- A new session with the ID 10 has been created for the user root. -- -- The leading process of the session is 4716. Dec 02 09:09:12 managed-node01 sshd[4716]: pam_unix(sshd:session): session opened for user root by (uid=0)