[WARNING]: Collection infra.leapp does not support Ansible version 2.14.18 [WARNING]: running playbook inside collection infra.leapp ansible-playbook [core 2.14.18] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections executable location = /usr/bin/ansible-playbook python version = 3.9.23 (main, Aug 19 2025, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-11)] (/usr/bin/python3) jinja version = 3.1.2 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_default.yml **************************************************** 1 plays in /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tests/tests_default.yml PLAY [Test] ******************************************************************** TASK [Gathering Facts] ********************************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tests/tests_default.yml:2 ok: [managed-node01] TASK [infra.leapp.common : Log directory exists] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:3 ok: [managed-node01] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/var/log/ripu", "secontext": "unconfined_u:object_r:var_log_t:s0", "size": 70, "state": "directory", "uid": 0} TASK [infra.leapp.common : Check for existing log file] ************************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:11 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : Fail if log file already exists] ******************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:16 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : Create new log file] ******************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:21 NOTIFIED HANDLER infra.leapp.common : Add end time to log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Slurp ripu.log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Decode ripu.log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Rename log file for managed-node01 changed: [managed-node01] => {"changed": true, "checksum": "6c22fce1c9bde0c9557aa4887ed1089504b88c63", "dest": "/var/log/ripu/ripu.log", "gid": 0, "group": "root", "md5sum": "cf26238e9930d75120f130d53b130807", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 61, "src": "/root/.ansible/tmp/ansible-tmp-1762956758.4072073-8798-163312043651075/source", "state": "file", "uid": 0} TASK [infra.leapp.common : /etc/ansible/facts.d directory exists] ************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:33 ok: [managed-node01] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/ansible/facts.d", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 57, "state": "directory", "uid": 0} TASK [infra.leapp.common : Capture current ansible_facts for validation after upgrade] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:41 changed: [managed-node01] => {"changed": true, "checksum": "1137dc2924cbcaa07c98c41035c06fc09443abf2", "dest": "/etc/ansible/facts.d/pre_ripu.fact", "gid": 0, "group": "root", "md5sum": "58930721772d1b7ab618a3fea87ac62a", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 14278, "src": "/root/.ansible/tmp/ansible-tmp-1762956759.5976355-8826-117335524402429/source", "state": "file", "uid": 0} TASK [infra.leapp.common : Capture a list of non-rhel versioned packages] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:49 ok: [managed-node01] => {"changed": false, "cmd": "set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\\.|+]el9' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort", "delta": "0:00:00.218199", "end": "2025-11-12 09:12:40.886653", "failed_when_result": false, "msg": "non-zero return code", "rc": 1, "start": "2025-11-12 09:12:40.668454", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [infra.leapp.common : Create fact with the non-rhel versioned packages list] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:63 ok: [managed-node01] => {"ansible_facts": {"non_rhel_packages": []}, "changed": false} TASK [infra.leapp.common : Capture the list of non-rhel versioned packages in a separate fact file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:67 ok: [managed-node01] => {"changed": false, "checksum": "97d170e1550eee4afc0af065b78cda302a97674c", "dest": "/etc/ansible/facts.d/non_rhel_packages.fact", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "path": "/etc/ansible/facts.d/non_rhel_packages.fact", "secontext": "system_u:object_r:etc_t:s0", "size": 2, "state": "file", "uid": 0} TASK [infra.leapp.upgrade : Include tasks for upgrade using redhat-upgrade-tool] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/main.yml:9 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.upgrade : Include tasks for leapp upgrade] ******************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/main.yml:13 [WARNING]: Collection community.general does not support Ansible version 2.14.18 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml for managed-node01 TASK [leapp-upgrade | Include the parse_leapp_report role to check for inhibitors] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:2 TASK [infra.leapp.parse_leapp_report : Default upgrade_inhibited to false] ***** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/parse_leapp_report/tasks/main.yml:2 ok: [managed-node01] => {"ansible_facts": {"upgrade_inhibited": false}, "changed": false} TASK [infra.leapp.parse_leapp_report : Collect human readable report results] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/parse_leapp_report/tasks/main.yml:6 ok: [managed-node01] => {"changed": false, "content": "UmlzayBGYWN0b3I6IGhpZ2ggKGVycm9yKQpUaXRsZTogQSBzdWJzY3JpcHRpb24tbWFuYWdlciBjb21tYW5kIGZhaWxlZCB0byBleGVjdXRlClN1bW1hcnk6IHsiZGV0YWlscyI6ICJDb21tYW5kIFsnc3Vic2NyaXB0aW9uLW1hbmFnZXInLCAncmVsZWFzZSddIGZhaWxlZCB3aXRoIGV4aXQgY29kZSAxLiIsICJzdGRlcnIiOiAiVGhpcyBzeXN0ZW0gaXMgbm90IHlldCByZWdpc3RlcmVkLiBUcnkgJ3N1YnNjcmlwdGlvbi1tYW5hZ2VyIHJlZ2lzdGVyIC0taGVscCcgZm9yIG1vcmUgaW5mb3JtYXRpb24uXG4iLCAiaGludCI6ICJQbGVhc2UgZW5zdXJlIHlvdSBoYXZlIGEgdmFsaWQgUkhFTCBzdWJzY3JpcHRpb24gYW5kIHlvdXIgbmV0d29yayBpcyB1cC4gSWYgeW91IGFyZSB1c2luZyBwcm94eSBmb3IgUmVkIEhhdCBzdWJzY3JpcHRpb24tbWFuYWdlciwgcGxlYXNlIG1ha2Ugc3VyZSBpdCBpcyBzcGVjaWZpZWQgaW5zaWRlIHRoZSAvZXRjL3Joc20vcmhzbS5jb25mIGZpbGUuIE9yIHVzZSB0aGUgLS1uby1yaHNtIG9wdGlvbiB3aGVuIHJ1bm5pbmcgbGVhcHAsIGlmIHlvdSBkbyBub3Qgd2FudCB0byB1c2Ugc3Vic2NyaXB0aW9uLW1hbmFnZXIgZm9yIHRoZSBpbi1wbGFjZSB1cGdyYWRlIGFuZCB5b3Ugd2FudCB0byBkZWxpdmVyIGFsbCB0YXJnZXQgcmVwb3NpdG9yaWVzIGJ5IHlvdXJzZWxmIG9yIHVzaW5nIFJIVUkgb24gcHVibGljIGNsb3VkLiIsICJsaW5rIjogImh0dHBzOi8vYWNjZXNzLnJlZGhhdC5jb20vc29sdXRpb25zLzYxMzgzNzIifQpLZXk6IDdlYzgyNjk3ODRkYjFiYmEyYWM1NGFlNDM4Njg5ZWYzOTdlMTY4MzMKLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLQpSaXNrIEZhY3RvcjogaW5mbyAKVGl0bGU6IEV4Y2x1ZGVkIHRhcmdldCBzeXN0ZW0gcmVwb3NpdG9yaWVzClN1bW1hcnk6IFRoZSBmb2xsb3dpbmcgcmVwb3NpdG9yaWVzIGFyZSBub3Qgc3VwcG9ydGVkIGJ5IFJlZCBIYXQgYW5kIGFyZSBleGNsdWRlZCBmcm9tIHRoZSBsaXN0IG9mIHJlcG9zaXRvcmllcyB1c2VkIGR1cmluZyB0aGUgdXBncmFkZS4KLSBjb2RlcmVhZHktYnVpbGRlci1mb3ItcmhlbC0xMC14ODZfNjQtcnBtcwotIGNvZGVyZWFkeS1idWlsZGVyLWZvci1yaGVsLTEwLWFhcmNoNjQtcnBtcwotIGNvZGVyZWFkeS1idWlsZGVyLWJldGEtZm9yLXJoZWwtMTAtYWFyY2g2NC1ycG1zCi0gY29kZXJlYWR5LWJ1aWxkZXItYmV0YS1mb3ItcmhlbC0xMC1zMzkweC1ycG1zCi0gY29kZXJlYWR5LWJ1aWxkZXItZm9yLXJoZWwtMTAtcHBjNjRsZS1ycG1zCi0gY29kZXJlYWR5LWJ1aWxkZXItYmV0YS1mb3ItcmhlbC0xMC1wcGM2NGxlLXJwbXMKLSBjb2RlcmVhZHktYnVpbGRlci1mb3ItcmhlbC0xMC1zMzkweC1ycG1zCi0gY29kZXJlYWR5LWJ1aWxkZXItYmV0YS1mb3ItcmhlbC0xMC14ODZfNjQtcnBtcwpSZW1lZGlhdGlvbjogW2hpbnRdIElmIHNvbWUgb2YgZXhjbHVkZWQgcmVwb3NpdG9yaWVzIGFyZSBzdGlsbCByZXF1aXJlZCB0byBiZSB1c2VkIGR1cmluZyB0aGUgdXBncmFkZSwgZXhlY3V0ZSBsZWFwcCB3aXRoIHRoZSAtLWVuYWJsZXJlcG8gb3B0aW9uIHdpdGggdGhlIHJlcG9pZCBvZiB0aGUgcmVwb3NpdG9yeSByZXF1aXJlZCB0byBiZSBlbmFibGVkIGFzIGFuIGFyZ3VtZW50ICh0aGUgb3B0aW9uIGNhbiBiZSB1c2VkIG11bHRpcGxlIHRpbWVzKS4KS2V5OiAxYjkxMzJjYjIzNjJhZTc4MzBlNDhlZWU3ODExYmU5NTI3NzQ3ZGU4Ci0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0K", "encoding": "base64", "source": "/var/log/leapp/leapp-report.txt"} TASK [infra.leapp.parse_leapp_report : Collect JSON report results] ************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/parse_leapp_report/tasks/main.yml:11 ok: [managed-node01] => {"changed": false, "content": "ewogICJlbnRyaWVzIjogWwogICAgewogICAgICAiYXVkaWVuY2UiOiAic3lzYWRtaW4iLAogICAgICAiZ3JvdXBzIjogWwogICAgICAgICJlcnJvciIKICAgICAgXSwKICAgICAgImtleSI6ICI3ZWM4MjY5Nzg0ZGIxYmJhMmFjNTRhZTQzODY4OWVmMzk3ZTE2ODMzIiwKICAgICAgInNldmVyaXR5IjogImhpZ2giLAogICAgICAic3VtbWFyeSI6ICJ7XCJkZXRhaWxzXCI6IFwiQ29tbWFuZCBbJ3N1YnNjcmlwdGlvbi1tYW5hZ2VyJywgJ3JlbGVhc2UnXSBmYWlsZWQgd2l0aCBleGl0IGNvZGUgMS5cIiwgXCJzdGRlcnJcIjogXCJUaGlzIHN5c3RlbSBpcyBub3QgeWV0IHJlZ2lzdGVyZWQuIFRyeSAnc3Vic2NyaXB0aW9uLW1hbmFnZXIgcmVnaXN0ZXIgLS1oZWxwJyBmb3IgbW9yZSBpbmZvcm1hdGlvbi5cXG5cIiwgXCJoaW50XCI6IFwiUGxlYXNlIGVuc3VyZSB5b3UgaGF2ZSBhIHZhbGlkIFJIRUwgc3Vic2NyaXB0aW9uIGFuZCB5b3VyIG5ldHdvcmsgaXMgdXAuIElmIHlvdSBhcmUgdXNpbmcgcHJveHkgZm9yIFJlZCBIYXQgc3Vic2NyaXB0aW9uLW1hbmFnZXIsIHBsZWFzZSBtYWtlIHN1cmUgaXQgaXMgc3BlY2lmaWVkIGluc2lkZSB0aGUgL2V0Yy9yaHNtL3Joc20uY29uZiBmaWxlLiBPciB1c2UgdGhlIC0tbm8tcmhzbSBvcHRpb24gd2hlbiBydW5uaW5nIGxlYXBwLCBpZiB5b3UgZG8gbm90IHdhbnQgdG8gdXNlIHN1YnNjcmlwdGlvbi1tYW5hZ2VyIGZvciB0aGUgaW4tcGxhY2UgdXBncmFkZSBhbmQgeW91IHdhbnQgdG8gZGVsaXZlciBhbGwgdGFyZ2V0IHJlcG9zaXRvcmllcyBieSB5b3Vyc2VsZiBvciB1c2luZyBSSFVJIG9uIHB1YmxpYyBjbG91ZC5cIiwgXCJsaW5rXCI6IFwiaHR0cHM6Ly9hY2Nlc3MucmVkaGF0LmNvbS9zb2x1dGlvbnMvNjEzODM3MlwifSIsCiAgICAgICJ0aXRsZSI6ICJBIHN1YnNjcmlwdGlvbi1tYW5hZ2VyIGNvbW1hbmQgZmFpbGVkIHRvIGV4ZWN1dGUiLAogICAgICAidGltZVN0YW1wIjogIjIwMjUtMTEtMTJUMTQ6MTE6MjcuNDI4NTgxWiIsCiAgICAgICJob3N0bmFtZSI6ICJtYW5hZ2VkLW5vZGUwMSIsCiAgICAgICJhY3RvciI6ICJzY2FuX3N1YnNjcmlwdGlvbl9tYW5hZ2VyX2luZm8iLAogICAgICAiaWQiOiAiNmM1NGJiNDJiZjI0MTlmYzk5NWM0MzRlMWYyMTE0M2Y3MDY5NTk2OGIwOWEyMTNhNDY4ZjQzYzEwOTUzNGVhYiIKICAgIH0sCiAgICB7CiAgICAgICJhdWRpZW5jZSI6ICJzeXNhZG1pbiIsCiAgICAgICJkZXRhaWwiOiB7CiAgICAgICAgInJlbWVkaWF0aW9ucyI6IFsKICAgICAgICAgIHsKICAgICAgICAgICAgImNvbnRleHQiOiAiSWYgc29tZSBvZiBleGNsdWRlZCByZXBvc2l0b3JpZXMgYXJlIHN0aWxsIHJlcXVpcmVkIHRvIGJlIHVzZWQgZHVyaW5nIHRoZSB1cGdyYWRlLCBleGVjdXRlIGxlYXBwIHdpdGggdGhlIC0tZW5hYmxlcmVwbyBvcHRpb24gd2l0aCB0aGUgcmVwb2lkIG9mIHRoZSByZXBvc2l0b3J5IHJlcXVpcmVkIHRvIGJlIGVuYWJsZWQgYXMgYW4gYXJndW1lbnQgKHRoZSBvcHRpb24gY2FuIGJlIHVzZWQgbXVsdGlwbGUgdGltZXMpLiIsCiAgICAgICAgICAgICJ0eXBlIjogImhpbnQiCiAgICAgICAgICB9CiAgICAgICAgXQogICAgICB9LAogICAgICAiZ3JvdXBzIjogWwogICAgICAgICJyZXBvc2l0b3J5IiwKICAgICAgICAiZmFpbHVyZSIKICAgICAgXSwKICAgICAgImtleSI6ICIxYjkxMzJjYjIzNjJhZTc4MzBlNDhlZWU3ODExYmU5NTI3NzQ3ZGU4IiwKICAgICAgInNldmVyaXR5IjogImluZm8iLAogICAgICAic3VtbWFyeSI6ICJUaGUgZm9sbG93aW5nIHJlcG9zaXRvcmllcyBhcmUgbm90IHN1cHBvcnRlZCBieSBSZWQgSGF0IGFuZCBhcmUgZXhjbHVkZWQgZnJvbSB0aGUgbGlzdCBvZiByZXBvc2l0b3JpZXMgdXNlZCBkdXJpbmcgdGhlIHVwZ3JhZGUuXG4tIGNvZGVyZWFkeS1idWlsZGVyLWZvci1yaGVsLTEwLXg4Nl82NC1ycG1zXG4tIGNvZGVyZWFkeS1idWlsZGVyLWZvci1yaGVsLTEwLWFhcmNoNjQtcnBtc1xuLSBjb2RlcmVhZHktYnVpbGRlci1iZXRhLWZvci1yaGVsLTEwLWFhcmNoNjQtcnBtc1xuLSBjb2RlcmVhZHktYnVpbGRlci1iZXRhLWZvci1yaGVsLTEwLXMzOTB4LXJwbXNcbi0gY29kZXJlYWR5LWJ1aWxkZXItZm9yLXJoZWwtMTAtcHBjNjRsZS1ycG1zXG4tIGNvZGVyZWFkeS1idWlsZGVyLWJldGEtZm9yLXJoZWwtMTAtcHBjNjRsZS1ycG1zXG4tIGNvZGVyZWFkeS1idWlsZGVyLWZvci1yaGVsLTEwLXMzOTB4LXJwbXNcbi0gY29kZXJlYWR5LWJ1aWxkZXItYmV0YS1mb3ItcmhlbC0xMC14ODZfNjQtcnBtcyIsCiAgICAgICJ0aXRsZSI6ICJFeGNsdWRlZCB0YXJnZXQgc3lzdGVtIHJlcG9zaXRvcmllcyIsCiAgICAgICJ0aW1lU3RhbXAiOiAiMjAyNS0xMS0xMlQxNDoxMToyNy42MzE2NTBaIiwKICAgICAgImhvc3RuYW1lIjogIm1hbmFnZWQtbm9kZTAxIiwKICAgICAgImFjdG9yIjogInJlcG9zaXRvcmllc19ibGFja2xpc3QiLAogICAgICAiaWQiOiAiMjk3ZTMzZTI4ZDY2NmFmNWI5YzRkNmVlZjA5OTZjN2RkYzgyODNiN2ZkMTI5OTY5OGMzOTEyMGQ5OWZlOTM2MiIKICAgIH0KICBdLAogICJsZWFwcF9ydW5faWQiOiAiOGYzNmY1YjgtYWI0Yy00NWJjLWI2Y2UtNGM0OTlkOGU4NzM4Igp9Cg==", "encoding": "base64", "source": "/var/log/leapp/leapp-report.json"} TASK [infra.leapp.parse_leapp_report : Parse report results] ******************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/parse_leapp_report/tasks/main.yml:16 ok: [managed-node01] => {"ansible_facts": {"leapp_report_json": {"entries": [{"actor": "scan_subscription_manager_info", "audience": "sysadmin", "groups": ["error"], "hostname": "managed-node01", "id": "6c54bb42bf2419fc995c434e1f21143f70695968b09a213a468f43c109534eab", "key": "7ec8269784db1bba2ac54ae438689ef397e16833", "severity": "high", "summary": "{\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}", "timeStamp": "2025-11-12T14:11:27.428581Z", "title": "A subscription-manager command failed to execute"}, {"actor": "repositories_blacklist", "audience": "sysadmin", "detail": {"remediations": [{"context": "If some of excluded repositories are still required to be used during the upgrade, execute leapp with the --enablerepo option with the repoid of the repository required to be enabled as an argument (the option can be used multiple times).", "type": "hint"}]}, "groups": ["repository", "failure"], "hostname": "managed-node01", "id": "297e33e28d666af5b9c4d6eef0996c7ddc8283b7fd1299698c39120d99fe9362", "key": "1b9132cb2362ae7830e48eee7811be9527747de8", "severity": "info", "summary": "The following repositories are not supported by Red Hat and are excluded from the list of repositories used during the upgrade.\n- codeready-builder-for-rhel-10-x86_64-rpms\n- codeready-builder-for-rhel-10-aarch64-rpms\n- codeready-builder-beta-for-rhel-10-aarch64-rpms\n- codeready-builder-beta-for-rhel-10-s390x-rpms\n- codeready-builder-for-rhel-10-ppc64le-rpms\n- codeready-builder-beta-for-rhel-10-ppc64le-rpms\n- codeready-builder-for-rhel-10-s390x-rpms\n- codeready-builder-beta-for-rhel-10-x86_64-rpms", "timeStamp": "2025-11-12T14:11:27.631650Z", "title": "Excluded target system repositories"}], "leapp_run_id": "8f36f5b8-ab4c-45bc-b6ce-4c499d8e8738"}, "leapp_report_txt": ["Risk Factor: high (error)", "Title: A subscription-manager command failed to execute", "Summary: {\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}", "Key: 7ec8269784db1bba2ac54ae438689ef397e16833", "----------------------------------------", "Risk Factor: info ", "Title: Excluded target system repositories", "Summary: The following repositories are not supported by Red Hat and are excluded from the list of repositories used during the upgrade.", "- codeready-builder-for-rhel-10-x86_64-rpms", "- codeready-builder-for-rhel-10-aarch64-rpms", "- codeready-builder-beta-for-rhel-10-aarch64-rpms", "- codeready-builder-beta-for-rhel-10-s390x-rpms", "- codeready-builder-for-rhel-10-ppc64le-rpms", "- codeready-builder-beta-for-rhel-10-ppc64le-rpms", "- codeready-builder-for-rhel-10-s390x-rpms", "- codeready-builder-beta-for-rhel-10-x86_64-rpms", "Remediation: [hint] If some of excluded repositories are still required to be used during the upgrade, execute leapp with the --enablerepo option with the repoid of the repository required to be enabled as an argument (the option can be used multiple times).", "Key: 1b9132cb2362ae7830e48eee7811be9527747de8", "----------------------------------------", ""]}, "changed": false} TASK [infra.leapp.parse_leapp_report : Check for inhibitors] ******************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/parse_leapp_report/tasks/main.yml:22 ok: [managed-node01] => (item={'audience': 'sysadmin', 'groups': ['error'], 'key': '7ec8269784db1bba2ac54ae438689ef397e16833', 'severity': 'high', 'summary': '{"details": "Command [\'subscription-manager\', \'release\'] failed with exit code 1.", "stderr": "This system is not yet registered. Try \'subscription-manager register --help\' for more information.\\n", "hint": "Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.", "link": "https://access.redhat.com/solutions/6138372"}', 'title': 'A subscription-manager command failed to execute', 'timeStamp': '2025-11-12T14:11:27.428581Z', 'hostname': 'managed-node01', 'actor': 'scan_subscription_manager_info', 'id': '6c54bb42bf2419fc995c434e1f21143f70695968b09a213a468f43c109534eab'}) => {"ansible_facts": {"leapp_inhibitors": [{"actor": "scan_subscription_manager_info", "audience": "sysadmin", "groups": ["error"], "hostname": "managed-node01", "id": "6c54bb42bf2419fc995c434e1f21143f70695968b09a213a468f43c109534eab", "key": "7ec8269784db1bba2ac54ae438689ef397e16833", "severity": "high", "summary": "{\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}", "timeStamp": "2025-11-12T14:11:27.428581Z", "title": "A subscription-manager command failed to execute"}], "upgrade_inhibited": true}, "ansible_loop_var": "item", "changed": false, "item": {"actor": "scan_subscription_manager_info", "audience": "sysadmin", "groups": ["error"], "hostname": "managed-node01", "id": "6c54bb42bf2419fc995c434e1f21143f70695968b09a213a468f43c109534eab", "key": "7ec8269784db1bba2ac54ae438689ef397e16833", "severity": "high", "summary": "{\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}", "timeStamp": "2025-11-12T14:11:27.428581Z", "title": "A subscription-manager command failed to execute"}} skipping: [managed-node01] => (item={'audience': 'sysadmin', 'detail': {'remediations': [{'context': 'If some of excluded repositories are still required to be used during the upgrade, execute leapp with the --enablerepo option with the repoid of the repository required to be enabled as an argument (the option can be used multiple times).', 'type': 'hint'}]}, 'groups': ['repository', 'failure'], 'key': '1b9132cb2362ae7830e48eee7811be9527747de8', 'severity': 'info', 'summary': 'The following repositories are not supported by Red Hat and are excluded from the list of repositories used during the upgrade.\n- codeready-builder-for-rhel-10-x86_64-rpms\n- codeready-builder-for-rhel-10-aarch64-rpms\n- codeready-builder-beta-for-rhel-10-aarch64-rpms\n- codeready-builder-beta-for-rhel-10-s390x-rpms\n- codeready-builder-for-rhel-10-ppc64le-rpms\n- codeready-builder-beta-for-rhel-10-ppc64le-rpms\n- codeready-builder-for-rhel-10-s390x-rpms\n- codeready-builder-beta-for-rhel-10-x86_64-rpms', 'title': 'Excluded target system repositories', 'timeStamp': '2025-11-12T14:11:27.631650Z', 'hostname': 'managed-node01', 'actor': 'repositories_blacklist', 'id': '297e33e28d666af5b9c4d6eef0996c7ddc8283b7fd1299698c39120d99fe9362'}) => {"ansible_loop_var": "item", "changed": false, "item": {"actor": "repositories_blacklist", "audience": "sysadmin", "detail": {"remediations": [{"context": "If some of excluded repositories are still required to be used during the upgrade, execute leapp with the --enablerepo option with the repoid of the repository required to be enabled as an argument (the option can be used multiple times).", "type": "hint"}]}, "groups": ["repository", "failure"], "hostname": "managed-node01", "id": "297e33e28d666af5b9c4d6eef0996c7ddc8283b7fd1299698c39120d99fe9362", "key": "1b9132cb2362ae7830e48eee7811be9527747de8", "severity": "info", "summary": "The following repositories are not supported by Red Hat and are excluded from the list of repositories used during the upgrade.\n- codeready-builder-for-rhel-10-x86_64-rpms\n- codeready-builder-for-rhel-10-aarch64-rpms\n- codeready-builder-beta-for-rhel-10-aarch64-rpms\n- codeready-builder-beta-for-rhel-10-s390x-rpms\n- codeready-builder-for-rhel-10-ppc64le-rpms\n- codeready-builder-beta-for-rhel-10-ppc64le-rpms\n- codeready-builder-for-rhel-10-s390x-rpms\n- codeready-builder-beta-for-rhel-10-x86_64-rpms", "timeStamp": "2025-11-12T14:11:27.631650Z", "title": "Excluded target system repositories"}, "skip_reason": "Conditional result was False"} TASK [infra.leapp.parse_leapp_report : Collect inhibitors] ********************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/parse_leapp_report/tasks/main.yml:34 ok: [managed-node01] => {"changed": false, "cmd": ["awk", "/\\(inhibitor\\)/,/^-------/", "/var/log/leapp/leapp-report.txt"], "delta": "0:00:00.004438", "end": "2025-11-12 09:12:43.013543", "failed_when_result": false, "msg": "", "rc": 0, "start": "2025-11-12 09:12:43.009105", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [infra.leapp.parse_leapp_report : Collect high errors] ******************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/parse_leapp_report/tasks/main.yml:43 ok: [managed-node01] => {"changed": false, "cmd": ["awk", "/high \\(error\\)/,/^-------/", "/var/log/leapp/leapp-report.txt"], "delta": "0:00:00.004026", "end": "2025-11-12 09:12:43.377209", "failed_when_result": false, "msg": "", "rc": 0, "start": "2025-11-12 09:12:43.373183", "stderr": "", "stderr_lines": [], "stdout": "Risk Factor: high (error)\nTitle: A subscription-manager command failed to execute\nSummary: {\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}\nKey: 7ec8269784db1bba2ac54ae438689ef397e16833\n----------------------------------------", "stdout_lines": ["Risk Factor: high (error)", "Title: A subscription-manager command failed to execute", "Summary: {\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}", "Key: 7ec8269784db1bba2ac54ae438689ef397e16833", "----------------------------------------"]} TASK [infra.leapp.upgrade : leapp-upgrade | Verify no inhibitor results found during preupgrade] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:7 fatal: [managed-node01]: FAILED! => { "assertion": "not upgrade_inhibited", "changed": false, "evaluated_to": false, "msg": "Inhibitors found, please investigate and rerun analysis." } PLAY RECAP ********************************************************************* managed-node01 : ok=17 changed=2 unreachable=0 failed=1 skipped=2 rescued=0 ignored=0 Nov 12 09:12:36 managed-node01 python3[23770]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 12 09:12:37 managed-node01 python3[23923]: ansible-ansible.builtin.file Invoked with path=/var/log/ripu state=directory owner=root group=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 12 09:12:38 managed-node01 python3[24048]: ansible-ansible.builtin.stat Invoked with path=/var/log/ripu/ripu.log follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 12 09:12:38 managed-node01 python3[24173]: ansible-ansible.legacy.stat Invoked with path=/var/log/ripu/ripu.log follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 12 09:12:39 managed-node01 python3[24273]: ansible-ansible.legacy.copy Invoked with dest=/var/log/ripu/ripu.log owner=root group=root mode=0644 src=/root/.ansible/tmp/ansible-tmp-1762956758.4072073-8798-163312043651075/source _original_basename=tmpoocnnw51 follow=False checksum=6c22fce1c9bde0c9557aa4887ed1089504b88c63 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 12 09:12:39 managed-node01 python3[24398]: ansible-ansible.builtin.file Invoked with path=/etc/ansible/facts.d state=directory mode=0755 owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 12 09:12:39 managed-node01 python3[24523]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/pre_ripu.fact follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 12 09:12:40 managed-node01 python3[24625]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/pre_ripu.fact mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1762956759.5976355-8826-117335524402429/source _original_basename=tmp7fiub27g follow=False checksum=1137dc2924cbcaa07c98c41035c06fc09443abf2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 12 09:12:40 managed-node01 python3[24750]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\.|+]el9' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 12 09:12:41 managed-node01 python3[24880]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/non_rhel_packages.fact follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 12 09:12:41 managed-node01 python3[24943]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/etc/ansible/facts.d/non_rhel_packages.fact _original_basename=tmpgpuba729 recurse=False state=file path=/etc/ansible/facts.d/non_rhel_packages.fact force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 12 09:12:42 managed-node01 python3[25068]: ansible-ansible.builtin.slurp Invoked with src=/var/log/leapp/leapp-report.txt Nov 12 09:12:42 managed-node01 python3[25193]: ansible-ansible.builtin.slurp Invoked with src=/var/log/leapp/leapp-report.json Nov 12 09:12:43 managed-node01 python3[25318]: ansible-ansible.legacy.command Invoked with _raw_params=awk '/\(inhibitor\)/,/^-------/' /var/log/leapp/leapp-report.txt _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 12 09:12:43 managed-node01 python3[25444]: ansible-ansible.legacy.command Invoked with _raw_params=awk '/high \(error\)/,/^-------/' /var/log/leapp/leapp-report.txt _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 12 09:12:43 managed-node01 sshd[25466]: Accepted publickey for root from 10.31.40.223 port 56406 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Nov 12 09:12:43 managed-node01 systemd-logind[604]: New session 21 of user root. ░░ Subject: A new session 21 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 21 has been created for the user root. ░░ ░░ The leading process of the session is 25466. Nov 12 09:12:43 managed-node01 systemd[1]: Started Session 21 of User root. ░░ Subject: A start job for unit session-21.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-21.scope has finished successfully. ░░ ░░ The job identifier is 2632. Nov 12 09:12:43 managed-node01 sshd[25466]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Nov 12 09:12:43 managed-node01 sshd[25469]: Received disconnect from 10.31.40.223 port 56406:11: disconnected by user Nov 12 09:12:43 managed-node01 sshd[25469]: Disconnected from user root 10.31.40.223 port 56406 Nov 12 09:12:43 managed-node01 sshd[25466]: pam_unix(sshd:session): session closed for user root Nov 12 09:12:43 managed-node01 systemd[1]: session-21.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-21.scope has successfully entered the 'dead' state. Nov 12 09:12:43 managed-node01 systemd-logind[604]: Session 21 logged out. Waiting for processes to exit. Nov 12 09:12:43 managed-node01 systemd-logind[604]: Removed session 21. ░░ Subject: Session 21 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 21 has been terminated. Nov 12 09:12:43 managed-node01 sshd[25490]: Accepted publickey for root from 10.31.40.223 port 56422 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Nov 12 09:12:43 managed-node01 systemd-logind[604]: New session 22 of user root. ░░ Subject: A new session 22 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 22 has been created for the user root. ░░ ░░ The leading process of the session is 25490. Nov 12 09:12:43 managed-node01 systemd[1]: Started Session 22 of User root. ░░ Subject: A start job for unit session-22.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-22.scope has finished successfully. ░░ ░░ The job identifier is 2717. Nov 12 09:12:43 managed-node01 sshd[25490]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0)