[WARNING]: Collection infra.leapp does not support Ansible version 2.14.18 [WARNING]: running playbook inside collection infra.leapp ansible-playbook [core 2.14.18] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections executable location = /usr/bin/ansible-playbook python version = 3.9.23 (main, Aug 19 2025, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-11)] (/usr/bin/python3) jinja version = 3.1.2 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_default.yml **************************************************** 1 plays in /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tests/tests_default.yml PLAY [Test] ******************************************************************** TASK [Gathering Facts] ********************************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tests/tests_default.yml:2 ok: [managed-node01] TASK [infra.leapp.common : Log directory exists] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:3 ok: [managed-node01] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/var/log/ripu", "secontext": "unconfined_u:object_r:var_log_t:s0", "size": 70, "state": "directory", "uid": 0} TASK [infra.leapp.common : Check for existing log file] ************************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:11 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : Fail if log file already exists] ******************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:16 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : Create new log file] ******************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:23 NOTIFIED HANDLER infra.leapp.common : Add end time to log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Slurp ripu.log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Decode ripu.log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Rename log file for managed-node01 changed: [managed-node01] => {"changed": true, "checksum": "10de7262f2b426fb0988d9dbe039dc88c32230b8", "dest": "/var/log/ripu/ripu.log", "gid": 0, "group": "root", "md5sum": "e4de4a3d3d553604ad59b722f8d52147", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 61, "src": "/root/.ansible/tmp/ansible-tmp-1762937124.4902751-8779-152821623607786/source", "state": "file", "uid": 0} TASK [infra.leapp.common : /etc/ansible/facts.d directory exists] ************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:35 ok: [managed-node01] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/ansible/facts.d", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 57, "state": "directory", "uid": 0} TASK [infra.leapp.common : Capture current ansible_facts for validation after upgrade] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:43 changed: [managed-node01] => {"changed": true, "checksum": "41d0c9182bde61145b6f28512b1df07c09893d85", "dest": "/etc/ansible/facts.d/pre_ripu.fact", "gid": 0, "group": "root", "md5sum": "45faa5c02adea4c551dc665aa1bd5472", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 14669, "src": "/root/.ansible/tmp/ansible-tmp-1762937125.5994902-8807-118245584112958/source", "state": "file", "uid": 0} TASK [infra.leapp.common : Capture a list of non-rhel versioned packages] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:51 ok: [managed-node01] => {"changed": false, "cmd": "set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\\.|+]el9' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort", "delta": "0:00:00.212864", "end": "2025-11-12 03:45:26.823412", "failed_when_result": false, "msg": "non-zero return code", "rc": 1, "start": "2025-11-12 03:45:26.610548", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [infra.leapp.common : Create fact with the non-rhel versioned packages list] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:65 ok: [managed-node01] => {"ansible_facts": {"non_rhel_packages": []}, "changed": false} TASK [infra.leapp.common : Capture the list of non-rhel versioned packages in a separate fact file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:69 ok: [managed-node01] => {"changed": false, "checksum": "97d170e1550eee4afc0af065b78cda302a97674c", "dest": "/etc/ansible/facts.d/non_rhel_packages.fact", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "path": "/etc/ansible/facts.d/non_rhel_packages.fact", "secontext": "system_u:object_r:etc_t:s0", "size": 2, "state": "file", "uid": 0} TASK [infra.leapp.upgrade : Include tasks for upgrade using redhat-upgrade-tool] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/main.yml:9 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.upgrade : Include tasks for leapp upgrade] ******************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/main.yml:13 [WARNING]: Collection community.general does not support Ansible version 2.14.18 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml for managed-node01 TASK [leapp-upgrade | Include the parse_leapp_report role to check for inhibitors] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:2 TASK [infra.leapp.parse_leapp_report : Default upgrade_inhibited to false] ***** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/parse_leapp_report/tasks/main.yml:2 ok: [managed-node01] => {"ansible_facts": {"upgrade_inhibited": false}, "changed": false} TASK [infra.leapp.parse_leapp_report : Collect human readable report results] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/parse_leapp_report/tasks/main.yml:6 ok: [managed-node01] => {"changed": false, "content": "UmlzayBGYWN0b3I6IGhpZ2ggKGVycm9yKQpUaXRsZTogQSBzdWJzY3JpcHRpb24tbWFuYWdlciBjb21tYW5kIGZhaWxlZCB0byBleGVjdXRlClN1bW1hcnk6IHsiZGV0YWlscyI6ICJDb21tYW5kIFsnc3Vic2NyaXB0aW9uLW1hbmFnZXInLCAncmVsZWFzZSddIGZhaWxlZCB3aXRoIGV4aXQgY29kZSAxLiIsICJzdGRlcnIiOiAiVGhpcyBzeXN0ZW0gaXMgbm90IHlldCByZWdpc3RlcmVkLiBUcnkgJ3N1YnNjcmlwdGlvbi1tYW5hZ2VyIHJlZ2lzdGVyIC0taGVscCcgZm9yIG1vcmUgaW5mb3JtYXRpb24uXG4iLCAiaGludCI6ICJQbGVhc2UgZW5zdXJlIHlvdSBoYXZlIGEgdmFsaWQgUkhFTCBzdWJzY3JpcHRpb24gYW5kIHlvdXIgbmV0d29yayBpcyB1cC4gSWYgeW91IGFyZSB1c2luZyBwcm94eSBmb3IgUmVkIEhhdCBzdWJzY3JpcHRpb24tbWFuYWdlciwgcGxlYXNlIG1ha2Ugc3VyZSBpdCBpcyBzcGVjaWZpZWQgaW5zaWRlIHRoZSAvZXRjL3Joc20vcmhzbS5jb25mIGZpbGUuIE9yIHVzZSB0aGUgLS1uby1yaHNtIG9wdGlvbiB3aGVuIHJ1bm5pbmcgbGVhcHAsIGlmIHlvdSBkbyBub3Qgd2FudCB0byB1c2Ugc3Vic2NyaXB0aW9uLW1hbmFnZXIgZm9yIHRoZSBpbi1wbGFjZSB1cGdyYWRlIGFuZCB5b3Ugd2FudCB0byBkZWxpdmVyIGFsbCB0YXJnZXQgcmVwb3NpdG9yaWVzIGJ5IHlvdXJzZWxmIG9yIHVzaW5nIFJIVUkgb24gcHVibGljIGNsb3VkLiIsICJsaW5rIjogImh0dHBzOi8vYWNjZXNzLnJlZGhhdC5jb20vc29sdXRpb25zLzYxMzgzNzIifQpLZXk6IDdlYzgyNjk3ODRkYjFiYmEyYWM1NGFlNDM4Njg5ZWYzOTdlMTY4MzMKLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLQpSaXNrIEZhY3RvcjogaW5mbyAKVGl0bGU6IEV4Y2x1ZGVkIHRhcmdldCBzeXN0ZW0gcmVwb3NpdG9yaWVzClN1bW1hcnk6IFRoZSBmb2xsb3dpbmcgcmVwb3NpdG9yaWVzIGFyZSBub3Qgc3VwcG9ydGVkIGJ5IFJlZCBIYXQgYW5kIGFyZSBleGNsdWRlZCBmcm9tIHRoZSBsaXN0IG9mIHJlcG9zaXRvcmllcyB1c2VkIGR1cmluZyB0aGUgdXBncmFkZS4KLSBjb2RlcmVhZHktYnVpbGRlci1mb3ItcmhlbC0xMC1zMzkweC1ycG1zCi0gY29kZXJlYWR5LWJ1aWxkZXItZm9yLXJoZWwtMTAteDg2XzY0LXJwbXMKLSBjb2RlcmVhZHktYnVpbGRlci1mb3ItcmhlbC0xMC1wcGM2NGxlLXJwbXMKLSBjb2RlcmVhZHktYnVpbGRlci1iZXRhLWZvci1yaGVsLTEwLXBwYzY0bGUtcnBtcwotIGNvZGVyZWFkeS1idWlsZGVyLWZvci1yaGVsLTEwLWFhcmNoNjQtcnBtcwotIGNvZGVyZWFkeS1idWlsZGVyLWJldGEtZm9yLXJoZWwtMTAtYWFyY2g2NC1ycG1zCi0gY29kZXJlYWR5LWJ1aWxkZXItYmV0YS1mb3ItcmhlbC0xMC1zMzkweC1ycG1zCi0gY29kZXJlYWR5LWJ1aWxkZXItYmV0YS1mb3ItcmhlbC0xMC14ODZfNjQtcnBtcwpSZW1lZGlhdGlvbjogW2hpbnRdIElmIHNvbWUgb2YgZXhjbHVkZWQgcmVwb3NpdG9yaWVzIGFyZSBzdGlsbCByZXF1aXJlZCB0byBiZSB1c2VkIGR1cmluZyB0aGUgdXBncmFkZSwgZXhlY3V0ZSBsZWFwcCB3aXRoIHRoZSAtLWVuYWJsZXJlcG8gb3B0aW9uIHdpdGggdGhlIHJlcG9pZCBvZiB0aGUgcmVwb3NpdG9yeSByZXF1aXJlZCB0byBiZSBlbmFibGVkIGFzIGFuIGFyZ3VtZW50ICh0aGUgb3B0aW9uIGNhbiBiZSB1c2VkIG11bHRpcGxlIHRpbWVzKS4KS2V5OiAxYjkxMzJjYjIzNjJhZTc4MzBlNDhlZWU3ODExYmU5NTI3NzQ3ZGU4Ci0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0K", "encoding": "base64", "source": "/var/log/leapp/leapp-report.txt"} TASK [infra.leapp.parse_leapp_report : Collect JSON report results] ************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/parse_leapp_report/tasks/main.yml:11 ok: [managed-node01] => {"changed": false, "content": "ewogICJlbnRyaWVzIjogWwogICAgewogICAgICAiYXVkaWVuY2UiOiAic3lzYWRtaW4iLAogICAgICAiZ3JvdXBzIjogWwogICAgICAgICJlcnJvciIKICAgICAgXSwKICAgICAgImtleSI6ICI3ZWM4MjY5Nzg0ZGIxYmJhMmFjNTRhZTQzODY4OWVmMzk3ZTE2ODMzIiwKICAgICAgInNldmVyaXR5IjogImhpZ2giLAogICAgICAic3VtbWFyeSI6ICJ7XCJkZXRhaWxzXCI6IFwiQ29tbWFuZCBbJ3N1YnNjcmlwdGlvbi1tYW5hZ2VyJywgJ3JlbGVhc2UnXSBmYWlsZWQgd2l0aCBleGl0IGNvZGUgMS5cIiwgXCJzdGRlcnJcIjogXCJUaGlzIHN5c3RlbSBpcyBub3QgeWV0IHJlZ2lzdGVyZWQuIFRyeSAnc3Vic2NyaXB0aW9uLW1hbmFnZXIgcmVnaXN0ZXIgLS1oZWxwJyBmb3IgbW9yZSBpbmZvcm1hdGlvbi5cXG5cIiwgXCJoaW50XCI6IFwiUGxlYXNlIGVuc3VyZSB5b3UgaGF2ZSBhIHZhbGlkIFJIRUwgc3Vic2NyaXB0aW9uIGFuZCB5b3VyIG5ldHdvcmsgaXMgdXAuIElmIHlvdSBhcmUgdXNpbmcgcHJveHkgZm9yIFJlZCBIYXQgc3Vic2NyaXB0aW9uLW1hbmFnZXIsIHBsZWFzZSBtYWtlIHN1cmUgaXQgaXMgc3BlY2lmaWVkIGluc2lkZSB0aGUgL2V0Yy9yaHNtL3Joc20uY29uZiBmaWxlLiBPciB1c2UgdGhlIC0tbm8tcmhzbSBvcHRpb24gd2hlbiBydW5uaW5nIGxlYXBwLCBpZiB5b3UgZG8gbm90IHdhbnQgdG8gdXNlIHN1YnNjcmlwdGlvbi1tYW5hZ2VyIGZvciB0aGUgaW4tcGxhY2UgdXBncmFkZSBhbmQgeW91IHdhbnQgdG8gZGVsaXZlciBhbGwgdGFyZ2V0IHJlcG9zaXRvcmllcyBieSB5b3Vyc2VsZiBvciB1c2luZyBSSFVJIG9uIHB1YmxpYyBjbG91ZC5cIiwgXCJsaW5rXCI6IFwiaHR0cHM6Ly9hY2Nlc3MucmVkaGF0LmNvbS9zb2x1dGlvbnMvNjEzODM3MlwifSIsCiAgICAgICJ0aXRsZSI6ICJBIHN1YnNjcmlwdGlvbi1tYW5hZ2VyIGNvbW1hbmQgZmFpbGVkIHRvIGV4ZWN1dGUiLAogICAgICAidGltZVN0YW1wIjogIjIwMjUtMTEtMTJUMDg6NDQ6MTQuMTY4MDg4WiIsCiAgICAgICJob3N0bmFtZSI6ICJtYW5hZ2VkLW5vZGUwMSIsCiAgICAgICJhY3RvciI6ICJzY2FuX3N1YnNjcmlwdGlvbl9tYW5hZ2VyX2luZm8iLAogICAgICAiaWQiOiAiZTA3MzNhYzU0ZDMwZmMwMGRkYzI0OGM5YzFjMjVjMDkzZWM4MjYzMzlhNGFmMjViZTg0M2NhNGM4MWY5ZWU0OCIKICAgIH0sCiAgICB7CiAgICAgICJhdWRpZW5jZSI6ICJzeXNhZG1pbiIsCiAgICAgICJkZXRhaWwiOiB7CiAgICAgICAgInJlbWVkaWF0aW9ucyI6IFsKICAgICAgICAgIHsKICAgICAgICAgICAgImNvbnRleHQiOiAiSWYgc29tZSBvZiBleGNsdWRlZCByZXBvc2l0b3JpZXMgYXJlIHN0aWxsIHJlcXVpcmVkIHRvIGJlIHVzZWQgZHVyaW5nIHRoZSB1cGdyYWRlLCBleGVjdXRlIGxlYXBwIHdpdGggdGhlIC0tZW5hYmxlcmVwbyBvcHRpb24gd2l0aCB0aGUgcmVwb2lkIG9mIHRoZSByZXBvc2l0b3J5IHJlcXVpcmVkIHRvIGJlIGVuYWJsZWQgYXMgYW4gYXJndW1lbnQgKHRoZSBvcHRpb24gY2FuIGJlIHVzZWQgbXVsdGlwbGUgdGltZXMpLiIsCiAgICAgICAgICAgICJ0eXBlIjogImhpbnQiCiAgICAgICAgICB9CiAgICAgICAgXQogICAgICB9LAogICAgICAiZ3JvdXBzIjogWwogICAgICAgICJyZXBvc2l0b3J5IiwKICAgICAgICAiZmFpbHVyZSIKICAgICAgXSwKICAgICAgImtleSI6ICIxYjkxMzJjYjIzNjJhZTc4MzBlNDhlZWU3ODExYmU5NTI3NzQ3ZGU4IiwKICAgICAgInNldmVyaXR5IjogImluZm8iLAogICAgICAic3VtbWFyeSI6ICJUaGUgZm9sbG93aW5nIHJlcG9zaXRvcmllcyBhcmUgbm90IHN1cHBvcnRlZCBieSBSZWQgSGF0IGFuZCBhcmUgZXhjbHVkZWQgZnJvbSB0aGUgbGlzdCBvZiByZXBvc2l0b3JpZXMgdXNlZCBkdXJpbmcgdGhlIHVwZ3JhZGUuXG4tIGNvZGVyZWFkeS1idWlsZGVyLWZvci1yaGVsLTEwLXMzOTB4LXJwbXNcbi0gY29kZXJlYWR5LWJ1aWxkZXItZm9yLXJoZWwtMTAteDg2XzY0LXJwbXNcbi0gY29kZXJlYWR5LWJ1aWxkZXItZm9yLXJoZWwtMTAtcHBjNjRsZS1ycG1zXG4tIGNvZGVyZWFkeS1idWlsZGVyLWJldGEtZm9yLXJoZWwtMTAtcHBjNjRsZS1ycG1zXG4tIGNvZGVyZWFkeS1idWlsZGVyLWZvci1yaGVsLTEwLWFhcmNoNjQtcnBtc1xuLSBjb2RlcmVhZHktYnVpbGRlci1iZXRhLWZvci1yaGVsLTEwLWFhcmNoNjQtcnBtc1xuLSBjb2RlcmVhZHktYnVpbGRlci1iZXRhLWZvci1yaGVsLTEwLXMzOTB4LXJwbXNcbi0gY29kZXJlYWR5LWJ1aWxkZXItYmV0YS1mb3ItcmhlbC0xMC14ODZfNjQtcnBtcyIsCiAgICAgICJ0aXRsZSI6ICJFeGNsdWRlZCB0YXJnZXQgc3lzdGVtIHJlcG9zaXRvcmllcyIsCiAgICAgICJ0aW1lU3RhbXAiOiAiMjAyNS0xMS0xMlQwODo0NDoxNC4zODgxNzFaIiwKICAgICAgImhvc3RuYW1lIjogIm1hbmFnZWQtbm9kZTAxIiwKICAgICAgImFjdG9yIjogInJlcG9zaXRvcmllc19ibGFja2xpc3QiLAogICAgICAiaWQiOiAiZjQ3YmU0MzUxOGY2MzE5NWZkMjIxYzAwOTNjMzJmMDk3ZjFlNDY5MTU5YmE4YzA3MGM2ZTAwMGFhOWE1MmQzOCIKICAgIH0KICBdLAogICJsZWFwcF9ydW5faWQiOiAiZTY1M2I4MGItNmFlNS00NTM3LThlYTktZDNjOWM3MzRlNDU0Igp9Cg==", "encoding": "base64", "source": "/var/log/leapp/leapp-report.json"} TASK [infra.leapp.parse_leapp_report : Parse report results] ******************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/parse_leapp_report/tasks/main.yml:16 ok: [managed-node01] => {"ansible_facts": {"leapp_report_json": {"entries": [{"actor": "scan_subscription_manager_info", "audience": "sysadmin", "groups": ["error"], "hostname": "managed-node01", "id": "e0733ac54d30fc00ddc248c9c1c25c093ec826339a4af25be843ca4c81f9ee48", "key": "7ec8269784db1bba2ac54ae438689ef397e16833", "severity": "high", "summary": "{\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}", "timeStamp": "2025-11-12T08:44:14.168088Z", "title": "A subscription-manager command failed to execute"}, {"actor": "repositories_blacklist", "audience": "sysadmin", "detail": {"remediations": [{"context": "If some of excluded repositories are still required to be used during the upgrade, execute leapp with the --enablerepo option with the repoid of the repository required to be enabled as an argument (the option can be used multiple times).", "type": "hint"}]}, "groups": ["repository", "failure"], "hostname": "managed-node01", "id": "f47be43518f63195fd221c0093c32f097f1e469159ba8c070c6e000aa9a52d38", "key": "1b9132cb2362ae7830e48eee7811be9527747de8", "severity": "info", "summary": "The following repositories are not supported by Red Hat and are excluded from the list of repositories used during the upgrade.\n- codeready-builder-for-rhel-10-s390x-rpms\n- codeready-builder-for-rhel-10-x86_64-rpms\n- codeready-builder-for-rhel-10-ppc64le-rpms\n- codeready-builder-beta-for-rhel-10-ppc64le-rpms\n- codeready-builder-for-rhel-10-aarch64-rpms\n- codeready-builder-beta-for-rhel-10-aarch64-rpms\n- codeready-builder-beta-for-rhel-10-s390x-rpms\n- codeready-builder-beta-for-rhel-10-x86_64-rpms", "timeStamp": "2025-11-12T08:44:14.388171Z", "title": "Excluded target system repositories"}], "leapp_run_id": "e653b80b-6ae5-4537-8ea9-d3c9c734e454"}, "leapp_report_txt": ["Risk Factor: high (error)", "Title: A subscription-manager command failed to execute", "Summary: {\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}", "Key: 7ec8269784db1bba2ac54ae438689ef397e16833", "----------------------------------------", "Risk Factor: info ", "Title: Excluded target system repositories", "Summary: The following repositories are not supported by Red Hat and are excluded from the list of repositories used during the upgrade.", "- codeready-builder-for-rhel-10-s390x-rpms", "- codeready-builder-for-rhel-10-x86_64-rpms", "- codeready-builder-for-rhel-10-ppc64le-rpms", "- codeready-builder-beta-for-rhel-10-ppc64le-rpms", "- codeready-builder-for-rhel-10-aarch64-rpms", "- codeready-builder-beta-for-rhel-10-aarch64-rpms", "- codeready-builder-beta-for-rhel-10-s390x-rpms", "- codeready-builder-beta-for-rhel-10-x86_64-rpms", "Remediation: [hint] If some of excluded repositories are still required to be used during the upgrade, execute leapp with the --enablerepo option with the repoid of the repository required to be enabled as an argument (the option can be used multiple times).", "Key: 1b9132cb2362ae7830e48eee7811be9527747de8", "----------------------------------------", ""]}, "changed": false} TASK [infra.leapp.parse_leapp_report : Check for inhibitors] ******************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/parse_leapp_report/tasks/main.yml:22 ok: [managed-node01] => (item={'audience': 'sysadmin', 'groups': ['error'], 'key': '7ec8269784db1bba2ac54ae438689ef397e16833', 'severity': 'high', 'summary': '{"details": "Command [\'subscription-manager\', \'release\'] failed with exit code 1.", "stderr": "This system is not yet registered. Try \'subscription-manager register --help\' for more information.\\n", "hint": "Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.", "link": "https://access.redhat.com/solutions/6138372"}', 'title': 'A subscription-manager command failed to execute', 'timeStamp': '2025-11-12T08:44:14.168088Z', 'hostname': 'managed-node01', 'actor': 'scan_subscription_manager_info', 'id': 'e0733ac54d30fc00ddc248c9c1c25c093ec826339a4af25be843ca4c81f9ee48'}) => {"ansible_facts": {"leapp_inhibitors": [{"actor": "scan_subscription_manager_info", "audience": "sysadmin", "groups": ["error"], "hostname": "managed-node01", "id": "e0733ac54d30fc00ddc248c9c1c25c093ec826339a4af25be843ca4c81f9ee48", "key": "7ec8269784db1bba2ac54ae438689ef397e16833", "severity": "high", "summary": "{\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}", "timeStamp": "2025-11-12T08:44:14.168088Z", "title": "A subscription-manager command failed to execute"}], "upgrade_inhibited": true}, "ansible_loop_var": "item", "changed": false, "item": {"actor": "scan_subscription_manager_info", "audience": "sysadmin", "groups": ["error"], "hostname": "managed-node01", "id": "e0733ac54d30fc00ddc248c9c1c25c093ec826339a4af25be843ca4c81f9ee48", "key": "7ec8269784db1bba2ac54ae438689ef397e16833", "severity": "high", "summary": "{\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}", "timeStamp": "2025-11-12T08:44:14.168088Z", "title": "A subscription-manager command failed to execute"}} skipping: [managed-node01] => (item={'audience': 'sysadmin', 'detail': {'remediations': [{'context': 'If some of excluded repositories are still required to be used during the upgrade, execute leapp with the --enablerepo option with the repoid of the repository required to be enabled as an argument (the option can be used multiple times).', 'type': 'hint'}]}, 'groups': ['repository', 'failure'], 'key': '1b9132cb2362ae7830e48eee7811be9527747de8', 'severity': 'info', 'summary': 'The following repositories are not supported by Red Hat and are excluded from the list of repositories used during the upgrade.\n- codeready-builder-for-rhel-10-s390x-rpms\n- codeready-builder-for-rhel-10-x86_64-rpms\n- codeready-builder-for-rhel-10-ppc64le-rpms\n- codeready-builder-beta-for-rhel-10-ppc64le-rpms\n- codeready-builder-for-rhel-10-aarch64-rpms\n- codeready-builder-beta-for-rhel-10-aarch64-rpms\n- codeready-builder-beta-for-rhel-10-s390x-rpms\n- codeready-builder-beta-for-rhel-10-x86_64-rpms', 'title': 'Excluded target system repositories', 'timeStamp': '2025-11-12T08:44:14.388171Z', 'hostname': 'managed-node01', 'actor': 'repositories_blacklist', 'id': 'f47be43518f63195fd221c0093c32f097f1e469159ba8c070c6e000aa9a52d38'}) => {"ansible_loop_var": "item", "changed": false, "item": {"actor": "repositories_blacklist", "audience": "sysadmin", "detail": {"remediations": [{"context": "If some of excluded repositories are still required to be used during the upgrade, execute leapp with the --enablerepo option with the repoid of the repository required to be enabled as an argument (the option can be used multiple times).", "type": "hint"}]}, "groups": ["repository", "failure"], "hostname": "managed-node01", "id": "f47be43518f63195fd221c0093c32f097f1e469159ba8c070c6e000aa9a52d38", "key": "1b9132cb2362ae7830e48eee7811be9527747de8", "severity": "info", "summary": "The following repositories are not supported by Red Hat and are excluded from the list of repositories used during the upgrade.\n- codeready-builder-for-rhel-10-s390x-rpms\n- codeready-builder-for-rhel-10-x86_64-rpms\n- codeready-builder-for-rhel-10-ppc64le-rpms\n- codeready-builder-beta-for-rhel-10-ppc64le-rpms\n- codeready-builder-for-rhel-10-aarch64-rpms\n- codeready-builder-beta-for-rhel-10-aarch64-rpms\n- codeready-builder-beta-for-rhel-10-s390x-rpms\n- codeready-builder-beta-for-rhel-10-x86_64-rpms", "timeStamp": "2025-11-12T08:44:14.388171Z", "title": "Excluded target system repositories"}, "skip_reason": "Conditional result was False"} TASK [infra.leapp.parse_leapp_report : Collect inhibitors] ********************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/parse_leapp_report/tasks/main.yml:34 ok: [managed-node01] => {"changed": false, "cmd": ["awk", "/\\(inhibitor\\)/,/^-------/", "/var/log/leapp/leapp-report.txt"], "delta": "0:00:00.004508", "end": "2025-11-12 03:45:28.825316", "failed_when_result": false, "msg": "", "rc": 0, "start": "2025-11-12 03:45:28.820808", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [infra.leapp.parse_leapp_report : Collect high errors] ******************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/parse_leapp_report/tasks/main.yml:43 ok: [managed-node01] => {"changed": false, "cmd": ["awk", "/high \\(error\\)/,/^-------/", "/var/log/leapp/leapp-report.txt"], "delta": "0:00:00.004531", "end": "2025-11-12 03:45:29.170958", "failed_when_result": false, "msg": "", "rc": 0, "start": "2025-11-12 03:45:29.166427", "stderr": "", "stderr_lines": [], "stdout": "Risk Factor: high (error)\nTitle: A subscription-manager command failed to execute\nSummary: {\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}\nKey: 7ec8269784db1bba2ac54ae438689ef397e16833\n----------------------------------------", "stdout_lines": ["Risk Factor: high (error)", "Title: A subscription-manager command failed to execute", "Summary: {\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}", "Key: 7ec8269784db1bba2ac54ae438689ef397e16833", "----------------------------------------"]} TASK [infra.leapp.upgrade : leapp-upgrade | Verify no inhibitor results found during preupgrade] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:7 fatal: [managed-node01]: FAILED! => { "assertion": "not upgrade_inhibited", "changed": false, "evaluated_to": false, "msg": "Inhibitors found, please investigate and rerun analysis." } PLAY RECAP ********************************************************************* managed-node01 : ok=17 changed=2 unreachable=0 failed=1 skipped=2 rescued=0 ignored=0 Nov 12 03:45:23 managed-node01 python3[23766]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Nov 12 03:45:23 managed-node01 python3[23920]: ansible-ansible.builtin.file Invoked with path=/var/log/ripu state=directory owner=root group=root mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 12 03:45:24 managed-node01 python3[24045]: ansible-ansible.builtin.stat Invoked with path=/var/log/ripu/ripu.log follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Nov 12 03:45:24 managed-node01 python3[24170]: ansible-ansible.legacy.stat Invoked with path=/var/log/ripu/ripu.log follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 12 03:45:25 managed-node01 python3[24270]: ansible-ansible.legacy.copy Invoked with dest=/var/log/ripu/ripu.log owner=root group=root mode=0644 src=/root/.ansible/tmp/ansible-tmp-1762937124.4902751-8779-152821623607786/source _original_basename=tmpjv17di7n follow=False checksum=10de7262f2b426fb0988d9dbe039dc88c32230b8 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 12 03:45:25 managed-node01 python3[24395]: ansible-ansible.builtin.file Invoked with path=/etc/ansible/facts.d state=directory mode=0755 owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 12 03:45:25 managed-node01 python3[24520]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/pre_ripu.fact follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 12 03:45:26 managed-node01 python3[24622]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/pre_ripu.fact mode=0644 owner=root group=root src=/root/.ansible/tmp/ansible-tmp-1762937125.5994902-8807-118245584112958/source _original_basename=tmpxrwv5ysh follow=False checksum=41d0c9182bde61145b6f28512b1df07c09893d85 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Nov 12 03:45:26 managed-node01 python3[24747]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\.|+]el9' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 12 03:45:27 managed-node01 python3[24877]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/non_rhel_packages.fact follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Nov 12 03:45:27 managed-node01 python3[24940]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/etc/ansible/facts.d/non_rhel_packages.fact _original_basename=tmpj4v6yxeg recurse=False state=file path=/etc/ansible/facts.d/non_rhel_packages.fact force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Nov 12 03:45:28 managed-node01 python3[25065]: ansible-ansible.builtin.slurp Invoked with src=/var/log/leapp/leapp-report.txt Nov 12 03:45:28 managed-node01 python3[25190]: ansible-ansible.builtin.slurp Invoked with src=/var/log/leapp/leapp-report.json Nov 12 03:45:28 managed-node01 python3[25315]: ansible-ansible.legacy.command Invoked with _raw_params=awk '/\(inhibitor\)/,/^-------/' /var/log/leapp/leapp-report.txt _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 12 03:45:29 managed-node01 python3[25441]: ansible-ansible.legacy.command Invoked with _raw_params=awk '/high \(error\)/,/^-------/' /var/log/leapp/leapp-report.txt _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Nov 12 03:45:29 managed-node01 sshd[25463]: Accepted publickey for root from 10.31.9.240 port 35532 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Nov 12 03:45:29 managed-node01 systemd-logind[601]: New session 20 of user root. ░░ Subject: A new session 20 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 20 has been created for the user root. ░░ ░░ The leading process of the session is 25463. Nov 12 03:45:29 managed-node01 systemd[1]: Started Session 20 of User root. ░░ Subject: A start job for unit session-20.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-20.scope has finished successfully. ░░ ░░ The job identifier is 2543. Nov 12 03:45:29 managed-node01 sshd[25463]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Nov 12 03:45:29 managed-node01 sshd[25466]: Received disconnect from 10.31.9.240 port 35532:11: disconnected by user Nov 12 03:45:29 managed-node01 sshd[25466]: Disconnected from user root 10.31.9.240 port 35532 Nov 12 03:45:29 managed-node01 sshd[25463]: pam_unix(sshd:session): session closed for user root Nov 12 03:45:29 managed-node01 systemd[1]: session-20.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-20.scope has successfully entered the 'dead' state. Nov 12 03:45:29 managed-node01 systemd-logind[601]: Session 20 logged out. Waiting for processes to exit. Nov 12 03:45:29 managed-node01 systemd-logind[601]: Removed session 20. ░░ Subject: Session 20 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 20 has been terminated. Nov 12 03:45:29 managed-node01 sshd[25487]: Accepted publickey for root from 10.31.9.240 port 35548 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Nov 12 03:45:29 managed-node01 systemd-logind[601]: New session 21 of user root. ░░ Subject: A new session 21 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 21 has been created for the user root. ░░ ░░ The leading process of the session is 25487. Nov 12 03:45:29 managed-node01 systemd[1]: Started Session 21 of User root. ░░ Subject: A start job for unit session-21.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-21.scope has finished successfully. ░░ ░░ The job identifier is 2628. Nov 12 03:45:29 managed-node01 sshd[25487]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0)