[WARNING]: Collection infra.leapp does not support Ansible version 2.14.18 [WARNING]: running playbook inside collection infra.leapp ansible-playbook [core 2.14.18] config file = /etc/ansible/ansible.cfg configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python3.9/site-packages/ansible ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections executable location = /usr/bin/ansible-playbook python version = 3.9.25 (main, Nov 10 2025, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-11)] (/usr/bin/python3) jinja version = 3.1.2 libyaml = True Using /etc/ansible/ansible.cfg as config file Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_default.yml **************************************************** 1 plays in /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tests/tests_default.yml PLAY [Test] ******************************************************************** TASK [Gathering Facts] ********************************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tests/tests_default.yml:2 ok: [managed-node01] TASK [Test | Run role upgrade] ************************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tests/tests_default.yml:10 TASK [infra.leapp.common : Log directory exists] ******************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:3 ok: [managed-node01] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/var/log/ripu", "secontext": "unconfined_u:object_r:var_log_t:s0", "size": 70, "state": "directory", "uid": 0} TASK [infra.leapp.common : Check for existing log file] ************************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:11 ok: [managed-node01] => {"changed": false, "stat": {"exists": false}} TASK [infra.leapp.common : Fail if log file already exists] ******************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:16 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : Create new log file] ******************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:23 NOTIFIED HANDLER infra.leapp.common : Check for log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Add end time to log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Slurp ripu.log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Decode ripu.log file for managed-node01 NOTIFIED HANDLER infra.leapp.common : Rename log file for managed-node01 changed: [managed-node01] => {"changed": true, "checksum": "db43f4d5a1fa22b47bc6cb6aec3096d42aff147a", "dest": "/var/log/ripu/ripu.log", "gid": 0, "group": "root", "md5sum": "4d354104b0a9cb78c75a143d802b0bd7", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:var_log_t:s0", "size": 61, "src": "/root/.ansible/tmp/ansible-tmp-1764751800.1777818-8784-98660230793097/source", "state": "file", "uid": 0} TASK [infra.leapp.common : /etc/ansible/facts.d directory exists] ************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:35 ok: [managed-node01] => {"changed": false, "gid": 0, "group": "root", "mode": "0755", "owner": "root", "path": "/etc/ansible/facts.d", "secontext": "unconfined_u:object_r:etc_t:s0", "size": 57, "state": "directory", "uid": 0} TASK [infra.leapp.common : Capture current ansible_facts for validation after upgrade] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:43 changed: [managed-node01] => {"changed": true, "checksum": "3e43056650fd838d0e3304d9466d22375a0c3894", "dest": "/etc/ansible/facts.d/pre_ripu.fact", "gid": 0, "group": "root", "md5sum": "97c2bfc36bea702ca22dbb879d595e78", "mode": "0644", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 15006, "src": "/root/.ansible/tmp/ansible-tmp-1764751801.3317106-8812-166473042912510/source", "state": "file", "uid": 0} TASK [infra.leapp.common : Capture a list of non-rhel versioned packages] ****** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:51 ok: [managed-node01] => {"changed": false, "cmd": "set -o pipefail; export PATH=$PATH; rpm -qa | grep -ve '[\\.|+]el9' | grep -vE '^(gpg-pubkey|libmodulemd|katello-ca-consumer)' | sort", "delta": "0:00:00.215269", "end": "2025-12-03 03:50:02.589874", "failed_when_result": false, "msg": "non-zero return code", "rc": 1, "start": "2025-12-03 03:50:02.374605", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [infra.leapp.common : Create fact with the non-rhel versioned packages list] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:65 ok: [managed-node01] => {"ansible_facts": {"non_rhel_packages": []}, "changed": false} TASK [infra.leapp.common : Capture the list of non-rhel versioned packages in a separate fact file] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/main.yml:69 ok: [managed-node01] => {"changed": false, "checksum": "97d170e1550eee4afc0af065b78cda302a97674c", "dest": "/etc/ansible/facts.d/non_rhel_packages.fact", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "path": "/etc/ansible/facts.d/non_rhel_packages.fact", "secontext": "system_u:object_r:etc_t:s0", "size": 2, "state": "file", "uid": 0} TASK [infra.leapp.upgrade : Include tasks for upgrade using redhat-upgrade-tool] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/main.yml:9 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.upgrade : Include tasks for leapp upgrade] ******************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/main.yml:13 [WARNING]: Collection community.general does not support Ansible version 2.14.18 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml for managed-node01 TASK [leapp-upgrade | Run parse_leapp_report to check for inhibitors] ********** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:2 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.upgrade : leapp-upgrade | Verify no inhibitor results found during preupgrade] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:8 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.upgrade : leapp-upgrade | Register to leapp activation key] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:14 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [leapp-upgrade | Include custom_local_repos for local_repos_pre_leapp] **** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:25 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.upgrade : leapp-upgrade | Install packages for upgrade from RHEL 7] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:33 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.upgrade : leapp-upgrade | Install packages for upgrade from RHEL 8] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:40 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.upgrade : leapp-upgrade | Install packages for upgrade from RHEL 9] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:47 ok: [managed-node01] => {"changed": false, "msg": "Nothing to do", "rc": 0, "results": []} TASK [infra.leapp.upgrade : leapp-upgrade | Include update-and-reboot.yml] ***** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:54 included: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/update-and-reboot.yml for managed-node01 TASK [infra.leapp.upgrade : update-and-reboot | Ensure all updates are applied] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/update-and-reboot.yml:2 ASYNC OK on managed-node01: jid=j250291207079.48167 changed: [managed-node01] => {"ansible_job_id": "j250291207079.48167", "changed": true, "finished": 1, "msg": "", "rc": 0, "results": ["Installed: python3-libxml2-2.9.13-12.el9_6.1.x86_64", "Installed: libxml2-2.9.13-12.el9_6.1.x86_64", "Removed: python3-libxml2-2.9.13-12.el9_6.x86_64", "Removed: libxml2-2.9.13-12.el9_6.x86_64"], "results_file": "/root/.ansible_async/j250291207079.48167", "started": 1, "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [infra.leapp.upgrade : update-and-reboot | Reboot when updates applied] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/update-and-reboot.yml:10 changed: [managed-node01] => {"changed": true, "elapsed": 120, "rebooted": true} TASK [leapp-upgrade | Create /etc/leapp/files/leapp_upgrade_repositories.repo] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:58 skipping: [managed-node01] => {"changed": false, "skip_reason": "Conditional result was False"} TASK [infra.leapp.upgrade : leapp-upgrade | Include disable-previous-repo-files.yml] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:69 skipping: [managed-node01] => {"changed": false, "skipped_reason": "No items in the list"} TASK [infra.leapp.upgrade : leapp-upgrade | Include rmmod-kernel-modules.yml] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:75 skipping: [managed-node01] => {"changed": false, "skipped_reason": "No items in the list"} TASK [infra.leapp.upgrade : leapp-upgrade | Start Leapp OS upgrade] ************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:81 ASYNC FAILED on managed-node01: jid=j627872270803.4401 fatal: [managed-node01]: FAILED! => {"ansible_job_id": "j627872270803.4401", "changed": true, "cmd": "set -o pipefail; export PATH=$PATH; ulimit -n 16384; leapp upgrade --report-schema=1.2.0 2>&1 | tee -a /var/log/ripu/ripu.log\n", "delta": "0:00:27.502241", "end": "2025-12-03 03:53:38.103161", "finished": 1, "msg": "non-zero return code", "rc": 1, "results_file": "/root/.ansible_async/j627872270803.4401", "start": "2025-12-03 03:53:10.600920", "started": 1, "stderr": "", "stderr_lines": [], "stdout": "==> Processing phase `configuration_phase`\n====> * ipu_workflow_config\n IPU workflow config actor\n==> Processing phase `FactsCollection`\n====> * system_facts\n Provides data about many facts from system.\n====> * get_enabled_modules\n Provides data about which module streams are enabled on the source system.\n====> * scan_systemd_source\n Provides info about systemd on the source system\n====> * repository_mapping\n Produces message containing repository mapping based on provided file.\n====> * ifcfg_scanner\n Scan ifcfg files with legacy network configuration\n====> * transaction_workarounds\n Provides additional RPM transaction tasks based on bundled RPM packages.\n====> * selinuxcontentscanner\n Scan the system for any SELinux customizations\n====> * root_scanner\n Scan the system root directory and produce a message containing\n====> * rpm_scanner\n Provides data about installed RPM Packages.\n====> * udevadm_info\n Produces data exported by the \"udevadm info\" command.\n====> * trusted_gpg_keys_scanner\n Scan for trusted GPG keys.\n====> * satellite_upgrade_services\n Reconfigure Satellite services\n====> * scanclienablerepo\n Produce CustomTargetRepository based on the LEAPP_ENABLE_REPOS in config.\n====> * distribution_signed_rpm_scanner\n Provide data about distribution signed & unsigned RPM packages.\n====> * scancryptopolicies\n Scan information about system wide set crypto policies including:\n====> * scan_custom_modifications_actor\n Collects information about files in leapp directories that have been modified or newly added.\n====> * scan_custom_repofile\n Scan the custom /etc/leapp/files/leapp_upgrade_repositories.repo repo file.\n====> * scandasd\n In case of s390x architecture, check whether DASD is used.\n====> * scan_defined_ipu_paths\n Load defined IPU paths for the current major source system version\n====> * scan_dynamic_linker_configuration\n Scan the dynamic linker configuration and find modifications.\n====> * load_device_driver_deprecation_data\n Loads deprecation data for drivers and devices (PCI & CPU)\n====> * scan_mysql\n Actor checking for presence of MySQL installation.\n====> * scan_files_for_target_userspace\n Scan the source system and identify files that will be copied into the target userspace when it is created.\n====> * migrate_rpm_db\n Register a workaround to migrate RPM DB during the upgrade.\n====> * scan_grub_config\n Scan grub configuration files for errors.\n====> * scan_grub_device_name\n Find the name of the block devices where GRUB is located\n====> * network_manager_read_config\n Provides data about NetworkManager configuration.\n====> * scan_kernel_cmdline\n No documentation has been provided for the scan_kernel_cmdline actor.\n====> * read_openssh_config\n Collect information about the OpenSSH configuration.\n====> * scanmemory\n Scan Memory of the machine.\n====> * scan_pkg_manager\n Provides data about package manager (yum/dnf)\n====> * pci_devices_scanner\n Provides data about existing PCI Devices.\n====> * scan_sap_hana\n Gathers information related to SAP HANA instances on the system.\n====> * scan_pam_user_db\n Scan the PAM service folder for the location of pam_userdb databases\n====> * scan_source_files\n Scan files (explicitly specified) of the source system.\n====> * open_ssl_config_scanner\n Read an OpenSSL configuration file for further analysis.\n====> * scan_source_kernel\n Scan the source system kernel.\n====> * persistentnetnames\n Get network interface information for physical ethernet interfaces of the original system.\n====> * scan_subscription_manager_info\n Scans the current system for subscription manager information\n====> * scan_target_os_image\n Scans the provided target OS ISO image to use as a content source for the IPU, if any.\n====> * remove_obsolete_gpg_keys\n Remove obsoleted RPM GPG keys.\n====> * register_ruby_irb_adjustment\n Register a workaround to allow rubygem-irb's symlink -> directory conversion.\n====> * scanzfcp\n In case of s390x architecture, check whether ZFCP is used.\n====> * persistentnetnamesdisable\n Disable systemd-udevd persistent network naming on machine with single eth0 NIC\n====> * copy_dnf_conf_into_target_userspace\n Copy dnf.conf into target userspace\n====> * storage_scanner\n Provides data about storage settings.\n====> * repositories_blacklist\n Exclude target repositories provided by Red Hat without support.\n====> * get_installed_desktops\n Actor checks if kde or gnome desktop environments\n====> * checkrhui\n Check if system is using RHUI infrastructure (on public cloud) and send messages to\n====> * biosdevname\n Enable biosdevname on the target RHEL system if all interfaces on the source RHEL\n====> * rpm_transaction_config_tasks_collector\n Provides additional RPM transaction tasks from /etc/leapp/transaction.\n====> * ipa_scanner\n Scan system for ipa-client and ipa-server status\n====> * used_repository_scanner\n Scan used enabled repositories\n====> * scancpu\n Scan CPUs of the machine.\n====> * xfs_info_scanner\n This actor scans all mounted mountpoints for XFS information.\n====> * luks_scanner\n Provides data about active LUKS devices.\n====> * detect_kernel_drivers\n Matches all currently loaded kernel drivers against known deprecated and removed drivers.\n====> * scan_fips\n Determine whether the source system has FIPS enabled.\n====> * pes_events_scanner\n Provides data about package events from Package Evolution Service.\n====> * setuptargetrepos\n Produces list of repositories that should be available to be used by Upgrade process.\n\n============================================================\n ERRORS \n============================================================\n\n2025-12-03 03:53:35.606657 [ERROR] Actor: scan_subscription_manager_info\nMessage: A subscription-manager command failed to execute\nSummary:\n Details: Command ['subscription-manager', 'release'] failed with exit code 1.\n Stderr: This system is not yet registered. Try 'subscription-manager register --help' for more information.\n Hint: Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\n Link: https://access.redhat.com/solutions/6138372\n\n============================================================\n END OF ERRORS \n============================================================\n\nDebug output written to /var/log/leapp/leapp-upgrade.log\n\n============================================================\n REPORT OVERVIEW \n============================================================\n\nFollowing errors occurred and the upgrade cannot continue:\n 1. Actor: scan_subscription_manager_info\n Message: A subscription-manager command failed to execute\n\nReports summary:\n Errors: 1\n Inhibitors: 0\n HIGH severity reports: 0\n MEDIUM severity reports: 0\n LOW severity reports: 0\n INFO severity reports: 1\n\nBefore continuing, review the full report below for details about discovered problems and possible remediation instructions:\n A report has been generated at /var/log/leapp/leapp-report.txt\n A report has been generated at /var/log/leapp/leapp-report.json\n\n============================================================\n END OF REPORT OVERVIEW \n============================================================\n\nAnswerfile has been generated at /var/log/leapp/answerfile", "stdout_lines": ["==> Processing phase `configuration_phase`", "====> * ipu_workflow_config", " IPU workflow config actor", "==> Processing phase `FactsCollection`", "====> * system_facts", " Provides data about many facts from system.", "====> * get_enabled_modules", " Provides data about which module streams are enabled on the source system.", "====> * scan_systemd_source", " Provides info about systemd on the source system", "====> * repository_mapping", " Produces message containing repository mapping based on provided file.", "====> * ifcfg_scanner", " Scan ifcfg files with legacy network configuration", "====> * transaction_workarounds", " Provides additional RPM transaction tasks based on bundled RPM packages.", "====> * selinuxcontentscanner", " Scan the system for any SELinux customizations", "====> * root_scanner", " Scan the system root directory and produce a message containing", "====> * rpm_scanner", " Provides data about installed RPM Packages.", "====> * udevadm_info", " Produces data exported by the \"udevadm info\" command.", "====> * trusted_gpg_keys_scanner", " Scan for trusted GPG keys.", "====> * satellite_upgrade_services", " Reconfigure Satellite services", "====> * scanclienablerepo", " Produce CustomTargetRepository based on the LEAPP_ENABLE_REPOS in config.", "====> * distribution_signed_rpm_scanner", " Provide data about distribution signed & unsigned RPM packages.", "====> * scancryptopolicies", " Scan information about system wide set crypto policies including:", "====> * scan_custom_modifications_actor", " Collects information about files in leapp directories that have been modified or newly added.", "====> * scan_custom_repofile", " Scan the custom /etc/leapp/files/leapp_upgrade_repositories.repo repo file.", "====> * scandasd", " In case of s390x architecture, check whether DASD is used.", "====> * scan_defined_ipu_paths", " Load defined IPU paths for the current major source system version", "====> * scan_dynamic_linker_configuration", " Scan the dynamic linker configuration and find modifications.", "====> * load_device_driver_deprecation_data", " Loads deprecation data for drivers and devices (PCI & CPU)", "====> * scan_mysql", " Actor checking for presence of MySQL installation.", "====> * scan_files_for_target_userspace", " Scan the source system and identify files that will be copied into the target userspace when it is created.", "====> * migrate_rpm_db", " Register a workaround to migrate RPM DB during the upgrade.", "====> * scan_grub_config", " Scan grub configuration files for errors.", "====> * scan_grub_device_name", " Find the name of the block devices where GRUB is located", "====> * network_manager_read_config", " Provides data about NetworkManager configuration.", "====> * scan_kernel_cmdline", " No documentation has been provided for the scan_kernel_cmdline actor.", "====> * read_openssh_config", " Collect information about the OpenSSH configuration.", "====> * scanmemory", " Scan Memory of the machine.", "====> * scan_pkg_manager", " Provides data about package manager (yum/dnf)", "====> * pci_devices_scanner", " Provides data about existing PCI Devices.", "====> * scan_sap_hana", " Gathers information related to SAP HANA instances on the system.", "====> * scan_pam_user_db", " Scan the PAM service folder for the location of pam_userdb databases", "====> * scan_source_files", " Scan files (explicitly specified) of the source system.", "====> * open_ssl_config_scanner", " Read an OpenSSL configuration file for further analysis.", "====> * scan_source_kernel", " Scan the source system kernel.", "====> * persistentnetnames", " Get network interface information for physical ethernet interfaces of the original system.", "====> * scan_subscription_manager_info", " Scans the current system for subscription manager information", "====> * scan_target_os_image", " Scans the provided target OS ISO image to use as a content source for the IPU, if any.", "====> * remove_obsolete_gpg_keys", " Remove obsoleted RPM GPG keys.", "====> * register_ruby_irb_adjustment", " Register a workaround to allow rubygem-irb's symlink -> directory conversion.", "====> * scanzfcp", " In case of s390x architecture, check whether ZFCP is used.", "====> * persistentnetnamesdisable", " Disable systemd-udevd persistent network naming on machine with single eth0 NIC", "====> * copy_dnf_conf_into_target_userspace", " Copy dnf.conf into target userspace", "====> * storage_scanner", " Provides data about storage settings.", "====> * repositories_blacklist", " Exclude target repositories provided by Red Hat without support.", "====> * get_installed_desktops", " Actor checks if kde or gnome desktop environments", "====> * checkrhui", " Check if system is using RHUI infrastructure (on public cloud) and send messages to", "====> * biosdevname", " Enable biosdevname on the target RHEL system if all interfaces on the source RHEL", "====> * rpm_transaction_config_tasks_collector", " Provides additional RPM transaction tasks from /etc/leapp/transaction.", "====> * ipa_scanner", " Scan system for ipa-client and ipa-server status", "====> * used_repository_scanner", " Scan used enabled repositories", "====> * scancpu", " Scan CPUs of the machine.", "====> * xfs_info_scanner", " This actor scans all mounted mountpoints for XFS information.", "====> * luks_scanner", " Provides data about active LUKS devices.", "====> * detect_kernel_drivers", " Matches all currently loaded kernel drivers against known deprecated and removed drivers.", "====> * scan_fips", " Determine whether the source system has FIPS enabled.", "====> * pes_events_scanner", " Provides data about package events from Package Evolution Service.", "====> * setuptargetrepos", " Produces list of repositories that should be available to be used by Upgrade process.", "", "============================================================", " ERRORS ", "============================================================", "", "2025-12-03 03:53:35.606657 [ERROR] Actor: scan_subscription_manager_info", "Message: A subscription-manager command failed to execute", "Summary:", " Details: Command ['subscription-manager', 'release'] failed with exit code 1.", " Stderr: This system is not yet registered. Try 'subscription-manager register --help' for more information.", " Hint: Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.", " Link: https://access.redhat.com/solutions/6138372", "", "============================================================", " END OF ERRORS ", "============================================================", "", "Debug output written to /var/log/leapp/leapp-upgrade.log", "", "============================================================", " REPORT OVERVIEW ", "============================================================", "", "Following errors occurred and the upgrade cannot continue:", " 1. Actor: scan_subscription_manager_info", " Message: A subscription-manager command failed to execute", "", "Reports summary:", " Errors: 1", " Inhibitors: 0", " HIGH severity reports: 0", " MEDIUM severity reports: 0", " LOW severity reports: 0", " INFO severity reports: 1", "", "Before continuing, review the full report below for details about discovered problems and possible remediation instructions:", " A report has been generated at /var/log/leapp/leapp-report.txt", " A report has been generated at /var/log/leapp/leapp-report.json", "", "============================================================", " END OF REPORT OVERVIEW ", "============================================================", "", "Answerfile has been generated at /var/log/leapp/answerfile"]} TASK [leapp-upgrade | Run parse_leapp_report to check for inhibitors] ********** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:97 TASK [infra.leapp.common : parse_leapp_report | Default upgrade_inhibited to false] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/parse_leapp_report.yml:12 ok: [managed-node01] => {"ansible_facts": {"upgrade_inhibited": false}, "changed": false} TASK [infra.leapp.common : parse_leapp_report | Collect human readable report results] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/parse_leapp_report.yml:16 ok: [managed-node01] => {"changed": false, "content": "UmlzayBGYWN0b3I6IGhpZ2ggKGVycm9yKQpUaXRsZTogQSBzdWJzY3JpcHRpb24tbWFuYWdlciBjb21tYW5kIGZhaWxlZCB0byBleGVjdXRlClN1bW1hcnk6IHsiZGV0YWlscyI6ICJDb21tYW5kIFsnc3Vic2NyaXB0aW9uLW1hbmFnZXInLCAncmVsZWFzZSddIGZhaWxlZCB3aXRoIGV4aXQgY29kZSAxLiIsICJzdGRlcnIiOiAiVGhpcyBzeXN0ZW0gaXMgbm90IHlldCByZWdpc3RlcmVkLiBUcnkgJ3N1YnNjcmlwdGlvbi1tYW5hZ2VyIHJlZ2lzdGVyIC0taGVscCcgZm9yIG1vcmUgaW5mb3JtYXRpb24uXG4iLCAiaGludCI6ICJQbGVhc2UgZW5zdXJlIHlvdSBoYXZlIGEgdmFsaWQgUkhFTCBzdWJzY3JpcHRpb24gYW5kIHlvdXIgbmV0d29yayBpcyB1cC4gSWYgeW91IGFyZSB1c2luZyBwcm94eSBmb3IgUmVkIEhhdCBzdWJzY3JpcHRpb24tbWFuYWdlciwgcGxlYXNlIG1ha2Ugc3VyZSBpdCBpcyBzcGVjaWZpZWQgaW5zaWRlIHRoZSAvZXRjL3Joc20vcmhzbS5jb25mIGZpbGUuIE9yIHVzZSB0aGUgLS1uby1yaHNtIG9wdGlvbiB3aGVuIHJ1bm5pbmcgbGVhcHAsIGlmIHlvdSBkbyBub3Qgd2FudCB0byB1c2Ugc3Vic2NyaXB0aW9uLW1hbmFnZXIgZm9yIHRoZSBpbi1wbGFjZSB1cGdyYWRlIGFuZCB5b3Ugd2FudCB0byBkZWxpdmVyIGFsbCB0YXJnZXQgcmVwb3NpdG9yaWVzIGJ5IHlvdXJzZWxmIG9yIHVzaW5nIFJIVUkgb24gcHVibGljIGNsb3VkLiIsICJsaW5rIjogImh0dHBzOi8vYWNjZXNzLnJlZGhhdC5jb20vc29sdXRpb25zLzYxMzgzNzIifQpLZXk6IDdlYzgyNjk3ODRkYjFiYmEyYWM1NGFlNDM4Njg5ZWYzOTdlMTY4MzMKLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLQpSaXNrIEZhY3RvcjogaW5mbyAKVGl0bGU6IEV4Y2x1ZGVkIHRhcmdldCBzeXN0ZW0gcmVwb3NpdG9yaWVzClN1bW1hcnk6IFRoZSBmb2xsb3dpbmcgcmVwb3NpdG9yaWVzIGFyZSBub3Qgc3VwcG9ydGVkIGJ5IFJlZCBIYXQgYW5kIGFyZSBleGNsdWRlZCBmcm9tIHRoZSBsaXN0IG9mIHJlcG9zaXRvcmllcyB1c2VkIGR1cmluZyB0aGUgdXBncmFkZS4KLSBjb2RlcmVhZHktYnVpbGRlci1iZXRhLWZvci1yaGVsLTEwLXBwYzY0bGUtcnBtcwotIGNvZGVyZWFkeS1idWlsZGVyLWJldGEtZm9yLXJoZWwtMTAtczM5MHgtcnBtcwotIGNvZGVyZWFkeS1idWlsZGVyLWZvci1yaGVsLTEwLWFhcmNoNjQtcnBtcwotIGNvZGVyZWFkeS1idWlsZGVyLWZvci1yaGVsLTEwLXBwYzY0bGUtcnBtcwotIGNvZGVyZWFkeS1idWlsZGVyLWZvci1yaGVsLTEwLXg4Nl82NC1ycG1zCi0gY29kZXJlYWR5LWJ1aWxkZXItZm9yLXJoZWwtMTAtczM5MHgtcnBtcwotIGNvZGVyZWFkeS1idWlsZGVyLWJldGEtZm9yLXJoZWwtMTAtYWFyY2g2NC1ycG1zCi0gY29kZXJlYWR5LWJ1aWxkZXItYmV0YS1mb3ItcmhlbC0xMC14ODZfNjQtcnBtcwpSZW1lZGlhdGlvbjogW2hpbnRdIElmIHNvbWUgb2YgZXhjbHVkZWQgcmVwb3NpdG9yaWVzIGFyZSBzdGlsbCByZXF1aXJlZCB0byBiZSB1c2VkIGR1cmluZyB0aGUgdXBncmFkZSwgZXhlY3V0ZSBsZWFwcCB3aXRoIHRoZSAtLWVuYWJsZXJlcG8gb3B0aW9uIHdpdGggdGhlIHJlcG9pZCBvZiB0aGUgcmVwb3NpdG9yeSByZXF1aXJlZCB0byBiZSBlbmFibGVkIGFzIGFuIGFyZ3VtZW50ICh0aGUgb3B0aW9uIGNhbiBiZSB1c2VkIG11bHRpcGxlIHRpbWVzKS4KS2V5OiAxYjkxMzJjYjIzNjJhZTc4MzBlNDhlZWU3ODExYmU5NTI3NzQ3ZGU4Ci0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0K", "encoding": "base64", "source": "/var/log/leapp/leapp-report.txt"} TASK [infra.leapp.common : parse_leapp_report | Collect JSON report results] *** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/parse_leapp_report.yml:21 ok: [managed-node01] => {"changed": false, "content": "ewogICJlbnRyaWVzIjogWwogICAgewogICAgICAiYXVkaWVuY2UiOiAic3lzYWRtaW4iLAogICAgICAiZ3JvdXBzIjogWwogICAgICAgICJlcnJvciIKICAgICAgXSwKICAgICAgImtleSI6ICI3ZWM4MjY5Nzg0ZGIxYmJhMmFjNTRhZTQzODY4OWVmMzk3ZTE2ODMzIiwKICAgICAgInNldmVyaXR5IjogImhpZ2giLAogICAgICAic3VtbWFyeSI6ICJ7XCJkZXRhaWxzXCI6IFwiQ29tbWFuZCBbJ3N1YnNjcmlwdGlvbi1tYW5hZ2VyJywgJ3JlbGVhc2UnXSBmYWlsZWQgd2l0aCBleGl0IGNvZGUgMS5cIiwgXCJzdGRlcnJcIjogXCJUaGlzIHN5c3RlbSBpcyBub3QgeWV0IHJlZ2lzdGVyZWQuIFRyeSAnc3Vic2NyaXB0aW9uLW1hbmFnZXIgcmVnaXN0ZXIgLS1oZWxwJyBmb3IgbW9yZSBpbmZvcm1hdGlvbi5cXG5cIiwgXCJoaW50XCI6IFwiUGxlYXNlIGVuc3VyZSB5b3UgaGF2ZSBhIHZhbGlkIFJIRUwgc3Vic2NyaXB0aW9uIGFuZCB5b3VyIG5ldHdvcmsgaXMgdXAuIElmIHlvdSBhcmUgdXNpbmcgcHJveHkgZm9yIFJlZCBIYXQgc3Vic2NyaXB0aW9uLW1hbmFnZXIsIHBsZWFzZSBtYWtlIHN1cmUgaXQgaXMgc3BlY2lmaWVkIGluc2lkZSB0aGUgL2V0Yy9yaHNtL3Joc20uY29uZiBmaWxlLiBPciB1c2UgdGhlIC0tbm8tcmhzbSBvcHRpb24gd2hlbiBydW5uaW5nIGxlYXBwLCBpZiB5b3UgZG8gbm90IHdhbnQgdG8gdXNlIHN1YnNjcmlwdGlvbi1tYW5hZ2VyIGZvciB0aGUgaW4tcGxhY2UgdXBncmFkZSBhbmQgeW91IHdhbnQgdG8gZGVsaXZlciBhbGwgdGFyZ2V0IHJlcG9zaXRvcmllcyBieSB5b3Vyc2VsZiBvciB1c2luZyBSSFVJIG9uIHB1YmxpYyBjbG91ZC5cIiwgXCJsaW5rXCI6IFwiaHR0cHM6Ly9hY2Nlc3MucmVkaGF0LmNvbS9zb2x1dGlvbnMvNjEzODM3MlwifSIsCiAgICAgICJ0aXRsZSI6ICJBIHN1YnNjcmlwdGlvbi1tYW5hZ2VyIGNvbW1hbmQgZmFpbGVkIHRvIGV4ZWN1dGUiLAogICAgICAidGltZVN0YW1wIjogIjIwMjUtMTItMDNUMDg6NTM6MzUuNjA2ODU1WiIsCiAgICAgICJob3N0bmFtZSI6ICJtYW5hZ2VkLW5vZGUwMSIsCiAgICAgICJhY3RvciI6ICJzY2FuX3N1YnNjcmlwdGlvbl9tYW5hZ2VyX2luZm8iLAogICAgICAiaWQiOiAiOGVhZDczM2QyYjc4NTI0ZTMwZmVjYjNmYTg5ZDA5ZThmYWFlMGQ0NTgxODg1MTU3YWJlNTQ2OWE1OTc5ZTU4MiIKICAgIH0sCiAgICB7CiAgICAgICJhdWRpZW5jZSI6ICJzeXNhZG1pbiIsCiAgICAgICJkZXRhaWwiOiB7CiAgICAgICAgInJlbWVkaWF0aW9ucyI6IFsKICAgICAgICAgIHsKICAgICAgICAgICAgImNvbnRleHQiOiAiSWYgc29tZSBvZiBleGNsdWRlZCByZXBvc2l0b3JpZXMgYXJlIHN0aWxsIHJlcXVpcmVkIHRvIGJlIHVzZWQgZHVyaW5nIHRoZSB1cGdyYWRlLCBleGVjdXRlIGxlYXBwIHdpdGggdGhlIC0tZW5hYmxlcmVwbyBvcHRpb24gd2l0aCB0aGUgcmVwb2lkIG9mIHRoZSByZXBvc2l0b3J5IHJlcXVpcmVkIHRvIGJlIGVuYWJsZWQgYXMgYW4gYXJndW1lbnQgKHRoZSBvcHRpb24gY2FuIGJlIHVzZWQgbXVsdGlwbGUgdGltZXMpLiIsCiAgICAgICAgICAgICJ0eXBlIjogImhpbnQiCiAgICAgICAgICB9CiAgICAgICAgXQogICAgICB9LAogICAgICAiZ3JvdXBzIjogWwogICAgICAgICJyZXBvc2l0b3J5IiwKICAgICAgICAiZmFpbHVyZSIKICAgICAgXSwKICAgICAgImtleSI6ICIxYjkxMzJjYjIzNjJhZTc4MzBlNDhlZWU3ODExYmU5NTI3NzQ3ZGU4IiwKICAgICAgInNldmVyaXR5IjogImluZm8iLAogICAgICAic3VtbWFyeSI6ICJUaGUgZm9sbG93aW5nIHJlcG9zaXRvcmllcyBhcmUgbm90IHN1cHBvcnRlZCBieSBSZWQgSGF0IGFuZCBhcmUgZXhjbHVkZWQgZnJvbSB0aGUgbGlzdCBvZiByZXBvc2l0b3JpZXMgdXNlZCBkdXJpbmcgdGhlIHVwZ3JhZGUuXG4tIGNvZGVyZWFkeS1idWlsZGVyLWJldGEtZm9yLXJoZWwtMTAtcHBjNjRsZS1ycG1zXG4tIGNvZGVyZWFkeS1idWlsZGVyLWJldGEtZm9yLXJoZWwtMTAtczM5MHgtcnBtc1xuLSBjb2RlcmVhZHktYnVpbGRlci1mb3ItcmhlbC0xMC1hYXJjaDY0LXJwbXNcbi0gY29kZXJlYWR5LWJ1aWxkZXItZm9yLXJoZWwtMTAtcHBjNjRsZS1ycG1zXG4tIGNvZGVyZWFkeS1idWlsZGVyLWZvci1yaGVsLTEwLXg4Nl82NC1ycG1zXG4tIGNvZGVyZWFkeS1idWlsZGVyLWZvci1yaGVsLTEwLXMzOTB4LXJwbXNcbi0gY29kZXJlYWR5LWJ1aWxkZXItYmV0YS1mb3ItcmhlbC0xMC1hYXJjaDY0LXJwbXNcbi0gY29kZXJlYWR5LWJ1aWxkZXItYmV0YS1mb3ItcmhlbC0xMC14ODZfNjQtcnBtcyIsCiAgICAgICJ0aXRsZSI6ICJFeGNsdWRlZCB0YXJnZXQgc3lzdGVtIHJlcG9zaXRvcmllcyIsCiAgICAgICJ0aW1lU3RhbXAiOiAiMjAyNS0xMi0wM1QwODo1MzozNi4xMDEwMzFaIiwKICAgICAgImhvc3RuYW1lIjogIm1hbmFnZWQtbm9kZTAxIiwKICAgICAgImFjdG9yIjogInJlcG9zaXRvcmllc19ibGFja2xpc3QiLAogICAgICAiaWQiOiAiYjU2YmM0YzdiY2ViYjY3M2NiM2QzODZlNmY3M2VkZmNkNGI2NTgyMGNlZWM0NDg5NzY5ZTM1ZGRkZGFkMmEyZCIKICAgIH0KICBdLAogICJsZWFwcF9ydW5faWQiOiAiOTUxOWI3Y2EtZjJiZS00ODQ4LTgyYTgtNjE0NGYxYzVmM2Q3Igp9Cg==", "encoding": "base64", "source": "/var/log/leapp/leapp-report.json"} TASK [infra.leapp.common : parse_leapp_report | Parse report results] ********** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/parse_leapp_report.yml:26 ok: [managed-node01] => {"ansible_facts": {"leapp_report_json": {"entries": [{"actor": "scan_subscription_manager_info", "audience": "sysadmin", "groups": ["error"], "hostname": "managed-node01", "id": "8ead733d2b78524e30fecb3fa89d09e8faae0d4581885157abe5469a5979e582", "key": "7ec8269784db1bba2ac54ae438689ef397e16833", "severity": "high", "summary": "{\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}", "timeStamp": "2025-12-03T08:53:35.606855Z", "title": "A subscription-manager command failed to execute"}, {"actor": "repositories_blacklist", "audience": "sysadmin", "detail": {"remediations": [{"context": "If some of excluded repositories are still required to be used during the upgrade, execute leapp with the --enablerepo option with the repoid of the repository required to be enabled as an argument (the option can be used multiple times).", "type": "hint"}]}, "groups": ["repository", "failure"], "hostname": "managed-node01", "id": "b56bc4c7bcebb673cb3d386e6f73edfcd4b65820ceec4489769e35ddddad2a2d", "key": "1b9132cb2362ae7830e48eee7811be9527747de8", "severity": "info", "summary": "The following repositories are not supported by Red Hat and are excluded from the list of repositories used during the upgrade.\n- codeready-builder-beta-for-rhel-10-ppc64le-rpms\n- codeready-builder-beta-for-rhel-10-s390x-rpms\n- codeready-builder-for-rhel-10-aarch64-rpms\n- codeready-builder-for-rhel-10-ppc64le-rpms\n- codeready-builder-for-rhel-10-x86_64-rpms\n- codeready-builder-for-rhel-10-s390x-rpms\n- codeready-builder-beta-for-rhel-10-aarch64-rpms\n- codeready-builder-beta-for-rhel-10-x86_64-rpms", "timeStamp": "2025-12-03T08:53:36.101031Z", "title": "Excluded target system repositories"}], "leapp_run_id": "9519b7ca-f2be-4848-82a8-6144f1c5f3d7"}, "leapp_report_txt": ["Risk Factor: high (error)", "Title: A subscription-manager command failed to execute", "Summary: {\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}", "Key: 7ec8269784db1bba2ac54ae438689ef397e16833", "----------------------------------------", "Risk Factor: info ", "Title: Excluded target system repositories", "Summary: The following repositories are not supported by Red Hat and are excluded from the list of repositories used during the upgrade.", "- codeready-builder-beta-for-rhel-10-ppc64le-rpms", "- codeready-builder-beta-for-rhel-10-s390x-rpms", "- codeready-builder-for-rhel-10-aarch64-rpms", "- codeready-builder-for-rhel-10-ppc64le-rpms", "- codeready-builder-for-rhel-10-x86_64-rpms", "- codeready-builder-for-rhel-10-s390x-rpms", "- codeready-builder-beta-for-rhel-10-aarch64-rpms", "- codeready-builder-beta-for-rhel-10-x86_64-rpms", "Remediation: [hint] If some of excluded repositories are still required to be used during the upgrade, execute leapp with the --enablerepo option with the repoid of the repository required to be enabled as an argument (the option can be used multiple times).", "Key: 1b9132cb2362ae7830e48eee7811be9527747de8", "----------------------------------------", ""]}, "changed": false} TASK [infra.leapp.common : parse_leapp_report | Check for inhibitors] ********** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/parse_leapp_report.yml:32 ok: [managed-node01] => (item={'audience': 'sysadmin', 'groups': ['error'], 'key': '7ec8269784db1bba2ac54ae438689ef397e16833', 'severity': 'high', 'summary': '{"details": "Command [\'subscription-manager\', \'release\'] failed with exit code 1.", "stderr": "This system is not yet registered. Try \'subscription-manager register --help\' for more information.\\n", "hint": "Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.", "link": "https://access.redhat.com/solutions/6138372"}', 'title': 'A subscription-manager command failed to execute', 'timeStamp': '2025-12-03T08:53:35.606855Z', 'hostname': 'managed-node01', 'actor': 'scan_subscription_manager_info', 'id': '8ead733d2b78524e30fecb3fa89d09e8faae0d4581885157abe5469a5979e582'}) => {"ansible_facts": {"leapp_inhibitors": [{"actor": "scan_subscription_manager_info", "audience": "sysadmin", "groups": ["error"], "hostname": "managed-node01", "id": "8ead733d2b78524e30fecb3fa89d09e8faae0d4581885157abe5469a5979e582", "key": "7ec8269784db1bba2ac54ae438689ef397e16833", "severity": "high", "summary": "{\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}", "timeStamp": "2025-12-03T08:53:35.606855Z", "title": "A subscription-manager command failed to execute"}], "upgrade_inhibited": true}, "ansible_loop_var": "item", "changed": false, "item": {"actor": "scan_subscription_manager_info", "audience": "sysadmin", "groups": ["error"], "hostname": "managed-node01", "id": "8ead733d2b78524e30fecb3fa89d09e8faae0d4581885157abe5469a5979e582", "key": "7ec8269784db1bba2ac54ae438689ef397e16833", "severity": "high", "summary": "{\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}", "timeStamp": "2025-12-03T08:53:35.606855Z", "title": "A subscription-manager command failed to execute"}} skipping: [managed-node01] => (item={'audience': 'sysadmin', 'detail': {'remediations': [{'context': 'If some of excluded repositories are still required to be used during the upgrade, execute leapp with the --enablerepo option with the repoid of the repository required to be enabled as an argument (the option can be used multiple times).', 'type': 'hint'}]}, 'groups': ['repository', 'failure'], 'key': '1b9132cb2362ae7830e48eee7811be9527747de8', 'severity': 'info', 'summary': 'The following repositories are not supported by Red Hat and are excluded from the list of repositories used during the upgrade.\n- codeready-builder-beta-for-rhel-10-ppc64le-rpms\n- codeready-builder-beta-for-rhel-10-s390x-rpms\n- codeready-builder-for-rhel-10-aarch64-rpms\n- codeready-builder-for-rhel-10-ppc64le-rpms\n- codeready-builder-for-rhel-10-x86_64-rpms\n- codeready-builder-for-rhel-10-s390x-rpms\n- codeready-builder-beta-for-rhel-10-aarch64-rpms\n- codeready-builder-beta-for-rhel-10-x86_64-rpms', 'title': 'Excluded target system repositories', 'timeStamp': '2025-12-03T08:53:36.101031Z', 'hostname': 'managed-node01', 'actor': 'repositories_blacklist', 'id': 'b56bc4c7bcebb673cb3d386e6f73edfcd4b65820ceec4489769e35ddddad2a2d'}) => {"ansible_loop_var": "item", "changed": false, "item": {"actor": "repositories_blacklist", "audience": "sysadmin", "detail": {"remediations": [{"context": "If some of excluded repositories are still required to be used during the upgrade, execute leapp with the --enablerepo option with the repoid of the repository required to be enabled as an argument (the option can be used multiple times).", "type": "hint"}]}, "groups": ["repository", "failure"], "hostname": "managed-node01", "id": "b56bc4c7bcebb673cb3d386e6f73edfcd4b65820ceec4489769e35ddddad2a2d", "key": "1b9132cb2362ae7830e48eee7811be9527747de8", "severity": "info", "summary": "The following repositories are not supported by Red Hat and are excluded from the list of repositories used during the upgrade.\n- codeready-builder-beta-for-rhel-10-ppc64le-rpms\n- codeready-builder-beta-for-rhel-10-s390x-rpms\n- codeready-builder-for-rhel-10-aarch64-rpms\n- codeready-builder-for-rhel-10-ppc64le-rpms\n- codeready-builder-for-rhel-10-x86_64-rpms\n- codeready-builder-for-rhel-10-s390x-rpms\n- codeready-builder-beta-for-rhel-10-aarch64-rpms\n- codeready-builder-beta-for-rhel-10-x86_64-rpms", "timeStamp": "2025-12-03T08:53:36.101031Z", "title": "Excluded target system repositories"}, "skip_reason": "Conditional result was False"} TASK [infra.leapp.common : parse_leapp_report | Collect inhibitors] ************ task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/parse_leapp_report.yml:44 ok: [managed-node01] => {"changed": false, "cmd": ["awk", "/\\(inhibitor\\)/,/^-------/", "/var/log/leapp/leapp-report.txt"], "delta": "0:00:00.003887", "end": "2025-12-03 03:54:12.575201", "failed_when_result": false, "msg": "", "rc": 0, "start": "2025-12-03 03:54:12.571314", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} TASK [infra.leapp.common : parse_leapp_report | Collect high errors] *********** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/tasks/parse_leapp_report.yml:53 ok: [managed-node01] => {"changed": false, "cmd": ["awk", "/high \\(error\\)/,/^-------/", "/var/log/leapp/leapp-report.txt"], "delta": "0:00:00.003986", "end": "2025-12-03 03:54:12.928674", "failed_when_result": false, "msg": "", "rc": 0, "start": "2025-12-03 03:54:12.924688", "stderr": "", "stderr_lines": [], "stdout": "Risk Factor: high (error)\nTitle: A subscription-manager command failed to execute\nSummary: {\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}\nKey: 7ec8269784db1bba2ac54ae438689ef397e16833\n----------------------------------------", "stdout_lines": ["Risk Factor: high (error)", "Title: A subscription-manager command failed to execute", "Summary: {\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}", "Key: 7ec8269784db1bba2ac54ae438689ef397e16833", "----------------------------------------"]} TASK [infra.leapp.upgrade : leapp-upgrade | Display inhibitors] **************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:102 skipping: [managed-node01] => {} TASK [infra.leapp.upgrade : leapp-upgrade | Display errors] ******************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:107 ok: [managed-node01] => { "results_errors.stdout_lines": [ "Risk Factor: high (error)", "Title: A subscription-manager command failed to execute", "Summary: {\"details\": \"Command ['subscription-manager', 'release'] failed with exit code 1.\", \"stderr\": \"This system is not yet registered. Try 'subscription-manager register --help' for more information.\\n\", \"hint\": \"Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.\", \"link\": \"https://access.redhat.com/solutions/6138372\"}", "Key: 7ec8269784db1bba2ac54ae438689ef397e16833", "----------------------------------------" ] } TASK [infra.leapp.upgrade : leapp-upgrade | Fail Leapp upgrade] **************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tasks/leapp-upgrade.yml:112 fatal: [managed-node01]: FAILED! => {"changed": false, "msg": "Errors encountered running Leapp upgrade command. Review the tasks above or the result file at /var/log/leapp/leapp-report.txt."} TASK [Test | Check error] ****************************************************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tests/tests_default.yml:14 ok: [managed-node01] => { "msg": "errors {\n \"_ansible_no_log\": false,\n \"changed\": false,\n \"failed\": true,\n \"msg\": \"Errors encountered running Leapp upgrade command. Review the tasks above or the result file at /var/log/leapp/leapp-report.txt.\"\n}" } TASK [Test | Ensure correct error] ********************************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/upgrade/tests/tests_default.yml:18 ok: [managed-node01] => { "changed": false, "msg": "All assertions passed" } RUNNING HANDLER [infra.leapp.common : Check for log file] ********************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/handlers/main.yml:3 ok: [managed-node01] => {"changed": false, "stat": {"atime": 1764751800.8532116, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 24, "charset": "us-ascii", "checksum": "6b90f0efacfe69d87fabed29b9b8f459207653d9", "ctime": 1764752018.0625565, "dev": 51716, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 209715818, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1764752018.0625565, "nlink": 1, "path": "/var/log/ripu/ripu.log", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 8256, "uid": 0, "version": "670229100", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false}} RUNNING HANDLER [infra.leapp.common : Add end time to log file] **************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/handlers/main.yml:9 changed: [managed-node01] => {"backup": "", "changed": true, "msg": "line added"} RUNNING HANDLER [infra.leapp.common : Slurp ripu.log file] ********************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/handlers/main.yml:19 ok: [managed-node01] => {"changed": false, "content": "RIPU in-place OS upgrade
Job started at 2025-12-03T08:50:00Z
==> Processing phase `configuration_phase`
====> * ipu_workflow_config
        IPU workflow config actor
==> Processing phase `FactsCollection`
====> * system_facts
        Provides data about many facts from system.
====> * get_enabled_modules
        Provides data about which module streams are enabled on the source system.
====> * scan_systemd_source
        Provides info about systemd on the source system
====> * repository_mapping
        Produces message containing repository mapping based on provided file.
====> * ifcfg_scanner
        Scan ifcfg files with legacy network configuration
====> * transaction_workarounds
        Provides additional RPM transaction tasks based on bundled RPM packages.
====> * selinuxcontentscanner
        Scan the system for any SELinux customizations
====> * root_scanner
        Scan the system root directory and produce a message containing
====> * rpm_scanner
        Provides data about installed RPM Packages.
====> * udevadm_info
        Produces data exported by the "udevadm info" command.
====> * trusted_gpg_keys_scanner
        Scan for trusted GPG keys.
====> * satellite_upgrade_services
        Reconfigure Satellite services
====> * scanclienablerepo
        Produce CustomTargetRepository based on the LEAPP_ENABLE_REPOS in config.
====> * distribution_signed_rpm_scanner
        Provide data about distribution signed & unsigned RPM packages.
====> * scancryptopolicies
        Scan information about system wide set crypto policies including:
====> * scan_custom_modifications_actor
        Collects information about files in leapp directories that have been modified or newly added.
====> * scan_custom_repofile
        Scan the custom /etc/leapp/files/leapp_upgrade_repositories.repo repo file.
====> * scandasd
        In case of s390x architecture, check whether DASD is used.
====> * scan_defined_ipu_paths
        Load defined IPU paths for the current major source system version
====> * scan_dynamic_linker_configuration
        Scan the dynamic linker configuration and find modifications.
====> * load_device_driver_deprecation_data
        Loads deprecation data for drivers and devices (PCI & CPU)
====> * scan_mysql
        Actor checking for presence of MySQL installation.
====> * scan_files_for_target_userspace
        Scan the source system and identify files that will be copied into the target userspace when it is created.
====> * migrate_rpm_db
        Register a workaround to migrate RPM DB during the upgrade.
====> * scan_grub_config
        Scan grub configuration files for errors.
====> * scan_grub_device_name
        Find the name of the block devices where GRUB is located
====> * network_manager_read_config
        Provides data about NetworkManager configuration.
====> * scan_kernel_cmdline
        No documentation has been provided for the scan_kernel_cmdline actor.
====> * read_openssh_config
        Collect information about the OpenSSH configuration.
====> * scanmemory
        Scan Memory of the machine.
====> * scan_pkg_manager
        Provides data about package manager (yum/dnf)
====> * pci_devices_scanner
        Provides data about existing PCI Devices.
====> * scan_sap_hana
        Gathers information related to SAP HANA instances on the system.
====> * scan_pam_user_db
        Scan the PAM service folder for the location of pam_userdb databases
====> * scan_source_files
        Scan files (explicitly specified) of the source system.
====> * open_ssl_config_scanner
        Read an OpenSSL configuration file for further analysis.
====> * scan_source_kernel
        Scan the source system kernel.
====> * persistentnetnames
        Get network interface information for physical ethernet interfaces of the original system.
====> * scan_subscription_manager_info
        Scans the current system for subscription manager information
====> * scan_target_os_image
        Scans the provided target OS ISO image to use as a content source for the IPU, if any.
====> * remove_obsolete_gpg_keys
        Remove obsoleted RPM GPG keys.
====> * register_ruby_irb_adjustment
        Register a workaround to allow rubygem-irb's symlink -> directory conversion.
====> * scanzfcp
        In case of s390x architecture, check whether ZFCP is used.
====> * persistentnetnamesdisable
        Disable systemd-udevd persistent network naming on machine with single eth0 NIC
====> * copy_dnf_conf_into_target_userspace
        Copy dnf.conf into target userspace
====> * storage_scanner
        Provides data about storage settings.
====> * repositories_blacklist
        Exclude target repositories provided by Red Hat without support.
====> * get_installed_desktops
        Actor checks if kde or gnome desktop environments
====> * checkrhui
        Check if system is using RHUI infrastructure (on public cloud) and send messages to
====> * biosdevname
        Enable biosdevname on the target RHEL system if all interfaces on the source RHEL
====> * rpm_transaction_config_tasks_collector
        Provides additional RPM transaction tasks from /etc/leapp/transaction.
====> * ipa_scanner
        Scan system for ipa-client and ipa-server status
====> * used_repository_scanner
        Scan used enabled repositories
====> * scancpu
        Scan CPUs of the machine.
====> * xfs_info_scanner
        This actor scans all mounted mountpoints for XFS information.
====> * luks_scanner
        Provides data about active LUKS devices.
====> * detect_kernel_drivers
        Matches all currently loaded kernel drivers against known deprecated and removed drivers.
====> * scan_fips
        Determine whether the source system has FIPS enabled.
====> * pes_events_scanner
        Provides data about package events from Package Evolution Service.
====> * setuptargetrepos
        Produces list of repositories that should be available to be used by Upgrade process.

============================================================
                           ERRORS                           
============================================================

2025-12-03 03:53:35.606657 [ERROR] Actor: scan_subscription_manager_info
Message: A subscription-manager command failed to execute
Summary:
    Details: Command ['subscription-manager', 'release'] failed with exit code 1.
    Stderr: This system is not yet registered. Try 'subscription-manager register --help' for more information.
    Hint: Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.
    Link: https://access.redhat.com/solutions/6138372

============================================================
                       END OF ERRORS                        
============================================================

Debug output written to /var/log/leapp/leapp-upgrade.log

============================================================
                      REPORT OVERVIEW                       
============================================================

Following errors occurred and the upgrade cannot continue:
    1. Actor: scan_subscription_manager_info
       Message: A subscription-manager command failed to execute

Reports summary:
    Errors:                      1
    Inhibitors:                  0
    HIGH severity reports:       0
    MEDIUM severity reports:     0
    LOW severity reports:        0
    INFO severity reports:       1

Before continuing, review the full report below for details about discovered problems and possible remediation instructions:
    A report has been generated at /var/log/leapp/leapp-report.txt
    A report has been generated at /var/log/leapp/leapp-report.json

============================================================
                   END OF REPORT OVERVIEW                   
============================================================

Answerfile has been generated at /var/log/leapp/answerfile
Job ended at 2025-12-03T08:54:13Z
", "encoding": "base64", "source": "/var/log/ripu/ripu.log"} RUNNING HANDLER [infra.leapp.common : Decode ripu.log file] ******************** task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/handlers/main.yml:26 ok: [managed-node01] => {"ansible_facts": {"ripu_log_file": ["RIPU in-place OS upgrade", "Job started at 2025-12-03T08:50:00Z", "==> Processing phase `configuration_phase`", "====> * ipu_workflow_config", " IPU workflow config actor", "==> Processing phase `FactsCollection`", "====> * system_facts", " Provides data about many facts from system.", "====> * get_enabled_modules", " Provides data about which module streams are enabled on the source system.", "====> * scan_systemd_source", " Provides info about systemd on the source system", "====> * repository_mapping", " Produces message containing repository mapping based on provided file.", "====> * ifcfg_scanner", " Scan ifcfg files with legacy network configuration", "====> * transaction_workarounds", " Provides additional RPM transaction tasks based on bundled RPM packages.", "====> * selinuxcontentscanner", " Scan the system for any SELinux customizations", "====> * root_scanner", " Scan the system root directory and produce a message containing", "====> * rpm_scanner", " Provides data about installed RPM Packages.", "====> * udevadm_info", " Produces data exported by the \"udevadm info\" command.", "====> * trusted_gpg_keys_scanner", " Scan for trusted GPG keys.", "====> * satellite_upgrade_services", " Reconfigure Satellite services", "====> * scanclienablerepo", " Produce CustomTargetRepository based on the LEAPP_ENABLE_REPOS in config.", "====> * distribution_signed_rpm_scanner", " Provide data about distribution signed & unsigned RPM packages.", "====> * scancryptopolicies", " Scan information about system wide set crypto policies including:", "====> * scan_custom_modifications_actor", " Collects information about files in leapp directories that have been modified or newly added.", "====> * scan_custom_repofile", " Scan the custom /etc/leapp/files/leapp_upgrade_repositories.repo repo file.", "====> * scandasd", " In case of s390x architecture, check whether DASD is used.", "====> * scan_defined_ipu_paths", " Load defined IPU paths for the current major source system version", "====> * scan_dynamic_linker_configuration", " Scan the dynamic linker configuration and find modifications.", "====> * load_device_driver_deprecation_data", " Loads deprecation data for drivers and devices (PCI & CPU)", "====> * scan_mysql", " Actor checking for presence of MySQL installation.", "====> * scan_files_for_target_userspace", " Scan the source system and identify files that will be copied into the target userspace when it is created.", "====> * migrate_rpm_db", " Register a workaround to migrate RPM DB during the upgrade.", "====> * scan_grub_config", " Scan grub configuration files for errors.", "====> * scan_grub_device_name", " Find the name of the block devices where GRUB is located", "====> * network_manager_read_config", " Provides data about NetworkManager configuration.", "====> * scan_kernel_cmdline", " No documentation has been provided for the scan_kernel_cmdline actor.", "====> * read_openssh_config", " Collect information about the OpenSSH configuration.", "====> * scanmemory", " Scan Memory of the machine.", "====> * scan_pkg_manager", " Provides data about package manager (yum/dnf)", "====> * pci_devices_scanner", " Provides data about existing PCI Devices.", "====> * scan_sap_hana", " Gathers information related to SAP HANA instances on the system.", "====> * scan_pam_user_db", " Scan the PAM service folder for the location of pam_userdb databases", "====> * scan_source_files", " Scan files (explicitly specified) of the source system.", "====> * open_ssl_config_scanner", " Read an OpenSSL configuration file for further analysis.", "====> * scan_source_kernel", " Scan the source system kernel.", "====> * persistentnetnames", " Get network interface information for physical ethernet interfaces of the original system.", "====> * scan_subscription_manager_info", " Scans the current system for subscription manager information", "====> * scan_target_os_image", " Scans the provided target OS ISO image to use as a content source for the IPU, if any.", "====> * remove_obsolete_gpg_keys", " Remove obsoleted RPM GPG keys.", "====> * register_ruby_irb_adjustment", " Register a workaround to allow rubygem-irb's symlink -> directory conversion.", "====> * scanzfcp", " In case of s390x architecture, check whether ZFCP is used.", "====> * persistentnetnamesdisable", " Disable systemd-udevd persistent network naming on machine with single eth0 NIC", "====> * copy_dnf_conf_into_target_userspace", " Copy dnf.conf into target userspace", "====> * storage_scanner", " Provides data about storage settings.", "====> * repositories_blacklist", " Exclude target repositories provided by Red Hat without support.", "====> * get_installed_desktops", " Actor checks if kde or gnome desktop environments", "====> * checkrhui", " Check if system is using RHUI infrastructure (on public cloud) and send messages to", "====> * biosdevname", " Enable biosdevname on the target RHEL system if all interfaces on the source RHEL", "====> * rpm_transaction_config_tasks_collector", " Provides additional RPM transaction tasks from /etc/leapp/transaction.", "====> * ipa_scanner", " Scan system for ipa-client and ipa-server status", "====> * used_repository_scanner", " Scan used enabled repositories", "====> * scancpu", " Scan CPUs of the machine.", "====> * xfs_info_scanner", " This actor scans all mounted mountpoints for XFS information.", "====> * luks_scanner", " Provides data about active LUKS devices.", "====> * detect_kernel_drivers", " Matches all currently loaded kernel drivers against known deprecated and removed drivers.", "====> * scan_fips", " Determine whether the source system has FIPS enabled.", "====> * pes_events_scanner", " Provides data about package events from Package Evolution Service.", "====> * setuptargetrepos", " Produces list of repositories that should be available to be used by Upgrade process.", "", "============================================================", " ERRORS ", "============================================================", "", "2025-12-03 03:53:35.606657 [ERROR] Actor: scan_subscription_manager_info", "Message: A subscription-manager command failed to execute", "Summary:", " Details: Command ['subscription-manager', 'release'] failed with exit code 1.", " Stderr: This system is not yet registered. Try 'subscription-manager register --help' for more information.", " Hint: Please ensure you have a valid RHEL subscription and your network is up. If you are using proxy for Red Hat subscription-manager, please make sure it is specified inside the /etc/rhsm/rhsm.conf file. Or use the --no-rhsm option when running leapp, if you do not want to use subscription-manager for the in-place upgrade and you want to deliver all target repositories by yourself or using RHUI on public cloud.", " Link: https://access.redhat.com/solutions/6138372", "", "============================================================", " END OF ERRORS ", "============================================================", "", "Debug output written to /var/log/leapp/leapp-upgrade.log", "", "============================================================", " REPORT OVERVIEW ", "============================================================", "", "Following errors occurred and the upgrade cannot continue:", " 1. Actor: scan_subscription_manager_info", " Message: A subscription-manager command failed to execute", "", "Reports summary:", " Errors: 1", " Inhibitors: 0", " HIGH severity reports: 0", " MEDIUM severity reports: 0", " LOW severity reports: 0", " INFO severity reports: 1", "", "Before continuing, review the full report below for details about discovered problems and possible remediation instructions:", " A report has been generated at /var/log/leapp/leapp-report.txt", " A report has been generated at /var/log/leapp/leapp-report.json", "", "============================================================", " END OF REPORT OVERVIEW ", "============================================================", "", "Answerfile has been generated at /var/log/leapp/answerfile", "Job ended at 2025-12-03T08:54:13Z", ""]}, "changed": false} RUNNING HANDLER [infra.leapp.common : Rename log file] ************************* task path: /root/.ansible/collections/ansible_collections/infra/leapp/roles/common/handlers/main.yml:32 changed: [managed-node01] => {"changed": true, "cmd": "export PATH=$PATH\nmv /var/log/ripu/ripu.log /var/log/ripu/ripu.log-20251203T034958\n", "delta": "0:00:00.004713", "end": "2025-12-03 03:54:14.727003", "msg": "", "rc": 0, "start": "2025-12-03 03:54:14.722290", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []} PLAY RECAP ********************************************************************* managed-node01 : ok=29 changed=6 unreachable=0 failed=0 skipped=12 rescued=2 ignored=0